We provide IT Staff Augmentation Services!

Lead Software Engineer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • 14 years’ experience in Information Technology with demonstrated expertise in providing cost effective, unconstrained, integrated and optimized business solutions in Financial Services domain.
  • Areas of Strength - Data Analysis, Data Integration, Database Development and Data Warehousing.
  • Experience in leading and executing projects end to end which includes design, development, configuration, implementation, provision of quality assurance and production support.
  • Experience in building high performance distributed batch processing applications on Hadoop platform using Big Data Technologies - Apache Hive, Impala, Sqoop, HDFS, Oozie, Apache Spark, Scala. Hands on experience using Hadoop ecosystem tools and processing frame works like Spark and MapReduce.
  • Experience in Scala programming. Experience in developing Spark programs using Scala API (RDD, Dataframe, Dataset).
  • Experience in building data integration applications using - Informatica PowerCenter/IDQ, Java Spring Batch, Spring Integrations, UNIX Shell Scripting.
  • Experience in building Data Warehouse application in Oracle. In-depth knowledge of Dimension Modeling.
  • Excellent communicator and liaison between business and technology, with reputation for managing and delivering multiple, large-scale projects.
  • Experience working with Project Managers/Business Partners for Project Planning, Financial Planning, Resource Planning, Estimation, Risk/Issue Management and Status reporting.
  • Experience in Team Management, Delivery Management, Partner Management. Experience onboarding, mentoring, training and managing development teams. Experience working in onshore/offshore delivery model.
  • Experience working in Agile and Waterfall based development methodologies.
  • Experience in building Big Data applications on Hadoop platform using technologies - Apache Hive and Spark.
  • Experience working with Apache Hive query language to transform and aggregate data on Hadoop. Good knowledge of Partitioning and Bucketing.
  • Experience in developing Apache SPARK programs using Scala API (RDD,Dataframe,Dataset) for data processing.
  • Experience working with Impala for creating user reports for business.
  • Used Apache Sqoop for data ingestion from RDBMS into HDFS file system. Knowledge of streaming data ingestion technologies like Kafka and Flume.
  • Used Oozie for workflow orchestration.
  • Experience in Java Spring Batch programming using Spring Framework. Used Spring Batch and Integration for developing batch processing applications. Good knowledge of Object Oriented Programming Concepts, Spring Dependency Injection.
  • Experience in working with Relational Database Management Systems. In-depth knowledge of Dimension Modeling.
  • Experience in developing Oracle SQL, Stored Procedures, Functions, Packages, Triggers, and Materialized Views.
  • Experience in Oracle SQL and PL/SQL Performance Tuning. Expert knowledge of Oracle Optimizer, Joins, Indexes, Hints, Partitioning and Parallelism.
  • Experience in developing ETL workflows using Informatica PowerCenter 9.X/8.X and IDQ. Worked extensively with Informatica client tools- Designer, Repository Manager, Workflow Manager and Workflow Monitor and IDQ.
  • Experience in optimizing Informatica Sessions, Mappings, Transformations, Sources, and Targets. Used pushdown optimization techniques, Informatica session partitioning features to improve performance of data intensive jobs.
  • Experience in UNIX Shell Scripting. Used extensively for job automation and scheduling. In-depth knowledge of secure file transmission protocols like SFTP, SCP and SSH.
  • Experience using Tableau Data Visualization tool.
  • Experience in working with ControlM Enterprise Manager tool for job scheduling and monitoring. Used extensively for designing and developing jobs on distributed platform.

TECHNICAL SKILLS

Database: Oracle 11g (Exadata)/10g/9i, Oracle SQL, PL/SQL

Programming: Java, Spring Framework, Scala

Big Data: Hadoop, MapReduce, Hive, Scoop, Oozie, Spark, Impala

Job Scheduling Tools: BMC ControlM

Source Code Control: IBM RTC

Server-Side Scripting: UNIX Shell Scripting

Database Tools: TOAD, SQL Developer, PL/SQL Developer

ETL Tools: Informatica PowerCenter 9.1, Informatica IDQ

Data Visualization Tools: Tableau

Development Tools: Spring Tool Suite, Eclipse

Development Methodologies: Waterfall, Agile

PROFESSIONAL EXPERIENCE

Confidential

Lead Software Engineer

Responsibilities:

  • Involved in customer business requirement analysis. Performed customer sales interaction, customer assets, sales volume and market metrics data analysis.
  • Transform the requirements into feasible technical solution.
  • Developed Sqoop scripts to extract customer sales interaction, customer assets, sales volume and market metrics data from various Oracle data sources and ingest into Hadoop cluster.
  • Developed User Defined Functions in Hive for processing transaction data.
  • Developed HQL queries to perform aggregates of SQL data.
  • Developed queries in Impala for end user reporting.
  • Developed Oozie workflows for orchestrating data ingestion using Sqoop and data processing using Hive.
  • Developed jobs in ControlM jobs for scheduling Big Data jobs o Hadoop platform.
  • Worked closely with QA team and Business Users during SIT and UAT phases to resolve defects and performance issues, and Data validation.
  • Created and published install and build deployment plans. Work closely with release management team during release weekends. Performed code and data validations after install.
  • Involved in post-production support to address any issues arising out of the production install.

Environment: RedHat Linux, Cloudera Hadoop Distribution 5.3, Hive, Impala, Hue, Sqoop, Oozie, Oracle 11g, Spring Tool Suite, Maven, ControlM Enterprise Manager, IBM RTC.

Confidential

Big Data Developer

Responsibilities:

  • Involved in customer requirement analysis. Performed analysis to web click data generated by Confidential customer facing applications.
  • Developed jobs to collect and inject the web click data feeds from webservers into Hadoop cluster using NFS gateways.
  • Developed programs in Apache Spark using Scala API to aggregate, transform, filter bots traffic in web click data.
  • Developed Spark programs to integrate web click data sourced from web servers with customer transaction data in RDBMS system.
  • Developed Sqoop scripts extract customer profile and transaction data from Oracle into Hadoop cluster.
  • Developed ControlM jobs in schedule spark jobs on Cloudera platform.
  • Work closely with QA team and Business Users during SIT and UAT phases to resolve defects and performance issues, and Data validation.
  • Created and published install and build deployment plans. Work closely with release management team during release weekends. Performed code and data validations after install.
  • Involved in post-production support to address any issues arising out of the production install.

Environment: RedHat Linux, Cloudera Hadoop Distribution 5.3, Apache Spark 1.6.2, Scala 2.10, Impala, Sqoop, HUE, Oozie, Oracle 11g, Spring Tool Suite, Maven, ControlM Enterprise Manager, IBM RTC.

Confidential

Technical Lead

Responsibilities:

  • Worked with Business users to gather requirements. Worked closely with Project Management Team for size, timeline and operational cost estimation.
  • Performed Data Analysis and Collaborated with data engineering team to create Data Models in Oracle.
  • Developed data flow reference diagrams and technical design documents for project development.
  • Developed technical solutions using Java Spring batch and Spring Integration, Oracle SQL and PL/SQL to process intraday fund pricing and performance data coming from vendor source systems.
  • Developed Informatica workflows to extract Fund Pricing data from OLTP database and load into Data Warehouse.
  • Performance tuned Oracle SQL/PLSQL and Informatica workflow jobs running long in production.
  • Developed ControlM jobs for scheduling and running spring batch jobs on LINUX platform.
  • Work closely with QA team and Business Users during SIT and UAT phases to resolve defects and performance issues.
  • Created and published install and build deployment plans. Work closely with release management team during release weekends. Performed code and data validations after install.
  • Involved in post-production support to address any issues arising out of the production install.

Environment: RedHat Linux, Java Spring Framework 4.0, Java 1.8, Oracle 11g, SQL Developer, Spring Tool Suite, IBM RTC, Maven, ControlM Enterprise Manager.

Confidential

Technical Lead

Responsibilities:

  • Worked with Business users to gather requirements. Worked closely with Project Management Team for size, timeline and operational cost estimation.
  • Performed Data Analysis and Collaborated with data engineering team to create Data Models in Oracle.
  • Developed data flow reference diagrams and technical design documents for project development.
  • Developed Informatica IDQ workflows to extract customer leads data from source files and perform address standardization using Informatica IDQ address doctor and load into Oracle.
  • Developed ControlM jobs for scheduling and Informatica IDQ jobs on LINUX platform.
  • Work closely with QA team and Business Users during SIT and UAT phases to resolve defects and performance issues.
  • Created and published install and build deployment plans. Work closely with release management team during release weekends. Performed code and data validations after install.
  • Involved in post-production support to address any issues arising out of the production install.

Environment: RedHat Linux, Oracle 11g, SQL Developer, IBM RTC, Maven, Informatica PowerCenter/IDQ 9.1, ControlM Enterprise Manager.

Confidential

Technical Lead

Responsibilities:

  • Performed Data Analysis of Confidential financial and non-financial feeds containing Accounts, Trades, Client Contact information.
  • Worked with business users to gather business requirements and developed feasible technical solution to process the incoming feeds.
  • Collaborated with data engineering team to create Data Models in OLTP and OLAP systems.
  • Developed data flow reference diagrams and technical design documents for project development.
  • Developed technical solutions using Java Spring batch and Spring Integration, Oracle SQL and PL/SQL to extract transaction data coming in financial feeds and store in OLTP system.
  • Developed Informatica workflows to extract transaction data from OLTP database and load into dimensions and facts on Data Warehouse.
  • Involved in performance tuning of Oracle PLSQL and Informatica workflow jobs running long in production.
  • Developed ControlM jobs for scheduling and running spring batch jobs on LINUX platform.
  • Work closely with QA team and Business Users during SIT and UAT phases to resolve defects and performance issues.
  • Created and published install and build deployment plans. Work closely with release management team during release weekends. Performed code and data validations after install.
  • Involved in post-production support to address any issues arising out of the production install.

Environment: RedHat Linux, Java Spring Framework 3.0, Java 1.7, Oracle 11g, SQL Developer, Spring Tool Suite, IBM RTC, Maven, ControlM Enterprise Manager.

Confidential

Associate Consultant

Responsibilities:

  • Involved in development and enhancement of Oracle Financial Solutions Banking Automation Product - FLEXCUBE. FLEXCUBE is a state-of-the-art graphical user interface (GUI) based Banking automation system, facilitating automation of a wide range of banking activities.
  • Involved in requirements gathering and documenting the requirements.
  • Analyze customer requirements and develop low level technical design documents to translate customer requirements into feasible technical solution.
  • Develop workflow diagrams to describe the process flow and system functionality.
  • Review technical design documents and workflow diagrams with product architects and project managers to determine the implementation timelines.
  • Develop programs/modules as per technical design document within planned schedule.
  • Adhere to product development guidelines and processes to develop programs/modules which can meet quality expectations of customers.
  • Perform peer to peer code reviews during development phase and provide feedback on the issues and concerns.
  • Create and maintain quality metrics for project defects, effort variance and task estimates.
  • Adhere to defined Change Control Process of oracle/customer. Comply with Oracle/customer audit/compliance requirements
  • Develop test strategies to perform end to end unit testing of individual programs and modules. Perform test data setup in FLEXCUBE backend Oracle database for Unit testing.
  • Work closely the Quality Assurance team during system integration testing rounds to address any bugs or issues in the programs.
  • Support User Acceptance Testing phase of FLEXCUBE modules and address any bugs in FLEXCUBE modules.
  • Document systems functionalities and maintain the documents in company’s knowledge management portals for reference and training purpose.

Confidential

Software Engineer

Responsibilities:

  • Development and enhancement of Oracle Stored Procedures, Functions, Packages, Sequences, and Triggers.
  • Java Application development and enhancement using JSP, Servlets, HTML in MVC architecture.
  • Develop Batch Jobs in UNIX Shell for bulk data processing in UNIX environment.
  • Document and Maintain System functionality. Create and execute Unit Test cases.
  • Support Integrated Testing. Work closely with end users to resolve functional defects.

We'd love your feedback!