We provide IT Staff Augmentation Services!

Lead Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Associated with the industry for around 10+ years of work experience in analysis, design, development, and testing of data integration logics using various ETL, Cloud Services, and Visualization tools - Informatica PowerCenter, AWS Services, Tableau, and IBM Cognos. Core specialty is Business Intelligence and Analytics, data warehousing, data integration and data migration from on-prem to cloud with outstanding experience in process optimization and performance tuning using Oracle, Informatica, Unix and AWS Services. Currently pursuing a master’s degree in Information Technology, has excellent written and verbal communication skills with diverse technical background, a proven ability to quickly learn and adapt to new skills and technologies along with excellent analytical skills.
  • Has approximately 10+ years of experience ETL, AWS Services, Reporting, and Visualization Tools.
  • Strong knowledge of financial service industry (Mortgage-Backed Securities and Loan Accounts), and Insurance domains.
  • Experience in developing and performance tuning Informatica mappings to extract, transform and load data from-and-to various sources like XML, Flat file, CSV, Oracle, and Netezza.
  • Experience in implementing ELT logic using AWS services such as EMR, EC2, Lambda, Step Function, Hive, Glue Catalog, Spectrum, Athena, S3, SNS, SQS, Spark SQL, Cloud Watch (Events, Logs and Metrics), and Cloud Formation Template based on the functional requirements.
  • Strong experience in analysing and designing data migration strategies to load data from on-prem data warehouse (Netezza) to cloud data warehouse (Redshift).
  • Migrated Informatica Workflows, Sessions, Mapplets, Mappings and Autosys JIL scripts from On-prem to AWS for scheduled execution.
  • Strong experience in developing and performance tuning SQL queries, PL-SQL Procedures, functions, AutoSys JILs, UNIX Shell scripts, and Erwin Data modeling.
  • Experience in developing dashboards in Tableau 9.0, 9.2, 10.3 and 10.5 versions, Dashboard development and Performance Tuning.
  • Experience in IBM Cognos Framework Manager, Report Studio and Workspace Advanced.
  • Good experience in TDM/Automation testing tools like Cucumber and Data Testing Framework.
  • Automated job executions using UCD and Jenkins pipelines, Autosys and Unix shell script by identifying & scheduling job dependencies.
  • Collaborated with technical and business end-users to understand, document, and bridge business requirements to technical requirements; designed and implemented solutions to ensure that the product fulfills the given requirements.
  • Provided extensive support for Production Support, System Integration Testing and User Acceptance Testing activities.
  • AWS Certified Developer - Associate (DVA).
  • Member of Upsilon Pi Epsilon (UPE) UMGC Chapter.

TECHNICAL SKILLS:

Cloud Service: AWS EMR, EC2, Lambda, Step Function, Hive, Glue Catalog, Spectrum, Athena, S3, SNS, SQS, Spark SQL, Cloud Watch (Events, Logs and Metrics), and Cloud Formation Template

ETL Technologies: Informatica PowerCenter 10.x/9.xInformatica Data Transformation Studio

Database: Oracle 11g / 12c / 19c

MS SQL Server 2005/2008/2012 PostgreSQL: Date Warehouse

IBM Netezza: AWS RedShift

Visualization & Reporting: Tableau 9.0, 9.2, 10.3, and 10.5

IBM Cognos 9.x, 10.x: Microsoft SSRS 2008/2012

Scripting: Unix shell scripting, Python, and Ruby

Scheduling: Autosys, Informatica SchedulerAWS - CloudWatch

Code Repository: Git, Bitbucket, and SVN.

Other Applications: Jira, Confluence, and Rally

PROFESSIONAL EXPERIENCE:

Confidential

Lead Developer

Responsibilities:

  • Developed ETL programs using Informatica PowerCenter, PL/SQLs, and Unix scripts, schedule the workflows using Autosys load data to Oracle Database and Netezza Datawarehouse and vend the data to downstream consumers to facilitate advance analytical reporting.
  • Identify and implement the right AWS services such as EMR, EC2, Secret’s manager, Cloud watch, S3, SNS, SQS, Lambda, Step function for different functional components and migrate/re-host existing on-prem infrastructure to Cloud.
  • Design Cloud formation templates, update configuration files.
  • Migrate Informatica Workflows, Sessions, Mapplets, Mappings and Autosys JIL scripts from On-prem to cloud for scheduled execution.
  • Design and develop solution to migrate on-prem data to cloud using python.
  • Verify data migrated between Oracle 19c (on-prem) and Amazon RDS for Oracle (on-cloud) using a custom built DTF tool.
  • Automated job executions using Jenkins pipelines and Unix shell script by identifying & scheduling job dependencies to transform & move high volume data files using advance UNIX utilities, triaging data anomalies in higher environment.
  • Use various AWS services available with python and spark to provide on-cloud solutions for data warehouse in RedShift and PostgreSQL.
  • Develop automated test suite using Python language to ensure the programs are in accordance with the business requirements and industry standards.
  • Ensure the code is maintained in our version control system Bitbucket.
  • Provide extensive support for Production Support, System Integration Testing and User Acceptance Testing activities.

Confidential

Team Member

Responsibilities:

  • Develop ETL programs to load Data from Flat files to Database using the Informatica PowerCenter to perform advanced analytical reporting for client's need.
  • Develop test suite, test cases and test scripts through SQL/PLSQL components to have an extensive testing on the Extract Transform and Load to validate programs are 100% in accordance with the business functional requirement
  • Creating ETL Mappings in Informatica PowerCenter to convert the data into required format for the Reports and Dashboards.
  • Loading data from flat files to the landing tables for the data quality checks using HexaRule and Profile the data using Profiler and check for Potential Codes, Constants, Length Profiling, Pattern Profiling & Value Profiling.
  • Gathering and documenting the requirements for the creation of the Cognos Reports and dashboards.
  • Coordinating with the ETL and Tableau team for the creation of target tables based on the requirement for the creation of reports and dashboards.
  • Creation of data quality reports and dashboards to provide insights of the issues using Tableau.
  • Creation of Support Documents for the Tableau dashboards and reports.

Confidential

Team Member

Responsibilities:

  • Understanding the Client’s Data and created rules in HexaRule to check the quality of data.
  • Profile the data using Profiler and check for Potential Codes, Constants, Length Profiling, Pattern Profiling & Value Profiling.
  • Develop ETL mapping using Informatica PowerCenter to load Data from Flat files to Database using the Informatica PowerCenter to perform advanced analytical reporting for client's need.
  • Develop test suite, test cases and test scripts through SQL/PLSQL components to have an extensive testing on the Extract Transform and Load to validate programs are 100% in accordance with the business functional requirement
  • Creating ETL Mappings in Informatica PowerCenter to convert the data into required format for the Reports and Dashboards.
  • Loading data from flat files to the landing tables for the data quality checks using HexaRule and Profile the data using Profiler and check for Potential Codes, Constants, Length Profiling, Pattern Profiling & Value Profiling.
  • Gathering and documenting the requirements for the creation of the Cognos Reports and dashboards.
  • Coordinating with the ETL and Tableau team for the creation of target tables based on the requirement for the creation of reports and dashboards.
  • Creation of data quality reports and dashboards to provide insights of the issues using Tableau.
  • Creation of Support Documents for the Tableau dashboards and reports.
  • Loading data from flat files to the landing tables for the data quality checks using Automaton.

Confidential

Team Member

Responsibilities:

  • Loading data from flat files to the landing tables for the data quality checks using HexaRule and Profile the data using Profiler and check for Potential Codes, Constants, Length Profiling, Pattern Profiling & Value Profiling.
  • Gathering and documenting the requirements for the creation of the Cognos Reports and dashboards.
  • Coordinating with the ETL and Cognos team for the creation of target tables based on the requirement for the creation of reports and dashboards.
  • Creation of data quality reports and dashboards to provide insights of the issues.
  • Creation of Framework Manager Model in Cognos 10.2.2 for the Impala Database.
  • Creation of Reports and Dashboards in Cognos 10.2.2 Environment as per the requirement.

Confidential

Team Member

Responsibilities:

  • Analyze Complexity of the reports was analyzed using Hexaware’s IP Tool BIMA for the estimation of time.
  • Loading data from flat files to the landing tables for the data quality checks using HexaRule and Profile the data using Profiler and check for Potential Codes, Constants, Length Profiling, Pattern Profiling & Value Profiling.
  • Gathering and documenting the requirements for the creation of the Cognos Reports and dashboards.
  • Coordinating with the ETL and Cognos team for the creation of target tables based on the requirement for the creation of reports and dashboards.
  • Creation of data quality reports and dashboards to provide insights of the issues.
  • Understand the reports that were created in Hyperion Interactive Reporting (BRIO) 8.5.
  • Migration of the reports Simple & Medium reports using Hexaware’s IPs SLA & RLA.
  • Creation of Reports in Cognos 10.2.2 Environment as per the requirement.

We'd love your feedback!