We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

0/5 (Submit Your Rating)

Princeton, NJ

SUMMARY

  • Around 6 years of experience in Information Technology. Experience in Software Development Life Cycle (SDLC) methodologies like Waterfall and Agile.
  • Experienced in using ETL tools including Informatica Power Center 10.x/9.x/8.x, Power Mart, Power Exchange, Oracle Exadata,DataMart,Workflow Manager, Repository Manager, Administration console and Informatica Data Quality (IDQ).
  • Proven experience in Design, Review and implementation of highly scalable and reliable application for large volume of distributeddatausingBigdatatechnologies like Apache Spark Spark SQL, Hadoop, Scala etc.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Used various Informatica Power Center and Data Quality transformations as - aggregator, source qualifier, update strategy, expression, joiner, lookup, router, sorter, filter, XML Parser, labeler, parser, address validator, match, merge, comparison and standardizer to perform various data loading and cleansing activities.
  • Worked on various applications using Python integrated IDEs Eclipse.
  • Extensively used Informatica data masking transformation to mask NPI data (SSN, birth date and account number etc.) in Dev and QA environments.
  • Strong experience with Informatica tools using real-time Change Data Capture (CDC) andMD5.
  • Experienced in using advanced concepts of Informatica like Push down Optimization (PDO).
  • Designed and developed Informatica mappings including Type-I, Type-II and Type-IIIslowly changing dimensions (SCD).
  • Created mappings using different IDQ transformations like Parser, Standardizer, Match, Labeler and Address Validator.
  • Working experience on different source systems for the planned datamarts like SalesDatamart and MarketingDatamart.
  • Experienced in working with Hadoop/Big-Datastorage and analytical frameworks over Amazon AWS cloud using tools like SSH, Putty.
  • Used Address validator transformation to validate source data with the reference data for standardizes addresses.
  • Experience of data manipulation, data ETL process, and statistical analysis using SAS
  • Experienced in Teradata SQL Programming.
  • Worked with Teradata utilities like Fast Load, Multi Load, Tpump and Teradata Parallel transporter.
  • Experience in using Transformations, creating Informatica Mappings, Mapplets, and Sessions, Work lets, Workflows and processing tasks using Informatica Designer / Workflow Manager.
  • Experienced in scheduling Informatica jobs using scheduling tools like Tidal, Autosys and Control- M.
  • More than one year of experience in JAVA, J2EE, Web Services, SOAP, HTML and XML related technologies demonstrating strong analytical and problem solving skills, computer proficiency and ability to follow through with projects from inception to completion.
  • Extensive experience in Netezza database design and workload management.
  • Experienced in writing SQL, PL/SQL programming, Stored Procedures, Functions, Triggers, Views and Materialized Views.
  • Helping the offshore team inObjectOrientedimplementation.
  • Translating research questions into statistical models for estimation and testing
  • Good command on Database as Oracle Teradata 13, SQL Server 2008, Netezza and MS Access 2003.
  • Data Modeling: Data modeling knowledge in Dimensional Data modeling, Star Schema, Snow-Flake Schema, FACT and Dimensions tables.
  • Experienced in writing and tuning complex SQL queries, Triggers and Stored procedures in SQL Server, OracleExadata, Teradata.
  • Proficient applying MVC/object-orientedtechniques, principles, frameworks, and event-driven analysis and design in implementing in both client/server and N-tiered, distributed environments.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
  • Maintained and monitored various production servers with Oracle 11g R1, 11g R2, 10g, andexadata servers.
  • Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Worked with the development team designing, implementing and testing a Time sheet Web-based application, used in the organization was developed inObjectorientedJava/J2EE
  • Experience in using Golden Gate to support data replication in the Oracle database environment.
  • Worked with other developers to match company deadlines, using Java and Ruby on Rails, MySQL, MongoDB, Redis, JQuery, Agile, Git, Ajax etc.
  • Deftly executed multi-resource projects following Onsite Offshore model while serving as a Mentor for the Junior Team Members.
  • Excellent communication and presentation skills, works well as an integral part of a team, as well as independently, intellectually flexible and adaptive to change.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.x/8.x, Power Exchange 9.1/8.1, Informatica Developer 9.5

Languages: SQL, PL/SQL, C, C++, XML, HTML, Visual Basic 6.0

Operating Systems: UNIX, Windows XP/2007, Sun Solaris

Tools: PL/SQL, SQL, Developer 2000, Oracle Developer Suite, SQL Plus, Toad 8.x/9.x/10.x, SQL *Loader, Multi Load, Teradata 12/13, SQL Assistant

Databases: Oracle 11g/10g, SQL Server 2000,Nosql, Teradata 14/12, DB2

Job Scheduling: Autosys, TWS (Tivoli Workload Scheduler)

BI Tools: Cognos 9, QlickView 8.5, Business Objects XI R3

PROFESSIONAL EXPERIENCE

Confidential, Milwaukee, WI

Sr ETL Developer

Responsibilities:

  • Design & Development of ETL mappings using Informatica Power Center 10.1.
  • Develop ETL Complex code to formulate business rules for Member and Provider.
  • Developed and maintained ETL (Extract, Transform and Load) mappings to extract the data from source Netezza database and loaded it into Oracle tables.
  • Developed Workflows and sessions associated with the mappings using Workflow Manager.
  • Worked with Netezza database in Windows platform and contributed to building the customized ELT framework using Shell scripting.
  • Maintenance of OracleExadataX2-2 Quarter Rack.
  • Worked on transformations like Transaction Control, Lookup, Router, Update Strategy and Sequence Generator.
  • UtilizedBigDatatechnologies for producing technical designs, prepared architectures and blue prints forBigDataimplementation.
  • Responsible for building scalable distributeddatasolutions usingBigDatatechnologies like Apache Hadoop, Spark, Drools, Scala and Java.
  • Working with stat analysts on NCCI/ISO requirements and how client wants to deliver.
  • Championed the introduction of a new Strategic Data Warehouse and Technology stack with emphasis on conceptual enterprise data modeling, dimensional modeling. Best data warehousing architecture processes, introduction of Data Integration/ETL tools. Interfacing extensively and gathering data requirements with Digital, Customer Experience, Sales, Servicing, Analytics and Financial Reporting areas Environment is Vertica, Python/Alteryx and Tableau.
  • Used statistical predictive and prescriptive to research and utilized e-commerce best practices for online sales to new and existing customers
  • Create Stored Procedures to move data from Source to Work tables and from work tables toDataMart.
  • Subjected expert on methodologies in implementing statistical analysis solutions
  • Teradata SP/View/BTEQ development and involving in the code review meetings
  • Involved in Implementation of SCD1 and SCD2 data load strategies.
  • Incremental build out of the data to support evolving data needs by adopting the agile methodology.
  • Develop Python application to request data from different vendor API's and process JSON data
  • Writing Data Validation scripts using complex sql, fluid queries to Netezza.
  • Worked with various formats of files like delimited text files, click stream log files, Apache log files, Avro files, JSON files, XML Files XSD.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and use them in the job.
  • Developed the SQL scripts using NZSQL for Netezza and Procedures for the business rules using Unix Shell.
  • Developed Python application for automating report generation using flask, SQL alchemy and MS SQL server.
  • Designed and developed several SQL Server Stored Procedures, Triggers and Views.
  • Used analytical and Windowing functions of Netezza to implement complex business logic.
  • Replication and Extraction of data and applying on production using Golden Gate.
  • Review all the development queries, performed optimization and query performance tuning using various techniques for Netezza Database.
  • Involved in change request design and implementation, and related testing, migration and documentation process. Translated the business process specification intoInformaticamappings design for building thedatamart.
  • Involved in testing, identification and documentation of bugs and apply fixes to applications.
  • Involved in Unit, Integration, System, and Performance testing levels.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Conducted code reviews on the code developed by my team mates before moving it into QA.
  • Involved in developing test automation scripts using Perl/Python.
  • Actively participated in Scrum meetings, review meetings and developed test scenarios.
  • Create team specific agile process flow in JIRA to move tasks from one activity

Environment: Informatica Power Center 10.1, SQL, Netezza, Ruby, Teradata, Nosql,Golden Gate, Python, Oracle 11g, Tidal, Unix, JIRA and SVN.

Confidential, Princeton, NJ

Sr. ETL Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Designed table structure in Netezza.
  • Designed various mappings and Mapplets using different Transformations Techniques such as Key Generator, Match, Labeler, Case Converter, Standardizer and Address Validator.
  • Developed new mapping designs using various tools in Informatica like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • ExadataDatabase Machine Administration.
  • Implementation ofExadatafeatures like Smart Scan, Smart Flash Cache, HCC, etc.
  • Use proprietary software to produce statistical output including descriptive and summary stats
  • Developed the mappings using transformations in Informatica according to technical specifications.
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area.
  • Wrote Python scripts to parse XML documents and load the data in database.
  • Implement Data Quality Rules using Informatica Data Quality (IDQ) to check correctness of the source files and perform the data cleansing/enrichment.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Created profiles and score cards for the users using IDQ.
  • Built several reusable components on IDQ using Parsers Standardizers and Reference tables.
  • Developed Merge jobs in Python in order to extract and load data into MySQL database and used Test driven approach for developing applications.
  • Used organic growth, moving average and other statistical methods to compare current vs. last year results.
  • Used Informatica reusability at various levels of development.
  • Involved in Database migrations from legacy systems, SQL server to Oracle and Netezza.
  • Developed mappings/sessions using Informatica Power Center 9.6.1 for data loading.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Created mappings, Mapplets according to Business requirement using Informatica big data version and deployed them as applications and exported to power center for scheduling.
  • Created NZLOAD process to load data into DataMart in Netezza.
  • Work directly with Business and Product engineering teams to devise design solutions for enhancements and new ETL development under different DataMart applications.
  • Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Building Reports according to user Requirement.
  • Experienced in loading data between Netezza tables using NZSQL utility.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Developed applications as Proof of Concept using Ruby on Rails, JavaScript, HTML, CSS,
  • SQL, Node, Backbone, etc.
  • Developed integration services using SOA, Web Services, SOAP, and WSDL
  • Handling and resolving Golden Gate Unique constraint collusions.
  • Written UNIX shell scripts to load data from flat files to Netezza database.
  • Scheduling Informatica jobs and implementing dependencies if necessary using Autosys.
  • Optimizing performance tuning at source, target, mapping and session level.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Informatica Power Center 9.6.1, IDQ 9.6.1, Oracle 11g/10g, Teradata V2R5, Vertica, TOAD for Oracle, SQL Server 2008, PL/SQL, DB2, Netezza, Golden Gate, Agile, Ruby, Python, SQL, Nosql,Erwin 4.5, Business Objects, Unix Shell Scripting (PERL), UNIX (AIX), Windows XP and Autosys.

Confidential, Melbourne, FL

ETL Developer

Responsibilities:

  • Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
  • Translate requirements and high-level design into detailed functional design specifications.
  • Extracted data from various centers with the data in different systems like Oracle Database and SQL Server and loaded the data into Teradata staging using Informatica Power Center 9.5.
  • Involved in Migration from Informatica Power Center 9.1 to Informatica Power Center 9.5
  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Implemented the concept of slowly changing dimensions (SCD) Type I and Type II to maintain current and historical data in the dimension.
  • Supported other projects including database migrations toExadata.
  • Developed complex Informatica mappings to implement Change Data Capture (SCD) mechanism by usingType-2 effective date and time logic.
  • Migrated databases from NonExadatatoExadataand AWS platforms.
  • Created critical re-usable transformations, mapplets and work lets wherever it is necessary
  • Integrated IDQ mappings, rules as mapplets within Power Center Mappings.
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Wrote complex SQL Queries involving multiple tables with joins and also generated queries to check for consistency of the data in the tables and to update the tables as per the Business requirements.
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data.
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Implemented restart strategy and error handling techniques to recover failed sessions.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Used ER Studio to analyze and optimize database and data warehouse structure.
  • Used Autosys scheduler to schedule and run the Informatica workflows on a daily/weekly/monthly basis.
  • Developed Oracle PL/SQL Package, procedure, function and trigger.
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
  • Assisted BI Developers to fetch Reports using Qlick View 8.x.
  • Reviewed the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects.

Environment: Informatica Power Center 9.5/9.1, Informatica Power Exchange 9.5, Oracle 11g, UNIX,PL/SQL, SQL, Oracle 11g, MySQL, Unix Shell Scripts, Embarcadero ER Studio Data Architect, Nosql,TOAD 10.1.1, Autosys, Putty, Erwin 8.0, Qlick View 8.5, SQL *Loader, SQL, PL/SQL.

Confidential

ETL Developer

Responsibilities:

  • Interacted with Business Analysts for Requirement gathering, understanding the Requirements, Explanation of technical probabilities and Application flow.
  • Developed ETL mappings, transformations using Informatica Power Center 9.1
  • Extracted data from flat files, DB2 and loaded the data into Oracle staging using Informatica Power Center.
  • Designed and created complex source to target mapping using various transformations inclusive of but not limited to Sorter, Aggregator, Joiner, Filter, Source Qualifier, Expression and Router Transformations.
  • Extensively used Lookup Transformation and Update Strategy Transformation while working with Slowly Changing Dimensions (SCD) and also performed reading and loading high-volume Type 2 dimensions.
  • Extensively used Informatica debugger to figure out the problems in mappings. Also involved in troubleshooting existing ETL bugs.
  • Implemented Incremental loading of mappings using Mapping Variables and Parameter Files.
  • Experienced in designing and developing Informatica IDQ environment.
  • Used Mapping Parameters and Mapping Variables based on business rules provided.
  • Wrote PL/SQL Procedures for data extractions, transformation and loading.
  • Assisted in Data Modeling and Dimensional Data Modeling.
  • Involved in Performance Tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance. • Scheduled workflow daily basis for incremental data loading.
  • Designed and developed UNIX shell scripts as part of the ETL process to automate the data load processes to target.
  • Developed BTEQ, MLOAD scripts to load data to Teradata Data mart.
  • Accomplished data movement process that load data from databases, files into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as BTEQ,FASTLOAD, FASTEXPORT, MULTILOAD.
  • Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Maintained Version Control using Clear Case.
  • Scheduling jobs using Autosys to automate the Informatica Sessions.
  • Provided Production Support at the end of every release.

Environment: Informatica Power Center 9.1, Oracle 10g, Sql Server2000/2008, UNIX, COBOL, ERWIN 3.5, Shell script, Rapid-SQL, Toad, Teradata 12, Visio, Autosys, Clear Case.

Confidential

Jr ETL Developer

Responsibilities:

  • Used Informatica Power Center 8.6 for migrating data from various OLTP databases and other applications to the Radar Store Data Mart.
  • Worked with different sources like Relational, Mainframe (COBOL), XML, flat files (CSV) loaded the data into Oracle staging.
  • Created complex Informatica mappings with extensive use of Aggregator, Union, Filter, Router, Normalizer, Java, Joiner and Sequence generator transformations.
  • Created and used parameter files to perform different load processes using the same logic.
  • Extensively used PL/SQL for creation of stored procedures and worked with XML Targets, XSD's and DTD's.
  • Filtered Changed data using Power exchange CDC and loaded to the target.
  • Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.
  • Used different Tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event-Wait and Control) in the creation of workflows.
  • Utilized the new utility Informatica Data Quality (IDQ) and Informatica Data Explorer (IDE) that came up with Informatica Version 8.
  • Performed performance tuning of source level, target level, mappings and session.
  • Involved in modifying already existing UNIX scripts and used them to automate the scheduling process.
  • Coordinated with testing team to make testing team understand business and transformation rules being used throughout ETL process.

Environment: InformaticaPower Center 8.6, Oracle 10g, MS SQL Server 2008, MS Access, MS Excel 2008, SQLDeveloper, Windows, Unix/Linux.

We'd love your feedback!