We provide IT Staff Augmentation Services!

Sr. Informatica/etl Developer Resume

0/5 (Submit Your Rating)

Centreville, VA

SUMMARY

  • Over 8+ years of IT experience in Planning, Analysis, Design, Implementation, Development, Maintenance and Support for production environment in different domains like Insurance, Healthcare, Financial, Retail with a strong conceptual background in Database development and Data warehousing.
  • Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses by Using InformaticaPower Center 10.2.0/9.6/9.5/9.1/9.0/8. x.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Superior SQL skills and ability to write and interpret complex SQL statements and also skillful in SQL optimization and ETL debugging and performance tuning.
  • Experience in AWS (Amazon Web Services), S3 Bucket and Redshift (AWS Relational Database).
  • Slowly Changing Dimensions Management including Type 1, 2, 3, Hybrid Type 3, De-normalization, Cleansing, Conversion, Aggregation, Performance Optimization.
  • Extensively experience in developing Informatica Mappings / Mapplets using various Transformations for Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse and Creating Workflows with Worklets & Tasks and Scheduling the Workflows.
  • Experience in working with Designer, Work Flow Manager, Work Flow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer, Worklet Designer, Gantt Chart, Task View, Mapplets, Mappings, Workflows, Sessions, Re-usable Transformations, Shortcuts, Import and Export utilities.
  • Experience in Data Warehouse development working with Extraction/Transformation/Loading using Informatica Power Mart/Power Center with flat files, Oracle, SQL Server, and Teradata.
  • Experience working on Data quality tools Informatica IDQ 9.1.
  • Experience working in multi-terabytes data warehouse using Databases like Oracle 11g/10g/9i, MS Access 2000/2002, XML, IBM UDB DB2 8.2, SQL Server 2008, MS Excel and Flat files.
  • Experience Relational Modeling and Dimensional Data Modeling using Star Snow Flake schema, De normalization, Normalization, and Aggregations.
  • Very strong in SQL and PL/SQL, extensive hands on experience in creation of database tables, triggers, sequences, functions, procedures, packages, and SQL performance-tuning.
  • Proficiency in data warehousing techniques like data cleansing, Slowly Changing Dimension phenomenon, Surrogate key assignment, change data capture.
  • Have Good understanding of ETL/Informatica standards and best practices, Slowly Changing Dimensions SCD1, SCD2 and SCD3.
  • Experience in testing coordination, writing test cases and executing test scripts and logged defects in Quality Center QC.
  • Working closely with ETL developers and other leads during development and support of BI application.
  • Experience with Data Extraction, Transformation, and Loading ETL from disparate Data sources, Multiple Relational Databases like Oracle, DB2-UDB and Worked on integrating data from flat files, CSV files, and XML files into a common reporting and analytical Data Model using Erwin.
  • Worked extensively in various kinds of queries such as Sub-Queries, Correlated Sub-Queries, and Union Queries for Query tuning.
  • Extensively worked on Data migration, Data cleansing and Data Staging of operational sources using ETL processes and providing data mining features for data warehouses.
  • Having hands on experience in InformaticaPower Exchange andInformaticaIDQ.
  • Hands on experience using query tools like TOAD, PLSQL developer, Teradata SQL Assistant and Query man.
  • Developed UNIX scripts for dynamic generation of Files & for FTP/SFTP transmission.

TECHNICAL SKILLS

Operating System: UNIX, Windows, MS-DOS

Language/Tools: SQL, PL/SQL, C, C++

Scheduling Tools: Autosys, Control-M, Informatica Scheduler

ETL Tools: Informatica Power Center 10.x/9.x/8.x, ETL Informatica Cloud, SSIS

Database: MS SQL Server, Oracle 8i/9i/10g, RDBMS DB2, Netezza, Teradata, PostgreSQL, Redshift

Scripting: Shell Scripting, Python

Data Modeling Tools: Microsoft Visio, ERWIN 9.3/7.5

Data Modeling ER: (OLTP) and Dimensional (Star, Snowflake Schema)

Data Profiling Tools: Informatica IDQ 10.0, 9.5.1, 8.6.1

Excel Tools & Utilities: TOAD, SQL Developer, SQL*Loader, Putty

Cloud Computing: Amazon Web Services (AWS), S3, RDS, Redshift, SNS

Other Tools: Notepad++, Toad, SQL Navigator, Teradata SQL Assistant, Snaplogic, AWS, Appworx

Defect Tracking Tools: ALM, Quality Center

Reporting Tools: IBM Cognos, Tableau 9

PROFESSIONAL EXPERIENCE

Confidential, Centreville, VA

Sr. Informatica/ETL Developer

Responsibilities:

  • Coordinated with business analysts to analyze the business requirements and designed and reviewed the implementation plan.
  • Responsible for designing and development, testing of processes necessary to extract data from operational databases, Transform and Load it into data warehouse using Informatica Power center.
  • Followed ETL standards -Audit activity, Job control tables and session validations.
  • Created Complex Mappings to load data using transformations like Source Qualifier, Expression, Aggregator, Dynamic Lookup, Connected and unconnected lookups, Joiner, Sorter, Filter, Stored Procedures, Sequence, Router and Update Strategy.
  • Created different jobs using UNIX shell scripting to call the workflow by using Command tasks.
  • Writing Oracle SQL queries to join or any modifications in the table.
  • Design and developed complex informatica mappings including SCD Type 2 (Slow Changing Dimension Type 2).
  • Worked on complex mapping for the performance tuning to reduce the total ETL process time.
  • Extensively used TOAD to test, debug SQL and PL/SQL Scripts, packages, stored procedures and functions.
  • Designed and developed ETL code usinginformatica Mappings to load data from heterogeneous Source systems like flat files, XML's, CSV files to target system Oracle under Stage, then to data warehouse and then to Data Mart tables for reporting.
  • Extracted and transformed data from various sources like Teradata and relational databases (Oracle, SQL Server).
  • Created snaplogic jobs for pulling the data from salesforce.
  • Analyze source data coming from multiple sources System. Design and develop data warehouse model in a flexible way to cater the future business needs.
  • Ability to analyze existing systems, conceptualize and design new ones, and deploying innovative solutions with high standards of quality.
  • Development of ETL code to extract data from multiple sources and load to Data warehouse using Informatica and load data into AWS Redshift.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, code enhancements.
  • Automation and scheduling of Oracle andinformatica batch jobs using Control-M application that are scheduled with file watchers, daily, weekly and on special On-Demand requests.
  • Worked on Parameterize of all variables, connections at all levels in UNIX.
  • Performed Developer testing, Functional testing, Unit testing and created Test Plans and Test Cases.
  • Create Unit Test Case document and capture Unit test results for each source system.
  • Working in Agile environment and experienced with daily stand-ups, sprints, and tracking stories using JIRA application.

Environment: Informatica PowerCenter 10, Control M, Oracle11g, Toad, Redshift, Razor SQL, WinSCP, Composite, Netsuite, Snaplogic, UNIX and TWS.

Confidential, Parsippany, NJ

Sr. Informatica Developer

Responsibilities:

  • Interacting with the end users to get the business Requirements, reporting needs and created Business Requirement Document.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Extracted the data from the flat files, DB2, SQL server and other RDBMS databases into staging area and populated onto Data warehouse. Worked on Flat Files and XML, DB2, Oracle as sources.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Generated ABAP programs to load data into Oracle from SAP source systems.
  • Customize ABAP programs according to business requirements to load data with respect to SAP source systems
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Worked on performance tuning by creating views in Oracle and implemented transformation logics in database using views.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Modified existing mappings for enhancements of new business requirements.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Tested all the modules and transported data to target Warehouse tables, scheduled, ran extraction and load process and monitor sessions and batches by using Informatica Workflow Manager and log files.
  • Precise Documentation was done for all mappings and workflows.
  • Responsible for ETL process under development, test and production environments.
  • Written test Plans, test Cases, Test scripts, test scenarios for the Quality releases in the SOA and Maintenance release
  • Wrote test plans and executed it at UNIT testing and also supported for system testing, volume testing and USER testing. Also provided production support by monitoring the processes running daily.

Environment: Informatica Power Center 9.6.1, Control M, Oracle11g, Toad, DB2, WinSCP, WinSQL, ERWIN, UNIX and TWS.

Confidential, Portland, OR

Informatica Developer/Admin

Responsibilities:

  • Provided technical leadership and developed new business opportunities.
  • Supported day to day activities of the Data Warehouse.
  • Analyzed the business requirements and functional specifications.
  • Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.
  • Used Informatica Power Center 9.1/8.6 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Created procedures to truncate data in the target before the session run.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
  • Have Experience on Teradata Utility scripts like BTEQ, FastLoad, MultiLoad and FastExport to load data from various source systems to Teradata.
  • Working knowledge on Oracle 11g, SQL Server, Teradata, Netezza, DB2, MY SQL and UNIX shell scripting
  • Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate.
  • Created Test cases for the mappings developed and then created integration Testing Document.
  • Followed Informatica recommendations, methodologies and best practices.
  • Fine-tuned Informatica transformations and workflows for better performance.
  • Involved in the performance tuning of Informatica mappings by using Informatica Push Down optimization (PDO).
  • Used Informatica Metadata Manager to show data lineage.
  • Collaborated with remote offshore team, creating the requirement documents, verifying coding standards and conducting code reviews.

Environment: Informatica Power Center 9.1/8.6, Power Exchange, DB2,Tivoli, SQL server, linux.

Confidential, Nashville, TN

Informatica Developer

Responsibilities:

  • Created Informatica mappingsusing various transformations like XML, Source Qualifier, Expression, look up, stored procedure, Aggregate, Update Strategy, Joiner, normalizer, Union, Filter and Router in Informatica designer.
  • Extensively worked with Teradata database usingBTEQscripts.
  • Involve in all phase of SDLC, i.e. design, code, test and deploy ETL components of data warehouse and integrated Data Mart .
  • Created subscriptions for source to target mappings and replication methods using IBM CDC tool.
  • UsedNZSQL scripts, NZLOADcommands to load data.
  • Experience in Data Stage Upgrade and Migration projects - from planning to execution.
  • Analysis of heterogeneous data from various systems like pm and Salesforce.com and validating it in ODS (operational Data store).
  • Worked with InformaticaIDQData Analyst, Developer with variousdata profiling techniques to cleanse, match/remove duplicate data.
  • Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ.
  • Designed and developed IDQ solutions for data profiling. Implemented Address Doctor as Address Validator transformation for data profiling in IDQ.
  • Worked extensively withNetezzascripts to load the data from flat files to Netezza database.
  • Created mappings using pushdown optimization to achieve good performance in loading data into Oracle and Teradata.
  • Developed various SQL queries using joins, sub-queries & analytic functions to pull the data from various relational DBs i.e. Oracle, Teradata & SQL Server.
  • Worked with cleanse, parse,standardization,validation, scorecardtransformations.
  • Createdpre-session, post session, pre-sql, post sql commandsin Informatica.
  • UsedUNIXscripts forfile managementas well as inFTP process.
  • Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings to PROD.
  • Production support for the Informatica process, troubleshoot and debug any errors.

Environment: Informatica Data Quality 9.1.0/9.5.1 , Flat Files, Mainframes Files, Oracle 11i, Netezza, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008, Salesforce.com, Webservices.

Confidential, Texas

DBA /ETL Developer

Responsibilities:

  • Performed business analysis, requirements gathering and converted them into technical specifications
  • Involved in designed the data mart (star schema dimensional modeling) after analyzing various source systems and the final business objects reports
  • Designed and developed all the slowly changing dimensions to hold all the history data in the data mart
  • Developed all the ETL data loads in Informatica Power Center to load data from the source data base into various dimensions and facts in the MIS data mart
  • Implemented Slowly Changing Dimensions (Type 2) while loading data into dimension tables to hold history.
  • Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
  • Used Informatica data services to profile and document the structure and quality of all data.
  • Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all the transformation properties.
  • Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
  • Translated Business processes into Informatica mappings for building Data marts by using Informatica Designer which populated the Data into the Target Star Schema on Oracle 10g Instance.
  • Followed the required client security policies and required approvals to move the code from one environment to other.
  • Worked on Informatica Cloud to createSource /Target sfdc connections, monitor and synchronize the data in sfdc.
  • Worked on sfdc session log error files to look into the errors and debug the issue.
  • Worked withInformatica Power Exchangeas well asInformatica cloudto load the data into salesforce.com.
  • Developed Informatica mappings, mapping configuration taskand Taskflows usingInformatica Cloud Service(ICS)
  • Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules.
  • Created Informatica complex mappings with PL/SQL procedures/functions to build business rules to load data.
  • Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
  • Created automated scripts to perform data cleansing and data loading.
  • Performed complex defect fixes in various environments like UAT, SIT etc to ensure the proper delivery of the developed jobs into the production environment.
  • Attended daily status call with internal team and weekly calls with client and updated the status report.

Environment: Informatica powercenter 9.X, Informatica cloud, Oracle 10g, PL/SQL,Teradata, linux, Control-M.

Confidential, St. Louis, MO

Informatica Developer

Responsibilities:

  • Developed internal and external Interfaces to send the data in regular intervals to Data warehouse systems.
  • Extensively used Power Center to design multiple mappings with embedded business logic.
  • Involved in discussion of user and business requirements with business team.
  • Performed data migration in different sites on regular basis.
  • Involved in upgrade of Informatica from 9.1 to 9.5.
  • Portfolio Management Enhancement: Analyzed/developed business requirement, designed/created database and processes to load data from Broad ridge using Informatica, SQL Server etc.; fixed defects; Wrote complicate stored procedures to generate data for PM web reports.
  • Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target tables in efficient manner.
  • Attended the meetings with business integrators to discuss in-depth analysis of design level issues.
  • Provide work Bucket hour estimation and budgeting for each story (agile process) and communicate status to PM.
  • Was responsible for Performance Tuning at the transformation Level and Session level.
  • Creation of tables, packages, mappings, batch jobs, roles and users in Informatica MDM Hub.
  • Worked with business users and analysts to understand the requirements for mastering the data obtained from various sources and loading the golden records to the targetMDM database.
  • Involved in MDM Design and Developmental activities.
  • Added new sources to the existing MDM implementation.
  • Experienced in creating and configuring Address Doctor, Identity Match reference data components for MDM Hub & Cleanse Server(s).
  • Experienced in integratingInformatica MDM with Informatica PowerCenter IDQ.
  • Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development.
  • Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and by using Parameter files.
  • Analyzed session log files in session failures to resolve errors in mapping or session configuration.
  • Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
  • Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.
  • Worked under Agile methodology and used Rally tool one to track the tasks.
  • Performed bulk data imports and created stored procedures, functions, views and queries.

Environment: Informatica MDM, Autosys, Oracle11g, SAP, Toad, WinSQL, ERWIN, UNIX.

Confidential

ETL/SQL Developer

Responsibilities:

  • Co-ordinated Joint Application Development (JAD) sessions with Business Analysts and source developer for performing data analysis and gathering business requirements.
  • Developed technical specifications of the ETL process flow.
  • Designed the Source - Target mappings and involved in designing the Selection Criteria document.
  • Worked on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Teradata.
  • Used Informatica PowerCenter to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, Flat files).
  • Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, Sequence Generator and Normalizer.
  • Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
  • Used T-SQL for Querying the SQL Server2000 database for data validation and data conditioning.
  • Worked Extensively with SSIS to import, export and transform the data between Used T-SQL for Querying the SQL Server2000 database for data validation and data conditioning.
  • Implemented Informatica Framework for (dynamic parameter file generation, start, failed and succeeded emails for an integration, Error handling and Operational Metadata Logging).
  • Implemented sending of Post-Session Email once data is loaded.
  • Worked with DBA for partitioning and creating indexes on tables used in source qualifier queries.
  • Involved in Performance/Query tuning. Generation/interpretation of explain plans and tuning SQL to improve performance.
  • Scheduled various daily and monthly ETL loads using Autosys.
  • Involved in writing UNIX shell scripts to run and schedule batch jobs.
  • Involved in unit testing and documentation of the ETL process.
  • Involved in Production Support in resolving issues and bugs.

Environment: Informatica Power Center 8.6. PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX.

We'd love your feedback!