We provide IT Staff Augmentation Services!

Sr. Etl/informatica Developer Resume

5.00/5 (Submit Your Rating)

Austin, TX

SUMMARY:

  • 7 years of Technical and Functional experience in Decision Support Systems - Data warehousing implementing ETL (Extract, Transform and Load) using Informatica Power Center 10.1.1/9.6/9.
  • Experience in various stages of System Development Life Cycle (SDLC) and its approaches like Waterfall, Agile Model.
  • Experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica PowerCenter 9.6.1/9.5.1/9.1/8.6/7.1, PowerExchange 9.1.
  • Experience in OLTP Modeling (2NF,3NF) andOLAP Dimensional modeling (Star and Snowflake) using Erwin Standard Edition/r7.3/4/3.5 (conceptual, logical, and physical data models).
  • Experienced in Oracle database development using PL/SQL, Stored Procedures, Functions and Packages.
  • Created Mappings in Mapping Designer to load data from various sources using complex transformations like transaction control, Lookup (Connected and Un-connected), Joiner, sorter, Aggregator, Update Strategy, Filter and Router transformation.
  • Experience working on Informatica MDM to design, develop, test and review & optimize Informatica MDM.
  • Database experience using Oracle, SQL Server, Azure SQL Server.
  • Excellent Onshore Informatica MDM SME responsible for Architecture, end to end Integration and delivery.
  • Experienced in scheduling Sequence and parallel jobs using DataStage Director.
  • Provided data, pharmacy drug/prescriber’s data. FDA AHFS Drugs details, quality metrics and health outcomes.
  • Experience in DTS, SSIS, Import/Export Wizard, SQL Server Enterprise Manager and SQL Query Analyzer.
  • Managing and updating database as per the business requirements.
  • Worked on Real Time Integration between MDM Hub and External Applications using Powe Center.
  • Developmental experience on Windows platform.
  • Data integration with SFDC and Microsoft Dynamics CRM using informatica cloud.
  • Used Informatica BDM IDQ 10 to inject the data from AWS S3 raw to S3 refine and from refine to Redshift.
  • Responsible for determining the bottlenecks and fixing bottlenecks with performance tuning using Netezza Database.
  • Developed, Deployed, and supported complex SSIS Package for Data Migration and Transferring Oracle/Access/Excel Sheets using SQL Server SSIS.
  • Experience writing daily batch jobs using UNIX/PERL shell scripts and developing complex UNIX/PERL Shell scripts for automation of ETL.
  • Enforce & promote standards and best practices in data modeling efforts.
  • Experience in scheduling Workflows or jobs to run at specified time intervals.
  • Good hands-on Source Code Management (Version Control System) tools like Git and Subversion to maintain Informatica code.
  • Build logical and physical data model.
  • Extensively worked on Power Center Mapping Designer, Mapplet Designer, Transformation developer, Workflow Manager, Repository Manager and Workflow Monitor.
  • Experienced In working with various Scheduling tools like Autosys, Control-M, Informatica Scheduler.
  • Experience with automation/integration tools like Jenkins.
  • Excellent communication and interpersonal skills.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.1.1/9.6/9. 1, Informatica Intelligent Cloud Service (IICS).

Database: Oracle 11g/10g/9i/ 8i, SQL Server, DB2, Netezza, AZURE SQL Server.

Languages: SQL, PL/SQL, UNIX/Perl

Operating Systems: Windows, UNIX

Software / Applications: MS XP/NT, MS office Suite

Tools: TOAD

PROFESSIONAL EXPERIENCE:

Confidential, Austin, TX

Sr. ETL/Informatica Developer

Responsibilities:

  • Designed and created new Informatica jobs to implement new business logic into the existing process.
  • Using Informatica modules (Repository Manager, Designer, Workflow Manager and Workflow Monitor) to accomplish end to end ETL process.
  • Performed data profiling with Sources to analyse the content, quality and structure of source data
  • During mapping development.
  • Created required scripts/transformations to extract the source data from various sources such as Oracle, Flat Files etc.
  • Designed and developed complex aggregate, joiner, lookup transformations to implement the business rules in the ETL mappings to load the target Facts and Dimensions.
  • Defined Target Load Order Plan for loading data into Target Tables.
  • Used Mapplets and Reusable Transformations prevent redundancy of transformation usage and maintainability.
  • Created Complex Informatica mappings and in other hand Simple mappings with Complex SQLs in it based on the need or requirement of business user.
  • Fine-tuned the session performance using Session partitioning for long running sessions.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used Versioning, Labels and Deployment group in the production move process.
  • Setup Permissions for Groups and Users in all Environments (Dev, UAT and Prod).
  • Involved in developing optimized code using PL/SQL for Server related Packages to centralize the application through procedures containing PL/SQL were created and stored in the database and fired off when contents of database were changed.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • Developed Oracle Stored Procedures, Packages and Functions and utilized in ETL Process.
  • Analysed the Target Data mart for accuracy of data for the pre-defined reporting needs.
  • Conducted unit testing of all ETL mappings as well as helped QA team in conducting their testing.
  • Used Autosys Tool to schedule shell scripts and Informatica jobs.
  • Performed Unit, Grid Integration, Testing and validate results with end users.

Environment: Informatica Power Center 10, Oracle 11g, SQL, PL/SQL, Oracle Sql Developer Tool, SQL Server, Flat Files, XML, Autosys, UNIX Shell Scripting, Subversion.

Confidential, Atlanta, GA

Sr. ETL/Informatica Developer

Responsibilities:

  • Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
  • Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.
  • Used various transformations like Source Qualifier, Joiner, Lookup, sql, and router, Filter, Expression and Update Strategy.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Worked on Hadoop Bigdata development environment. Experience in Big Data development using open- source tools including Hadoop Core, Hive, Scala and Spark.
  • Created the report for the Informatica CDC workflow’s performance.
  • WriteTeradataSQL, BTEQ, MLoad, OLELoad, FastLoad, and FastExport for Ad-hoc queries, and build UNIX shell script to perform ETL interfaces BTEQ, FastLoad or FastExport, via Hummingbird and Control-M software.
  • Defined, configured, and optimized various MDM process including, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
  • Interacted with Business Analysts to finalize AML requirements (Banking) and documented the UDD, TDD for coding.
  • Responsible for investigation, characterization, and communication of build and release problems, implementing corrective and preventive actions. Resolved all the issues with JIRA tickets on priority basis.
  • Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools.
  • Developed Perl scripts for jobs to run the informatica workflows.
  • Worked with Business Analyst to come up with the source - target document to map the juris data to the existing data structures.
  • Extracted data from various source systems like Oracle, SQL Server and DB2 to load the data into Landing Zone and then by using Java copy command loaded onto AWS-S3 Raw Bucket.
  • Worked on data cleansing using the cleanse functions in informatica MDM.
  • Participated in Data Model design sessions with Data modeler, BSA and provided valuable input impacting the ETL build work.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Data integration with SFDC and Microsoft Dynamics CRM using Informatica Intelligent Cloud Services (IICS).
  • Developed the audit activity for all the cloud mappings.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Created and ran the Workflows using Workflow manager inInformaticaMaintained stored definitions, transformation rules and targets definitions usingInformaticarepository manager.
  • Migration of informatica jobs to Hadoop Sqoop jobs and load into desired database.
  • Documented Informatica mappings in Excel spread sheet.
  • Tuned the Informatica mappings for optimal load performance.
  • Developed Custom Audit tables in SQL server to capture SSIS package logging information.
  • Analyzed, identified, fixed bad data, and imported data fromSalesForce - CRM to Oracle. Upstream data integration and migration processes in predefined schemas.
  • Provided technical input to BSA while creating the STM document.
  • Installed and configured Informatica Power Exchange for CDC and Informatica Data Quality (IDQ).
  • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems usingInformatica.
  • Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager. Informatica Web-services and web-portal applications.
  • Worked in ETL architecture for integrating data at real time and batch processing to populate the warehouse and implemented it using Oracle Data Integrator.
  • Proficient in Dara Integration of various data sources with multiple relational databases like Oracle10g. MS SQL Server, DB2, AWS S3, Redshift, VSAM files and Flat Files into the stagging area, ODS, Data Warehouse and Data Mart.
  • Handful experience on Windows 32-bit commands, Quoting, Escaping.
  • Created Filewatcher jobs to setup the dependency between Cloud and PowerCenter jobs.
  • Managed the migration of SQL Server 2008 databases to SQL Server 2012.
  • Have generated reports using OBIEE 10.1.3 for the future business utilities.
  • Created SQL scripts to load the custom data into Development, Test and production Instances using Import/Export. Created scripts to create custom Tables and Views.
  • This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.
  • Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
  • Constantly interacted with business users to discuss requirements.

Environment: Informatica Power Center 10.1, Informatics Repository Manager, OBIEE Oracle10g/9i, and DB2 6.1, Erwin, TOAD, and SAP Version: 3.1.H, Unix- SunOS, PL/SQL, and SQL Developer, Java/J2EE, Struts, JDBC, PL/SQL, JUNIT, ANT, HTML, DHTML, JSP, JavaScript, XML, Oracle, Apache Tomcat, MS Excel.

Confidential, Washington, DC

Informatica Developer

Responsibilities:

  • Understood ETL process and business rules that need to be implemented.
  • Worked on Informatica Power Center 7.1.1 tool - Source Analyzer, warehouse designer, Mapping Designer, Mapplets and Transformations.
  • Developed mappings in Informatica to load the data from various data sources into the Data Marts, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
  • Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica Power Center.
  • Involved in creating the Unix Scripts and jobs to handle Informatica workflows andTeradatautilities like TPT scripts.
  • Created Netezza SQL scripts to test the table loaded correctly.
  • Initialize AML daily screening process by loading data from UDM CDS to profile DB.
  • Analyzed Change request (CR) as pr requests from team track/JIRA.
  • Development/ enhancement/ testing of Informatica MDM configurations.
  • DevelopedIDQ mappings using various transformations like Labeler, Standardization, Case Converter, Match and Address validation Transformation.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Designed and Development of ETL processing using Informatica Intelligent Cloud Service (IICS).
  • Assisted BSA in implementing the FI global file transfer mechanism standards (SFTP).
  • Actively supported data migration for FDA drugs list from Medispan to our database system.
  • Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
  • Worked extensively with the connected lookup Transformations using dynamic cache.
  • Worked with complex mappings having an average of 15 transformations.
  • Supported data validation and how to combine with our claims to get this updated FDA information at claims level.
  • Coded PL/SQL stored procedures and successfully used them in the mappings.
  • Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process and to schedule the automatic execution of workflows.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Created and configured Workflows, Worklets and Sessions to transport the data to target using Informatica Workflow Manager.
  • Involved in extraction, cleansing, and loading of data into Relational database (Oracle database) from flat files and DB2 tables.
  • Extensively involved in performance tuning at source, target, mapping, session, and system levels by analyzing the reject data.
  • Performed Unit testing and Functional Testing of Informatica mappings and Workflows.
  • Performed ETL & database code migrations across environments.
  • Designed tables, indexes and constraints using TOAD and loaded data into the database using SQL*Loader.
  • Documentation of technical specification, business requirements, functional specifications for the development of Informatica mappings to load data into various tables and defining ETL standards.

Environment: Informatica Power Center 7.1.1, Oracle 8.1.7.4, SQL*Plus, PL/SQL, TOAD 7.1, UNIX, Windows XP.

Confidential

ETL Developer

Responsibilities:

  • Installation of MS SQL Server and Installation of SQL Client software on Windows.
  • Implemented the calculations to aggregate physical measures using management studio.
  • Designed, Developed and Deployed reports in MS SQL Server 2008/2008R2.
  • Experienced in developing T-SQL (DDL, DML) statements using dynamically generated SQL Server 2008/2008R2.
  • Created and configured workflows, worklets & Sessions to transport the data to target systems using Informatica Workflow Manager.
  • Used Mapplets and Reusable Transformations prevent redundancy of transformation usage and maintainability.
  • Fine tuned the session performance using Session partitioning for long running sessions.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data.
  • Used various Oracle Index techniques like B*tree, bitmap index to improve the query performance and created scripts to update the table statistics for better explain plan.
  • Responsible for loading data into warehouse using Oracle Loader for history data.
  • Responsible for moving the mappings and sessions from development repository to testing repository box.
  • Troubleshooting production issues.
  • Identified and defined the scoped assignments using script commands.
  • Performed Data integrity, Sanity Checks by using backend SQL Quires for the data validation and for the data quality.
  • Created complex Stored Procedures, DTS packages, triggers, cursors, tables, views and SQL joins and statements for applications.

Environment: Informatica Power Center 9.5, Erwin, MS Visio, Oracle 11g, SQL, PL/SQL, Oracle Sql Developer Tool, SQL Server 2008, Mainframe, JCL, Dimension Tool, MKS Integrity.

We'd love your feedback!