We provide IT Staff Augmentation Services!

Informatica/idq Developer Resume

5.00/5 (Submit Your Rating)

New York City, NY

PROFESSIONAL SUMMARY:

  • 7+ years of professional experience as Software developer in the field of Information Technology, including in analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehousing/Data mart design..
  • Over 6 years of experience in using ETL Informatica PowerCenter 9.5.1/9.0.1/8. x/7.x
  • Good domain knowledge of HR, Healthcare, Insurance, Pharmacy, Petroleum & Financial system.
  • Strong Data warehousing experience specializing in ETL Concepts and RDBMS.
  • Designed and documented ETL frame works with Best Practices, Performance tuning techniques, naming conventions, node configurations. Prepared full migration process and framework for Oracle and Data stage migration projects.
  • Strong knowledge on IDQ Mapping Designer, Mapplet Designer, Transformation developer Designer, Workflow Manager and Repository.
  • Expert level Data Integration skills using Informatica Power Center to design, develop, implement, and optimize ETL mappings, transformations and workflows to move data from multiple sources including flat files, RDBMS tables, XML’s into Operational Data Store (ODS), Data Warehouse and Data Marts.
  • Have experience with IDQ, MDM with knowledge on Big Data Edition Integration with Hadoop and HDFS.
  • Highly proficient in the use of T - SQL for developing complex Stored Procedures, Triggers, Functions, Views, Indexes, Cursors, SQL joins and Dynamic SQL queries etc.
  • Experience in integration of various data sources such as Oracle, SQL Server, Sybase, IBM DB2, Teradata, MS Excel and Flat Files into staging area, ODS.
  • Worked with Dimensional Data warehouses in Star and Snowflake Schemas.
  • Experience in Extracting data from Facets claims application to pull it into the Staging area.
  • Experience in using Facets claim application to open, add generations, enter and save information
  • Extensive experience with ETL tool Informatica in designing and developing complex Mappings, Mapplets, Transformations, Workflows, Worklets, and scheduling the Workflows and sessions.
  • Experience on Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Experience in Extracting, Transforming and Loading (ETL) data from Excel, Flat file, DTS and SSIS services.
  • Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.
  • Expertise in Claims, Subscriber/Member, Plan/Product, Claims, Provider, Commissions and Billing Modules of Facets.
  • Implemented Slowly changing dimensions and change data capture using Informatica.
  • Extensively developed Complex mappings using various transformations such as Unconnected / Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Union and more.
  • Experience on data profiling & various data quality rules development using Informatica Data Quality (IDQ).
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files. Hands on experience in maintaining the code versions using Visual Source Safe / Informatica Versioning.
  • Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
  • Worked on IDQ tools for data profiling, data enrichment and standardization.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Experience in data profiling and analyzing the scorecards to design the data model
  • Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Package Designer.
  • Experience in developing UNIX Shell scripts that are used by ETL processes.
  • Strong in SQL, T-SQL, PL/SQL, SQL*LOADER, SQL*PLUS, MS-SQL & PRO*C
  • Good understanding with all phases of SDLC (System Development Life Cycle) including Planning, Analysis, Design, Implementation and Maintenance.
  • Experience in importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility
  • Experienced with coordinating cross-functional teams, project management and presenting technical ideas to diverse groups.
  • Strong analytical, problem-solving, communication, learning and team skills.

TECHNICAL SKILLS:

ETL Technology: Informatica Power Center 9.5.1/9.0.1/8. 6.1/8.0.1/7.5/7.1, Control-M, IDQ, MDM, Autosys, SharePoint, Erwin

Data Warehouse: ODS, Normalized Data Mart, Dimensional Star and Snowflake Modeling

Databases: Oracle 11g/10g/9i, Teradata, MS SQL Server 2008/7.0/2000, Sybase, DB2

Programming: SQL, SQL Developer, T-SQL, PL/SQL, Toad 9.2/8.6, SQL Plus, MS SQL Server Integration Services (SSIS), Microsoft Office, SQL Plus, HTML, UNIX Scripting, Core Java.

Operating Systems: HP-UX, Sun Solaris 2.6/2.4, Linux, Windows XP/7

EXPERIENCE:

Confidential, New York City, NY

Informatica/IDQ Developer

Responsibilities:

  • As part of SDLC process, created Software Requirements Specifications (SRS) Document and System Design Specifications (SDS) Document before Development phase.
  • Designed and developed Informatica mappings to Extract, Transform and Load data into target tables.
  • Worked with Memory cache for Static and Dynamic cache to enhance throughput of sessions containing Rank, Lookup, Joiner, Filter, Sorter, Normalizer and Aggregator transformations.
  • Extensively worked on Informatica Designer and Workflow Manager.
  • Extensively used almost all transformations of Informatica including Lookups, Stored Procedures, Update Strategy and others.
  • Worked with Informatica Data Quality(IDQ) 9.1 for data cleansing, data matching and data conversion
  • Designed/Developed IDQ reusable mappings to match accounting data based on demographic information.
  • Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
  • Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them
  • Worked on IDQ parsing, IDQ Standardization, matching, IDQ web services.
  • Imported the mappings developed in data quality (IDQ) to Informatica designer.
  • Worked on Informatica Analyst Tool IDQ, to get score cards reports for data issues. Extensively worked on Performance Tuning of ETL Procedures and processes.
  • Expertise in creating Packages using SQL Server Integration Services (SSIS).
  • Developed complex T-SQL queries and designed SSIS packages to load the data into warehouse.
  • Developed mappings for Type 1, Type 2 Slowly Changing Dimension (SCD).
  • Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.
  • Tuned performance of existing SQL statements and PL/SQL code.
  • Worked with Session Logs, Workflow Logs and Debugger for Error Handling and Troubleshooting in all environments.
  • Involved in Data Profiling using Informatica Data Quality (IDQ).
  • Involved in creating extract programs for various clients including Aetna, HMS, Milliman, CNP and Pharmacy Advisor.
  • Worked with various claim adjudications systems like RxClaim, Recap, and RxAmerica which combines over 12 million records on a daily basis.
  • Used SSIS to create ETL packages (.dtsx files) to validate, extract, transform and load data to data warehouse databases, data mart databases.
  • Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.
  • Executed software projects for Health Care domains. Expertise in designing and creating Mappings, sessions and Workflows.
  • Accepted inbound transactions from multiple sources using FACETS.
  • Supported integrated EDI batch processing and real-time EDI using FACETS
  • Worked with T-SQL to create Tables, Views, and triggers and stored Procedures.
  • Responsible for tuning T-SQL procedures, triggers and other database objects
  • Worked on tight Service Level Agreements with the downstream applications.
  • Set up claim processing data for different Facets Module.
  • Involved in scheduling jobs to meet requirements on a defined SLA from clients from various adjudications in maestro.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Created profiles and score cards for the users using IDQ.
  • Involved in fixing batch job failures faced due to various issues across the 26 applications and 16000 plus jobs.
  • Worked with Tivoli Maestro for scheduling purposes and fixing jobs using the logs from Maestro.
  • Worked extensively in Unix Environment and Informatica to fix jobs.
  • Involved in deployment of jobs to the production environment.
  • Involved in various outages, major implementations and handled multiple post upgrade situations.
  • Worked on configuring SFTP protocols in Unix Servers.
  • Prepared the complete data mapping for all the migrated jobs using SSIS.
  • Responsible for resolution of tickets reported by application users
  • Extensively involved in team meetings to fix the production issues on daily basis.

Environment: Informatica Power Center 9.5.1, Power Exchange, IDQ, TOAD, Facets, UNIX (Sun-Solaris), ERWIN, Oracle, Teradata, DB2, SQL Server Integration Services (SSIS), SQL*Plus, MS SQL Server 2008, T-SQL, PL SQL, Windows XP, TWS (Tivoli Maestro), Teradata Viewpoint, HPSM & Heat.

Confidential, Long Beach CA

Informatica/IDQ Developer

Responsibilities:

  • Responsible for requirement definition and analysis in support of Data Warehousing efforts.
  • Worked with lead Business analyst to identify the source systems.
  • Worked with the Data Modeler during the designing of Logical and physical models.
  • Worked with client team while designing requirements document.
  • Developed various scripts to support EIP platform and get the health check of EIP platform.
  • Performed Data cleansing and data scrubbing on the source systems.
  • Implemented transformations on the source systems as per requirements and loaded them into a Landing zone.
  • Extensively used ETL Tool Informatica to load data from Flat Files to landing tables in Oracle server.
  • Worked with the client to perform extensive validation of landing tables.
  • Implemented Star schema for this Data Warehouse.
  • Developed ETL mappings, transformations using Informatica PowerCenter 9.1/9.5.1 to load the data from landing tables to Dimension tables.
  • Identified and eliminated duplicates in datasets thorough IDQ components
  • Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica power center as mappings, mapplets
  • Identified and eliminated duplicates in datasets thorough IDQ components
  • Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica power center as mappings, mapplets
  • Implemented Slowly Changing Dimensions (SCD-2) for loading the dimension tables.
  • Developed mappings to load the data into Fact Tables.
  • Profile source data using IDQ tool, understand source system data representation, formats & data gaps Created Exception handling process and worked on the best practices and standards for exception handling routines
  • Extensively worked with Teradata in data Extraction, Transformation and loading from source to target system using Bteq, Fast Load, and Multi Load.
  • IDQ is used for data cleansing, Tuning and profiling and Implemented reports, dashboards to display DQ results
  • Wrote BTEQ scripts to transform data. Used Teradata utilities fastload, multiload, tpump to load data.
  • Worked on error handling and performance tuning in Teradata queries and utilities fastload, Multi-loading, tpump.
  • Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping
  • Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Performed data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them
  • Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor.
  • Developed and tested all the Informatica mappings and update processes.
  • Extensively worked with the Debugger for handling the data errors in the mapping designer.
  • Hands on experience with mappings from varied transformation logics like Unconnected and Connected Lookups, Router, Aggregator, Filter, Joiner, Update Strategy.
  • Created Complex ETL Packages using SSIS to extract data from staging tables to partitioned tables with incremental load.
  • Created SSIS Reusable Packages to extract data from Multi formatted Flat files, Excel, XML files into UL Database and DB2 Billing Systems.
  • Developed, deployed, and monitored SSIS Packages.
  • Created complex Stored Procedures, Triggers, Cursors, Tables and other SQL Joins and Statements for Applications by using T-SQL.
  • Extensively used T-SQL in constructing User defined Functions, Views, Indexes, User Profiles, Relational Database Models, Data Dictionaries, and Data Integrity.
  • Created events and tasks in the work flows using workflow manager.
  • Created sessions and arranged them in various workflows in the workflow manager.
  • Responsible for tuning ETL procedures to optimize load and query Performance.
  • Created Pre/Post Session/SQL commands in sessions and mappings on the target instance.
  • Reviewed QA Test Plans and provided technical support during QA and Stage testing (UAT).
  • Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.
  • Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.
  • Worked closely with business analysts and gathered functional requirements. Designed technical design documents for ETL process.
  • Developed Unit test cases and Unit test plans to verify the data loading process.

Environment: Informatica PowerCenter 9.1.0/9.5.1, Oracle 11g, delimited files, UNIX Shell Script, IDQ, Windows 7, Toad for oracle 11G, Teradata, SSIS, T-SQL, PL SQL, SQL server 2008.

Confidential, Louisville, KY

ETL Developer

Responsibilities:

  • Responsible for documentation, version control of schema and version release.
  • Analyzed specifications and identified source data needs to be moved to data warehouse, participated in the Design Team and user requirement gathering meetings.
  • Interpreted logical and physical data model for business users to determine common data definitions and establish referential integrity of the system.
  • Participated in the analysis of development environment of Extraction process and development architecture of ETL process.
  • Coordinated with customer in finding the sources and targets for data conversion.
  • Involved in the preparation of documentation for ETL standards, Procedures and Naming conventions as per ETL standards.
  • Designed, developed Informatica mappings using Informatica 9.1, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.
  • Created new Informatica Mappings with Source qualifier, Union, Aggregator, connected and unconnected lookups, Filter, Update Strategy, Rank, Stored Procedure, Expression and Sequence Generator transformations while transforming the Sales/Marketing data.
  • Created SSIS package for loading the data coming from various interfaces like OMS, Orders, Adjustments and Objectives and also used multiple transformation in SSIS to collect data from various sources.
  • Created SSRS reports for Franchise Health Reports.
  • Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.
  • Created Reusable transformations and Mapplets for use in Multiple Mappings.
  • Utilized Informatica IDQ to complete initial data profiling and matching/removing duplicate data
  • Provided 24X7 on call support, which included monitoring morning and nightly jobs and emergency production fixes.
  • Created multiple universes and resolved loops by creating table aliases and contexts.
  • Used session partitions, Dynamic cache memory and Index caches for improving performance of Informatica server.
  • Extracted data from SQL server Source Systems and loaded into Oracle Target tables.
  • Involved in writing shell scripts for automating pre-session, post-session processes and batch execution at required frequency using power center server manager.
  • Involved in writing BTEQ scripts.
  • Involved in the loading and Scheduling of jobs to be run in the Batch process.
  • Optimized and performed Tuning in mappings to achieve higher response times.
  • Involved in the migration of existing ETL process to Informatica Power center.
  • Created effective Test data and developed thorough Unit test cases to ensure successful execution of the data loading processes.
  • Organized data in reports using Filters, sorting, ranking data with alerts.
  • Created reports using business object functionality like queries, slice and dice, drill down, functions and formulas.

Environment: Informatica Power Center 9.1.0/8.6.1, Oracle 10g, Teradata, MS SQL SERVER 2000, T-SQL, SQL, SSIS, PL/SQL, SQL*Loader, UNIX Shell Script.

Confidential, Greensboro, NC

ETL Informatica Developer

Responsibilities:

  • Worked closely with business analysts and gathered functional requirements. Designed technical design documents for ETL process.
  • Developed ETL mappings, transformations using Informatica Power Center 9.0.1/8.6.1 .
  • Implemented Change Data Capture (CDC) process to load into the staging area.
  • Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, and Workflow Manager.
  • Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, & Excel.
  • Developed reusable Mapplets, Transformations and user defined functions.
  • Extensively used Mapping Debugger to handle the data errors in the mapping designer.
  • Experience using transformations such as Normalizer, Unconnected/Connected Lookups, Router, Aggregator, Joiner, Update Strategy, Union, Sorter, and reusable transformations.
  • Created event wait and event raise, email, command tasks in the workflows manager.
  • Responsible for tuning ETL procedures to optimize load and query Performance.
  • Extensively worked with incremental loading using Parameter Files, Mapping Variables and Mapping Parameters.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe system.
  • Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.
  • Involved in writing shell scripts for file transfers, file renaming and concatenating files.
  • Created debugging sessions for error identification by creating break points and monitoring the debug data values in the mapping designer.
  • Developed Unit test cases and Unit test plans to verify the data loading process.

Environment: Informatica - Power Center 9.0.1/8.6.1, Oracle 11g/10G, Sybase, and Delimiter files, UNIX Shell Script, Windows XP, Toad for oracle, SQL server 2008.

Confidential

ETL Informatica Developer

Responsibilities:

  • Extensively worked on Informatica Designer, Workflow Manager, and Workflow Monitor as a senior Informatica Developer.
  • Extracted data stored in Oracle 11g, Oracle 10g and Oracle 9i and Flat files and loaded data into Oracle Data warehouse
  • Worked with Source Analyzer, Data Warehouse Designer, Workflow monitor, Mapping Designer, Mapplets, and Transformation Developer in Informatica designer
  • Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate daily weekly and monthly loading of Data
  • Extensively used transformations like Aggregator, Expression, Sorter, Sequence Generator, Joiner, Filter, Router, Rank, Look up and Update Strategy transformations to model various standardized business processes
  • Used Informatica features to implement Type I, II changes in slowly changing dimension tables
  • Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger
  • Participated in reconciling data drawn from multiple systems across the company like Oracle 11g, Oracle 10g, flat files into Oracle data warehouse
  • Involved in data Transformations between different Informatica folders and also Repositories along with DBA team.
  • Experience using different transformations like Aggregator, Lookup (connected and unconnected), Filter, Expression, Router, Update Strategy and Sequence Generator for data transformations.
  • Created and Configured Workflows, Work lets and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Developed reusable transformations to load data from various data sources to the DW.
  • Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Scheduled Informatica sessions and workflows using Informatica Scheduler in Business process as Requested in SLA.
  • Performed extensive debugging and performance tuning of mappings, sessions and workflows including partitioning, memory tuning and cache management.
  • Created and maintained various project related documents like high level design documents etc.
  • Created, executed and documented associated test cases.

Environment: Informatica Power Center 8.6.1/7.1(Informatica Designer, Workflow Manager, Workflow Monitor), Oracle 10g,Oracle 11g, Flat files, ODBC, Windows NT, UNIX, Shell Scripts, Toad 7.5,Apex explorer/loader, SQL.

We'd love your feedback!