We provide IT Staff Augmentation Services!

Informatica Developer/production Support Analyst Resume

2.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY

  • Nine years of Experience in analysis, design, development, implementation and troubleshooting of Data Mart/Data Warehouse applications using ETL tools like Informatica PowerCenter 10.x/9.x/ 8.x/7.x and IICS.
  • Experienced in Implementing and Delivering project using industry best practice and exposed to Software Development Methodologies like Software Development Life Cycle (SDLC).
  • Experience in working with Business Users to understand and analyze project requirements and to co - ordinate work efforts with multiple teams.
  • Experienced in creating mappings, sessions and workflows in Informatica Intelligent Cloud Services (IICS).
  • Responsible for troubleshooting failed tasks in Informatica Intelligent Cloud Services (IICS).
  • Expertise in using methodologies for data extraction, transformations and loading processing in corporate-wide ETL solutions using Informatica PowerCenter 10.x/9.x/8.x/7.x on UNIX and Windows platform.
  • Good experience working with Teradata utilities like FastLoad, MultiLoad, FastExport and BTEQ.
  • Experienced in extracting data from API to tables using HTTP and XML Parser transformation.
  • Experience in Integration of various data feeds from sources like Oracle, SQL Server and Flat Files.
  • Experience working on applications based on Java/J2EE.
  • Experience in loading data from flat file to staging to ODS tables.
  • Good knowledge of MDM to create clean and consistent master data.
  • Created rules and Scorecard for data cleansing, profiling and overall data quality checks.
  • Experience in Capturing data changes for Change Data Capture (CDC), SCD 1 and 2 with the help of MD5.
  • Good experience working with various transformations like SQ, Expression, router, filter, Lookup, Transaction control, Update strategy, Union, Aggregator and Seq generator.
  • Extensive experience in all phases of Data Warehouse project life cycle.
  • Proficient in the development of ETL (Extract, Transform, and Load) processes with a good understanding of source to target data mapping, ability to define, capture Meta data and business rules.
  • Good experience extracting data from Snowflake, AWS S3, AWS RDS, Mongo DB and Salesforce.
  • Created Mappings using Mapping Designer to load the data from Salesforce, transform data according to business rules and load it back to Salesforce.
  • Experienced in designing, developing, testing, and performance tuning and debugging of existing ETL processes.
  • Knowledge and understanding of Dimensional Data Modeling using Star Schema, Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling.
  • Experience in using Erwin Data Modeler to perform data modeling.
  • Knowledge about Kimball and Bill Inmon modeling approaches.
  • Experience in Implementation of slowly changing dimensions methodology to keep track of historical data.
  • Experience in implementing complex business rules by creating optimized and efficient mappings with appropriate transformations.
  • Experience in creating database objects like procedures, views, packages, functions and triggers using PL/SQL and T-SQL
  • Developed PL/SQL packages, shell scripts to execute the packages and PL/SQL blocks to debug the package in production environment and migrated bulk data.
  • Proficient with TOAD 9.6.1, Oracle SQL Developer 2.1.1.64 & Win\Unix interface.
  • Extensively used ETL techniques to load data from SQL Server, Oracle and flat files to Oracle Database. Created Views & Schemas for the Application as a part of data modeling.
  • Good experience working with scheduling tools like Autosys, Control-M and Informatica scheduler.
  • Provided integration and postproduction support for ETL and reporting systems.
  • Experience in creating the label and Migration of code to different environments.
  • Experience in ETL implementation and Production Support.

TECHNICAL SKILLS

  • Informatica Power Center 10.2/9.1.0, 8.6.1/8.5/8.1
  • Informatica Workflow Manager
  • IICS
  • Informatica Workflow Monitor
  • Informatica Repository Manager
  • CA Workload/Autosys
  • Oracle 11g/10g/9i/8i
  • SQL Server
  • Mongo DB
  • AWS RDS
  • Star Schema Modeling
  • Snowflake Modeling,
  • Fact and Dimensions Tables
  • Physical and Logical Data Modeling
  • Erwin
  • Autosys (Unix Backend)
  • CA Workload
  • Control-M
  • Tidal
  • AWS
  • XML
  • UNIX Shell Scripting
  • SQL, PL/SQL, T-SQL, SQL*Plus
  • SSRS/SSIS
  • ODBC
  • TOAD
  • JIRA
  • Snowflake
  • Salesforce
  • Confluence
  • Polarian
  • Waterfall
  • Agile (SCRUM)

PROFESSIONAL EXPERIENCE

Confidential - Minneapolis, MN

Informatica Developer/Production Support Analyst

Responsibilities:

  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor
  • Worked Remotely through Citrix
  • Parsed high-level design specification to simple ETL coding and mapping standards
  • Designed and customized data models for Data warehouse supporting data from multiple sources in real time
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse
  • Created mapping documents to outline data flow from sources to targets
  • Kept daily records of progress due to remote work
  • Responsible to create new data models to remove confidential information of Insurance Agents and add new identifiers
  • Responsible of Unit Testing all development before migrating to QA Team
  • Responsible to develop and fix any defects raised by QA Team as per business requirements
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used CA Workload to design the business process, dimensions and measured facts as per business requirements
  • Extracted the data from Oracle and SQL Server Databases
  • Altered Data tables in Oracle and SQL Server in accordance with Business Requirements Documents
  • Maintained stored definitions, transformation rules and targets definitions using Informatica Repository Manager
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer
  • Created reusable PL/SQL stored procedures to disable and enable constraints for Bulk load of Target tables in Oracle.
  • Involved in the development of PL/SQL stored procedures, functions and packages to process business data in OLTP systems.
  • Developed mapping parameters and variables to support SQL override
  • Created mapplets to use them in different mappings
  • Developed mappings to load into staging tables and then to Dimensions and Facts
  • Used existing ETL standards to develop these mappings
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Modified existing mappings for enhancements of new business requirements
  • Used Debugger to test the mappings and fix and workflow or job failures
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Citrix, Informatica 10.x/9.x, IICS, AWS, Snowflake, Oracle 11g, PL/SQL, SQL Server, CA Workload, Polarian

Confidential - Woonsocket, RI

Informatica Developer/ Production Support Analyst

Responsibilities:

  • Provided development and technical support for HR and Payroll related data and loaded this data from PeopleSoft to Oracle database.
  • Created mappings to load raw data from PeopleSoft (Flat files) to staging tables (Oracle). This data was then loaded to various dimensions and facts to be used for report development (Cognos) for business users.
  • Captured data changes with the help of CDC, SCD Type 1 and Type 2 mappings and also utilized MD5 to accomplish Dimension, Facts and error record loading.
  • Worked with Look Up transformation using both Connected and UN - connected options.
  • Worked with various look up cache like Static Cache, Dynamic Cache and Persistent Cache.
  • Created scripts to unzip raw data file, load data to staging tables and them move the zipped files to archive folder and also to create list file dynamically.
  • Responsible for the design, development, testing and documentation of the Informatica mappings, PL/SQL, Transformation, jobs based on standards.
  • Worked extensively in PL/SQL procedures to load the data.
  • Extensively used various transformations like Union, Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
  • Based on the error message from logs, review Informatica mappings and stored procedures to identify the reason for batch job failure and propose an appropriate solution. Also documented failures for future references.
  • Worked with business users to troubleshoot faulty data in reports.
  • Extensively worked with Joiner transformation using the four types of joins-Normal Join, master outer join, detail outer join, and Full Outer join.
  • Used parameters and variables in PowerCenter code to avoid hardcoding values.
  • Worked with Business analyst and business users to capture business requirements before working on mappings.
  • Did performance tuning at ETL and database level to perform ETL faster.
  • Responsible for supporting multiple functional high availability business applications and business users. Additional responsibilities include but not limited to troubleshoot production data issues and failures, manage critical incidents and bridge calls, automate repetitive tasks, problem analysis, maintain knowledge base and documentation, implement process improvement methods.
  • Extensively worked with aggregate functions like Avg, Min, Max, Count, First, and Last in the Aggregator Transformation.
  • Used Filter transformation at the early stages of the mapping for the performance tuning of the mapping.
  • Ability to work on performance issues by accessing both Informatica and Oracle logs to increase process efficiency.
  • Good experience in scheduling jobs with Autosys.
  • Coordinate with other technology groups to resolve incidents and problems, and escalates complex problems to the next level of support as required by documented procedures. Perform as incident manager to ensure resolution and follow up.
  • Develop and gather metrics to assist in problem analysis and resolution, support efforts to improve system monitoring tools and processes to enhance service delivery and support model for the end users.
  • Participate in problem analysis and incident management procedure meetings to enhance process improvements. Ensure timely communication to the impacted parties during an incident.
  • Provide post-implementation and system checkout support for changes to production applications and services and provide information and updates to tasks and incidents in the appropriate systems and knowledge base tools.
  • Respond to inquiries, requests for support or information, and escalations from Level 1 support or other technology and business users. Respond to production incidents and help resolve or escalate in a timely manner.

Environment: Informatica 10.x/9.x, IICS, PL/SQL, Oracle 11g, UNIX, T-SQL, SQL, AWS, SQL Server, Rapid SQL, TOAD 8.6, Erwin

Confidential - Deerfield, IL

Sr. ETL Developer

Responsibilities:

  • Create ETL mappings and database objects to load source data in to PINIA (Power Information Network Integrated Analytics) warehouse and PIN Explorer DataMart.
  • Co-ordinate with source teams from ADVENT, CTSOFT, DATA CONSULTANTS, DEALER TRACK and REYNOLDS for any data inconsistencies, transfer/FTP issues.
  • Wrote SQL queries, PL/SQL programming and Query Level Performance tuning.
  • Write adhoc SQL queries to generate data from tables based on requirements.
  • Modify existing mappings and workflows to accommodate changes as suggested by business.
  • Informatica transformations like lookup, Joiner, Update Strategy along with others were used for data processing.
  • Understand and develop resolutions for any data cleansing or code issues that could hinder the DSS efficiency.
  • Lead problem analysis and resolution for critical business applications and services, technology products.
  • Create mappings/mapplets, reusable transformations using all the transformations like Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, worklets etc.
  • Experience using Metadata Manager and Data Analyzer to generate ETL run reports which were used to understand the process performance and overall job schedules.
  • Create scheduling jobs using TIDAL scheduler to start data processing and to send success/failure notification alerts.
  • Co-ordinate weekly QA and production release activities.
  • Raise CCB tickets using e-help portal for data refresh and other maintenance requests.

Environment: Informatica 9.x, Oracle 10g, PL/SQL, UNIX, Oracle 11g, SQL, T-SQL, TOAD 8.6, Erwin

Confidential - Irving, TX

ETL Developer

Responsibilities:

  • Worked on the maintenance and enhancements for VMware Entitlements related data mart.
  • Coordinated monthly roadmap releases to push enhanced/new Informatica code to production.
  • Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using Informatica Designer 8.6 to extract the data from multiple source systems that comprise databases like Oracle 10g, SQL Server 7.2, flat files to the Staging area, EDW and then to the Data Marts.
  • Testing of database packages developed using PL/SQL.
  • Created mappings using different look-ups like connected, unconnected and Dynamic look-up with different caches such as persistent cache
  • Performed Impact Analysis of the changes done to the existing mappings and provided the feedback
  • Create mappings using reusable components like worklets, mapplets using other reusable transformations.
  • Participated in providing the project estimates for development team efforts for the off shore as well as on-site.
  • Coordinated and monitored the project progress to ensure the timely flow and complete delivery of the project
  • Worked on Informatica Source Analyzer, Mapping Designer &Mapplets, and Transformations.

Environment: Informatica 9.x, Oracle 10g, PL/SQL, UNIX, Oracle Data Integrator, Oracle 11g, SQL, TOAD 8.6, Erwin

Confidential - Springfield, IL

ETL Developer

Responsibilities:

  • Design and develop Informatica mappings to address business requirements to bring in the SOR data in to the customer data ware house using flat files from different LOB’s.
  • Developed mapping to bring in new attributes to existing data load, and to set up a completely new data feeds load.
  • Managed application support along with off-shore technical team.
  • Prepare and update SIA documents required for new and existing feeds.
  • Co-ordinate and validate the upgrade of all CUSTDW database environments from Oracle 10g to 11g.
  • Testing of database packages developed using PL/SQL.
  • Develop and deploy autosysjil code to all environments.
  • Work on the existing issues and any development efforts queued in the CUSTDW share point website.
  • Maintain and manage share point site for CUSTDW
  • Co-ordinate efforts for CUSTDW BCP Cassie Hill exit project, which include partner notifications and providing any technical solutions required for smooth transmission.
  • Provide weekly status updates to both business and technical teams.
  • Raise work requests/change requests to move Informatica, autosys code from lower environments to Production and BCP.
  • Provide technical reports to manager on any Informatica related production runs using Data Analyzer.
  • Notify and follow up with all the partners to use common domain names service for CUSTDW so that the failover and any other database changes are transparent and require less effort in the future.
  • Co-ordinate and support annual BCP failover activity for CUSTDW production database.
  • On-call support on any production failures and emergencies.

Environment: Informatica 9.x, Oracle 10g, UNIX, TOAD 8.6, Erwin, Pac 2000, Autosys 4.5, PL/SQL

We'd love your feedback!