We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

0/5 (Submit Your Rating)

CT

SUMMARY

  • 8+ years of IT experience in System Analysis, Design Development, Implementation and testing of Databases/Data Warehouse Applications on client server technologies in Pharma, Health care, Banking and Finance Domains.
  • Experience in Extraction, Transformation and Loading (ETL) of Data into Data Warehouse using Informatica Power Center 9.x, 8.x, 7.x versions.
  • Designed Informatica objects Mappings, Sessions and Workflows and scheduled them using Informatica Power Center 9.6.1, 8.6, 8.1 and 7.1.
  • Adept in using Informatica Powercenter tools Repository Manager, Designer, workflow manager, workflow monitor, pmcmd to create, schedule and control workflows, tasks, and sessions.
  • Experience in designing and developing complex mappings from various Transformations like Source Qualifier, Joiner, Aggregator, Router, Filter, Expression, Lookup, Sequence Generator, Java Transformation, Update Strategy, XML Transformations and Web Services.
  • Extensively worked on Informatica SAP BAPI/IDOCS transformations, Web services transformations and Security configuration using V8.6.
  • Worked on Web services as a part of Informatica and obtained the successful connection of the web service external call.
  • Expertise in OLTP/OLAP System Study, Analysis and E - R modeling, developing Database Schemas like Star Schema & Snowflake Schema used in dimensional modeling
  • Experience in Debugging sessions and mappings; Performance Tuning of the Sessions and mappings, implementing the complex business rules, optimizing the mappings.
  • Experience in extraction of data from various Heterogeneous sources (Relational database, Flat Files, XML, Excel) to load into Data Warehouse/data mart targets.
  • Expertise in databases, schema objects, Performance Tuning of SQL statements.
  • Implemented data warehousing techniques for Data cleansing, Slowly Changing Dimension Phenomenon’s (SCD) and Change Data Capture (CDC).
  • Experienced with the Informatica Data Quality (IDQ) tool for Data Cleansing.
  • Experienced with Pentaho reporting tool.
  • Good Experience in creating cubes by using Pentaho Schema Workbench.
  • Created visualizations and reporting using pentaho.
  • Created and maintained complex dashboard report studio reports using Business Insight for quick analysis.
  • Upgraded reports from Cognos 8.4 to Cognos 10 and created new active reports in Cognos 10.
  • Extensive development, support and maintenance experience working in all phases of the Software Development Life Cycle (SDLC) especially in Data warehousing.
  • Experience with SQL Query Tuning, SQL Server/Oracle RDBMS.
  • Good understanding and working experience in Logical and Physical data models that capture current/future state data elements and data flows using Erwin.
  • Experience in production Support in Technical and Performance issues
  • Excellent communication and presentation skills using MS Office Tools, PowerPoint, and Visio.
  • Independent, self-motivated, Logical thinker, team player, open mindset, mobility, proactive and organized.

TECHNICAL SKILLS

Operating systems: MS-DOS, HP-Unix, Windows 9x//NT/2000/XP, Mainframe, UNIX (SUN SOLARIS)

ETL and BI: Informatica Power Center 5.x/6.x/7.x/8.x//9.x, Syncsort DMExpress 6.5, Pentaho Data Integration 4.0, MS SQL Server DTS, SSIS & SSRS, Cognos 8, Cognos Report Net, Power Exchange 9.0.1

Databases: Oracle 8i/ 9i/10g, db2 UDB 9.1.6, Vertica 5.0, Teradata V2R5.01, My SQL, MS SQL Server 2005/2008, Teradata, Netezza

Data Modeling: Erwin, Star-Schema, Relational Modeling and Snowflake-Schema Modeling

PROFESSIONAL EXPERIENCE

Sr. Informatica Developer

Confidential, CT

Responsibilities:

  • Worked with business owners in defining business requirements and creating data flow documentation.
  • Created efficient Staging and ODS ETL architecture.
  • Developed Informatica mappings by using SCD1 and SCD2.
  • Used Windows Task Scheduler to automate jobs
  • Upgraded Informatica 9.6.0 to 9.6.1 HF2.
  • Developed mappings in Informatica Power Center to load the data from various sources using transformations like Source Qualifier, Expression, Lookup (connected and unconnected), Aggregator, Update Strategy, Filter, Router etc.
  • Responsible for identifying reusable logic to build several Mapplets which would be used in several mappings.
  • Created mappings to extract and de-normalize (flatten) data from XML files using multiple joiners with Informatica Power Center.
  • Created the transformation routines to transform and load the data.
  • Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica Power Center.
  • Was part of scripting team for shell script to automate and migrate data from ODS to data warehouse.
  • Used global temporary tables and volatile temporary tables in Teradata and used Fload, Mload and error tables from Teradata.
  • Created Secondary Indexes and join Indexes in Teradata
  • Hands on Experience Teradata SQL and associated utilities like BTEQ, Fast Load, Fast Export and Multi Load.
  • Used syntax of oracle different joins, sub queries and nested query in PL/SQL query.
  • Used the Power center Web services Hub to build data integration functionalities and exposing them as web services.
  • Created and maintained complex dashboard report studio reports using Business Insight for quick analysis.
  • Upgraded reports from Cognos 8.4 to Cognos 10 and created new active reports in Cognos 10
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Experience in creating cubes by using Pentaho Schema Workbench.
  • Created ETL jobs using Pentaho Data Integration to handle the maintenance and processing of data.
  • Informatica Data Quality (IDQ 8.6.1) is the tool used here for data quality measurement.
  • Gathered requirements from senior developers and stakeholders for implementing Pentaho Data Integration.
  • Extensive MDM experience, worked on different aspects of MDM - The Consolidation, Cleansing, Sharing and Governance of Data.
  • Work closely with BI, MDM Aware Applications, other MDM Teams and Hub Teams Supplier Hub, Site Hub, Product Hub, and the Financial Modules.
  • Stimulating Presenter Give Presentations on MDM Initiatives to Internal and external audiences to enable providing a 360 Degree view of the Customer.
  • Strong functional and strategic background, adept at functioning with cross functional teams to effectively translate complex business requirements into system designs
  • Developed Test Plans and written Test Cases to cover overall quality assurance testing.

Environment: Informatica Power Center 9.x, Pentaho Data Integration 4.0, MS SQL Server DTS, SSIS & SSRS, Cognos 10, Erwin.

Informatica developer

Confidential, Somerset, NJ

Responsibilities:

  • Worked with architects and DBA in designing the rules engine and implementing the logical and physical model based on the rules and metadata.
  • Followed best practices in Informatica for EDW at various levels of SDLC.
  • Reporting to Manager, leading a cross functional team of 3 Functional Analysts and 5 Developers.
  • Extensively used ERWIN for Logical / Physical data modeling and Dimensional data modeling, and designed Star and Snowflake schemas.
  • Worked withInformaticaData Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Involved in massive data cleansing prior to data staging.
  • Designed and developed ETL routines, using Informatica Power Center within the Informatica Mappings, usage of Lookups, Aggregator, Java, XML, Ranking, Mapplets, connected and unconnected stored procedures / functions / Lookups, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers were extensively done.
  • Created complex mappings with shared objects/Reusable Transformations/Mapplets using mapping/Mapplet Parameters/Variables.
  • Gathered requirements from senior developers and stakeholders for implementing Pentaho Data Integration.
  • Created visualization chart and custom reports using Pentaho.
  • Evaluated existing Pentaho ETL process.
  • Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
  • Created Stored Procedures to transform the Data and worked extensively in T-SQL, PL/SQL for various needs of the transformations while loading the data into SQL Server database
  • Created sequential/concurrent Sessions/ Batches for data loading process and used Pre & Post Session SQL Script to meet business logic.
  • Configured workflows with Email Task, which would send mail with session, log for Failure of a sessions and for Target Failed Rows.
  • Extensively used pmcmd commands on command prompt and executed Unix Shell scripts to automate workflows and to populate parameter files.
  • Worked with UNIX shell scripts extensively for job execution and automation.
  • Used SQL tools like Query Analyzer and TOAD to run SQL queries and validate the data based on using the oracle syntax of Joins.
  • Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.

Environment: Informatica Power Center 8.x, Pentaho Data Integration 4.0, MS SQL Server DTS, SSIS & SSRS, Erwin, Toad

Informatica Developer

Confidential, Portland, OR

Responsibilities:

  • Reviewed code, design and test plans as appropriate throughout project lifecycle.
  • Involved in Analysis phase of the business requirement and design of the Informatica mappings using low level documents
  • Creating mappings with different look-ups like connected look-up, unconnected look-up, Dynamic look-up with different caches such as persistent cache etc
  • Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
  • Involved in Informatica 9.1 upgrade testing.
  • Involved in requirement gathering, analysis and designing technical specifications for the data migration according to the business requirement.
  • Developed Technical Design Documents
  • Tuned performance of Informatica sessions for large data files by increasing block size, data cache size, sequence buffer length, and target based commit interval.
  • Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.
  • Worked with Type-I, Type-II and Type-III dimensions and Data warehousing Change Data Capture (CDC).
  • Wrote Unix scripts to back up the log files in QA and production
  • Created Mapplets in place of mappings which were repeatedly used like formatting date or data type conversion.
  • Extensively worked with SQL queries, created stored procedures, packages, triggers, views using PL/SQL Programming.
  • Involved in optimization and performance tuning of Informatica objects and database objects to achieve better performance.
  • Used Teradata Manager Tool for monitoring and controlling the system and modified Multiloads for Informatica using UNIX and loading data into IDW.
  • Performed Unit and Integration testing and wrote test cases
  • Worked extensively in defect remediation and supported the QA testing
  • Experience in taking repository backup and restoring it, starting and stopping Informatica services and worked with pmcmd commands.

Environment: Informatica Power Center 8.x, MS SQL Server DTS, SSIS & SSRS, Erwin, Toad, UNIX, Oracle

Informatica Developer

Confidential, VA

Responsibilities:

  • Implemented best practices, reviewed code, design and test plans as appropriate throughout project lifecycle.
  • Developed Stored Procedures and invoked the same through Power Center Stored Procedure Transformation
  • Developed Reusable mapplets and Transformations for reusable business calculations. Developed workflows and tasks using Informatica Power Center Work flow Manager
  • Wrote proposals for a small project at HB Communications and Alstom Power.
  • Used Power Center for Extraction, Transformation and Loading data from heterogeneous source systems into the target data base.
  • Created DTS Packages using SQL Server 2000.
  • Used stored procedure, views and functions for faster processing of bulk volume of source data
  • Responsible for unit testing and Integration testing.
  • Assisted in mentoring internal staff on Informatica best practices and skill.
  • Responsible for Performance Tuning of Informatica Mapping and Tuned SQL Queries.
  • Responsible for multiple projects with cross functional teams and business processes.
  • Developed ETL process for the integrated data repository from external sources.
  • Provided production support by monitoring the processes running daily
  • Created Functional Specifications for the different Problem Logs to get the approval of the work.
  • Created Remedy Tickets for the work approval from different users.

Environment: Informatica Power Center 8.x, MS SQL Server DTS, SSIS & SSRS, Erwin, Toad, SQL server 2000

Data warehouse developer

Confidential

Responsibilities:

  • Worked with the customers and business analysts for requirements gathering, rules and metadata, understanding the business and designing the solution.
  • Worked with Data modeler on logical and physical model designs. Worked with DBA to set up development, test, stage and production environments
  • Created Repository using Repository Manager.
  • Worked Extensively on Informatica tools -Repository Manager, Designer and Server Manager.
  • Involved in Extraction, Transformation and Loading (ETL) Process.
  • Created the Source and Target Definitions using Informatica Power Center Designer.
  • Imported Flat files to Designer did some modifications, used in the Mappings and exported into an Oracle tables.
  • Developed and tested all the backend programs, Informatica mappings and update processes.
  • Created and Monitored Batches and Sessions using Informatica Power Center Server.
  • Tuned the mappings to increase its efficiency and performance. Used Informatica Workflow Manager to create workflows, Workflow Monitor was used to monitor and run workflows
  • Involved in production support which also includes trouble shooting of connectivity issues.

Environment: Informatica Power Center 8.x, MS SQL Server DTS, SSIS & SSRS, Visio, Erwin, Toad.

We'd love your feedback!