We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

0/5 (Submit Your Rating)

Miami, FL

SUMMARY

  • 9+ years of IT experience in analysis, design, development, testing and Implementation Informatica Workflows using Data Warehousing/Data mart design, ETL, OLAP client /server applications.
  • ETL and data integration experience in developing ETL mappings and scripts using Informatica Power Center 9.x/8.x.
  • Solid understanding of ETL design principles and good practical knowledge of performing ETL design processes through Informatica.
  • Proficient in Data warehouse ETL activities using SQL, PL/SQL, PROC, SQL LOADER, C, Data structures using C, Unix scripting, Python scripting.
  • Strong ETL experience using Informatica Power Center with various versions 9.x/8.x/7.x/6.x Client tools - Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor and Server tools - Informatica Server, Repository Server manager and Power Mart .
  • Knowledge in Full Life Cycle development of Data Warehousing.
  • Knowledge of Advanced Programming for data transformation (JAVA, C).
  • Ability to write complex SQLs, stored procedures and Unix Shell Scripting, for ETL jobs and analysing data.
  • Used Informatica data quality (IDQ) in cleaning and formatting customer master data.
  • Over 4+ years worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.
  • Good exposure on Informatica Cloud Services.
  • Good understanding and experience on Informatica cloud with oracle and sales force.
  • Having experience in Change data capture (CDC) and Strong understanding of OLAP and OLTP Concepts.
  • Experience in Software development life cycle (SDLC) Such as testing, migrating, developing, etc.
  • Proficient and worked with databases like Oracle, SQL Server, IBM DB2, XML, Teradata, Excel sheets, Flat Files and Netezza.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Excellent working knowledge of UNIX shell scripts and scheduling of jobs by using tools like Control M, Autosys.
  • Extensively worked on Data migration, Data Cleansing and Data Staging of operational sources using ETL process and providing data mining features for Data warehouses.
  • Strong Experience in creating Transformations such as Aggregation, Expression, Update Strategy, Lookup, Joiner, Rank, Router and Source Qualifier Transformations in the Informatica Designer.
  • Comprehensive knowledge of Dimensional Data Modelling like Star Schema, Snowflake Schemas, Facts and Dimension Tables, Physical and Logical Data Models and knowledge in designing tools like MS Visio.
  • Experience in the Development, Implementation of Database/Data Warehousing /Client/Server /Legacy applications for usingData Extraction,Data Transformation,Data LoadingandData Analysis
  • Good understanding and experience in usage of source/target/field mapping/scheduling in Informatica cloud services and experience on creating connection for oracle and Sales force in Informatica cloud.
  • Experience in maintaining Data Concurrency, Replication of data.
  • Familiarity with Master Data Management (MDM) concepts.
  • Strong knowledge on Relation Databases on different platforms such as Windows/Unix Linux using GUI tools like SQL DEVELOPER, SQL PLUS, SQL*PLUS, MICROSOFT VISUAL
  • Analytical and Technical aptitude with the ability to solve complex problems.
  • Good knowledge on Devops and version control.
  • Knowledge on Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Cassandra, Oozie, Flume, Chukwa, Pentaho Kettle.
  • Good Knowledge on file distribution system, pig,hive,sqoop.
  • Knowledge on how to Develop MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • I have good knowledge on Paxata.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
  • Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.
  • Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation.
  • Systematic, disciplined and has an analytical and logical approach in problem solving. Ability to work in tight Schedules and efficient in time management
  • Excellent Written, Communications and Analytical skills with ability to perform independently as well as in a team quick learner and able to meet deadlines.
  • Solid experience years of experience in various types of testing like smoke, functional, Data Quality, Regression, Performance and System Integration testing.
  • Hands on experience in developing Type 1 and 2 dimensions, Fact Tables, Star Schema design, Operational Data Store (ODS), levelling and other Data Warehouse concepts.
  • Experience in installing Informatica in UNIX environment.
  • Experience in Informatica Administration and hands-on experience in installing, configuring, upgrading the Informatica Power Center and Data Explorer.
  • Experienced in handling various Informatica Power Center code migration methods (XML Export/Import, Deployment Group and Object Copy)
  • Experience in handling Informatica repository back-ups, restore and fail-over of services from primary to back-up nodes and vice-versa and Good understanding on file transport protocols like FTP, SFTP.

TECHNICAL SKILLS

ETL Tools: Informatica power Center 9.x/8.x,Informatica Power Exchange9.1, B2B DX/DT v 8.0, Informatica Cloud, Winscp

Programming Languages: SQL, PL/SQL, UNIX Shell Scripting,Python.

Operating Systems: WINDOWS 7/Vista, XP/2003/2009/NT/98/95, MS-DOS, Unix/Linux

Office Applications: Microsoft Word, Excel, Outlook, Access, Project, PowerPoint

Databases: SQL Server 2008/2005, Oracle 11i/10g/9i, MS-Access 2003/2007/2010 SQL Server, XML, XSD Teradata, DB2, Netezza.

PROFESSIONAL EXPERIENCE

Confidential, Carlsbad, CA

ETL Informatica Data quality Consultant

Responsibilities:

  • Interacting with business users and business analysts to gather, understand, analyse and document the requirements for reporting.
  • Convert the Business requirements into technical requirement documentation (LSD and ETL Design Documentation).
  • Design ETL for Framework, Logging Activity, Control Tables and Error Handling by Interacting with the ETL Integration Consultants and Architects.
  • Developed Informatica technical mapping document to pull data from different Source systems and integrate.
  • Analysed the pharma data for all the products in different source system.
  • Experienced in Informatica data quality (IDQ), power center, data cleansing, data profiling, data quality measurement and data validation processing,Match, Merge, Weightage score, Deduplication process.
  • Workflow Recovery with mapping build for recovery, Tuning and Error Handling.
  • Debugging and Performance Tuning of sources, targets, mappings and sessions.
  • Developed several reusable transformations and mapplets, several workflow tasks.
  • Design, Develop, Configure, Deploy program and implement software applications, servers, packages, and components customized to meet specific needs and requirements.
  • Extensively worked with Salesforce sources to read data and write the data into the Salesforce.
  • Involved in Unit, Performance and Informatica DVO Testing for data comparisons and data validations.
  • Experienced in IDE data management and used all features of this tool.
  • Building Profile and Architect Jobs in Data Flux.
  • Analysis of functional and non-functional categorized data elements fordata profilingand mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.
  • Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
  • DevelopedPL/SQL triggersandmaster tablesfor automatic creation of primary keys.
  • Created PL/SQLstored procedures, functionsandpackagesfor moving the data from staging area to data mart.
  • Created scripts to createnew tables, views, queriesfor new enhancement in the application using TOAD.
  • Createdindexes on the tables for faster retrieval of the data to enhance databaseperformance.
  • Involved in data loading usingPL/SQLandSQL*Loadercalling UNIX scripts to download and manipulate files.
  • PerformedSQL and PL/SQL tuningand Application tuning using various tools likeEXPLAIN PLAN, SQL*TRACE, TKPROFandAUTOTRACE.
  • Createdand maintained specifications and process documentation to produce the required data deliverables (data profiling, source to Client maps, flows).
  • Involve in the data profiling activities for the column assessment, data inconsistency and natural key study
  • Extensively worked on Teradata utilities.
  • Worked on Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.
  • Identify bottlenecks and perform tuning using appropriate transformations in building mappings.
  • Develop various dimensions like SCD Type1, Type2 and Facts and reconcile facts.
  • Implemented CDC techniques to read latest data from source systems incrementally.
  • Created static and dynamic parameter files.
  • Implemented ETL solutions to bring the data from sales force application into datawarehouse.
  • Build scripts for Oracle golden gate Extract/Pump/Replicate process and implement.
  • Created Pre/Post Session and SQL commands in sessions and mappings on the target instance.
  • Develop Informatica Mappings, Sessions and Workflows.
  • Trained on MDM concepts.
  • I have worked on SAP BODS tool to perform manipulations and transformation of huge and complex volume of data very efficiently.
  • Creating Informatica workflow processes for daily, weekly, monthly, yearly and adhoc.
  • Fine tune the data load process for performance improvement Creating/Replicating SQL server jobs that run the ETL process.
  • Monitoring Day-to-Day of developed process and existed ETL processes.
  • Created Informatica Cloud Services connections.
  • Worked on Informatica cloud with oracle and sales force.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Extensive experience as Oracle Applications e-Business Suite Consultant and Technical Developer in Design, Development and Implementation of Oracle ApplicationsR12/11i.
  • Strong understanding of business flows, underlying technical architecture and table structure ofOracleE-Business Finance Suite.
  • Hands-on experience in Data Conversion, Data Migration, Report Generation and Developing Interfaces within several modules in Oracle Applications.
  • Hands on experience on Oracle EBS 11i/R12 P2P (iProcurement, AP, PO, iSupplier) modules.
  • Hands on experience on Oracle EBS 11i/R12 SCM, BOM, WIP modules.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
  • Extensively worked on the Database Triggers, Stored Procedures, Functions and Database Constraints. Written complex stored procedures and triggers and optimized for maximum performance.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Restricted data for particular users using Row level security and User filters.
  • Developed Tableau visualizations and dashboards using Tableau Desktop.
  • Developed Tableau workbooks from multiple data sources using Data Blending.
  • Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.
  • Involved in Performance Tuning.
  • Responsible for monitoring source code and scheduling definition deployments to various environments, such as system integration testing, user acceptance testing, and production regions.

Environment: Informatica Power center 9.1 (Hotflix 6), Informatica data quality (IDQ), Oracle 11g, Oracle Applications R12, Teradata,, MS SQL Server, Informatica B2B DX, Shell programming, MS Visio, WINScp, Siebel, Sales Force, SAP,SAP BODS.

Confidential, FL

Informatica Data Quality Developer

Responsibilities:

  • Used Informatica Power Center 9.6.1 for Extraction, Transformation and Loading data from heterogeneous source systems into the target data base.
  • Interacted with subject matter experts and data management team to get information about the business rules for data cleansing.
  • Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst).
  • Used Informatica data quality (IDQ) in cleaning and formatting customer master data.
  • Built logical data objects (LDO) and developed various mappings, Mapplet/rules using Informatica data quality (IDQ) based on requirements to profile, validate and cleanse the data.
  • Work on the performance improved areas. Debug the issues and coming up with the proper solutions which will reduce the process times.
  • Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.
  • Developed AdvancePL/SQLpackages, procedures, triggers, functions, IndexesandCollectionsto implement business logic usingSQLNavigator. Generated server sidePL/SQLscriptsfordata manipulationand validation and materialized views for remote instances.
  • Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica Powercenter as Mappings, Mapplet.
  • Designed various mappings and Mapplets using different transformations such as key generator, match, labeler, caseconverter, standardize, Address Validator, parser and lookup.
  • Configured Address doctor content on both PC and IDQ servers and helped users in building the scenarios.
  • Used XML PARSER Transformation to read XML data.
  • Used OXYGEN TOOL to create DTD files from XML data.
  • Responsible for creating VIEWS in SharePoint.
  • Expertise in UNIX shell scripting, FTP, SFTP and file management in various UNIX environments.
  • Effectively communicates with other technology and product team members.
  • Work on the performance improved areas. Debug the issues and coming up with the proper solutions which will reduce the process times.
  • Created Pre/Post Session and SQL commands in sessions and mappings on the target instance.
  • Develop Informatica Mappings, Sessions and Workflows.
  • Worked closely with all Application/Development Teams that used Control-M Scheduling.
  • Worked directly with Application/Development on-call to fix issues of failed jobs in Control-M.
  • Coordinated installation of Applications/Development jobs into Control-M.
  • Coordinated procedures for requests for scheduling in Control-M for all Application/Development Teams.
  • Created numerous reports using Control-M Report Facility Tool.
  • Configured Address doctor content on both PC and IDQ servers and
  • Responsible for Performance tuning at various levels during the development.
  • Used PMCMD command to call/run the workflows from UNIX script.
  • Performed performance tuning on Informatica Workflows.
  • Involved in data validation, data quality monitoring and data mining.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Extracted/loaded data from/into diverse source/target systems like SQL server, XML and Flat Files.
  • Worked using Parameter Files, Mapping Variables, and Mapping Parameters for Incremental loading.
  • Managed post production issues and delivered all assignments/projects within specified time lines.
  • Extensive use of Persistent cache to reduce session processing time.

Environment: Informatica Power center 9.6.1, Informatica data quality (IDQ), Oracle 11g, Share point 2010, Oxygen 15.2, Soap UI 5.1.3, Autosys (Job scheduler), SQL server 2008 and UNIX shell scripting.

Confidential, Miami, FL

Sr. ETL Informatica Developer

Responsibilities:

  • Understanding the functional specifications and the documents related to the Architecture.
  • Understanding legacy systems data and building, designing Target DB schema.
  • Identified suitable dimensions and facts for schema.
  • Helped users in building the scenarios.
  • The following are the modules of epic likeCerner, and Cadence
  • Worked on Extraction, Transformation and Loading of data using Informatica.
  • Experienced in Informatica data quality (IDQ), power center, data cleansing, data profiling, data quality measurement and data validation processing.
  • Developing several complex mappings in Informatica a variety of Power center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica Powercenter and IDQ.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
  • Performed profiling, matching, cleansing, parsing and redacting data using Informatica IDQ and implementing standards and guidelines.
  • Expertised knowledge in Health care BI applications, and knowledge in Epic, Cerner
  • Creating the Profiles, scorecards, custom rules and reference tables using IDQ.
  • Created mappings using different IDQ transformations like Address Validator, Match, Labeler, Parser, and Standardizer.
  • Built application interface and web scrapping scripts using OO designing, UML modeling and dynamic data structures.Implemented discretization and binning, data wrangling: cleaning, transforming, merging and reshaping data frames.
  • Determined optimal business logic implementations, applying best design patterns.
Environment: Informatica Power Center 9.1, Informatica Data Quality (IDQ), Epic Ambulatory care,Toad, Oracle 10g, PL/SQL, RDBMS, Unix Shell Programming,Python, Control-M.

Confidential, Milpitas, CA

ETL (Informatica) Consultant

Responsibilities:

  • Participated in all phases including Client Interaction, Design, Coding, Testing, Release, Support and Documentation.
  • Interacted with Management to identify key dimensions and Measures for business performance.
  • Involved in defining the mapping rules, identifying requires data sources and fields
  • Created ER (Entity Relationship) diagrams
  • Extensively used RDBMS and Oracle Concepts.
  • Dimensional modeling using STAR schemas (Facts and Dimensions).
  • Generated weekly and monthly report Status for the number of incidents handled by the support team.
  • Worked on data conversions and data loads using PL\SQL and created measure objects, aggregations and stored in MOLAP mode.
  • Involved in Performance Tuning.
  • Worked on slowly changing dimension table to keep full history which was used across the board.
  • Used aggregate, expression, lookup, update strategy, router, and rank transformation.
  • Worked for some time in Support Activities (24*7 Production Support), Monitoring of Jobs and worked on enhancements and change requests.
  • Familiarity with Data Analysis and building Reports and Dashboards with OBIEE.

Environment: Informatica Power Center 8.6, Informatica Power ConnectRDBMS, Oracle 10g/11g, PL/SQL, Toad, UNIX scripting, OBIEE 10.1.3.4.1

Confidential

ETL Developer (Offshore)

Responsibilities:

  • Creating dimensions and facts in the physical data model.
  • Involved in designing the Data Mart model with Erwin using Star Schema methodology.
  • Used aggregate, expression, lookup, update strategy, router, and rank transformation.
  • Used Lookup Transformation to access data from tables, which are not the source for mapping and also used Unconnected Lookup to improve performance.
  • Created ftp connections, database connections for the sources and targets.
  • Loading Data to the Interface tables from multiple data sources such as MS Access, SQL Server, Text files and Excel Spreadsheets using SQL Loader, Informatica and ODBC connection.
  • Wrote stored procedure to check source data with Warehouse data if it not present written that records to spool table and used spool table as Lookup in Transformation.
  • Implemented Variables and Parameters in Transformations to calculate billing data in billing Domain.
  • Modified the existing batch process, shell script, and Pl/Sql Procedures for Effective logging of Error messages into the Log table.
Environment: Informatica 9.1, 7.1, Erwin 4.5, RDBMS, Oracle 8i, PL/SQL, UNIX, Toad.

Confidential

Informatica Developer

Responsibilities:

  • Perform analysis, validation on the DB2 data received from California Track, Qcare related oracle data from Colorado and integrate the various source systems.
  • Design ETL for Framework, Logging Activity, Control Tables and Error Handling by Interacting with the ETL Integration Consultants and Architects.
  • Developed Informatica technical mapping document to pull data from different Source systems and integrate.
  • Workflow Recovery with mapping build for recovery, Tuning and Error Handling.
  • Debugging and Performance Tuning of sources, targets, mappings and sessions.
  • Developed Slowly Changing Dimension Mappings of type II
  • Complex mappings creation using various transformation tasks and fine tune.
  • Assist in usage of Exceptional handling, records, arrays, Partitioning of tables, Bulk Collects.
  • Create RFC and SRs for any enhancements for deployment and get approvals.
  • Do health checks by ensuring the source data and target data are accurate and valid.
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.

Environment: Informatica 8.6.1, ORACLE RDBMS 10g, PL/SQL, DB2,WinSQL, MS SQL Server, Shell programming, MS Visio, WINScp, VSS, TOAD 9, UNIX AIX, Cygwin, Tivoli, Erwin, Remedy User.

We'd love your feedback!