We provide IT Staff Augmentation Services!

Sr Informatica/etl Developer Resume

0/5 (Submit Your Rating)

Dublin, CA

SUMMARY

  • Over all 8 years of IT experience in Design, Analysis, Application Development, Implementation and Maintenance in Client/Server architecture, Informatica Power Mart/ Power Center, Windows 7/XP, Oracle 11g/10g, Green Plum 4, SeaQuest.
  • 7+ years of strong experience as an ETL Developer/Business Intelligence Developer, which includes Data Warehouse Developer experience in Informatica Power Center 9, 8.
  • Strong Experience in developing strategies for Extraction, Transformation and Loading (ETL) mechanism using various ETL Tools. Handled 3 Terabytes of data warehouse database.
  • Strong in Source to Target data Mappings and using Slowly Changing Dimension Mappings.
  • Expert in various Warehouse Modules like Informatica Server and Client tools like Designer, Server Manager, Workflow Manager, Workflow Monitor, Repository Manager, Business Objects and OLAP.
  • Experienced in performance and tuning of Informatica mappings and sessions for better performance.
  • Experience in writing UNIX scripts and Windows Batch jobs.
  • Experience in Database programming for Data Warehouses (Star Schemas), developed Strategies for Extraction Transformation Loading (ETL) mechanism.
  • Written PL/SQL, SQL, triggers, and cursors in Oracle 10g, 9i, 8i Expertise in performance and tuning of triggers, stored procedures, queries. Creating tables, views and importing to the Informatica.
  • Experience in writing Functions in PG SQL and using External Tables in Greenplum.
  • Expertise in creating and scheduling Informatica jobs with control M, AutoSys, Cron Tab.
  • High exposure to development, testing, debugging, implementation, documentation, user training and production support.
  • Exceptional analytical and problem solving skills. Team Player with the ability to communicate effectively at all levels of the development process

TECHNICAL SKILLS

ETL: Data Warehousing Informatica Power Center/Mart 9.6.1/9.5/9.1/8.6.1/8.0/7.1.2/7.1.1 7.0/6.2/5.1.2.

Modeling: Star - Schema, Snowflake Modeling, FACT Dimension Tables, ERWIN 4.1/4.0/3.5.2 Power Designer 6.0, Oracle Designer

Reporting &BI Tools: Siebel Analytics 7.8.4/7.8.2/7.7 , OBIEE 10.1.3.x

Languages: XML, SQL, Shell Scripting, PL/SQL, C, C++, Visual basics 6.0 Developer 2000 (Forms 6/6i)

Operating system: Windows (95/98/00/XP, vista), Win NT, UNIX.

Databases: Green plum 4, Oracle 11g/10g/9i/8i, MS Access 2000, MySQL, DB2 Teradata v2r6, SQL Server 2005/2008

PROFESSIONAL EXPERIENCE

Confidential, Dublin, CA

Sr Informatica/ETL Developer

Responsibilities:

  • Involved in complete SDLC phase in collecting and documenting requirements. Wrote TSD documents for developers easily to understand and code.
  • As an Onsite / Offshore model, helped team to understand scope of the project and also assisted and developed Informatica code.
  • Developed Informatica mappings to consolidate data from 10 source systems into the new data warehouse that we built. All the 10 data sources resided on various platforms. These 10 sources include fixed width and delimited flat files, excel, Oracle and Greenplum tables.
  • Creating mappings to load data using different transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, Normalizer, Filter and Router transformations, Union transformations, etc.
  • Prepared Key metrics and dimensions document. In this process, analyzed the System Requirement document and identified areas which were missed by Business and IT team.
  • Identified the data sources and computational logic of rolling up metrics across various dimensions by analyzing all the various systems that source data into the warehouse. I was able to analyze the systems that are in production from the past 10 years with in just 20 days’ time and received applauds for the same.
  • Generated sample data based on the data model design and loaded the data into the database using UNIX shell scripts, which was used both for data analysis and also unit testing purposes.
  • Formulated and implemented Historical load strategy from multiple data sources into the data warehouse. Data concurrency, Prioritization, comprehensiveness, completeness and minimal impact to existing users are taken as key attributes for Historical data load strategy.
  • All the information regarding dimension tables which included Dimension category, data frequency (Monthly, weekly daily etc.), Data periodicity (Incremental Vs. Historical), Identification of CRC (or MD5) columns is led by me.
  • Participated in Logical data model review meetings and identified missing attributes, facts and key measures. Updated the Logical data model using Erwin.
  • Proposed a file validation mechanism to ensure correctness of data.
  • Working on CDC (Change data capture) for Type 1 and Type 2 loads
  • Working on Billion record tables ( performance tuning for historical loads )
  • Implementing error handling strategy and audit information into the design.
  • Integration testing of the jobs developed and migration of jobs from one environment to another.
  • Helping Associate developers on analysis and Development.
  • Performance Tuning by adding hints and session level partitions.
  • Creating the job schedule component using Control-M Scheduler.
  • Involved in upgrading Informatica from 9.5 to 9.6
  • Involved in production support for the data warehouse Informatica jobs.

Environment: Informatica Power Center 9.5, Informatica Power Center 9.6.1, Greenplum 4, PostGres, Oracle 11g, Teradata, Flat Files, OBIEE 10.1.3.X, Toad, PG Admin III, WIN SCP,Unix/AIX, Windows 7, Erwin, BMC Control-M.

Confidential, San Ramon, CA

Sr. ETL/Informatica Developer

Responsibilities:

  • Worked in all the project phases starting from Requirements gathering through Deployment of this Program.
  • ETL Tool Informatica is used for Design and Development of the code.
  • Data is sourced from different Work Units. These Work Units span across huge variety of data sources like Oracle, Flat Files and Spread Sheets.
  • ETL flows are developed from Source to Stage, Stage to Work tables and Stage to Target Tables.
  • Source and Target Definitions are imported from Source Analyzer and Target Designer connecting to databases using Relational and ODBC Connections.
  • Developed Informatica Mappings for the complex business requirements provided using different transformations like Normalizer, SQL Transformation, Expression, Aggregator, Joiner, Lookup, Sorter, Filter, and Router and so on.
  • Developed Web Services transformation to call web services and capture the response from the web service.
  • Worked in developing Mapplets and Re-usable Transformations for reusability and reducing effort.
  • Created Work Flows with Command Tasks, Worklets, Decision, Event Wait and Monitored sessions by using workflow monitor.
  • Migrated Informatica Folders from Development Env to Test and System Test Env and Worked with Admins to migrate the same to Production environments.
  • Wrote UNIX scripts for validating input data and also to trigger Informatica Workflows.
  • To externalize the business logic instead hardcoding in the mapping I have used Parameter file in Informatica.
  • Tuned Mappings and Mapplets for best Performance on ETL Side and Created Indexes and Analyzed tables periodically on Database side.
  • Organized the dataflow and developed many UC4 jobs for Scheduling MINT program and moved to production.
  • Primary resource in Production support team so, involved in emergency calls when application outage occurred and resolved defects when raised.

Environment: Informatica Power Center 9.0.1, Oracle 11g, XML, Weblogic Application Server, SOUPUI 4.0.0, UNIX, TOAD, SQL Developer, Web Services, Clear Case, UC4.

Confidential, Palo Alto, CA

Sr. ETL/Informatica Developer

Responsibilities:

  • Worked as a Technical Lead and participated in all the project phases starting from Requirements gathering through Deployment of this Program (SDLC).
  • Worked as a part of Onsite - Offshore model. Being a Lead Informatica developer, I have taken the responsibility for creating checklists for coding standards, naming conventions etc. I have also developed reusable code components to maintain ETL standard load practices like Table Analyze script, Stage pre load, file archive and maintaining two months of archived files and logs
  • Used Full PDO (Push down optimization) in infromatica for loading data.
  • Creating mappings to load data using different transformations
  • Troubleshooted the ETL process developed for Conversions and implemented various techniques for enhancing the performance.
  • Extensively involved in creating design documents for loading data into Data warehouse and worked with the data modeler to change/update the Data warehouse model when needed.
  • Developed ETL processes to load data into dimensional model from various sources using Informatica Power Center 9.5.1
  • Developed Mapplets and reusable transformations that were used in different mappings and across different folders
  • Designed and developed the error handling mechanism to be used for all the Informatica jobs, which load data into the data warehouse.
  • Extensively used warehouse designer of Informatica Power center Created different target definitions.
  • Working on Billion record tables ( performance tuning for historical loads )
  • Created robust and complex workflows and worklets using Informatica workflow manager and Troubleshooted data load problems

Environment: Informatica Power Center 9.5.1 Hot fix2 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Oracle 11g, SeaQuest, HPDM, SQL Server, UNIX, Toad, Control-M.

Confidential, Dublin, CA

ETL/Informatica Developer

Responsibilities:

  • Involved in complete SDLC phase in collecting and documenting requirements. Wrote TSD documents for developers easily to understand and code.
  • As an Onsite / Offshore model, helped team to understand scope of the project and also assisted and developed Informatica code.
  • Developed Informatica mappings to consolidate data from 10 source systems into the new data warehouse that we built. All the 10 data sources resided on various platforms. These 10 sources include fixed width and delimited flat files, excel, Oracle and Greenplum tables.
  • Creating mappings to load data using different transformations like Source Qualifier, Expression, Lookup, Aggregator, Update Strategy, Joiner, Normalizer, Filter and Router transformations, Union transformations, etc.
  • Prepared Key metrics and dimensions document. In this process, analyzed the System Requirement document and identified areas which were missed by Business and IT team.
  • Identified the data sources and computational logic of rolling up metrics across various dimensions by analyzing all the various systems that source data into the warehouse. I was able to analyze the systems that are in production from the past 10 years with in just 20 days’ time and received applauds for the same.
  • Generated sample data based on the data model design and loaded the data into the database using UNIX shell scripts, which was used both for data analysis and also unit testing purposes.
  • Formulated and implemented Historical load strategy from multiple data sources into the data warehouse. Data concurrency, Prioritization, comprehensiveness, completeness and minimal impact to existing users are taken as key attributes for Historical data load strategy.
  • All the information regarding dimension tables which included Dimension category, data frequency (Monthly, weekly daily etc.), Data periodicity (Incremental Vs. Historical), Identification of CRC (or MD5) columns is led by me.
  • Participated in Logical data model review meetings and identified missing attributes, facts and key measures. Updated the Logical data model using Erwin.
  • Proposed a file validation mechanism to ensure correctness of data.
  • Working on CDC (Change data capture) for Type 1 and Type 2 loads
  • Working on Billion record tables ( performance tuning for historical loads )
  • Implementing error handling strategy and audit information into the design.
  • Integration testing of the jobs developed and migration of jobs from one environment to another.
  • Helping Associate developers on analysis and Development.
  • Performance Tuning by adding hints and session level partitions.
  • Creating the job schedule component using Control-M Scheduler.
  • Involved in production support for the data warehouse Informatica jobs.
  • Involved in upgrading Informatica from 8.6 to 9.1

Environment: Informatica Power Center 9.1/8.6, Green plum 4, Oracle 11g, Flat Files, Business Objects XI, Toad, PG Admin III, WIN SCP, Unix/AIX, Windows 7, Erwin, Control-M.

Confidential, Carlsbad, CA

EDW Informatica Developer

Responsibilities:

  • Worked in all the project phases starting from Requirements gathering through Deployment of this Program.
  • ETL Tool Informatica is used for Design and Development of the code.
  • Data is sourced from 22 different Work Units. These Work Units span across huge variety of data sources like Oracle, Flat Files, MS Access and Spread Sheets.
  • ETL flows are developed from Source to Stage, Stage to Work tables and Stage to Target Tables.
  • Source and Target Definitions are imported from Source Analyzer and Target Designer connecting to databases using Relational and ODBC Connections.
  • Developed Informatica Mappings for the complex business requirements provided using different transformations like Normalizer, SQL Transformation, Expression, Aggregator, Joiner, Lookup, Sorter, Filter, and Router and so on.
  • Worked in developing Mapplets and Re-usable Transformations for reusability and reducing effort.
  • Created Work Flows with Command Tasks, Worklets, Decision, Event Wait and Monitored sessions by using workflow monitor.
  • Migrated Informatica Folders from Development Env to Test and System Test Env and Worked with Admins to migrate the same to Production environments.
  • Wrote PL/SQL procedures for reconciliation of financial data between source and target to automate testing phases and help business for preliminary validation.
  • Wrote UNIX scripts, environment files for Informatica.
  • Developed Metadata driven code for effective utilization and maintenance using technical metadata, business metadata and process metadata.
  • To externalize the business logic instead hardcoding in the mapping I have used Parameter file in Informatica.
  • Generated BO reports to test standardized reports as per business requirements.
  • Tuned Mappings and Mapplets for best Performance on ETL Side and Created Indexes and Analyzed tables periodically on Database side.
  • Organized the dataflow and developed many AutoSys jobs for Scheduling MINT program and moved to production.
  • Primary resource in Production support team so, involved in emergency calls when application outage occurred and resolved defects when raised.

Environment: Informatica Power Exchange, Power Center 8.6, Power Connect, Trillium, Data Explorer and Data Quality, Data Flux, Erwin 3.5, OBIEE 10.1.3, DB2,Oracle 10g, SQL Loader, XML, Unix, Win NT, TOAD, AutoSys.

Confidential, Fort Wayne, IN

Informatica Developer

Responsibilities:

  • Analyzed data flow requirements and developed a scalable architecture for staging and loading data and Translated business rules and functionality requirements into ETL procedures.
  • Involved in Dimensional modeling of the Data warehouse designed the business process, grain, dimensions and measured facts.
  • Responsible in designing and introducing new (FACT or Dimension Tables) to the existing Model and decide the granularity of Fact Tables.
  • Created necessary Repositories using repository manager to handle the metadata in the ETL process.
  • Worked on Informatica Power Center for extraction, loading and transformation (ETL) of data in the data warehouse.
  • Used the Informatica Designer, Source Analyzer, Warehouse Designer and Mapping Designer.
  • Developed and documented data Mappings/Transformations, Audit procedures and Informatica sessions.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner, and Stored procedure transformations.
  • Worked on Informatica Power Center Workflow Manager to create sessions, worklets to run with the logic embedded in the mappings.
  • Involved in scheduling the Informatica jobs and UNIX scripts and Oracle procedures.
  • Used SQL*Loader for Bulk Loading
  • Migrated data from source to target effectively using the update strategy.
  • Responsible to collaborate with DBA and Data Architects on implementation of database changes.
  • Assisted in design and maintenance of Metadata environment.
  • Interacted with end users, regularly in order to generate the reports required.
  • Involved in the Pipeline Partitioning for Header and Line tables.
  • Created Partitions through the session wizard in the Workflow Manager to increase the performance.
  • Extensively worked with multi-dimensional expression data.

Environment: Informatica Power Center 8.6, Teradata, Metadata Manager, UNIX, SQL Server, Flat Files, SQL*Loader, MDX, Windows XP, Autosys

Confidential

ETL Developer

Responsibilities:

  • Creating Database objects and managing the resources.
  • Exporting and importing data using Export/Import utilities.
  • Using Explain Plan and oracle optimizer hints to enhance performance of SQL statements.
  • Implementation of a backup strategy.
  • Generat completion messages and status reports usingInformatica.
  • TuneETLprocedures andSTAR schemasto optimize load and query Performance.
  • Design and tested the dimensional tables, ODS tables.
  • Built interface with existing Legacy system.
  • Carried out weekly reviews,technical reviews, made weekly progress reports.
  • Act as primary point of contact between customer and core development team.
  • Acting as configuration controller.
  • Configuring TNSNAMES.ORA and LISTENER.ORA for Client Server Connections.

Environment: Win NT/00, Oracle 8.1,Informatica, OracleDeveloper2000 Forms and Reports.

We'd love your feedback!