We provide IT Staff Augmentation Services!

Informatica Tech Lead Dev Resume

2.00/5 (Submit Your Rating)

Dallas, TX

PROFESSIONAL SUMMARY:

  • 10 years of experience in designing, developing, and maintaining large business applications involving data migration, integration, conversion, and data warehousing.
  • Experience includes thorough domain knowledge of Banking, Insurance (and reinsurance), Healthcare, Pharmacy, and Telecom industries.
  • Experience working with various versions of Informatica Power center - Client and Server tools and Business Objects tools
  • Extensively worked in Teradata Utilities like BTEQ, Fast Export, Fast Load, Multi Load
  • Business requirements review, assessment, gap identification, defining business process, deliver project roadmap including documentation, initial source data definition, mapping, detailed ETL development specifications and operations documentation.
  • Expertise in data warehousing, ETL architecture, data profiling and business analytics warehouse (BAW) and Business Objects reporting
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager and passed the data to Microsoft SharePoint.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created complex mappings using Aggregator, Expression, Joiner transformations
  • Involved in generating reports from Data Mart using OBIEE and working with Teradata.
  • Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks.
  • Experience in creating pre-session and post-session scripts to ensure timely, accurate processing and ensuring balancing of job runs.
  • Experience in integration of various data sources like SQL Server, Oracle, Tera Data, Flat files, DB2 Mainframes.
  • Strong experience in Versions of Business Objects (6.5 to XIR1/R2& XIR2 to XI3.1)
  • Thorough knowledge of different OLAP’s like DOLAP, MOLAP, ROLAP, HOLAP.
  • Intense Knowledge in designing Fact & Dimension Tables, Physical & Logical data models using ERWIN 4.0 and Erwin methodologies like Forward & Reverse Engineering.
  • Experience in creating UNIX shell scripts and Perl scripts.
  • Knowledge in development of reports using Business Objects.
  • Knowledge in Installation and configuration of Informatica server with sql server, oracle and able to handle the Informatica administrator tasks like Configuring DSN, creating Connection Strings, copying & moving mappings, workflows, creating folders etc.

TECHNICAL SKILLS:

Data Warehousing:  Informatica Power Center, Power Exchange for DB2, Metadata Reporter Data Profiling, Data cleansing, Star & Snowflake Schema, Fact & Dimension Tables, Physical & Logical Data Modeling, Data Stage, ERwin

Business Intelligence Tools:  Business Objects, Micro strategy, Cognos

Databases:  MS SQL Server, Oracle, Sybase, Teradata, MySQL, MS-Access, DB2

Database Tools:  SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace

Development Languages:  C, C++, XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting

Other Tools and Technologies:  MS Visual Source Safe, PVCS, Autosys, Control M, Remedy, Mercury Quality center, Star Team.

PROFESSIONAL EXPERIENCE:

Confidential

Senior ETL Lead

Responsibilities:
  • The primary objective of this project is to develop Data warehouse and perform complex data migration
  • Acted as Sr ETL Technical Lead and created project plans, formulated the solution design diagrams needed
  • Acted as Technical lead and lead a team of 8 developers, guided them, helped them in coding, troubleshooting etc
  • Worked on CFLI Project and create Mappings from Flat files and connect with Xml’s as part of CFLI Project and also did data migration from Oracle applications to sql server and also worked on the implementation of ETL Cloud based applications
  • Used Teradata utilities like T-pump to load data.
  • Wrote Teradata Bteq scripts to transform Data and also worked on heavy end Oracle source system analysis using PLSQL and also worked on integration of salesforce into ETL systems for finance reporting and also tested some applications coded in Perl
  • Developed Interface batch applications automation for ETL/Informatica
  • Involved in Interface ETL batch setup, execution, Integration testing of all interface applications and supports PROD implementation
  • Involved in Distributed database data support includes data refresh, data backup, executing database scripts
  • Involved in the execution and validation of building Dev, Test and Prod Informatica jobs and Involved in mainframe batch execution, change-man package creation, build and promotion.
  • Scheduled and participated meetings with development teams to understanding the code branching strategy (SCM), code build dependencies and deploy processes.
  • Created Setup and supported continuous integration pipeline (Build and Deploy) for development and Test environments
  • Coordinated with external support teams (DBA, IDOC, MQ etc.,) for Continuous Integration/Continuous Deployment of various Commercial Lines Transformation program assets
  • Participated huddle board meetings, provided on-call support as and when required
  • Develop the Shell scripts to automate the source files, used Informatica power exchange 9.5 connections to read data from web services like soap, rest etc and also used Informatica to perform ETL data migration into Sql server, wrote complex DDL and DML Statements
  • Also used Informatica Power Exchange connections to read data from Legacy Mainframe applications and extract it and load into Oracle and did advanced PLSQL Coding
  • Created a successful integration with Flat files, Web services and Databases and integrated them in a Network and performed Dimensional Modelling and connected Tableau to modelling cubes and picked reports and customized reports
  • Acted as ETL Technical lead and Architect and created Designs, Data models, created a successful integration with Flat files, Web services and Databases and integrated them in a Network and performed Dimensional modelling for Property/Causality applications and implemented interfaces connected to MDM.
  • Acted as Architect, Developed a Conceptual model using Erwin based on requirements analysis and used Microsoft excel to create custom reports and analyzed them
  • Acted as Architect, Developed normalized Logical and Physical database models to design OLTP system for insurance applications
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin, performed Data Migration from Microsoft CRM to SQL Server
  • Drove development and implementation of financial analytic tools developed in Python programing
  • Designed entity relationship diagrams for enterprise-wide use in government and civilian reporting, business analysis, and management decisions and worked in data modelling of very large datasets which are greater than 60 TB
  • Built a data model encompassing executive systems, with details on current and needed interfaces.
  • Created several Informatica Mappings in Informatica power center for all the EDW Environments ran the workflows and monitored it, connected to Mainframe sources using Informatica Power exchange and used Mainframe Db2 legacy database as the source databases
  • Used XML Spy to read the Xmls coming across integration systems, verified the formats of xmls and passed the xml data across database systems
  • Successfully used Informatica Big Data Management platform software to process the data using connectors, Data integration transformations, parsers
  • Also used Informatica Big data Management software to support Hadoop and Big Data integrated applications
  • Used Informatica B2B DX/ DT Studio components like Parser, Serialiser etc. and created customized XML schemas which are configured using Unstructured Data transformation in Informatica.
  • Excellent experience in XML, XSD, Java script, Html coding, CSS Coding etc.
  • Accessed Informatica B2B DT Studio projects, created well versed DT Studio scripts which are uploaded in server for usage of modifying existing Informatica schemas using unstructured data transformation.
  • Created several Design patterns, Standards documents, and experience in ETL systems explaining all above mentioned processes, used B2B DT Studio Data transformation studio authoring and created. tgp scripts, used track-wise tool for quality management
  • Worked on Teradata Fast Load and Multi Load and have excellent experience in executing Bteq scripts and also used Teradata for auditing process and ran Batch jobs using Teradata.
  • Used Informatica Data quality to perform data profiling and Also tested the Databases using PLSQL and analyzed several mapping scenarios have Sql Development expertise with Db2
  • Created all the paths and folders accessible from UNIX and ran all the Informatica workflows in UNIX using cron tab and performed Data Analysis using advanced techniques

Environment: Informatica Power Center 9.6/9.5 version, Informatica power exchange, Informatica DT Studio, Microsoft SSIS, Oracle 11, BAW, Teradata 6, SQL, PL/SQL, IBM AIX, UNIX Shell Scripts, Cognos, Erwin, Mercury Quality Center, MDM

Confidential, Dallas, TX

Informatica Tech Lead Dev

Responsibilities:
  • This project is a Federal Project and we converted Stored Procedures into ETL Logic using Informatica.
  • Created ETL Mappings based on the requirements created from stored procedures documents for ETL Process.
  • Wrote Teradata macros and used various Teradata analytic functions
  • Have written extensive Teradata Fast export scripts to export various forms of data
  • Developed the required data warehouse model using Star schema for the generalized model
  • Used forward engineering approach for designing and creating databases for OLAP model.
  • Conducted design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for Oracle applications
  • Used XML Spy to read the Xmls coming across integration systems, verified the formats of xmls and passed the xml data across database systems, used informatica power exchange to read data from web services like soap, rest etc
  • Migrated existing inbound processes from legacy system to Oracle and did advanced coding using PLSQL
  • Transferred historical data from Oracle into Hive for last 7 years using Sqoop jobs during off peak hours
  • Worked with business teams and created Hive queries for ad hoc analysis.
  • Designed, developed Informatica mappings in Informatica power center, enabling the extract, transport and loading of the data into target tables which are in sql server, wote complex DML and DDL Statements
  • Created Workflow, Work lets and Tasks to schedule the loads Confidential required frequency using Workflow Manager and also created multi-dimensional dashboards and views using Tableau
  • Designed and developed Informatica mappings for data loads and data cleansing and worked in data integration project dealing with Oracle Ebs systems
  • Created complex mappings using Aggregator, Expression, Joiner transformations and created customized dashboards using Tableau
  • Excellent experience in XML, XSD, Java script, Html coding, CSS Coding etc. and used this scripting for various internal tasks
  • Developed Perl and Shell scripts for upload of data feed into database
  • Developed reports for applications programed in Python
  • Designed and developed process to handle high volumes of data coming from Db2 Mainframes and high volumes of data loading in given load intervals.
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Designed and developed table structures, stored procedures, and functions to implement business rules.

Environment: Informatica Power Center 9.0/8.6, Web services SQL Server 2005/2000, Oracle 11i/10g, Teradata 12, SQL, PL/SQL, IBM AIX, UNIX Shell Scripts, Cognos, Erwin, STAR team, Remedy, Maestro job scheduler, Mercury Quality Center

Confidential

Informatica Senior Developer/Db2 Legacy

Responsibilities:
  • Acted as Senior Informatica developer to create Mappings from Flat files and connect with Xml’s.
  • Created Mappings Connecting the Xml’s with Payload Db2 Mainframe Databases and WPS tables in the Event store databases.
  • Created a successful integration with Flat files, Web services using Informatica power exchange and Databases and integrated them in a Network, performed data migration for applications
  • Created several Informatica Mappings in Informatica power center for all the EDW Environments ran the workflows and monitored it.
  • Also Created Audit and Controls using Teradata Metadata tables and ran the scripts using Audit and controls statistics.
  • Wrote conversion scripts using SQL, PL/SQL, stored procedures, functions and packages to migrate data from SQL server database to Oracle database.
  • Performed Database Administration of all database objects including tables, clusters, indexes, views, sequences packages and procedures.
  • Implemented 11g and upgraded the existing database from Oracle 9i to Oracle 11g.
  • Involved in Logical & Physical Database Layout Design.
  • Used XML Spy to read the Xmls coming across integration systems, verified the formats of xmls and passed the xml data across database systems
  • Created several Design patterns, Standards documents, ETL strategies explaining all above mentioned processes.
  • Also tested the Databases especially the source Mainframe Db2 using PLSQL and analyzed several mapping scenarios.
  • Excellent experience in XML, XSD, Java script, Html coding, CSS Coding etc. and used this scripting for various internal tasks
  • Created all the paths and folders accessible from UNIX and ran all the Informatica workflows in UNIX using cron tab.

Environment: Teradata 14, Erwin, Informatica Power Center 9.1 version, Informatica Power Exchange, SQL Server 2005/2000, Oracle 11, BAW, Teradata 6, SQL, PL/SQL, IBM AIX, UNIX Shell Scripts, Cognos, Erwin, Mercury Quality Center, MDM

Confidential, Columbus, Ohio

BI Business Objects/Informatica

Responsibilities:
  • Acted as Technical Lead in Modeling, Estimation, Requirement Analysis and Design of mapping document and planning using ETL, BI tools, MDM, Toad by various environmental sources.
  • Acted as Team lead in coordinating offshore ETL Development for EDW and weblogs and planned analysis using Deliverables include ERD, Data Models, Data Flow Diagrams, Use Cases, Gap Analysis and process flow documents and have expert understanding of Ralph Kimball.
  • Created Business Objects Dashboards using Crystal X-Celsius for senior management for business decision-Making for BO Mobile interfaces.
  • Used Business Objects and created Performed Server Management Tasks, using Central Configuration Manager (CCM) and Central Management Console (CMC), worked in Unix and windows.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex
  • Scripts, also worked with XML, XSD, DT Studio and also Customized reports using IMM tools.
  • Excellent experience in XML, XSD, Java script, Html coding, CSS Coding etc. and used this scripting for various internal tasks
  • Used XML Spy to read the Xmls coming across integration systems, verified the formats of xmls and passed the xml data across database systems
  • Integrated Business Objects with SAP BI reporting which involves Query building, Filtering, Free Characteristics,
  • Used Business Objects and Restricted Key Figures and Variables, Query variants using BEx Analyzer.
  • Extensively worked on BO XI 3.1 for reporting purposes.
  • Used Business Objects and Involved in Adhoc query development and data mining.
  • Used Business Objects and Extensively worked in INFOVIEW to create Web Intelligence, Desk Intelligence Reports over the Universe Created
  • Used Business Objects and extensively worked on Central Management Console (CMC) in configuring BO environment.
  • Used Business Objects and Defined BO Classes, Objects, in the Universe, defined Cardinalities, Contexts, Joins and Aliases and resolved loops in Universes using Table aliases and contexts.
  • Used Business Objects and Created Highly Complex, Custom Line Graphs, Bar Charts with shared data of main and sub Reports data using Crystal Reports
  • Created Universes over BEx queries and Info cubes of SAP BI/BW using Universe Designer
  • Extensively worked in Info view to create Web Intelligence Reports over the Universe Created.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager and passed the data to Microsoft SharePoint.
  • Used VB, net to create user define functions in SSRS, Created standard and data driven Subscription, used report manager for securities also worked with XML, XSD, and DT Studio.
  • Created complex mappings using Aggregator, Expression, Joiner transformations and also worked in administration part of Oracle 11 and Oracle 10.
  • Involved in generating reports from Data Mart using SAP Business Objects Universe Interfaces.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Performed Configuration Management to migrate Informatica mappings/sessions /workflows from Development to Test to production environment.
  • Good knowledge on Teradata manager PMON, DBQL, Sql assistant, Bteq
  • Designed and developed process to handle high volumes of data from Mainframe Db2 databases and high volumes of data loading in a given load interval.

Environment: Business Objects, Informatica Power Center 9.1 version, Informatica power exchange, Oracle data integrator, SAP BO 4.0, SQL Server 2005/2000,SSRS, Oracle 11, BAW, Teradata 6, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, STAR team, Remedy, Maestro job scheduler, Mercury Quality Center, MDM

Confidential, Columbus, Ohio

Informatica Developer/SSIS SSRS SSAS Developer

Responsibilities:
  • Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the Data Mart and worked in Claims policy center Migration
  • Used XML Spy to read the Xmls coming across integration systems, verified the formats of xmls and passed the xml data across database systems
  • Acted as Team lead in coordinating offshore B2B power center and power exchange and Data conversion using tool similar to Gateway and Data migration process using Toad and also used Informatica data quality for cleansing purposes dealing with advanced Db2 mainframes.
  • Experience in Cube designing in Business Intelligence Development Studio (SSAS) to edit various properties of a cube, including the measure groups and measures, cube dimensions and dimension relationships, calculations.
  • Enhanced the functionality in database and data ware housing concepts (OLAP, OLTP) Cube by creating KPI, Actions and Perspective using SQL Server Analysis Services 2008(SSAS).
  • Developed mappings to extract data from SQL Server, Oracle, Teradata 12, Flat files and load into Data Mart using the Power Center and acted as PR4/PR3 programmer for manipulating CDC modules also used Pentaho for ETL.
  • Developed common routine mappings. Made use of mapping variables, mapping parameters and variable functions, worked on integration services and reporting services SSIS & SSRS.
  • Experience in creating SSIS packages and migrating DTS packages from SQL server 2000 to SQL server 2005/2008.
  • Extensive ETL experience using DTS/SSIS for data extraction, transformation and loading from OLTP systems to ODS and OLAP systems.
  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, work lets and workflows accessed information using Mainframe DB2 written Cobol Coding.
  • Written procedures, Queries to retrieve data from DWH and implemented in DM also connected Informatica with Teradata 12 and did Migrations.
  • Data extraction and data Transfer from and to SQL Server Database using utilities / tools like Toad and BULK INSERT and work on contingency plan using SQL Queries.
  • Published and scheduled Business Objects Reports to users using Scheduler and updated data to Microsoft SharePoint and also fixed issues dealing with WIN-SCP interfaces.
  • Used Store Procedure as Data provider to retrieve data from scheduled tables and complex queries
  • Designing SAP Business Objects Universe based on the XI Repository and developing Business Objects reports and Crystal Reports and documented using Microsoft office.
  • Developed centralized schema console using Business Analytics warehouse (BAW), wrote analytical queries, designed and developed ELT (Extraction, loading, Transformation) solutions and ensured only checked data is loaded.
  • Developed core system components utilizing the SQL, Oracle, Informatica, Maestro, and Harvest.
  • Wrote SQL queries, triggers, and PL/SQL procedures to apply and maintain the business rules.

Environment: Informatica Power Center 9.0/8.6, SQL Server 2005/2000, Oracle 11i/10g, Teradata 12, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, STAR team, Remedy, Maestro job scheduler, Mercury Quality Center

Confidential, MI

Informatica Developer with SSIS/SSRS/SSAS

Responsibilities:
  • Interacted with business community and gathered requirements based on changing needs.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
  • Identified all the dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables, accessed AS400 mainframe db2 systems with COBOL.
  • Tested the reports like Drill Down, Drill up and pivot reports generated from Cognos.
  • Involved in deploying SSIS Package into Production and used Package configuration to export various package properties to make package environment independent.
  • Gathered report requirements and determined the best solution to provide the results in either a Reporting Services report or created automation services framework using VB Scripting practices.
  • Wrote complex SQL queries and stored procedure to create reports using SSRS 2005/2008
  • Generated parameterized/Drilldown reports using SSRS 2005/2008
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Performed Configuration to Migrate Informatica mappings/sessions /workflows from Development to Test to production environment also troubleshooted data issues with Oracle warehouse builder.
  • Developed Cubes using SQL Analysis Services (SSAS).
  • Deployment of SSAS Cubes to the production server, worked in Installation of Facets software.
  • Generation of reports from the cubes by connecting to Analysis server from SSRS.
  • Experience in Developing and Extending SSAS Cubes, Dimensions and data source view SSAS-Data Mining Models and Deploying and Processing SSAS objects.
  • Used ProClarity for accessing and analyzing cubes and dimensions, and also worked on integration of SharePoint with SSRS.
  • Created reports like Master/Detail reports, Cross Tab reports, slice and dice reports, and drill down reports.

Environment: Informatica 8.6, SQL Server 2000, Teradata 6, Oracle 9i, DB2,SQL, PL/SQL, Mainframes, Sun Solaris, UNIX Shell Scripts, Business Cognos 8, Erwin, Autosys, Remedy.

Confidential, MI

Informatica developer

Responsibilities:
  • Worked closely with business users while gathering requirements, analyzing data and supporting existing reporting solutions.
  • Involved in gathering of business scope and technical requirements and created technical specifications.
  • Developed complex mappings and SCD type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations.
  • Created complex mapplets for reusable purposes, Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.
  • Fine-tuned existing Informatica maps for performance optimization, also used MQ series for passing distributed data and also worked on Power center and Power exchange B2B.
  • Worked on Informatica Designer tools: Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer and Server Manager to create and monitor sessions and batches.
  • Wrote tested and implemented Teradata Fastload, Multiload and Bteq scripts
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Analyzed existing system and developed business documentation on changes required.
  • Made adjustments in Data Model and SQL scripts to create and alter tables.
  • Extensively involved in testing the system from beginning to end to ensure the quality of the adjustments made to oblige the source system up-gradation.
  • Worked on various issues on existing Informatica Mappings to produce correct output.
  • Involved in intensive end user training (both Power users and End users in Report studio and Query studio) with excellent documentation support.

Environment: Informatica, Oracle 10g/9i, SQL, SQL Developer, Windows 2008 R2/7, Toad

Confidential, Seattle, WA

Business Objects / Informatica

Responsibilities:
  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into Business intelligence database.
  • Used Business Objects and Created the DESK I and Web I reports using Business Objects functionalities like Queries, Slice and Dice, Drill down, @Functions, Cross Tab, Master/Detail and Formulas etc.
  • Created Prompts, conditions and Filters to improve the report generation. Also used Alerts toImprove the readability of the report.
  • Used Business Objects and developed custom BEx queries and workbooks using BEx Analyzer. Published queries in BEx browser using Web Application Designer.
  • Used Business Objects and Created Dashboards using Crystal X-Celsius for senior management for business decision-Making for BO Mobile interfaces.
  • Used Business Objects and Performed Server Management Tasks, using Central Configuration Manager (CCM) and Central Management Console (CMC), worked in UNIX and windows.
  • Worked on SAP BI BO reporting which involves Query building, Filtering, Free Characteristics,
  • Restricted Key Figures and Variables, Query variants using BEx Analyzer.
  • Extensively worked on Business Objects XI 3.1 for reporting purposes.
  • Involved in Adhoc query development and data mining.
  • Extensively worked in INFOVIEW to create Web Intelligence, Desk Intelligence Reports over
  • the Universe Created
  • Expertise in creating Teradata MLOAD, Fast load and T Pump control scripts to load data to the BID.
  • Expertise in creating control files to define job dependencies and for scheduling using maestro tool.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Business Objects, Informatica, ETL, Teradata V2R5 as a BID, Business Objects, Oracle 10g/9i/8i, HP - Unix, Sun OS, Perl scripting, Erwin, PL/SQL, Maestro for scheduling.

We'd love your feedback!