Data Integration Consultant/informatica Developer Resume
Indianapolis, IN
PROFESSIONAL SUMMARY:
- 9+ years of professional experience in IT Industry including around 8+ years of expertise in Development and Implementation of Data Warehouse applications and architecture.
- Extensive experience in development and designing of ETL methodology for supporting data transformations and processing in a corporate wide environment using Informatica PowerCenter 9.5.1/9.1.0/8.6/8.5.1/8.1.1.
- Experience in TOAD, Explain Plan, Ref Cursors, Constraints, Triggers, Indexes - B-tree Index, Bitmap Index, Views, Inline Views, Materialized Views, Database Links and Export/Import Utilities.
- Involved in the full development lifecycle from requirements gathering through development and support for retail, banking and communications industries using Informatica PowerCenter, Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor.
- Experience in developing strategies for Extraction, Transformation and loading (ETL) mechanism.
- Experience in Creation of complex parallel loads and dependency creation using workflows.
- Knowledge of data warehousing techniques, Star / Snowflake schema, ETL, Fact and Dimensions tables, physical and logical data modeling OLAP and Report delivery methodologies.
- Experience in integration of various data sources like SQL Server, Oracle, Flat Files, and Excel.
- Worked in loading the EDW data from flat files and Oracle.
- Changed the cognos cubes for sales by region.
- Installation and configuration of Informatica servers and client environments.
- Experience in writing, testing, and implementation of the triggers, cursors, procedures, and functions at database level using PL/SQL.
- Experience in debugging and performance tuning of sources, targets, mappings and sessions.
- Interacted with business partners to identify information needs and business requirement for reports.
- Familiar with Business Objects 5.x.
- Experience in high-end programming using SQL, PL/SQL, SQL * Loader.
- Installed/configured Teradata Power Connect for Fast Export for Informatica.
- Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of a team.
TECHNICAL SKILLS:
RDBMS: Oracle 8i/9i/10g/11g, SQL Server 2003/2005/2008 R2/2012, 2016MS Access, DB2, Teradata 13/14/16.
Data Warehousing Tools: Informatica Power Center 10.X/9.X/8.X, Business Objects 5.1/6.2SSIS, SSRS, OBIEE, Cognos EBusiness Suite, And Micro Strategy.
Design / Application Tools: Erwin 3.5 / 4.2, ER Studio.
Programming / Scripting: SQL, PL/SQL, VC++, UNIX Shell Scripts, VBScript, Perl
Operating Systems: Windows7/ NT/2000/XP, UNIX, Sun Solaris V2.8
Methodologies: Data Modeling - Logical/Physical/Dimensional, Star/Snowflake
ETL: OLAP, Complete software development cycle
PROFESSIONAL EXPERIENCE:
Confidential, Indianapolis, IN
Data Integration Consultant/Informatica Developer
Responsibilities:
- Worked closely with business analysts to understand and document business needs for decision support data.
- Created the ETL performance expectations document based on the source data profile results.
- Captured the data volumes, upserts/truncate and load strategies etc in Integration design document.
- Worked on data integration from Sales Force to Oracle using Informatica Cloud (IICS).
- Created source/target connections in IICS.
- Created mapping tasks in IICS.
- Created multiple mappings (source to target mappings) in Informatica Cloud for data integration.
- Incorporated the refresh strategy, maintaining the historical data, archiving strategies for the source flat file, Audit balance and Control (ABC) etc in Integration design document.
- Created the technical architecture (Hardware and Software) that will support ETL.
- Configured Informatica Power Center GRID on Linux platform.
- Assigned master and worker nodes to GRID in Informatica platform.
- Created the Informatica data quality plans, created rules, applied Rules to IDQ plans and incorporated the plans as mapplets in Informatica Power Center.
- Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
- Created High Level and Low-Level design document and ETL standards document.
- Involved in Extraction, Transformation and Loading (ETL) Process.
- Installed and configured Informatica 9.5.1 HF3 on Red Hat platform.
- Wrote shell script to take repository backup on a weekly basis and archiving the 30-day old files on Red Hat.
- Created the Visio diagram
- Developed various T-SQL stored procedures, functions, and packages.
- Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager.
- Worked extensively on Autosys using the CA workload center and JIL Checker.
- Scheduled Informatica jobs using Autosys.
- Created dependencies in Autosys, inserted/updated jobs Autosys on CA Workload Center.
- Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2008.
- Generated sequence in Teradata using identity column.
- Displayed sequence number in Teradata Studio using CSUM.
- Retrieved data from Oracle EBS and loaded into SQL Server data Warehouse.
- Worked with the Oracle EBS tables like GL CODE COMBINATIONS, GL LEDGER, GL PERIODS, GL JE SOURCES TL, AP CHECKS ALL, AP INVOICE ALL, PO HEADERS ALL, PO LINES ALL, RA CUSTOMER TRX ALL, SO LINES INTERFACE ALL etc.
- Involved in unit testing, Integration testing and User acceptance testing of the mappings.
- Performance tuned Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target-based commit interval.
- Developed various T-SQL stored procedures, functions, and packages.
- Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager
- Developed SSIS packages and migrated from Dev to Test and then to Production environment.
- Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2008.
- Created the SFDC, Flat File and Oracle connections for AWS Cloud services.
- Optimized the T-SQL queries and converted PL-SQL code to T-SQL.
- Standardized the T-SQL stored procedures per the organization’s standards.
- Applied try/catch blocks to the T-SQL procedures.
- Used merge statement in T-SQL for upserts into the target tables.
- Made changes to SSRS financial reports with user’s input.
- Installed/configured Teradata Power Connect for Fast Export for Informatica.
- Involved heavily in creating customized Informatica data quality plans.
- Worked with address and names data quality.
- Used Proactive monitoring for daily/weekly Informatica jobs.
- Customized the proactive monitoring dashboard with the Informatica repository tables like OPB SESS TASK LOG etc.
- Resolved Skewness in Teradata
- Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
- Wrote BTEQ scripts of Teradata extensively.
- Installed configured Amazon redshift cloud data integration application for faster data queries.
- Created JDBC, ODBC connections in Amazon redshift from the connect client tab of the console.
- Automated the administrative tasks of Amazon redshift like provision, monitoring etc.
- Aware of the columnar storage, data compression, zone maps of Amazon redshift.
- Extracted data from complex XML hierarchical schemas for transformation and load into Teradata and vice versa.
- Modified OBIA dashboard for metrics and standard reports.
- Resolve syntax differences from Teradata to Oracle and documented it.
- Scheduled the workflows to pull data from the source databases at weekly intervals.
- Used various performance enhancement techniques to enhance the performance of the sessions and workflows.
- Created the FTP connection from Tidal to the source file server.
- Retrieved data from XML, Excel, and CSV files.
- Archived the source files with timestamp using Tidal Scheduler.
- Performance tuning on sources, targets, mappings and database.
- Worked with the other team such reporting to investigate and fix the data issues coming out of the warehouse environment.
- Worked as production support SME to investigate and troubleshoot data issues coming out of Weekly and Monthly Processes.
- Worked with business to provide them daily production status report in the form of issues, their priority and business impact along with recommended short term and long-term solution.
Environment: Informatica Power Center 10.2/10.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Informatica Cloud (IICS), Teradata 16, Amazon web services (AWS) cloud, Amazon RedShift cloud, data integrator 10, Business Objects, Erwin 7.2, Oracle 11g, Oracle Exadata, XML, Sales Force dot com (SFDC), SQL Server 2014/2017, DB2 8.0/7.0, Team Foundation Server, SQL Server Management studio, Sun Solaris, Windows XP, Control M.
Confidential, Houston, TX
Sr. Informatica Lead
Responsibilities:
- Worked closely with Business Analyst and the end users for understanding existing business model and customer requirements and involved in preparing the functional specifications based on the business requirement needs.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center.
- Expertise in building scripts using Transact-SQL for DDL and DML
- Developed stored procedures in T-SQL that were invoked to load dimensions and fact tables and also for Error handling during the ETL process
- Developed stored procedures in T-SQL that were invoked to load dimensions and fact tables and also for Error handling during the ETL process.
- Used Tidal scheduler to get the source file from the server using Tidal flat file FTP connection as well as power center FTP connection.
- Ran the all workflows using the Tidal scheduler.
- Created Merge statement update else insert T-SQL jobs.
- Worked on Migration of mappings from Data Stage to Informatica.
- Performed Unit testing and Data validation testing using Validation scripts.
- Created Data Validation document, Unit Test Case Document, Technical Design Document, Informatica Migration Request Document and Knowledge Transfer Document.
- Installed and configured Informatica Power Exchange for CDCand Informatica Data Quality (IDQ).
- Used IDQ’s standardized plans for addresses and names clean ups.
- Created custom plans for product name discrepancy check using IDQand incorporated the plan as a mapplet into Power Center.
- Experience inBig Datawith deep understanding of the Hadoop Distributed File System Eco System.
- Did some MDM management using Informatica MDM tool.
- Worked on batch data process flows and consolidation flags in Informatica MDM.
- Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite.
- Designed and developed Reports for the user Interface, according to specifications given by the Track leader.
- Created stored procedures (NZ-SQL).
- Scheduled the ELT load in Autosys.
- Debugged and corrected the xfr files developed by other ELT developers.
- Fixed numerous bugs with load issues.
- Optimized the NZ-SQL queries.
- Converted Oracle ddl’s to Netezza ddl’s.
- Scheduled the ELT load in Control-M.
- Lookup, Router, Filter Update Strategy, Joiner, Transaction Control and Stored Procedure etc.
- Created Process Control and Metadata for informatica jobs.
- Created the format of the unit test documents per Netezza Framework.
- Assisted ELT developers in creating the unit test documents.
- Managed user and folder permissions for the developers.
- Purged old repository objects weekly.
- Created shell script for repository backup weekly.
- Developed data Mappings between source systems to Landing and from Standardization to warehouse components using Mapping Designer.
- Did performance tuning on the ELT code developed by ELT developers.
- Debugged the framework error logs.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Problem resolution involves resolving the issues raised by the client/actuarial users, validation of the data in the database or in the application functionality.
- Worked closely with QlickView Developers.
- Involved in performance tuning at source, target, mapping and session level.
- Loaded Oracle tables from XML sources.
- Configured Informatica for the SAP Connector.
- Extracted data from SAP and loaded into Oracle EBS
- Introduced the concept of Data Dashboard to track the technical details sighting the continuous requirement changes and rework needed.
- Optimized the BOXI dashboard SQL with aggregated and sub queries.
- Created and modified BO reports.
- Loaded the data into reporting layer based on the BOXI reporting requirements in facts and dimension tables.
- Created the business objects universes.
- Supported Integration testing by analyzing and fixing the issues.
- Created Unit Test Cases and documented the Unit Test Results.
- Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
- Did bulk loading of Teradata table using TPump utility.
- Resolved Skewness in Teradata.
- Analyzed some of the JCL scripts for job scheduling.
- Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
- Created the Cognos connections for web portal and content store.
- Developed a query subject in framework manager in Cognos.
- Used Informatica Data Quality tool for Standardization by referring to the database dictionary tables and populating the flat file dictionaries.
- Used Informatica web services to create work requests/work Items for the end user.
- Successfully Integrated Multiple XML sources and created a de-normalized, flat-structured file.
- Supported Integration testing by analyzing and fixing the issues.
- Created Unit Test Cases and documented the Unit Test Results.
- Created Stored Procedures to validate and load the data from interface tables to the Oracle E-Business Suite internal tables.
- Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
- Staged data in Oracle E-Business Suite stage tables using Power Center in Informatica.
- Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
- Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
Environment: Informatica Power Center 9.5.1, Informatica Data Quality 9.1, IDQ, Cognos, Oracle 11g, MDM, Netezza TwinFin 6 (Production), Netezza TwinFin 3 and Netezza Skimmer (Non-production), Business Objects XI Edition (BOXI) Reporting tool SQL Server 2012/2008 R2, Flat files, TOADWindows XP, Control-M, Mainframe, Tidal scheduler. (SDFC).
Confidential, Houston, TX
Sr. Informatica Developer
Responsibilities:
- Requirement Gathering and Business Analysis
- Analyzed the data models of legacy implementations, identifying the sources for various dimensions and facts for different data marts according to star schema design patterns.
- Worked with multiple sources such as Relational Tables, Flat Files, Excel Sources, ERP (SAP-BW) sources for Extraction using Source Analyzer/PowerExchnage
- Extensively involved in the Analysis, Design and Modeling. Worked on Star Schema, Data Modeling, Data Elements
- Created CDC (change data capture) sources in Power Exchange and imported that into Power Center
- Designed the ETL process for extracting data from heterogeneous source systems, transform and load into Data Mart
- Created Logical and Physical models for production using Erwin 3.5.
- Worked with Informatica PowerConnect to get data from PeopleSoft XLATTABLE table and modules like General Ledger, Accounts Payable, Accounts Receivable, and Asset Management.
- Involved in Upgrading Informatica from version 7.1.3 to 8.1
- Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data.
- Performance tuned the workflows by identifying the bottlenecks in targets, sources, mappings, sessions and workflows and eliminated them.
- Provided production support including error handling and validation of mappings, sessions and workflow.
- Extensively used Debugger Process to test data and applying Break Points while Session is running.
- Provided production support for Business Users and documented problems and solutions for running the workflow.
- Developed UNIX scripts for scheduling the jobs.
- Designed and developed Oracle PL/SQL Procedures.
- Performance tuning of Oracle PL/SQL Scripts
- Created Universe using Business Objects for pulling the data for analysis and reporting.
Environment: Windows NT, Teradata 13, Oracle 9i, Informatica Power center 9.1.0 /Power Exchange 8.1/7.1.3, SQL Assistant, SQL Developer, Business Objects 5.1.1, Business Query for Excel, ERWIN 3.5.2, PL/SQL, TOAD.
Confidential, Dallas, TX
ETL Informatica/Talend Developer
Responsibilities:
- Translated business rules and functionality requirements into ETL procedures. Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy.
- Attended POC for Talend open studio.
- Created the ODS jobs using Talend Open Studio.
- Used Talend open studio to execute jobs for ODS.
- Debugged numerous issues in Talend.
- Worked closely with the administrators with the configuration of Talend Open studio.
- Developed and tested all the backend programs, Informatica mappings and update processes.
- Developed Informatica mappings to load data into various dimensions and fact tables from various source systems.
- Developed, tested stored procedures, Cursors, Functions and Packages in PL/SQL for Data ETL.
- Worked on retrieving data from FACETS repository for patient care program.
- Worked on identifying the entities and attributes in FACETS for eligibility, provider, plan etc.
- Worked on populating FACETS repository for patient billing, clinical documentation reporting.
- Used Power Exchange along with Power Center to leverage data by avoiding manual coding on data extraction programs.
- Created various active and passive transformations like Source Qualifier, Lookup, Router, Normalizer, Aggregator, Filter, Joiner, Expression and standard/reusable mappings using Informatica.
- Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Source, Target, Mapplets and Transformation objects.
- Responsible for developing and testing the new conformed dimensions that were used by the conformed fact.
- Used Power Center Workflow Manager to create sessions, and also used various tasks like session, event wait, event raise, email to run with the logic embedded in the mappings.
- Responsible for validating the Informatica mappings against the pre-defined ETL design standards.
- Developed incremental and updateable loading through Informatica mappings.
- Used debugger and breakpoints to view transformations output and debug mappings.
- Documented all the mappings and the transformations involved in ETL process
- Used UNIX shell scripting for scheduling tasks.
- Extracted huge volumes of data from legacy systems and uploaded into Oracle using SQL*Loader and shell scripts.
Environment: Informatica 8.6.1, Power exchange, Talend Open Studio 4.X/3.X, Oracle 8i, FACETS, SQL Server, PL/SQL Developer 5.0, and UNIX shell scripts, PL/SQL, TOAD.
Confidential, Houston, TX
ETL Developer
Responsibilities:
- Developed, tested stored procedures, functions and packages in PL/SQL for Data ETL.
- Used ETL tool Informatica PowerCenter 5.1 and 6.2 to create maps to transform data
- Creating various active and passive transformations like source qualifier, Lookup, Router, Procedure, aggregator, filter, joiner, expression and standard and reusable mappings using Informatica
- Loaded operational data from heterogeneous sources into various data mart.
- Unit and integration tested Informatica Sessions, Batches, and Workflows.
- Documented the mappings and the transformations involved in ETL process.
- Involved in gathering business requirements, data sourcing and data transformation, data loading, SQL and performance tuning.
- Involved in writing UNIX scripts and used them to automate the scheduling process.
- Used mapping Wizards in creating slowly growing dimensions and slowly changing dimensions.
- Recommended, designed and implemented strategies for managing the data in data warehouse.
- Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all applications and systems.
- Responsible for creating folders and giving privileges to the new users.
- Enhanced session performance, and improved response times, through extensive performance tuning of the mappings, ETL Procedures and processes.
- Responsible for querying data from different database tables as per the requirement.
Environment: Windows NT, Oracle 8i, Teradata 9, Informatica PowerCenter 8.6.1, SQL Assistant, SQL Developer, UNIX shell scripts, PL/SQL, TOAD.