We provide IT Staff Augmentation Services!

Informatica Integration Developer Resume

0/5 (Submit Your Rating)

Wayne, NJ

SUMMARY

  • Around Eleven years of Professional experience as Software developer including over Nine Plus years of experience in designing and development of Data warehouse/ Data Migration/ Data Conversion projects as Informatica ETL consultant and client/server applications.
  • Carried out Full Data warehouse life cycle in the project, requirement analysis, Modeling and designing, database and end user application development.
  • 9+ years experienced Informatica ETL expert with experience in online analytics, system analysis, creating technical design documents, project plans, managing Data Warehouse projects.
  • Strong understanding of Conceptual Data Modeling (CDM), Logical Data Modeling (LDM), Physical Data Modeling (PDM).
  • Attended the POC for ODI.
  • Creating data models, building Informatica ETL processes, dimensional data modeling primarily star schema, performing data migration, defining user interfaces, executing large projects, assuring quality and deployment.
  • Developed staging areas to Extract, Transform and Load new data from the OLTP database to warehouse.
  • Strong in Dimensional Modeling, Star/Snow Flake Schema, Extraction, Transformation & Load and Aggregates.
  • Strong in converting business requirements to project documentation and technical specifications.
  • Extensive experience in using Business intelligence report tools like Business Objects, Cognos BI Suite for warehouses and data marts, OBIEE etc.
  • Sound knowledge of Data warehousing concepts and Informatica ETL tool.
  • Excellent communication/Presentation, communication, interpersonal presentation and analytical skills.

TECHNICAL SKILLS

Languages: C, Java (jdk1.2, Servlets, JSP), J2EE, SQL, PL/SQL, T - SQL, NZ-SQL.

Data warehousing Tools: Informatica Power Center 9.6.1/9.5.1/9.1.0/8.6.1/8.5.1/7.1, Informatica Power Exchange, Informatica Metadata Manager, Informatica Data Quality, Informatica Data Explorer, DataStage Server Edition, Talend Open Studio, Oracle Data Integrator, Talend data profiler, Business objects 5.1/6.5, Cognos BI suite 6.0/7.2.Reportnet1.1, QlickView (Netezza), OBIEE.

Data Modeling Tools: Erwin, Embarcadero.

RDBMS: Oracle Exadata, Oracle 11g/10g/9i/8i/8.0, DB2, Oracle EBS 11g, Siebel 7.8, Ms SQL Server 2012/2008 R2/2005, Sybase, Teradata (BTEQ, Fast load, multi load, TPump, SQL Assistant, Fast Export), Netezza TwinFin 3/6/skimmer.

Script Languages: Perl, Script, Java script, korn shell script, Application Server IBM WebSphere 5

Others: Citrix, Telnet, PLSQL developer, Toad 9.7.2/6.5, T-Pump, ISQL, Workbench Aginity, Management Studio, Visio pro, COBOL, IMS, VSAM

Operating System: Windows XP / Windows 2000, WinNT, UNIX (AIX, HP, SCO), Linux, Sun Solaris

PROFESSIONAL EXPERIENCE

Confidential, Wayne, NJ

Informatica Integration Developer

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Worked on populating the SAP objects like Customer Master, Material Master, Vendor Master etc.
  • Created views in Oracle for SAP data pre-validations.
  • Involved in the development of Informatica mappings and tuned for better performance.
  • Created the load readiness reports (LRR) to load into SAP.
  • Created scorecards for redundant data and created validation rules using Informatica.
  • Did data profiling for the column/across table data validation.
  • Parsed the target data in IDQ using parser transformation.
  • Worked on Data integration from various sources like JDE-AUTO, JDE-IND, Marine, Vessels etc into SAP.
  • Created the SAP lookup tables for cross reference in Oracle.
  • Configured Power Connect for SAP and maintained sap.ini file entries.
  • Retrieved data from SAP ECC and installed/configured ABAP code.
  • Configured LDAP connector for Informatica in administration console.
  • Created the Sap ABAP data dictionaries and mapped underlying relational tables and views.
  • Created SAP ABAP data classes and indexes.
  • Worked with transparent and pooled tables of SAP ABAP.
  • Extensive knowledge on Master Data Management (MDM) concepts.
  • Extensive experience on Designing, Managing and administrating MDM/DIW objects.
  • Involved in Data integration keeping all success factors into consideration.
  • Success factors of integration includes, historical data, data archiving strategy, cache sizes calculations on the server etc.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Retrieved AUTO data from DB2.
  • Worked extensively in PL/SQL to migrate the data from DB2 to Oracle database.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Extracting the data from BRD files, Flat files & Oracle and load them through Informatica.
  • Production Support and issue resolutions.
  • Integrated data from DB2 to Oracle.
  • Validated the source queries in DB2.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.

Environment: Informatica Power Center 9.5.1 HF2, Informatica Data Quality 9.5.1, SAP ECC 6.0, Oracle 11 g, DB2, PL/SQL, Linux, Putty, Winscp.

Confidential, Dallas, Tx

Informatica Architect/Admin/Developer

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Created the ETL performance expectations document based on the source data profile results.
  • Captured the data volumes, upserts/truncate and load strategies etc in Integration design document.
  • Incorporated the refresh strategy, maintaining the historical data, archiving strategies for the source flat file, Audit balance and Control (ABC) etc in Integration design document.
  • Created the technical architecture (Hardware and Software) that will support ETL.
  • Configured Informatica Power Center GRID on Linux platform.
  • Assigned master and worker nodes to GRID in Informatica platform.
  • Created the Informatica data quality plans, created rules, applied Rules to IDQ plans and incorporated the plans as mapplets in Informatica Power Center.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Created High Level and Low-Level design document and ETL standards document.
  • Involved in Extraction, Transformation and Loading (ETL) Process.
  • Installed and configured Informatica 9.5.1 HF3 on Red Hat platform.
  • Wrote shell script to take repository backup on a weekly basis and archiving the 30 day old files on Red Hat.
  • Created the Visio diagram
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager.
  • Worked extensively on Autosys using the CA workload center and JIL Checker.
  • Scheduled Informatica jobs using Autosys.
  • Created dependencies in Autosys, inserted/updated jobs Autosys on CA Workload Center.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2008.
  • Worked on Informatica BDE for retrieving data from Hadoop’s HDFS file system.
  • Solved T-SQL performance issues using Query Analyzer.
  • Optimized SQL queries, sub queries for SSRS reports.
  • Created the SSRS reports with multiple parameters.
  • Modified the data sets and data sources for SSRS reports.
  • Retrieved data from Oracle EBS and loaded into SQL Server data Warehouse.
  • Worked with the Oracle EBS tables like GL CODE COMBINATIONS, GL LEDGER, GL PERIODS, GL JE SOURCES TL, AP CHECKS ALL, AP INVOICE ALL, PO HEADERS ALL, PO LINES ALL, RA CUSTOMER TRX ALL, SO LINES INTERFACE ALL etc.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Performance tuned Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager
  • Developed SSIS packages and migrated from Dev to Test and then to Production environment.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2008.
  • Created the SFDC, Flat File and Oracle connections for AWS Cloud services.
  • Optimized the T-SQL queries and converted PL-SQL code to T-SQL.
  • Standardized the T-SQL stored procedures per the organizations standards.
  • Applied try/catch blocks to the T-SQL procedures.
  • Used merge statement in T-SQL for upserts into the target tables.
  • Made changes to SSRS financial reports with user’s input.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Involved heavily in creating customized Informatica data quality plans.
  • Worked with address and names data quality.
  • Used Proactive monitoring for daily/weekly Informatica jobs.
  • Customized the proactive monitoring dashboard with the Informatica repository tables like OPB SESS TASK LOG etc.
  • Resolved Skewness in Teradata
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Wrote BTEQ scripts of Teradata extensively.
  • Installed configured Amazon redshift cloud data integration application for faster data queries.
  • Created JDBC, ODBC connections in Amazon redshift from the connect client tab of the console.
  • Automated the administrative tasks of Amazon redshift like provision, monitoring etc.
  • Aware of the columnar storage, data compression, zone maps of Amazon redshift.
  • Extracted data from complex XML hierarchical schemas for transformation and load into Teradata and vice versa.
  • Resolve syntax differences from Teradata to Oracle and documented it.
  • Scheduled the workflows to pull data from the source databases at weekly intervals.
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows.
  • Created the FTP connection from Tidal to the source file server.
  • Retrieved data from XML, Excel, and CSV files.
  • Archived the source files with timestamp using Tidal Scheduler.
  • Performance tuning on sources, targets, mappings and database.
  • Worked with the other team such reporting to investigate and fix the data issues coming out of the warehouse environment.
  • Worked as production support SME to investigate and troubleshoot data issues coming out of Weekly and Monthly Processes.
  • Worked with business to provide them daily production status report in the form of issues, their priority and business impact along with recommended short term and long term solution.

Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Informatica BDE, Amazon web services (AWS) cloud, Amazon RedShift cloud data integrator 10, Business Objects, Erwin 7.2, Oracle 11g, Oracle Exadata, XML, Sales Force dot com (SFDC), SQL Server 2008 R2/2012, DB2 8.0/7.0, Team Foundation Server, SQL Server Management studio, Sun Solaris, Windows XP, Control M.

Confidential, Houston

Informatica Architect/Developer

Responsibilities:

  • Responsible for Requirement Gathering Analysis and End User Meetings
  • Responsible for Business Requirement Documents BRD's and converting Functional Requirements into Technical Specifications.
  • Responsible for mentoring Developers and Code Review of Mappings developed by other developers.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Used Informatica MDM (Siperion) tool to manage Master data of EDW.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Pulled data from Epic Clarity, EDI and HIPAA compliance files (X12 files).
  • Loaded data to EDI system and clinical documentation.
  • Installed/configured SAP adaptor and JD Edwards (Power Exchange) for Informatica.
  • Did change data capture (CDC) by using the MD5 function of Informatica.
  • Created the RPD for OBIEE.
  • Created custom IDQ plans and incorporated it into Power Center as mapplet.
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
  • Created source system containers for OBIEE.
  • Created subject areas in containers for OBIEE.
  • Created narrative reports in OBIEE.
  • Extracted data from various heterogeneous sources likeOracle, SQL Server, andFlat Files and loaded intoDataMart’susingInformatica.
  • Resolved Skewness in Teradata.
  • Optimized OBIEE dashboard queries.
  • Used Informatica Data Quality (IDQ) to format data from sources and load it into target databases according to business requirements.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created the visio diagram to show the job dependencies for Control-M jobs.
  • Inserted/updated the Informatica jobs in Control-M by invoking the pmcmd utility of Informatica.
  • Wrote BTEQ scripts of Teradata extensively.
  • Created custom plans for product name discrepancy check using IDQand incorporated the plan as a Mapplet into Power Center.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Loaded data from flat files to Big Data (1010 data).
  • Did bulk loading of Teradata table using Tpump utility.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Extensively used various active and passive transformations likeFilter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation,Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Responsible for best practices like naming Conventions, Performance tuning, andError Handling
  • Responsible for maintaindata qualityanddata consistencybefore loading into ODS.
  • Responsible forPerformance Tuningat the Source level, Target level, Mapping Leveland Session Level
  • Created business objects universes.
  • Created denormalized BO reporting layer for BO reports.
  • Solid Expertise in using bothConnectedand Un connectedLookup transformations
  • Extensively worked with various lookup caches likeStatic Cache, Dynamic Cache, and Persistent Cache
  • Responsible for determining thebottlenecks and fixing the bottlenecks withperformance tuning.
  • Used Update Strategy DD INSERT, DD DELETE, DD UPDATE, AND DD REJECT to insert, delete, update and reject the items based on the requirement
  • Worked withSession Logs, andWorkflow Logsfor Error handling and Troubleshooting in all environment
  • Responsible for Unit Testing and Integration testing of mappings and workflows.

Environment: Informatica Power Center 9.5.1/9.1.0 HF1, Informatica MDM,SAP BW,Big Data 1010 data., Teradata 14, Oracle 10g, MS SQL Server,SAP BW/IDoc, Control-M, TOAD, SQL, PL/SQL, BO XI, OBIEE, Windows XP, UNIX

Confidential, Houston

Sr. Data Warehouse Consultant

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Upgraded Informatica from 9.5.1 to 9.6.1 on Linux servers for Dev/Test and Prod environments.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Migration of data from Oracle 11g to Oracle Exadata
  • Created Oracle Exadata database, users, base table and views using proper distribution key structure.
  • Worked on Data integration from various sources into ODS using ODI (oracle data Integrator)
  • Configured power exchange for SAP R3
  • Retrieved data from SAP R3.
  • Used Informatica Power Connect for Oracle Exadata to pull data from Oracle Exadata data warehouse.
  • Calculated the KPI’s and worked with the end users for OBIEE report changes.
  • Created the RPD for OBIEE.
  • Developed mapping parameters and variables to support connection for the target database as Oracle Exadata and source database as Oracle OLTP database.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Worked extensively in PL/SQL to migrate the data from Oracle to Oracle Exadata database.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Extracting the data from BRD files, Flat files & Oracle and load them through Informatica.
  • Worked with Crontab for job scheduling.
  • Production Support and issue resolutions.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.
  • Written Unix Shell Scripting for repository backups, job scheduling on Crontab etc.

Environment: Informatica Power Center 9.6.1/9.5.1 HF2, Informatica Power Exchange 8.6.1, Informatica Data Quality 8.6.1, SAP R3, ODI (Oracle Data Integrator), SQL Server 2008 R2, Oracle 11 g, Oracle Exadata, OBIEE, PL/SQL, Linux, Putty, Winscp.

Confidential, Houston, TX

Sr. Informatica Architect/ T-SQL Developer

Responsibilities:

  • Created the architectural design of the Informatica /ETL and Data warehouse.
  • Applied patches to Informatica Servers.
  • Worked with SQL DBA’s on collation change on SQL Server 2012.
  • Applied Sales Force license to the domain for development and production environments.
  • Created Informatica best practices document, mapping/Unit testing document templates.
  • Communicated with the networking team on ETL server upgrades to space, memory and processors.
  • Maintained Informatica servers to make sure integration services, repository and servers are up and running and also coordinated with networking teams for server reboots etc.
  • Created ETL auditing reports for error handling/validations etc against Informatica ‘OPB’ repository tables.
  • Installed/Configured Informatica 9.1.0 HotFix1 on Development server.
  • Installed/Configured Informatica HotFix3 on development/production environment.
  • Applied EBF on Informatica server for SQL Server native client 11.0.
  • Production support for the daily/weekly and monthly loads.
  • Installed/configured SAP adaptor and JD Edwards (Power Exchange) for Informatica.
  • Configured the SFDC application connection to fulltest and production.
  • Configured JD Edwards Power Exchange for Informatica.
  • Standardized the parameter file location for each project in BWParam folder of Informatica.
  • Deployed Informatica workflows from Development
  • Deleted old repository backup files and purged deleted objects from repository in both development and production environments.
  • Supported the development team on ETL standards, naming conventions, best practices and folder structures.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Folder migration from Development to UAT to Production.
  • Created shared/regular folders for rojects and personal development.
  • Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite.
  • Designed and developed Reports for the user Interface, according to specifications given by the Track leader.
  • Worked on converting the older ETL jobs like cast Iron, CA ADT for getting the data into data warehouse.
  • Involved in performance tuning at source, target, mapping and session level.
  • Loaded Oracle tables from XML sources.
  • Configured Informatica for the SAP Connector.
  • Extracted data from SAP and loaded into Oracle EBS
  • Introduced the concept of Data Dashboard to track the technical details sighting the continuous requirement changes and rework needed.
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
  • Created source system containers for OBIEE.
  • Created subject areas in containers for OBIEE.
  • Created narrative reports in OBIEE.
  • Retrieved data from SAP using Informatica Power Exchange.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Resolved Skewness in Teradata.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Used Informatica web services to create work requests/work Items for the end user.
  • Successfully Integrated Multiple XML sources and created a de-normalized, flat-structured file.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Created Stored Procedures to validate and load the data from interface tables to the Oracle E-Business Suite internal tables.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Staged data in Oracle E-Business Suite stage tables using Power Center in Informatica.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.

Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1,Informatica Power Exchange 8.6.1, Informatica Data Quality 8.6.1,Informatica Data Explorer 8.6, SQL Server 2008 R2/2012, Oracle 10g, Oracle 12.0.4/5 EBS, JD Edwards, SAP, Sales Force dot com (SFDC), Informatica/windows scheduler, Force.com, Teradata 13, OBIEE, PL/SQL, Windows Server 2008 R2 Standard.

Confidential, Houston, TX

ETL Architect/ Informatica Lead

Responsibilities:

  • Extensively worked with the solution architect to implement the project from scratch.
  • Upgraded Informatica Power center from 8.6.1 to 9.1.
  • Converted all the Invesco specific utilities like File Manipulator, DB Wrapper, Programme Wrapper etc into Informatica jobs.
  • Purged the deleted objects from the repository in the DEV and QA environments.
  • Managed the growth of repository as part of normal administration maintenance.
  • Created the shell script to old repository backup files.
  • Worked with versioned objects.
  • Installed and configured Informatica Power Exchange for CDCand Informatica Data Quality (IDQ).
  • Extensively handled Informatica server related issues with the development teams.
  • Created various issue tickets with Informatica professional services.
  • Created SR’s with Informatica PS various workflow errors which require SR to be opened.
  • Worked with the network supportteams both on Windows and AIX servers for memory, storage, event log errors, performance issues and coordinated outage for maintenance.
  • Worked closely with the Cognos reports development team in configuring the cubes.
  • Created Cognos cubes and denormalized the data for faster access.
  • Customized the Cognos reports.
  • Provided upgrade and configuration support as part of administration.
  • Applied HotFix 5 on Informatica 8.6.1.
  • Created labels for each project for code deployments.
  • Created deployment groups for code migration from DEV to QA and also QA to Prod.
  • Created program wrapper/DB wrapper/File Manipulator jobs.
  • Attended POC of IDR.
  • Configured Informatica data replication (IDR) fast clone.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created custom plans for product name discrepancy check using IDQand incorporated the plan as a mapplet into Power Center.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Used Workflow Manager for creating and maintaining the sessions and also to monitor, edit, schedule, copy, aborts and deletes of the session.
  • Extensively worked in performance tuning of the programs, ETL Mappings and processes.
  • Developed Interfaces using UNIX Shell Scripts to automate the bulk load & update Processes.

Environment: Informatica PowerCenter 9.1, SQL Server 2008 R2, Oracle 11g, Sybase, SQL server Management Studio, MS Visual Studio 2010, SQL Developer for Oracle, Program Wrapper, File Manipulator, Cognos EBusiness Suite, Cognos Cubes, Teradata, Designer, SQL Server Migration Assistant (SSMA), CMS, ER-Studio, AIX, UNIX Shell Scripting, SQL, PLSQL, T-SQL, XML, Xqueries on SQL Server Database, Jira Bug tracker.

Confidential, AUSTIN, TX

Data Architect/DW Developer

Responsibilities:

  • Responsibilities include Production Implementation, Scheduling, Data Loading, Monitoring, Troubleshooting and Support for Global Operations Reporting using Informatica and Business Objects.
  • Extensively worked with the solution architect to implement the project from scratch.
  • Training team members to run and monitor workflows in Informatica.
  • Lead the team of change Management.
  • Created extensive T-SQL procedures.
  • Created SSIS packages to get the data from the flat files and load it into Oracle 11g.
  • Created the deployment document to be followed for migration of code from one environment to another.
  • Created the metadata layer for OBIEE.
  • Created batch scripts to rename/copy and move the processed files.
  • Designed conceptual, logical and physical data model using Erwin data modeling tool.
  • Administrative role and monitoring and support for several projects in Staging and Production environments for Informatica and OBIEE.
  • Exceptional background in analysis, design, development, and implementation and testing of data warehouse development and software application
  • Created ETL Mappings to ensure conformity, compliance with standards and lack of redundancy.
  • Exceptional background in analysis, design, development, and implementation and testing of data warehouse development and software applications.
  • Designed and developed Informatica mappings, to load data into target tables.
  • Worked with static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Created Mapplets to be re-used in the Informatica mappings.
  • Developed ETL Mappings to ensure conformity, compliance with standards and lack of redundancy.
  • Designed and developed Informatica mappings for data loads and data cleansing. Extensively worked on Informatica Designer, Workflow Manager.
  • Extensively used most of the transformations of Informatica including lookups, Stored Procedures, Update Strategy.
  • Created SSIS packages to automate data from Flat files into Oracle database.
  • Created SSIS package to get the dynamic source file name using ForEachLoop Container.
  • Used the Lookup, Merge, Data conversion, sort etc Data flow transformations in SSIS.
  • Used Workflow Manager for creating and maintaining the sessions and also to monitor, edit, schedule, copy, aborts and deletes of the session.
  • Extensively worked in performance tuning of the programs, ETL Mappings and processes.
  • Developed Interfaces using UNIX Shell Scripts to automate the bulk load & update Processes.
  • Made performance improvements to the database by building Partitioned tables, Index Organized Tables and Bitmap Indexes.
  • Developed OBIEE RPD and DAC from end user’s input.
  • Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules and security.
  • Designed and Developed Audit process to maintain the data integrity and data quality. Source to target audits were built to make sure accurate data is loaded to the warehouse and Internal Audits for checking the integrity of the data within the data warehouse.
  • Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.
  • Strong knowledge in Oracle Business Intelligence Enterprise Edition.

Environment: Erwin 4.2, Informatica Power Center 9.1/8.6.1, Informatica Data Replication 9.1.1 (IDR), Oracle 10g/11g, Teradata 13.1, SQL Developer, Crontab scheduler, SQL Server 2008 R2, SQL Server Management Studio, SQL Server Visual Studio 2007, PL/SQL, SQL Assistant, Windows XP, Sun Solaris 5.1.0, UNIX scripting, OBIEE, IBM Vendavo Profit Analyzer.

Confidential, HOUSTON, TX

Informatica/OBIEE lead

Responsibilities:

  • Extensively involved in gathering business requirements and translating them into technical requirements
  • Responsible for loading the data into base model repository and Dimensional model.
  • Used Informatica’s Data Transformation (B2B) tool to retrieve unstructured (xml) data.
  • Responsible in strictly maintaining naming standards and warehouse standards for future development.
  • Extensively wrote xqueries to get data from XML stored in SQL Server.
  • Actively worked on Informatica Administration Console.
  • Coordination and Leading the ETL development team and assisting the OBIEE team on RPD and Dashboard customization.
  • Worked closely with the OBIEE team in building the RPD and dashboard.
  • Customized the OBIEE dashboard.
  • Configured SFDC license in administration console.
  • Wrote BTEQ scripts of Teradata extensively.
  • Retrieved data from JD Edwards and loaded into Oracle.
  • Created customized OBIEE model in the RPD to retrieve the RPD data into dashboard.
  • Configured replication jobs for initialsync for replication jobs using DR console.
  • Created Informatica IDR jobs using Data replication console.
  • Scheduled the replication job using replication console.
  • Extracted data from various sources such as Flat Files, Oracle using Informatica Power Center
  • Designed and Developed Stored Procedures/Views in Oracle.
  • Created SSIS packages using BIDS.
  • Imported sources and targets for the SSIS packages into BIDS.
  • Achieved performance improvement by tuning SQL queries, xqueries, extraction procedures between Oracle and Power Center.
  • Offered 24/7 production support for the application on a time to time basis.
  • Developed complex Power Center mappings and IDQ plans by using different transformations and components respectively to meet the data integration and data quality requirements for various clients.
  • Used Informatica xml SQ transformation to read data from an xml (XSD) source and write data to a xml target.
  • Created and monitored workflows/sessions using Informatica Workflow Manager/Workflow Monitor to load data into target Oracle database
  • Performed Unit testing, Integration Testing, and User Acceptance testing to pro-actively identify data discrepancies and inaccuracies
  • Involved in performance tuning at source, target, mapping and session level
  • Delivered and sign-off of deliverables pertaining to the Transactional Data warehouse.
  • Migrated the Informatica Code using the deployment groups
  • Prepared design documents, ETL Specifications and migration documents
  • Introduced the concept of OBIEE Dashboard to track the technical details sighting the continuous requirement changes and rework needed
  • Maintained daily Tech Tracker for the updates from the team regarding their objects, issues and progress
  • Involved in Informatica PowerCenter 8.1.1 SP4 Repository upgrade to Informatica Power center 8.6.1
  • Involved in providing Informatica Technical support to the team members, as well as, the business

Environment: Informatica Power center 9.1/8.6.1, Informatica Data Transformation (B2B), Informatica Data Quality 8.6.2, Informatica IDR, Oracle 11g/10g, SQL Server 2008, SSIS, Windows 2003/2008, Sun Solaris, Red Hat Linux, OBIEE, Micro Strategy, UNIX shell scripting.

Confidential, HOUSTON, TX

BI Lead / Netezza Implementation Consultant

Responsibilities:

  • Provided architectural design of Netezza to the client.
  • Provided Netezza framework design to the client (pros and cons).
  • Extensively assisted the Framework implementation team with time-to-time client requirements and WM specific framework design.
  • Interacted with the business users, assisted the ELT developers with code development.
  • Assisted the Informatica lead in understanding the framework requirement.
  • Worked closely with data modelers in requirement gatherings.
  • Assisted the ELT developers in understanding the mapping documents.
  • Created nzsql procedures to be kicked off within Framework.
  • Extensively created shell scripts for the configuration of meta, ddl, data, xfr files in the Unix directories.
  • Mentor client on different BI tools.
  • Evaluated Team member’s performances.
  • Created stored procedures (NZ-SQL).
  • Scheduled the ELT load in Autosys.
  • Debugged and corrected the xfr files developed by other ELT developers.
  • Fixed numerous bugs with load issues.
  • Optimized the NZ-SQL queries.
  • Converted Oracle ddl’s to Netezza ddl’s.
  • Created the format of the unit test documents per Netezza Framework.
  • Assisted ELT developers in creating the unit test documents.
  • Managed user and folder permissions for the developers.
  • Purged old repository objects weekly.
  • Created shell script for repository backup weekly.
  • Developed data Mappings between source systems to Landing and from Standardization to warehouse components using Mapping Designer.
  • Did performance tuning on the ELT code developed by ELT developers.
  • Debugged the framework error logs.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Problem resolution involves resolving the issues raised by the client/actuarial users, validation of the data in the database or in the application functionality.
  • Worked closely with QlickView Developers.

Environment: Informatica Power center 8.6.1, Trillium Data Quality, Netezza TwinFin 6 (Production), Netezza TwinFin 3 and Netezza Skimmer (Non-production), QlickView Reporting tool from Netezza, Oracle 11g/10g, Windows 2003/2008, Sun Solaris, Red Hat Linux, SUSE Linux, Micro Strategy, Crystal Reports, UNIX as Server Autosys and UNIX Shell Script.

Confidential, HOUSTON, TX

Sr. Informatica Administrator/Developer/Data Analyst

Responsibilities:

  • Installation & Configuring Informatica 8.6.1 on UNIX (SUNos 5.11), also configuring the High availability option for the same.
  • Configuring the domain with various application services like repository services & integration services.
  • Purge deleted objects from the repository in the DEV, and QA environment. The repository is approx. 850MB and growing. In order to manage the growth of the repository, the purging of deleted objects must be performed as part of the normal administrative maintenance (1 hour every other day).
  • Performed Informatica PowerCenter HotFix upgrade / TOMCAT upgrade to meet the Organization security standards.
  • Creating & maintaining user, groups for Development repository and Quality Check.
  • Maintaining the repository & purging the repository objects & Archiving Periodic repository backup.
  • Workflow errors that require Service Request to Informatica for problem resolution. This sometimes requires backing up and zipping the repository, and FTP uploading them to Informatica FTP site. Informatica will examine repositories for inconsistencies, and fix them. Downloading fixed repositories, and restoring them in the troubled environment.
  • Revised the standards for the ETL Design
  • Involved in Gathering and Analyze business requirements for the RS One conformed to the business rules
  • Design & Customizing the Business process using Informatica transformations including SQL Transformation, Source Qualifier, Lookup, Aggregator (Incremental update), Expression, Joiner, Filter, Router, Update Strategy Transformations.
  • Created Reusable Transformations Logic and Mapplets to use in Multiple Mappings and also worked with shared folders, shortcuts.
  • Developed Informatica mapping to handle dynamic partition i.e., to create/add & Archive partitions of an oracle data base tables.
  • Flexible Implementation of mapping, sessions and workflows using Parameter file/Global parameters and by implementing Informatica Best Practices
  • Worked with various relation and non relational sources like FF (Direct / Indirect), Relational Tables and ERP systems.
  • Created SSIS packages to get the data from AS 400 (.csv files) into SQL Server 2008.
  • Used the data conversion transformation in SSIS to get the correct datatypes into SQL Server database.
  • Loaded data into Interface tables of Oracle EBS.
  • Wrote extensive validation scripts before loading the data into Oracle EBS.
  • Configured SAP IDoc connector for Informatica.
  • Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
  • Did bulk loading of Teradata table using TPump utility.
  • Retrieved data from SAP IDocs using Informatica connector.
  • Imported data from AS400 to be loaded into SQL Server 2008 into the dbo schema through import wizard and stored it as an SSIS package.
  • Created SSIS packages for importing the XRef tables for Purchase orders conversion.
  • Developed and configured various mappings and workflows for reading and writing the data to JMS (JAVA Message Service) Queues, using Application Source qualifier.
  • Significantly Involved in the analysis and Implementation of Performance Tuning techniques for SQL, Transformations, Mappings, Session
  • Developed & Tuned DML/DDL SQL’s to implement Data modeling changes, Also developed T-SQL and PLSQL procedure to handle the Informatica JOB metadata.
  • Providing the Functional and Technical specification for designing customized workflows & their automation.
  • Implementing MDM and Maintaining data Integrity by eliminating the redundancy.
  • Flexible Implementation of mapping, sessions and workflows using Parameter file/Global parameters and by implementing Informatica Best Practices
  • Change data capture by multiple level scheduling of session and workflow.
  • Developed various shell scripts using pmcmd command line program, to execute and maintain the workflow jobs.

Environment: Informatica PowerCenter 8.6.1/8.1.1, Erwin Data Modeler, Teradata, SQL Server 2008 R2, SSIS, SSRS, Oracle 11g/10g, Oracle 11.X (EBS), SQL Server Management studio, TOAD 9.6, SAP IDoc, OBIEE Reporting, Windows NT/2000, UNIX SUNos 5.11, HP Load Runner 9.50.

Confidential, DENVER, CO

Sr. Informatica Lead/Aministrator

Responsibilities: -

  • Created Integration Requirements and Design Specification Document.
  • Provided architectural design for Informatica.
  • Defined and designed flow and standards for Informatica.
  • Presented all of the Informatica tools to the Client, its usage, advantages and disadvantages for them to make up their mind to proceed with specific tools of Informatica.
  • Extracted data from SalesForce legacy system, SalasVision, Charles River (Trading Platform).
  • Documented ETL requirements translating STM’s Business logic into ETL language.
  • Created Projects, jobs in Talend Open Studio.
  • Used Basic run, debug jobs and used metadata wizard etc in Talend Open Studio
  • Lead the offshore GDC team of ETL Developers providing them with in depth understanding of the Architecture, ETL system design and requirements.
  • Provided the real time solutions with Informatica mappings for the traders to instantaneously react to market opportunities.
  • Did extensive analysis for advanced trading analytics for drill down capability (by trader, portfolio etc).
  • Analyzed data from commodity exchanges (ICE and NYMEX) and pricing sources (LIM and Platts).
  • Created ETL mappings for identification of arbitrage opportunities, optimize a portfolio in real-time, simulate transactions and automatically execute trade strategies with live feeds.
  • Worked closely with data population developers, multiple business units and a data solutions engineer to identify key information that will enhance business decision-making.
  • Used Informatica data explorer tool (IDE) for data profiling.
  • Loaded the relational tables for trade decision support which is consumed by the dashboard for trade decisions.
  • Involved in designing the Data warehouse based on the requirement document using Informatica Power Center 8.6.1.
  • Created stored procedure to be called in Informatica for the nexval from Dual table of oracle.
  • Created reusable expression transformation for Meta columns of Standardization area.
  • Masked data and populated to the limited trust zone using Data masking transformation of Informatica.
  • Used SQL, Stored Procedure.
  • Used Exceed tool for scheduling the Autosys jobs.
  • Debugged and corrected mappings created by GDC.
  • Fixed numerous bugs with Testers inputs.
  • Created Visio documents for Autosys Production Schedule.
  • Created Production readiness document.
  • Created Autosys documents for one time, Daily loads of data.
  • Used Exceed to execute the Autosys Jobs.
  • Upgraded Informatica Power Center 8.1.1 SP4 to 8.6.1.
  • Configured Informatica Power Exchange add on for SAP (Power Connect)
  • Created Cognos Cubes and developed Cognos reports
  • Used Informatica Data Quality tool for Standardization by referring to the database dictionary tables and populating the flat file dictionaries.
  • Good knowledge of Oracle major upgrade from 10.2.0.4 to 11g
  • Worked with tools - Source Analyzer, Warehouse designer, Transformation and Mapping Designer, Transformations developer, Informatica Repository Manager and workflow Manager and Informatica workflow monitor.
  • Read CSV and Tab delimited file and worked with code page.
  • Created .CSV files from excel spreadsheets and loaded into the target Oracle Database.
  • Worked with memory cache for static and dynamic cache for better throughput of sessions containing Rank, Sorter, lookup, joiner, Aggregator transformations.
  • Wrote UNIX shell scripts extensively.
  • Mentor and tutor Informatica users on the Power Center product suite.
  • Created deployment groups for each iteration releases.
  • Created labels for deployment groups for migration.
  • Managed tools and services of Informatica.
  • Managed user and folder permissions for the developers.
  • Purged old repository objects weekly.
  • Created shell script for repository backup weekly.
  • Developed data Mappings between source systems to Landing and from Standardization to warehouse components using Mapping Designer.
  • Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner.
  • Did performance tuning on the mappings developed by developers.
  • Wrote PL/SQL Packages, Stored procedures to implement business rules and validations in the actuarial system.
  • Looked up and read session, event and error logs for troubleshooting.
  • Created Informatica unit testing document.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Problem resolution involves resolving the issues raised by the client/actuarial users, validation of the data in the database or in the application functionality.
  • Worked closely with Cognos Developers for building cubes and upgrade from Cognos 8.3 to 8.4.1.
  • Optimizing/Tuning mappings/sessions, indexing and partitioning for better performance and efficiency.
  • Created the reusable transformations for better performance.
  • Design and implement data verification and testing methods for data warehouse.

Environment: Informatica Power center 8.5.1, Informatica Metadata Manager, Informatica Data Quality 8.6.2, Informatica Data Explorer, Oracle 11g/10g, SalesForce, SalesVision, Charles River, Windows 2003/2008, Sun Solaris, Red Hat Linux, SUSE Linux, Talend Open Studio 4.1, Talend Open Data Profiler, XML Sources, Cognos ePlanning 8.4.1, UNIX as Server and Citrix as Client, Exceed Autosys Tool and UNIX Shell Script.

We'd love your feedback!