We provide IT Staff Augmentation Services!

Bi Developer Resume

3.00/5 (Submit Your Rating)

Costa Mesa, CA

OBJECTIVE

  • To obtain position as a Informatica Developer in a fast - paced company where I can add value to the company by leveraging my prior experience of around 10 years of IT experience with expertise in analysis, design, development and implementation of Data warehousing using ETL tools with Oracle, DB2, MS SQL server, Sybase and Teradata databases on windows and UNIX platforms.

SUMMARY

  • Expert level experience in Data Integration and Data Warehousing using ETL tool INFORMATICA PowerCenter 10.X/9.6/9.1/8.6 (Source Analyzer, Warehouse Designer, Mapping/Mapplet Designer, Sessions/tasks, Worklets /Workflow Manager). Knowledge of Informatica tools Power Exchange, Power Analyzer, Power Connect, Data Mart, OLAP and OLTP.
  • Extensively used Enterprise Data warehousing ETL/ Business Intelligence methodologies for supporting data extraction, transformation and loading processing, in a corporate-wide-ETL Solution using Informatica Power Center.
  • Expert-level mastery in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XML files, Applications, COBOL Sources and Teradata.
  • Extensive experience in Hadoop, HDFS, Hive, Impala, HQL Queries and Sqoop.
  • Extensive Data Warehouse experience using Informatica Power Center, Informatica PowerExchange (CDC) for designing and developing transformations, mappings, sessions.
  • Proficient with Informatica Big Data Developer, Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
  • Strong working experience on IDQ (Informatica Data Quality) which involves developing plans for Analysis, Standardization, Matching and Consolidation of data by using different components.
  • Experience in integration of various data sources like Flat files, EDI files, IDOC, Oracle, SQL server, SAP and MS access into staging area.
  • Extensive experience on Ab Initio graphs.
  • Proficient in implementing Complex business rules by creating re-usable transformations, workflows/worklets and Mappings/Mapplets.
  • Experience in Performance Tuning of targets, mappings and sessions.
  • Thorough knowledge of database management concepts like conceptual, logical and physical data modeling and data definition, population and manipulation. Expertise in logical and physical database design and Data Modeling using data modeling tools like Erwin.
  • Experienced in using Tableau 10.4, Business Objects XIR3/XIR2 to build user defined queries and reports to enable drill-down and slice and dice analysis on multiple databases.
  • Configured SSIS packages using package logging, Breakpoints, Checkpoints and Event handler to fix the errors.
  • Demonstrated expertise utilizing ETL tools, including SQL Server Integration Services (SSIS), and Informatica and ETL package design, and RDBMS systems like SQL Server, Oracle.
  • Strong expertise in Relational data base systems like Teradata,Oracle, SQL Server, MS Access, Sybase, DB2 design and database development using SQL, PL/SQL, SQL PLUS, TOAD, SQL-LOADER. Highly proficient in writing, testing and implementation of triggers, stored procedures, functions, packages, Cursors using PL/SQL.
  • Experience in scheduling of ETL jobs using Control-M, Tivoli.
  • Worked on Data Loading from ASCII flat files to Oracle database using SQL*Loader.
  • Created scripts using Fast Load, Multi-Load to load data into the Teradata data warehouse.
  • Experience in Task Automation using UNIX Shell Scripts, Job scheduling (AUTOSYS) and Communicating with Server using pmcmd.
  • Good Knowledge on SFDC.
  • Domain Data warehousing experience including Investment banking, Credit card, and Pharmaceutical. Proven ability to implement technology-based solutions for business problems.
  • Designed and written the scripts required to extract, transform, load (ETL), clean, and move data and metadata so it can be loaded into a data warehouse, data mart, or data store.
  • Developed staging areas to Extract, Transform and Load new data from the OLTP database to warehouse and Strong in Dimensional Modeling, Star/Snowflake Schema, Extraction, Transformation & Load and Aggregates.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.X/9.X/8.X, SAP, Power Exchange 5. IDQ, SSIS.

OLAP/BI: Cognos, Spot Fire, Cognos IWR, OBIEE, Business Objects 5.0/6.5.

Data Modeling: Erwin 4.0, Star-Schema Modeling, FACT and Dimension Tables

DBMS: Oracle 11g/10g/9i/8i, Microsoft Access, SQL Server 2005/2008, MS Excel, Flat Files, Teradata V13.0/12.0, Sybase

Languages: C, C++, Java, JavaScript, SQL, PL/SQL, T-SQL, HTML, DHTML, XML, UNIX, Shell Scripting, Visual basic, ASP, JSP, XML, Macromedia Software, JCL.

Scripting Languages: Java Script, VB Script and UNIX Shell Scripting.

Operating Systems: Windows 2008/2003/NT/XP, HP-Unix, Linux, AIX, Linux

Design Skills: Object Oriented Analysis Design using UML.

Others: MS Project, MS Office Suite, Toad (Tool of Oracle Application Development).

PROFESSIONAL EXPERIENCE

Confidential, Costa Mesa, CA

BI Developer

Responsibilities:

  • Working on the Enterprise level Datawarehouse (EDW).
  • Working on different data marts and data lake for different domains i.e Automotive, Emergency Roadside Assistance, Insurance, Travel, Membership and Payments etc.
  • Working on Hadoop, Hive, Impala and HQL queries for different databases.
  • Extract transform and load data through Informatica Power Centre and Informatica Developer.
  • Creation of design document ofdata load process.
  • Loading data from/to Teradata, SQL Server, Oracle, Hadoop, Hive and API.
  • Export & Import of workflows in different environments.
  • Making changes in the configuration/Parameter files and performance tuning of existing Informatica jobs.
  • Data loading, data conversion, ensuring data validation and loading of error & audit tables.
  • Maintaining the data retention policy of the organization.
  • Scheduling the ETL jobs in Control M.
  • Working of Agile Methodology and Implementing in Clarizen and JIRA.

Environment: Informatica Power Center 10.2, Informatica Developer, Teradata, Oracle 11g, SQL Server, Hadoop, HDFS, HIVE, Impala, Sqoop, Pipe delimited Flat Files, XML, JSON, WinSCP, Sales Force dot com (SFDC) SnapLogic Tableau, Business Objects, Erwin 7.2 and Oracle 11g.

Confidential, NJ

Informatica Developer

Responsibilities:

  • Worked on the Outages Management System.
  • Customer data flows from Inservice 9.2 (OMS Database to downstream application)
  • From OMS database data stored in staging table where data is at granular level of Customer Account Number & Premise ID.
  • Worked on SnapLogic configuration for iPaas
  • Created AWS and SFDC connections in Snaplogis for real time data integration
  • Creation of design document ofcustomer load process.
  • Extract transform and load data through Informatica.
  • Creation of informatica mappings, mapplets, worklet & workflows.
  • Load data from SQL Server, Flat Files & Oracle to Oracle Database.
  • Created the SFDC, Flat File and Oracle connections for AWS Cloud services.
  • Performance tuning of existing Informatica jobs.
  • Export & Import of workflows in different environments
  • Making changes in the configuration/Parameter files already developed for OMS History, Proactive Communication, IFactor (Outages Maps) and Reliability Metric.
  • Data loading & data conversion (customers IDs & premise ID's pattern changed)
  • Ensuring Data validation
  • Creation & loading of error table and audit tables
  • One-time conversion of transactional tables in OMS Primary, Archival and NRT (Near Real Time) databases.
  • Weekly full load customer data by ETL Process.
  • Maintaining the data retention policy of the organization.
  • Scheduling the ETL jobs in CA Workload Automation.
  • Created unit test cases and supported the testing team for issues.
  • Working of Agile Methodology and Implementing in JIRA.
  • VB Script for the loading of data

Environment: Informatica Power Center 10.2, SAP HANA, Oracle 11g, SQL Server 2017, Pipe delimited Flat Files, Amazon web services (AWS) cloud, Amazon RedShift cloud data integrator 10, Business Objects, Erwin 7.2, Oracle 11g, Oracle Exadata, XML, Sales Force dot com (SFDC) SnapLogic.

Confidential, CA

Informatica Developer

Responsibilities:

  • Created High Level and Low Level design document and ETL standards document.
  • Involved in Extraction, Transformation and Loading (ETL) Process.
  • Extracted data form flat files, Mainframes, DB2 and oracle database, applied business logic to load them in the central oracle database.
  • Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite.
  • Designed and developed Reports for the user Interface, according to specifications given by the Track leader.
  • Involved in performance tuning at source, target, mapping and session level.
  • Loaded Oracle tables from XML sources.
  • Configured Informatica for the SAP Connector.
  • Extracted data from SAP and loaded into Oracle EBS
  • Introduced the concept of Data Dashboard to track the technical details sighting the continuous requirement changes and rework needed.
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
  • Created source system containers for OBIEE.
  • Created subject areas in containers for OBIEE.
  • Created narrative reports in OBIEE.
  • Retrieved data from SAP using Informatica Power Exchange.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Resolved Skewness in Teradata.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Used Informatica web services to create work requests/work Items for the end user.
  • Successfully Integrated Multiple XML sources and created a de-normalized, flat-structured file.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Created Stored Procedures to validate and load the data from interface tables to the Oracle E-Business Suite internal tables.
  • Written shell scripts to backup Informatica Repository on a weekly basis.
  • Used Perl scripts to archive the older logs using Informatica command task.
  • Written basis Perl scripts for profile checks etc.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Staged data in Oracle E-Business Suite stage tables using Power Center in Informatica.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Worked extensively on Sap BW data as source system to Data Warehouse.
  • Worked on financial data of SAP like AP, AR and GL etc.
  • Integrated the sales force data into target Oracle using Informatica cloud.
  • Validated the sales force target data in force.com application.
  • Created Invoices, Cash Receipts, RMA, RMA Start into Sales Force from Oracle EBS.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Created customized OBIEE model in the RPD to retrieve the RPD data into dashboard.
  • Scheduled the workflows to pull data from the source databases at weekly intervals.
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows.
  • Performance tuning on sources, targets, mappings and database.
  • Worked with the other team such reporting to investigate and fix the data issues coming out of the warehouse environment.
  • Worked as production support SME to investigate and troubleshoot data issues coming out of Weekly and Monthly Processes.
  • Worked with business to provide them daily production status report in the form of issues, their priority and business impact along with recommended short term and long term solution.
  • Used database level Greenplum partitioning and Informatica hash partitioning.

Environment: Informatica Power Center 10.2, SAP BW, SFDC, Oracle 11g, DB2, SQL Server 2017, Service Now, Toad 10, DB Visualizer, Unix, Toad Data modeler, Green Plum DB, DB2, Linux Susse, shell/Perl scripting, SQL, Control-M.

Confidential, IA

Informatica Developer

Responsibilities:

  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and
  • Analyzing, designing and implementing complex SQL stored procedures, ETL processes and Informatica mappings.
  • Used Tidal scheduler to get the source file from the server using Tidal flat file FTP connection as well as power center FTP connection.
  • Migration of data from Oracle 11g to Oracle Exadata.
  • Created Oracle Exadata database, users, base table and views using proper distribution key structure.
  • Used Informatica Power Connect for Oracle Exadata to pull data from Oracle Exadata data warehouse.
  • Developed mapping parameters and variables to support connection for the target database as Oracle Exadata and source database as Oracle OLTP database.
  • Ran the all workflows using the Tidal scheduler.
  • Worked on Pentaho integration installation like content management, execution, security, scheduling etc.
  • Worked with tool bar, perspective tool bar, sub-tool bar and design and view tabs of Pentaho data integration.
  • Created Pentaho ELT jobs and did the performance monitoring and logging.
  • Configured Pentaho’s thin kettle JDBC driver.
  • Executed transformations for debugging in Pentaho DI. worked on L2 and L3 production suppport/monitoring of the daily nightly loads of ETL.
  • Implemented SCD1, SCD2 type maps to capture new changes and to maintain the historic data
  • Providing technical assistance during production phase of project development
  • Defined and developed technical standards for data movement and transformation as well as review all designs to ensure those standards are met.
  • Handled extraction of various types of source files Flat files, XML standard source data of different transactions and loading to staging area.
  • Configured Informatica for the SAP Connector.
  • Extracted data from SAP and loaded into Oracle EBS.
  • Configured power exchange for SAP R3
  • Retrieved data from SAP R3.
  • Designed and written the scripts required to extract, transform, load (ETL), clean, and move data and metadata so it can be loaded into a data warehouse, data mart, or data store.
  • Installed and configured Informatica Power Exchange for CDCand Informatica Data Quality (IDQ).
  • Created CDC (change data capture) sources in Power Exchange and imported that into Power Center
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Created custom plans for product name discrepancy check using IDQand incorporated the plan as a mapplet into Power Center.
  • Configured Informatica Power Exchange add on for SAP (Power Connect)
  • Retrieved data from SAP IDocs using Informatica connector.
  • Used Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B DT/Data Exchange of Informatica.
  • Used Informatica B2B DT/Data Exchange to Structured data like XML.
  • Worked extensively in PL/SQL to migrate the data from Oracle to Oracle Exadata database.
  • Experience in SQL Server & SSRS Reports Migration from SQL Server 2000 to SQL Server 2005, SQL Server 2008 to SQL Server 20012 and SQL Server 2008 to SQL Server 2012 R2.
  • Created Drill Down, Drill through, linked and sub reports using SSRS and resolved issues and errors in SSRS for errors generated.
  • Created new mappings and enhancements to the old mappings according to changes or additions to the Business logic.
  • Converted Oracle ddl’s to Netezza ddl’s.
  • Created the format of the unit test documents per Netezza Framework.
  • Created the NZ/SQL Procedures on Netezza using Workbench Aginity.
  • Retrieved error logs on UNIX for Netezza data loads from Oracle to Netezza.
  • Optimized the NZ-SQL queries.
  • Optimized the BOXI dashboard SQL with aggregated and sub queries.
  • Retrieved data from simple object access protocol (SOAP) using the existing XSD from an XML file using web services hub.
  • Retrieved data from web services and validated the response using Informatica expression transformation like date, zip, location formats etc.
  • Configured Informatica web services hub in administration console.
  • Worked with Informatica web services
  • Created scripts using Fast Load, Multi-Load to load data into the Teradata data warehouse.
  • Configured RPD and DAC.
  • Created the Visio diagram for Informatica workflows to be scheduled in DAC.
  • Scheduled workflows in DAC and populated the Reporting layer of OBIEE.
  • Maintained user accounts in DAC.
  • Created the source system containers, customized the data warehouse load, manipulated the columns in data warehouse tables etc in DAC.
  • Created the custom logical/physical folders in DAC.
  • Configured Informatica on demand for data. synchronization from SFDC to Oracle though cloud based on demand.
  • Installed/configured Informatica’s ILM product suite for Data masking (Data Privacy), file archive load and data discovery and data visualization for data archive
  • Created projects in ILM for data masking with different parameters like commit interval, encryption key and degree of parallelism.
  • Created the expression, encryption, SSN replacement data masking techniques in data masking transformation of Informatica Power Center
  • Did data masking for the limited trust zone using data masking transformation of Informatica power center
  • Integrated the data from Oracle to Sales Force (SFDC) using Informatica cloud.
  • Experience inBig Datawith deep understanding of the Hadoop Distributed File System Eco System (MapReduce,Pig,Hive,Sqoop,HBase Cloudera Manager) ETL and RDBMS
  • Loaded data from flat files to Big Data (1010 data).
  • Worked on Migration of mappings from Data Stage to Informatica.
  • Updated numerous Bteq/Sql scripts, making appropriate DDL changes and completed unit and system test
  • Validated the salesforce target data in force.com application.
  • Created Invoices, Cash Receipts, RMA, RMA Start into SalesForce from Oracle EBS.
  • Created Invoices, Cash Receipts, RMA, RMA Start into SalesForce from Oracle EBS.
  • Configured SFDC license in administration console.
  • Creating automated schedules to run tasks at specific time as per need to migrate date to/from SFDC and DB and ERP.
  • This project involved developing a DataWareHouse using Informatica to transform the SalesForce Data that is extracted from the various databases and Flat files. DataWareHouse was built in order to provide standardized reporting process for the Salesforce data.
  • Extensive knowledge on Master Data Management (MDM) concepts.
  • Extensive experience on Designing, Managing and administrating MDM/DIW objects using Kalido DIW/MDM8.5/9 tool.
  • Worked on designing catalogs, categories, sub-category and user roles using Kalido MDM 9.
  • Extracted data from SalesForce legacy system, SalasVision, Charles River (Trading Platform).
  • Provided enterprise data warehousing solutions, including design and development of ETL processes, mappings and workflows using Informatica’s Power Center.
  • Created the SalesForce connections in Informatica Power Center
  • Expert in designing and scheduling complex SSIS Packages for transferring data manually from multiple data sources to SQL server.
  • Expert in creating, configuring and fine-tuning ETL workflows designed in DTS and MS SQL Server Integration Services (SSIS).
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Worked on Windows PowerShell for automation of remote work stations.
  • Worked on Ab Initio graphical development environment (GDE).
  • Created Ab initio graphs using inputs (Flat Files), reformat, join, Rollup, concatenate, output components.
  • Worked on creating parallel Ab initio jobs for faster reads.
  • Designed the Informatica mappings based on AB Initio code.
  • Fixed the existing components by comparing the Informatica code with Ab Initio graph.
  • Worked with component object model (COM) and windows management instrumentation (WMI) using windows powershell scripting.
  • Created windows powershell scripts in cmdlets in in windows powershell environment.
  • Used pipeline in windows powershell to enable one cmdlet to be piped into another cmdlet.
  • Configured SFDC license in administration console. by CouponDropDown">Participate din design workshops, providing technical insight and knowledge.

Environment: Informatica Power Center 9.6.1, Informatica Data Quality, Cloud, Oracle 11G, Netezza 7.2.0, Netezza TwinFin 6 (Production), TwinFin 3 and Netezza Skimmer (Non-production), Exadata, web services, DAC, Ab Initio, Teradata V13.0, Cognos BI 8.3, Informatica B2B Data Transformation (DT)/Data Exchange (DE), data masking, MDM, Relational Junction, DB2, Flat files, SSIS, PL/SQL, SQL*Plus, TOAD, UNIX, SAP, Shell Scripting, Autosys, Big Data, Erwin 4.2. Pentaho,Tidal scheduler.Hadoop. (SFDC). Windows PowerShell.

Confidential, TN

Informatica Developer

Responsibilities:

  • Running batch cycles which involves job trigger from Informatica, query table in Teradata /OracleSQl developing, report generation and claims archiving maintenance.
  • Involved in creating stored procedures and using them in Informatica
  • Implementing claim Data Conversion process is to move claims from Claims Workbench to the CNG Navigator application.
  • Providing technical assistance during production phase of project development
  • Defined and developed technical standards for data movement and transformation as well as review all designs to ensure those standards are met.
  • Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions
  • Designed Work Flows that uses multiple sessions and command line objects (which are used to run the Unix scripts)
  • Created source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Provided enterprise data warehousing solutions, including design and development of ETL processes, mappings and workflows using Informatica’s PowerCenter.
  • Responsible for migration of the work from dev environment to testing environment
  • Provided guidance and expertise to resolve technical issues related to DW Tools and primarily Informatica.

Environment: Informatica Power Center 9.0.1, Oracle 10G, Cognos BI 8.3, Relational Junction,, DB2, Flat files, PL/SQL, SQL*Plus, TOAD, UNIX, Shell Scripting, Control-M, Erwin 4.2.

Confidential, IA

Informatica Developer

Responsibilities:

  • Worked with Informatica PowerMart client tools like Source Analyzer, Warehouse Designer, and Mapping.
  • Worked on Informatica tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Extracted data from Flat files loaded them into EDW.
  • Worked on multiple inbound ASN files to create XML, created a list file with the entire Inbound ASN files list on UNIX and use indirect loading and transforming methods in Informatica
  • Handled extraction of various types of source files Flat files, XML standard source data of different transactions and loading to staging area.
  • Wrote Data loading stored procedures, functions using PL/SQL from Source systems into operational data storage.
  • Involved and Worked with DBA to build Dimension Tables and Fact Tables.
  • Created Complex Mapping for generating the parameter files.
  • Created source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Designed and developed SSIS packages, store procedures, configuration files, tables, views, and functions; implement best practices to maintain optimal performance.
  • Utilized SSIS (SQL Server Integration Services) to produce a Data Warehouse for reporting.
  • Mainly involved in developing Star Schema (Facts and dimensions based on future requirements).
  • Developed source to target mappings and scheduling Informatica sessions.

Environment: Informatica Power Center 8.6, Oracle 10G, WLM, UNIX AIX, DB2, Flat files, PL/SQL, SQL*Plus, TOAD, UNIX, Shell Scripting, Tivoli, Erwin 4.2

Confidential, MN

ETL Developer lead

Responsibilities:

  • Involved in creating design document Informatica mappings based on business requirement.
  • Extracted data from Flat files loaded them into EDW.
  • Developing Complex Transformations, Mapplets using Informatica to Extract, Transform and Load data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS)
  • Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping
  • Worked on Informatica tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Created tasks and workflows in the Workflow Manager and monitored the sessions in the Workflow Monitor
  • Tested scripts by running workflows and assisted in debugging the failed sessions
  • Implemented Client Side validation usingJavaScripts
  • Analyze functional requirements provided by Business Analysts for the code changes.
  • Writing UNIX scripts for unit testing the ETL Code
  • Execute the Test cases for the code changes.
  • Extensively participated in functional and technical meetings for designing the architecture of ETL load process

Environment: Informatica Power Center 9.0.1 Power Exchange, Teradata V12.0, SQL Server 2008, Oracle 10g, SOA, TOAD, SSH, WLM, UNIX AIX, Windows XP.

We'd love your feedback!