We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

0/5 (Submit Your Rating)

Buffalo, NY

PROFESSIONAL SUMMARY:

  • 8+ years of focused experience in Information Technology with a strong background in developing BI applications in various verticals such as Health, Insurance, Finance, Banking and Pharmacy.
  • Experience in Analysis, Design, and Development of various business applications in different platforms in data warehousing using Informatica 10/9.x/8.x/7.1, Oracle, Sybase, Teradata, and SQL server.
  • Extensive experience in using Informatica tool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
  • UsedInformaticaPower Center 10/9.X to Extract, Transform and Load data intoNetezzaData Warehouse from various sources like Oracle and flat files.
  • Expertise in Data Warehousing, Data Migration, Data Integration using Business Intelligence (BI) tools such as Informatica Power Center, B2B Data Transformation, Informatica Data Quality, MDM, SSIS, OBIEE, etc.,
  • Experience in Business Intelligence applications design, development and Maintenance of Integration Services (SSIS), SQL Server Reporting Services.
  • Extensive knowledge with Relational &dimensional data modeling, star schema/snowflakes schema, fact and dimension tables and process mapping using the top - down and bottom-up approach.
  • Experience in using Informatica Client Tools - Designer, Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor, Analyst, Developer tools.
  • Developed various mappings using different transformations like Source Qualifier, Expression, Filter, Joiner, Router, Union, Unconnected / Connected Lookups and Aggregator.
  • Closely worked with other IT team member’s business partner’s data steward’s stakeholders steering committee members and executive sponsors for all MDM and Data governance related activities
  • Hands on experience in Healthcare Effectiveness Data and Information Set (HEDIS) and P4P used by NCQA (National Committee for Quality Assurance).
  • Experience in creatingHigh Level Design and Detailed Designin the Design phase.
  • Experience in integration of various data sources like Oracle, DB2, MS SQL Server and Flat Files.
  • Experience inpackagesto extract, transform and load data (ETL) usingSSIS, designedpackageswhich are utilized for tasks and transformations Data Conversion and Pivot tables.
  • Extensive experience in Requirements gathering, Analysis, Impact analysis, Design, Development and Quality Assurance, Quality Control,Mainframetesting (Unit testing, System testing, Integration testing and supported UAT), QA Testing Implementation, Project management, defect tracking, causal analysis for the defect, reporting project status reports.
  • Hands-on experience in SAS Base, SAS Macros, UNIX, SQL and Teradata, working with business users daily in solving their queries on technical end.
  • Generated Surrogate Keys forcompositeattributes while loading the data into dimension table using Surrogate key generator.
  • Experience in identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using Database Tuning, Partitioning, Index Usage, Aggregate Tables, Session partitioning, Load strategies, commit intervals and transformation tuning.
  • Strong familiarity with master data and metadata management and associated processes.
  • Supported MDM initiatives using Oracle databases andInformatica Data Quality (IDQ).
  • Experience in SQL and PL/SQL - Stored Procedures, Triggers, Packages and Functions.
  • Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2) loads.
  • Extensively worked on Dimensional Modeling, Data Migration, Data Cleansing, and Data staging for operational sources using ETL and data mining features for data warehouses.
  • Extensively involved in writing the UNIX shell programming.
  • Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • Coordinated with offshore, onsite, QA, scheduling, changemanagement, business, and inter-dependent teams.
  • Working knowledge oftestmanagementTool HP ALM QC.
  • Implemented performance-tuning techniques at application, database and system levels.
  • Skilled in developing Test Plans, Creating and Executing Test Cases.
  • Experience working in Scrum & agile methodology and ability to manage change effectively.
  • Excellent communication, interpersonal skill and quickly assimilate latest technologies concepts and ideas.
  • Create, maintain and monitorTIDALjobs for QNXT and Unix, SQL, FTP and SFTP operations; respond to requests and escalate when necessary to the appropriate support group
  • Worked with various scheduling solutions including Control-M, Autosys,Tidal,Enterprisescheduler.

TECHNICAL SKILLS:

ETL Tools/Data Modeling tools: Informatica Power Center 10/9.x/8.x/7.1, Power Exchange, IDQ, MDM, Informatica Cloud, MSBI. (Repository Manager, Designer, Server manager, SSIS, Work Flow Monitor, Work Flow Manager), Metadata, Erwin, FACTS and Dimension Tables, Physical and Logical Data Star Join SchemaModeling.

Databases: Oracle 12c/11g/10g/9i, Teradata, MS SQL Server, MS Access, SQL, PL/SQL

Languages: SQL, PL/SQL, T-SQL, UNIX, Shell Scripting, Batch Scripting

Operating Systems: UNIX, Windows Server 2008/2003, LINUX.

Job Scheduling: Informatica Scheduler, Tidal Enterprise Scheduler, Control M, CA Autosys

Tools: Toad, SQL developer, Visio, Teradata SQL Assistant

PROFESSIONAL EXPERIENCE:

Confidential, Buffalo, NY

Sr. ETL Informatica Developer

Responsibilities:

  • Provide debugging and troubleshooting support for SQL stored procedures, PowerShell scripts, triggers and Visualforce components.
  • Worked as developerfor Claims and members team to create multiple flat file and XML extracts for Dept. of Stores, SITES, VENDORS, RAWITEMS.
  • Worked for DRIVE II to import multiple flat files from Dept. of stores regarding RAWITEMS and loaded them to oracle tables.
  • Played a major role in Power center upgrade from 9.6 to 10 and Oracle upgrade from 11g to 12c.
  • Working with multiple Informatica repositories, version control, deployment groups, mapping and folder migrations.
  • Provided support to OBIEE team to use EDW data and SQL to generate reports from multiple facts and dimension tables for reporting needs.
  • Created standard application workbooks for each project that includes all the technical documentation and error handling procedures used by production support team.
  • Created and maintained EDW/ETLstandards and procedure documents to keep all the mappings in sync.
  • Conducted unit tests and created test plans in conjunction with testing team to prevent production failures and to maintain data accuracy.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Made continuous improvement in existing data warehouse by implementing design changes and database logics.
  • Expertise in creating all technical and functional requirements for various database projects.
  • Used Autosys as job scheduling software from receiving files, created automated file watchers, created job dependencies and took care of job failures tickets.
  • Performed robust testing, unit testing, negative testing and then handed over the data to QA team.
  • Proficient in designing and developing the various transformation logics by utilizing the expression, filter, joiner, lookup, router and rank.
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.
  • Worked closely with department heads and strategic planning committees to formulate strategic plans and directions.
  • Experience in the documentation of specifications, creating step-by step technical and user manuals with illustrations, designing and implementing new processes and technology

Environment: InformaticaPower Center/MDM 10, Oracle 12c, SQL Server 2012, MS Access 2010, Toad, UNIX, WinSCP, Putty, Erwin, SQL, PL/SQL, Autosys.

Confidential, Coventry, RI

Sr. ETL Informatica Developer

Responsibilities:

  • Provide debugging and troubleshooting support for SQL stored procedures, PowerShell scripts, triggers and Visualforce components.
  • Create validation queries to support quality control procedures that ensure data integrity in Data Warehouse.
  • Design, build and support methods to manage numerous data integration and reporting tasks. Tools include Informatica, MSSQL.
  • Drive organization towards meta data driven ETL procedures using automated and self-auditing scripts and the d tools, replacing human driven data duplication using Explorer, Access and Excel.
  • Created variousInformaticamappings to validate the transactional data against Business rules, extract look up values and enrich the data as per the mapping documents.
  • Developed variousInformaticaWorkflows to load the data from various upstream systems using different methodologies i.e. trigger based pull, direct pull & file based push.
  • Designed the ETL architecture for the Deposits product to process huge volumes of Deposits data on daily basis.
  • Fine-tuned several long runningInformaticaworkflows and implemented various techniques for the faster processing of high volume data by creating parallel partitions and using Teradata Fast Export and Netezza Bulk Writer.
  • Developed various SQL queries using joins, sub-queries & analytic functions to pull the data from various relational DBs i.e. Oracle, Teradata & SQL Server.
  • Created complex DataMart views for the corresponding products.
  • Created various complex PL/SQL stored procedures to manipulate/reconcile the data and generate the dashboard reports.
  • Performed Unit Testing& prepared the deployment plan for the various objects by analyzing the inter dependencies.
  • Developed several UNIX shell scripts for the files Archival & Compression.
  • Created various AutoSys jobs for the scheduling of the underlying ETL flows.
  • Co-ordinated with various team members across the globe i.e. Application teams, Business Analysts, Users, DBA and Infrastructure team to resolve any technical and functional issues in UAT and PROD.
  • Created various technical documents required for the knowledge transition of the application which includes re-usable objects (Informatica& UNIX).
  • Worked on IDQ for data cleansing, data matching, data conversion and address standardization.
  • Involved in integrating the change in the workflow to both test and allow error handling usingInformaticaIDQ
  • Overall responsibility for the day-to-day operations and oversight of the Metadata Repository.
  • Created Data objects, Quick Profiles, Custom Profiles and Drill Down on Profile Result using IDQ.
  • Created tables from profile columns using IDQ.
  • Handling all Hadoop environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring
  • Loading data from large data files into Hive tables.
  • Importing and exporting data into HDFS and Hive using Sqoop.

Environment: InformaticaPower Center 9.6.1, IDQ, Oracle 11g, SQL Server 2012, MS Access 2010, SQL*Loader, UNIX, WinSCP, Putty, Erwin 7.2, SQL, PL/SQL.

Confidential, Houston, TX

ETL Developer

Responsibilities:

  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Build the Dimension & Facts tables load process and reporting process using Informatica
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Extracted data from various data sources such as Oracle, SQL Server, Flat files and transformed and loaded into targets using Informatica.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Expression, Router, Joiner, Normalizer, Aggregator Sequence Generator and Address validator.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Worked withInformaticaData Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities ofIDQ9.6.
  • Enhance PLSQL procedures for bulk loading from source systems such as Teradata, SAP.
  • Identified and eliminated duplicates in datasets thoroughIDQ9.6 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Formatted/Generated Data to support SAP Business Objects Interface for Users to access data from Data Warehouse
  • Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
  • ETL experience in using SAP BODS, Cognos Data Manager and Informatica.
  • Worked with SAP and Oracle sources to process the data.
  • Experience with Job Alerts, Job Classes, Queues, Job Actions, Variables, Calendars, File Events, and email on-demand jobs are some features deployed in our TIDAL environment.
  • UsedPowerBIPowerPivot to develop data analysis prototype, and usedPowerView andPowerMap to visualize reports.
  • Worked with T-SQL to create Tables, Views, and triggers and stored Procedures.
  • Used indexes to enhance the performance of individual queries and enhance the stored Procedures.
  • Created, Maintained & scheduled various reports inPowerBIlike Tabular Reports, Matrix Reports.
  • Experience in configuring and deploying SSRS
  • MDM Developer:
  • Configured match rule set property by enabling search by rules inMDMaccording to Business Rules.
  • UsingInformaticaMDMHub to run Batch jobs.
  • Loading data fromInformaticaMDMLanding to Stage.
  • Designing data hierarchy management forInformaticaMDM.
  • Identified Major Data Quality issues inMDMProduction.
  • Involved in match rule sets, Match columns for match and Merge process in improvingdata qualitybyBatch Analysis.
  • Implemented the pre-land and land process of loading thedatasetinto InformaticaMDM Hub.
  • Configured and documented the InformaticaMDMHub to perform loading, cleansing, matching, merging and publication ofMDMdata.
  • Executed batch jobs and batch job group usingBatch Viewer.
  • Implemented parallelism for executing batch jobs usingMerge Manager.
  • Performed Data Steward activities likemanual merge/unmergeanduploadingdata.
  • Implementingauto merge and manual mergefor best version of Truth.
  • Finally created Data Validation document,Unit Test Case Document,Technical Design document,Migration request documentandKnowledge transfer document.

Environment: InformaticaPower center, MDM, SQL Server 2005 Enterprise Edition, SSIS, T-SQL, SSRS, Windows Server 2003, Oracle 9i.

Confidential, Malvern, PA

ETL Developer

Responsibilities:

  • Participated in requirement meetings with Business Analysts to understand analytical content needs.
  • Assisted in documenting these requirements, resolving ambiguities and conflicts, and ensuring requirements are complete.
  • Works directly with Client Operations, Client Tech team to understand business scope. Does Requirement Analysis, converts business requirements into technical terms. Works with offshore team for coding, unit testing.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center. Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouseNetezzatables usingInformaticaWorkflow Manager.
  • Worked on SAP for data migration Human Resources and Finance and converted various objects on Organizational Structure, Addresses, Time, Basic Pay, Bank Details, Recurring Payments, Tax assignment, Insurance Plans, Payroll etc., to generate report from SAP BI system.
  • Extensive ETL experience using DTSfor Data Extractions, Transformations and Loads.
  • Design database table structures for transactional and data sources.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Initiate the data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
  • Extracted data from various data sources such as Oracle, SQL Server, Flat files, transformed, and loaded into targets using Informatica.
  • Created source and Target Definitions, Reusable transformations, mapplets and worklets.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Expression, Router, Joiner, Normalizer, Aggregator Sequence Generator and Address validator.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Designedpackageswhich are utilized for tasks and transformations such as Execute SQL Task, Mapping the right and required data from source to destination, Data Flow Task, Data Conversion, For each Loop Container.
  • Expert in using the work flow canvas for multiple transformations of data and the use of different features like variables,packageconfigurations, event handlers, error logging and check points to facilitate various complex ETL logics and specifications.
  • Defined system trust and validation rules for base object columns.
  • Performed Data Steward Activities like manual merge/unmerge and uploading data.
  • Implementing auto merge and manual merge for best version of Truth.
  • Created a DEV/QA/Test environment and offered to IT employees to learn how to build jobs, understand best practices and utilize Tidal's futures in their Development.
  • Experience with Job Alerts, Job Classes, Queues, Job Actions, Variables, Calendars, File Events, and email on-demand jobs are some features deployed in our TIDAL environment.

Environment: Informatica Power Center 9.5/9.1, IDQ, SQL Server, UNIX, Toad, SQL Developer, SSIS, Putty, SFTP/FTP

Confidential, Dallas, TX

Sr. ETL Developer

Responsibilities:

  • Interacted with the users and making changes to Informatica mappings according to the Business requirements.
  • Experienced in Database programming for Data Warehouses (Schemas), proficient in dimensional modeling (StarSchemamodeling, and Snowflake modeling)
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers and data flow management into multiple targets using Router.
  • Worked withNetezza/Teradata Database to implement data cleanup, performance-tuning techniques.
  • Experience withNetezzaDatabase integration withInformaticaand load processes withNetezzaBulk load utilities likeNetezzaBulk read and write.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Lookup, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Generating of uniquekeys forcompositeattributes while loading the data into Data Warehouse.
  • Created views using Cisco Data VirtualizationCompositedata sources from Data Warehouse Exadata, Oracle EBS. Published the views for Business users to extract the data in Spotfire for difference purposes.
  • Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, ISNULL, ISDATE, TO DATE, Decode, Substr, Instr, and IIF functions in Expression Transformation.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level.
  • Used Address validator transformation in IDQ.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Created partitioned tables, partitioned indexes for manageability and scalability of the application. Made use of Post-Session success and Post-Session failure commands in the session task to execute scripts needed for cleanup and update purposes.
  • Developed various mapping and tuning using Oracle and SQL*Plus in the ETL process.
  • Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
  • Created Use-Case Documents to explain and outline data behavior.
  • Working withInformatica Developer (IDQ)tool to ensure data quality to the consumers.
  • Used Address validator transformation forvalidating various customers address from various countries by using SOAP interface.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Perform analysis, design and implementation of batch processing workflows using Cisco Tidal Enterprise SchedulerMonitor daily cycles.
  • Scheduled batch jobs throughTidalScheduleror the Unix Batch server to retrieve output files successfully to be sent to requesting parties.
  • UtilizeTidalEnterpriseSchedulerfunctions to establish job streams with complex dependencies, manage intricate calendar schedules,Tidalagent installations with specific deliverables to 50 plus application teams throughout the environment.

Environment: Informatica Power Center 9.1, IDQ, SAS, Oracle 11g, UNIX, PLSQL, SQL* PLUS, SQL SERVER 2008 R2, SSIS, TOAD, MS Excel 2007.

Confidential, Walnut Creek, CA

Informatica Developer

Responsibilities:

  • Visualize a data architecture designfrom high level to low level, and design performance objects for each level
  • Troubleshooting database issues related to performance, queries, stored procedure.
  • Develop and maintain data marts on an enterprise data warehouse to support various defined dashboards such as Imperative for Quality (IQ) program.
  • Designed and develop data models and database architecture by translating abstract relationships into logical structures.
  • Proficient in defining Key Performance Metrics (KPIs), facts, dimensions, hierarchies and developingStarand Snow Flakeschemas.
  • Extensively used flat files for Designed, developed complex Informatica mappings using expressions, aggregators, filters, lookup, and stored procedures to ensure movement of the data between various applications.
  • Extracted data from source systems to a DataMart running on Teradata.
  • Worked in extracting data from legacy Systems such as mainframes, oracle to Teradata.
  • Source data analysis and data profiling for data warehouse projects.
  • Design and implement all stages of data life cycle. Maintain coding and naming standards.
  • Developed end-to-end ETL processes for Trade Management Data Mart Using Informatica.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy.
  • Extensively worked with Slowly Changing Dimensions Type1 and Type2.
  • Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and partitioned the tables in Oracle for optimal performance.
  • Developed reconciliation scripts to validate the data loaded in the tables as part of unit testing.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Prepared scripts to email the records that do not satisfy the business rules (Error Records) to the uploaded business users.
  • Prepared the UNIX Shell Scripts to process the file uploads, one of the SOURCE for the data that process the uploads into different stages (Landing, Staging and Target tables)
  • Worked on TOAD and Oracle SQL to develop queries and create procedures.
  • Created the mapping specification, workflow specification and operations guide for theInformatica projects andMFTrun book as part of end user .
  • Experience working in agile methodology and ability to manage change effectively.
  • Assign work and responsible for providing technical expertise for the design and execution of ETL projects to onshore and offshore developers.

Environment: Informatica Power Center 8.6 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Informatica Power Exchange, SQL, SSIS, Oracle 11g, Flat Files, UNIX Shell Scripting

Confidential

DW Engineer

Responsibilities:

  • Implement procedures to maintain, monitor, backup and recovery operations for ETL environment.
  • Conduct ETL optimization, troubleshooting and debugging.
  • Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, mapplets, transformations, re-usable transformations.
  • Written Complex SQL overrides for source qualifier and Lookups in mappings.
  • Export or Import data from other data sources like flat files using Import/Export throughSSIS.
  • CreatedSSISpackage to load data from Flat File to Data warehouse using Lookup, Derived Columns, Sort, Aggregate, Pivot Transformation and Slowly Changing Dimension.
  • Designed and developed validation scripts based on business rules to check the Quality of data loaded into EBS.
  • Created the Data Flow Diagrams for the full run and the reprocess partial run for the workflows to be created in Informatica taking into point the dependencies using Microsoft Visio.
  • Implemented best practices in ETL Design and development and ability to load data into highly normalized tables and star schemas.
  • Designed and developed mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator, Web services Consumer, XML Generator Transformations.
  • Wrote UNIX shell scripts for Informatica ETL tool to run the Sessions.
  • Stored reformatted data from relational, flat file, XML files using Informatica (ETL).
  • Developed mapping to load the data in slowly changing dimension.
  • Involved in Design Review, code review, test review, and gave valuable suggestions.
  • Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2.
  • Responsible for offshore Code delivery and review process
  • Used Informatica to extract data from DB2, XML and Flat files to load the data into the Teradata
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Extracted data from various data sources transformed and loaded into targets using Informatica.
  • Environment: Informatica Power Center 8.6, Oracle 9i, DB2, Sybase, Rapid Sql Server, SSIS, Erwin, UNIX.

We'd love your feedback!