We provide IT Staff Augmentation Services!

Etl Informatica Consultant Resume

0/5 (Submit Your Rating)

Buffalo, NY

SUMMARY:

  • 7+ years of focused experience in Information Technology with a strong background in developing BI applications in various verticals such as Health, Insurance, Finance and Pharmacy.
  • Experience in Analysis, Design, and Development of various business applications in different platforms in data warehousing using Informatica 10/9.x/8.x/7.1, Oracle, Sybase, Teradata, and SQL server.
  • Extensive experience in using Informatica tool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
  • UsedInformaticaPower Center 10/9.X to Extract, Transform and Load data intoNetezzaData Warehouse from various sources like Oracle and flat files.
  • Expertise in Data Warehousing, Data Migration, Data Integration using Business Intelligence (BI) tools such as Informatica Power Center, B2B Data Transformation, Informatica Data Quality, MDM, SSIS, OBIEE, etc.,
  • Experience in Business Intelligence applications design, development and Maintenance of Integration Services (SSIS), SQL Server Reporting Services.
  • Extensive knowledge with Relational &dimensional data modeling, star schema/snowflakes schema, fact and dimension tables and process mapping using the top - down and bottom-up approach.
  • Experience in using Informatica Client Tools - Designer, Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor, Analyst, Developer tools.
  • Developed various mappings using different transformations like Source Qualifier, Expression, Filter, Joiner, Router, Union, Unconnected / Connected Lookups and Aggregator.
  • Experience in creatingHigh Level Design and Detailed Designin the Design phase.
  • Experience in integration of various data sources like Oracle, DB2, MS SQL Server and Flat Files.
  • Experience inpackagesto extract, transform and load data (ETL) usingSSIS, designedpackageswhich are utilized for tasks and transformations Data Conversion and Pivot tables.
  • Extensive experience in Requirements gathering, Analysis, Impact analysis, Design, Development and Quality Assurance, Quality Control,Mainframetesting (Unit testing, System testing, Integration testing and supported UAT), QA Testing Implementation, Project management, defect tracking, causal analysis for the defect, reporting project status reports.
  • Hands-on experience in SAS Base, SAS Macros, UNIX, SQL and Teradata, working with business users daily in solving their queries on technical end.
  • Generated Surrogate Keys forcompositeattributes while loading the data into dimension table using Surrogate key generator.
  • Experience in identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using Database Tuning, Partitioning, Index Usage, Aggregate Tables, Session partitioning, Load strategies, commit intervals and transformation tuning.
  • Supported MDM initiatives using Oracle databases andInformatica Data Quality (IDQ).
  • Experience in SQL and PL/SQL - Stored Procedures, Triggers, Packages and Functions.
  • Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2) loads.
  • Extensively involved in writing the UNIX shell programming.
  • Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • Coordinated with offshore, onsite, QA, scheduling, changemanagement, business, and inter-dependent teams.
  • Working knowledge oftestmanagementTool HP ALM QC.
  • Implemented performance-tuning techniques at application, database and system levels.
  • Skilled in developing Test Plans, Creating and Executing Test Cases.
  • Experience working in Scrum & agile methodology and ability to manage change effectively.
  • Excellent communication, interpersonal skill and quickly assimilate latest technologies concepts and ideas.
  • Create, maintain and monitorTIDALjobs for QNXT and Unix, SQL, FTP and SFTP operations; respond to requests and escalate when necessary to the appropriate support group
  • Worked with various scheduling solutions including Control-M, Autosys,Tidal Enterprisescheduler.

TECHNICAL SKILLS:

ETL Tools/Data Modeling tools: Informatica Power Center 10/9.x/8.x/7.1, Power Exchange, IDQ, MDM, Informatica Cloud, MSBI. (Repository Manager, Designer, Server manager, Work Flow Monitor, Work Flow Manager), Erwin, FACTS and Dimension Tables, Physical and Logical Data Star Join SchemaModeling.

Databases: Oracle 12c/11g/10g/9i, Teradata, MS SQL Server, MS Access, SQL, PL/SQL

Languages: SQL, PL/SQL, T-SQL, UNIX, Shell Scripting, Batch Scripting

Operating Systems: UNIX, Windows Server 2008/2003, LINUX.

Job Scheduling: Informatica Scheduler, Tidal Enterprise Scheduler, Control M, CA Autosys

Tools: Toad, SQL developer, Visio, Teradata SQL Assistant

PROFESSIONAL EXPERIENCE:

Confidential, Buffalo, NY

ETL Informatica Consultant

Responsibilities:

  • Provide debugging and troubleshooting support for SQL stored procedures, PowerShell scripts, triggers and Visualforce components.
  • Worked as developerfor Claims and members team to create multiple flat file and XML extracts for Dept. of Stores, SITES, VENDORS, RAWITEMS.
  • Worked for DRIVE II to import multiple flat files from Dept. of stores regarding RAWITEMS and loaded them to oracle tables.
  • Played a major role in Power center upgrade from 9.6 to 10 and Oracle upgrade from 11g to 12c.
  • Working with multiple Informatica repositories, version control, deployment groups, mapping and folder migrations.
  • Provided support to OBIEE team to use EDW data and SQL to generate reports from multiple facts and dimension tables for reporting needs.
  • Created standard application workbooks for each project that includes all the technical documentation and error handling procedures used by production support team.
  • Created and maintained EDW/ETLstandards and procedure documents to keep all the mappings in sync.
  • Conducted unit tests and created test plans in conjunction with testing team to prevent production failures and to maintain data accuracy.
  • Made continuous improvement in existing data warehouse by implementing design changes and database logics.
  • Expertise in creating all technical and functional requirements for various database projects.
  • Used Autosys as job scheduling software from receiving files, created automated file watchers, created job dependencies and took care of job failures tickets.
  • Performed robust testing, unit testing, negative testing and then handed over the data to QA team.
  • Proficient in designing and developing the various transformation logics by utilizing the expression, filter, joiner, lookup, router and rank.
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.
  • Worked closely with department heads and strategic planning committees to formulate strategic plans and directions.
  • Experience in the documentation of specifications, creating step-by step technical and user manuals with illustrations, designing and implementing new processes and technology

MDM Developer:

  • Configured match rule set property by enabling search by rules inMDMaccording to Business Rules.
  • UsingInformaticaMDMHub to run Batch jobs.
  • Loading data fromInformaticaMDMLanding to Stage.
  • Designing data hierarchy management forInformaticaMDM.
  • Identified Major Data Quality issues inMDMProduction.
  • Involved in match rule sets, Match columns for match and Merge process in improving data quality by Batch Analysis.
  • Implemented the pre-land and land process of loading the dataset into Informatica MDM Hub.
  • Configured and documented the Informatica MDM Hub to perform loading, cleansing, matching, merging and publication of MDM data.
  • Executed batch jobs and batch job group using Batch Viewer.
  • Implemented parallelism for executing batch jobs using Merge Manager.
  • Performed Data Steward activities like manual merge/unmerge and uploading data.
  • Implementing auto merge and manual merge for best version of Truth.
  • Finally created Data Validation document, Unit Test Case Document, Technical Design document, Migration request document and Knowledge transfer document.

Environment: Informatica Power Center 10/9.6, MDM, Oracle 12c/11g, SQL Server 2012, MS Access 2010, SQL*Loader, UNIX, WinSCP, Putty, Erwin 7.2, SQL, PL/SQL.

Confidential, RI

ETL / Informatica Cloud Developer

Responsibilities:

  • Provide debugging and troubleshooting support for SQL stored procedures, PowerShell scripts, triggers and Visualforce components.
  • Create validation queries to support quality control procedures that ensure data integrity in Data Warehouse.
  • Develop mappings, mapping configurations, data synchronization and data replication tasks with Informatica Cloud Services.
  • Developed the audit activity for all thecloudmappings.
  • Automated/Scheduled thecloudjobs to run daily with email notifications for any failures.
  • Generated automated stats with the staging loads comparing with the present day counts to the previous day counts.
  • Performed data loading from source to target usingInformaticaCloud.
  • Maintain daily backups of all Salesforce organization in SQL Server 2012.
  • Provide debugging and troubleshooting support for SQL stored procedures, PowerShell scripts, Salesforce workflows, triggers and Visualforce components.
  • Developed various SQL queries using joins, sub-queries & analytic functions to pull the data from various relational DBs i.e. Oracle, Teradata & SQL Server.
  • Created complex DataMart views for the corresponding products.
  • Created various complex PL/SQL stored procedures to manipulate/reconcile the data and generate the dashboard reports.
  • Performed Unit Testing& prepared the deployment plan for the various objects by analyzing the inter dependencies.
  • Worked as a part of AWS build team.
  • Create, configure and managing S3 bucket(storage).
  • Experience on AWS EC2, EMR and Cloud Watch.
  • Developed several UNIX shell scripts for the files Archival & Compression.
  • Created various AutoSys jobs for the scheduling of the underlying ETL flows.
  • Co-ordinated with various team members across the globe i.e. Application teams, Business Analysts, Users, DBA and Infrastructure team to resolve any technical and functional issues in UAT and PROD.
  • Created various technical documents required for the knowledge transition of the application which includes re-usable objects (Informatica& UNIX).
  • Handling all Hadoop environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring
  • Loading data from large data files into Hive tables.
  • Importing and exporting data into HDFS and Hive using Sqoop.

Environment: InformaticaCloud, Power Center 9.6.1, IDQ, Oracle 11g, SQL Server 2012, MS Access 2010, SQL*Loader, UNIX, WinSCP, Putty, Erwin 7.2, SQL, PL/SQL.

Confidential, TX

Sr. Informatica Developer

Responsibilities:

  • Extensively used SSIS Import/Export Wizard, for performing the ETL operations.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Resolve the issues, Remedy Tickets on Priority basis.
  • Install Informatica Power center 9.1 and 9.5.
  • Deploying the codes from DEV, TST and PRD regions.
  • Creating new users and groups with required privileges at the repository level.
  • Documenting the Business rules and requirements.
  • Managing the various alerts due to disk space and High-volume alerts.
  • Scheduling the workflows from the Informatica Scheduler and monitoring the runs.
  • Build the Dimension & Facts tables load process and reporting process using Informatica.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Worked on building & deploying Java code through Jenkins, used Jenkins Amazon Web Services (AWS) Code Deploy plugin to deploy to AWS and worked AWS Cloud management.
  • Configured Elastic Load Balancers (ELB) with EC2 Auto scaling groups. Created monitors, alarms and notifications for EC2 hosts using Cloud Watch.
  • Launching and configuring of Amazon EC2 (AWS) Cloud Servers using AMI's (Linux) and configuring the servers for specified applications.
  • Used AWS to deploy the project on EC2 instance Implementation of the Business logic layer for MongoDB Services.
  • Extracted data from various data sources such as Oracle, SQL Server, Flat files and transformed and loaded into targets using Informatica.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Expression, Router, Joiner, Normalizer, Aggregator Sequence Generator and Address validator.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Created profiles and score cards for the users using IDQ.
  • Extensively worked on Data Cleansing, Data Analysis and Monitoring Using Data Quality.
  • Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.

Environment: SQL Server 2005 Enterprise Edition, Azure, SSIS, T-SQL, SSRS, Windows Server 2003, Oracle 9i, Power BI

Confidential, TX

ETL Developer

Responsibilities:

  • Participated in requirement meetings with Business Analysts to understand analytical content needs.
  • Assisted in documenting these requirements, resolving ambiguities and conflicts, and ensuring requirements are complete.
  • Upgraded Informatica power center 9.1 to version 9.6.1 on Linux servers.
  • Install and Configure Informatica DR environment platform.
  • Monitoring resource utilization and proactively identifying Informatica job performance and resource issues. (network, disk usage, memory usage, server side limitation).
  • Created Shell scripts to automate the Backups, ETL Execution. Worked on Managing the Alerts for the server failures using SMTP Configuration.
  • Monitor and communicate the availability of UNIX and Informatica Environment. Follow-up on problem tickets.
  • Performed tuning of Informatica Mappings for optimum performance.
  • Extensive ETL experience using DTS/SSIS for Data Extractions, Transformations and Loads.
  • Design database table structures for transactional and data sources.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Initiate the data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
  • Created mappings, workflows and application using Informatica BDE developer.
  • Extensively used ETL methodology for supporting Data Extraction, transformations and loading processing, using Informatica BDE.
  • Worked on Control M to run Informatica and Hadoop jobs parallel.
  • Monitored mappings, workflows and applications using Informatica BDE Monitor.
  • Experience in integrating databases like MongoDB, MySQL with webpages like HTML, PHP and CSS to update, insert, delete and retrieve data with simple ad-hoc queries.
  • Extracted data from various data sources such as Oracle, SQL Server, Flat files, transformed, and loaded into targets using Informatica.
  • Created users, groups and gave read/write permissions on the respective Folders of repository.
  • Designed Mappings by including the logic of restart.
  • Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Expression, Router, Joiner, Normalizer, Aggregator Sequence Generator and Address validator.
  • Created Batch jobs using SAP Best practices jobs to pull the data from Heterogeneous source to Create load files, which will be used by LSMW/Custom programs to load into SAP created the data stores For Database.
  • Designed packages which are utilized for tasks and transformations such as Execute SQL Task, Mapping the right and required data from source to destination, Data Flow Task, Data Conversion, for each Loop Container.
  • Expert in using the work flow canvas for multiple transformations of data and the use of different features like variables, package configurations, event handlers, error logging and check points to facilitate various complex ETL logics and specifications.

Environment: Informatica Power Center 9.5/9.1, IDQ, SQL Server, UNIX, Toad, SQL Developer, SSIS, Putty, SFTP/FTP

Confidential, TX

ETL Developer

Responsibilities:

  • Interacted with the users and making changes to Informatica mappings according to the Business requirements.
  • Experienced in Database programming for Data Warehouses (Schemas), proficient in dimensional modeling (Star Schema modeling, and Snowflake modeling).
  • Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers and data flow management into multiple targets using Router.
  • Designed the ETL/Data Integration engine by meeting the highest standards for Data quality performance, scalability and modularity using the BDM developer, Data Profiling tools and scripting.
  • Used Informatica Analyst for data profiling, Score carding to bring data quality issues upfront and then developed Informatica BDM jobs/workflows to operationalize the data delivery to data scientists and Worked with Netezza/Teradata Database to implement data cleanup, performance-tuning techniques.
  • Created dynamic BDM mappings to take care of dynamic schemas on read and write.
  • Experience with Netezza Database integration with Informatica and load processes with Netezza Bulk load utilities like Netezza Bulk read and write.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Lookup, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
  • Generating of unique keys for composite attributes while loading the data into Data Warehouse.
  • Created views using Cisco Data Virtualization Composite data sources from Data Warehouse Exadata, Oracle EBS. Published the views for Business users to extract the data in Spotfire for difference purposes.
  • Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, ISNULL, ISDATE, TO DATE, Decode, Substr, Instr, and IIF functions in Expression Transformation.
  • Responsible for best practices like naming conventions, Performance tuning, and Error Handling.
  • Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level.
  • Created partitioned tables, partitioned indexes for manageability and scalability of the application. Made use of Post-Session success and Post-Session failure commands in the session task to execute scripts needed for cleanup and update purposes.
  • Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Perform analysis, design and implementation of batch processing workflows using Cisco Tidal Enterprise Scheduler Monitor daily cycles.
  • Scheduled batch jobs through Tidal Scheduler or the Unix Batch server to retrieve output files successfully to be sent to requesting parties.
  • Utilize Tidal Enterprise Scheduler functions to establish job streams with complex dependencies, manage intricate calendar schedules, Tidal agent installations with specific deliverables to 50 plus application teams throughout the environment.

Environment: Informatica Power Center 9.1, IDQ, SAS, Oracle 11g, UNIX, PLSQL, SQL* PLUS, SQL SERVER 2008 R2, SSIS, TOAD, MS Excel 2007

Confidential

Informatica Developer

Responsibilities:

  • Visualize a data architecture design from high level to low level, and design performance objects for each level.
  • Troubleshooting database issues related to performance, queries, stored procedure.
  • Designed and develop data models and database architecture by translating abstract relationships into logical structures.
  • Proficient in defining Key Performance Metrics (KPIs), facts, dimensions, hierarchies and developing Star and Snow Flake schemas.
  • Extensively used flat files for Designed, developed complex Informatica mappings using expressions, aggregators, filters, lookup, and stored procedures to ensure movement of the data between various applications.
  • Worked on SSIS script task, look up transformations and data flow tasks using T-SQL and Visual Basic (VB) scripts.
  • Extracted data from source systems to a Data Mart running on Teradata.
  • Worked in extracting data from legacy Systems such as mainframes, oracle to Teradata.
  • Source data analysis and data profiling for data warehouse projects.
  • Design and implement all stages of data life cycle. Maintain coding and naming standards.
  • Developed end-to-end ETL processes for Trade Management Data Mart Using Informatica.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy.
  • Extensively worked with Slowly Changing Dimensions Type1 and Type2.
  • Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Oracle for optimal performance.
  • Developed reconciliation scripts to validate the data loaded in the tables as part of unit testing.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Prepared scripts to email the records that do not satisfy the business rules (Error Records) to the uploaded business users.
  • Prepared the UNIX Shell Scripts to process the file uploads, one of the SOURCE for the data that process the uploads into different stages (Landing, Staging and Target tables).
  • Created the mapping specification, workflow specification and operations guide for the Informatica projects and MFT run book as part of end user .

Environment: Informatica Power Center 8.6 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Informatica Power Exchange, SQL, SSIS, Oracle 11g, Flat Files, UNIX Shell Scripting

We'd love your feedback!