We provide IT Staff Augmentation Services!

Etl Informatica Data Integration Specialist Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • 11+ years of IT experience in Design, Development, Maintenance, Enhancements and Production Support of Business Intelligence Enterprise Applications.
  • IT Experience in various industries like Aviation applications (Reservation systems, Ticketing Application, DCS - Departure Control Systems), Banking, HRIS (Human Resource Information Systems), Media and Entertainment, Social Networking and US - Healthcare (Medicare &Medicaid).
  • Experience in developing Enterprise BI solutions applications using Informatica PowerCenter (8.x,9.x & 10.x) Informatica Cloud Services--IaaS/SaaS/PaaS (Cloud Application Integration, Cloud Data Integration), Oracle 10g/11g/12i, Netezza 7.x, MS SQL Server 2008/2012, DB2, Teradata, Salesforce, Windows Batch, PowerShell, Unix Shells scripts & Python scripts.
  • Certified AWS Solution Architect Associate.
  • Experience in adhering software methodologies like waterfall, Agile, Domain Driven and BDD - Behaviour Driven Development in developing data warehousing applications.
  • Experience and good knowledge in Designing ETL models (logicaldata models & physicaldata models) and experience inForwardandReverseEngineering usingErwin & Visio for teh multi-platform applications.
  • Profound experience wif Informatica PowerCenter tool, Informatica Cloud Services, Power Exchange, Informatica DT Studio Platform, Data Masking tools and DVO tool and good knowledge at Informatica admin level.
  • Experience and good understanding in remodelling teh Enterprise BI Application from existing data model to Netezza Hot Appliance Model & experience in implementing teh physical data model in Netezza Database.
  • Experience in supporting Attunity 'Click-2-Load’ solution for data replication purpose on Netezza IBM PureData System for Analytics.
  • Experience in handling XML files related technologies like Informatica XML parser & XML generator.
  • Experience in using Salesforce lookup/salesforce picklist/salesforce merge transformations for salesforce data integration.
  • Experience in writing UNIX reusable scripts for Informatica jobs, used for data movement and transformation purpose.
  • Good Knowledge and experience in performance tuning using PDO/Informatica partitioning teh ETL Informatica jobs.
  • Experience in writing database objects like views, Indexes, Stored Procedures, Triggers for Oracle, MS SQL Server, and Netezza databases and good knowledge in PL/ SQL, hands on experience in writing complex SQL queries.
  • Experience in writing Teradata BTEQ, F-Load, M-Load scripts for ETL purpose.
  • Experienced using in Teradata Parallel Transport connections (TPT) in Informatica mappings.
  • Experience in data integration to Saleforce.com CRM (SFDC) using Informatica Cloud Services (Data Replication Tasks, Data Synchronization Tasks & Mapping Configuration Tasks) as well Informatica Power Center.
  • Experience in data integration to/from NetSuite Cloud ERP Application using ICS connectors like NetSuite Analytics connector, NetSuite File Cabinet Connector & NetSuite ODBC connector.
  • Experience in implementing teh cloud data warehouse solutions like Amazon Redshift using ICS jobs.
  • Experience in implementing teh cloud data integration solutions to Azure Blob data store and Azure DataLake from various source systems.
  • Experience in implementing teh cloud data integration solutions to Snowflake Cloud DB systems using ICS jobs.
  • Hands on experience in scheduling tools like Autosys, Informatica Scheduler, Control M and UC4 Manager for ETL jobs.
  • Hands on experience in working wif HP QC - Quality Center Systems, Perforce - Versioning applications, BMC Remedy & Service now - Ticketing &Change Request Management applications.
  • Knowledge on BigData technologies like Hadoop (HDFS systems) and its’ supported databases like HIVE, Mango DB, PIG and NoSQL.
  • Knowledge on ETL Cloud technologies like Amazon Redshift and CloverETL.
  • Experience in leading team and acting as offshore-onsite team coordinator.

TECHNICAL SKILLS:

ETL Tool: Informatica PowerCenter 10.1/9.6.1/9.1/8.6/8.1, Informatica Power Exchange, Informatica B2B DT Studio, Informatica TDM, Informatica DVO & Informatica Cloud Services.

Cloud Applications/DB: Salesforce.com(SFDC), NetSuite Cloud Application, AWS Redshift, AWS S3 storage

ETL Scheduling Tool: Informatica Scheduler, Autosys, Control-M, UC4 Manager & Tidal.

Databases: Netezza 7.x, Oracle 12i, 11g &10g Teradata, MS SQL Server 2008/2012, DB2, Hive, Mango DB.

DB Tools: Toad, SQL Developer, WinSQL, Squirrel Client, Aginity & SQL Assistant.

Other Tools: VersionOne, HP QC, BMC-Remedy, JIRA, SeviceNow & Perforce.

Operating Systems: Windows Family, Linux and Solaris.

PROFESSIONAL EXPERIENCE:

Confidential

ETL Informatica Data Integration Specialist

Responsibilities:

  • As an Integration Developer, is responsible for collaborating wif various application teams & business owners to design and build sustainable solutions to meet business requirements and to add significant business value.
  • Responsible to create teh sprint stories for design, build & test integration solutions which follows Agile methodology.
  • Responsible to execute teh build, test and deployment of teh solutions to ensure it meets design specification and ultimately teh business requirements.
  • Responsible for implementing Informatica Cloud Services solutions b/w all could applications.
  • Responsible for Informatica Power Center solutions/implementation for Data warehouse purpose.
  • Develops new mappings/modifies teh existing mappings as per new systems and requirements using Informatica PC 10.1.
  • Develops Cloud Services tasks (Replication/Synchronization/Mapping Configuration) to load teh data into Salesforce (SFDC) Objects.
  • Creating real time integration jobs for SFDC (Data synchronization/mapping tasks, creating SF outbound message & assign to SF workflow) for real time data integration from SF objects to downstream systems.
  • Creating ICS jobs to integrate NetSuite ERP Cloud Application and on-premise DB for data integration purpose using NetSuite Analytics Connector and NetSuite File Cabinet Connector.
  • Creating teh jobs to load teh data to AWS -S3 service.
  • Creating a new DW solutions using AWS - Redshift for new enterprise solutions.
  • Developing Java Scripts for creating teh flatfiles from NetSuite SavedSearch results into File Cabinet folder. Deploying & scheduling teh JScripts wifin NetSuite Applications.
  • Creating ICS jobs to integrate NetSuite ERP Cloud Application and on-premise DB using ODBC connector for bulk data movement.
  • Creating ICS jobs to integrate SAP systems and NetSuite Systems using idoc connectors (Read/Write).
  • Creating ICS jobs to integrate JIRA systems and NetSuite Systems using ICS standard connectors (Read/Write).
  • Creating teh ICS jobs to store teh data files into Azure Blob storage.
  • Creating teh ICS jobs to write teh data into Azure Data Lake store.
  • Creating ICS jobs to integrate on-premise systems like SQL Server and Teradata for enterprise finance reporting solutions.
  • Creating ICS for Smartsheet systems for file attachment uploads.
  • Supporting Admin activities for ICS environments.

Environment: - Informatica PowerCenter 10.1, Teradata, SQL Server, Informatica Cloud Services, SFDC, Film Track, NetSuite, SAP, UNIX & JIRA systems.

Confidential

ETL Informatica Consultant - Application Specialist.

Responsibilities:

  • As an application specialist, is responsible for collaborating wif business owners to design and build solutions to meet bank business financial requirements and to add significant business value.
  • Responsible for converting & deriving teh new requirements received from business analysts and technical stakeholders for ETL Saber2 application wif respect new Basel committee recommendations for regulators.
  • Responsible for Informatica PC solutions/implementation.
  • Develops new mappings/modifies teh existing solutions using Informatica PC v9.6.1.
  • Creates various flat files for Python Calculators.
  • Creates multilevel NZ DB views for business users to access/verifies RWA values.
  • Createdparameter fileswifGlobal, mapping, sessionandworkflowvariables using UNIX scripts.
  • Creates Unix scripts for updating/modifying teh flat files to Python Calculators.
  • Updating teh existing Python scripts for MAT adjustment calculations on teh datasets.
  • Unit testing and System Integration Testing.
  • Provides support for production jobs.

Environment: - Informatica PowerCenter 9.6.1, Teradata, Netezza, Python, VersionOne & UNIX.

Confidential

ETL Informatica Consultant/ Informatica Cloud Services developer.

Responsibilities:

  • As Informatica Cloud Architect, is responsible for deriving teh new requirements from business users and technical stakeholders for ETL applications wif respect new business approaches.
  • Responsible for preparing teh ETL specification documents, HLD & DLD for new requirements.
  • Responsible for ETL solutions in ICS as Informatica Cloud Consultant.
  • Develops new mappings using Informatica PC v10.1.
  • Migrate existing jobs (mappings/sessions/workflows) from previous v9.6.1 to PC v10.1.
  • Develops Cloud Services tasks (Replication/Synchronization/Mapping Configuration) to load teh data into Salesforce (SFDC) Objects.
  • Creating real time integration jobs for SFDC (Data synchronization/mapping tasks, creating SF outbound message & assign to SF workflow) for real time data integration from SF objects to downstream systems.
  • Created UDF in PowerCenter to determine application rules & apply on teh datasets.
  • Created User defined functions in Informatica PC to derive systems/application rules.
  • Develops XML files for downstream systems sourced from SFDC system using ICS.
  • Develops mappings for data integration using Web Services using WSDLs.
  • Using teh different data mask functions to mask teh prod data to make it available for lower environments for test data purpose.
  • Createdparameter fileswifGlobal, mapping, sessionandworkflowvariables using UNIX scripts.
  • Responsible for User administration & maintaining teh Informatica Cloud Services - Secure Agent on Unix Server for Dev/QA environment.
  • Unit testing and System Integration Testing.
  • Scheduling teh Informatica Cloud Service jobs using Informatica Cloud task scheduler.

Environment: - Informatica PowerCenter 9.6.1, Informatica Cloud Services, Salesforce.com, Oracle 11g, UNIX.

Confidential

ETL Consultant - Informatica Cloud Services.

Responsibilities:

  • As Informatica Cloud Architect, is responsible for deriving teh new requirements based on business data driven method for ETL applications wif respect to CentrisDirect.
  • Responsible for ETL solutions in ICS as Informatica Consultant.
  • Responsible for preparing teh technical documents (HLD, DLD) for Informatica ETL mapping developments for new requirements.
  • Directly interacted wif Business Users & Client User Team for remodelling teh existing legacy system to IC services.
  • Developing teh Informatica mappings to in corporate teh business logic using Informatica Cloud Services tool.
  • Loading teh data from source data files to SFDC CRM product using ICS mappings.
  • Creating jobs (Data synchronization/mapping tasks, creating SF outbound message & assign to SF workflow) for real time data integration for SFDC integration systems wif ICS.
  • Standardizing, De-duplicating, Merging & Relating teh master data at Salesforce using Informatica cloud customer 360 (cloud MDM) functionality to get one view of information.
  • Responsible for administration & maintaining teh Informatica Cloud Services - Secure Agent in Windows Server for all 3 Environments.
  • Createdparameter filesfor mapping variables using Batch scripts & PowerShell scripts.
  • Managing User groups for developers & testers in ICS servers.
  • Unit testing and System Integration Testing wif respect to CentrisDirect functionality.
  • Scheduling teh Informatica Cloud Service jobs using Informatica Cloud task scheduler.

Environment: - Informatica Cloud Services, Salesforce.com, Windows Server.

Confidential

Senior ETL Informatica Consultant/Project Associate.

Responsibilities:

  • Developing ETL solutions as Informatica consultant for new enhancements & supporting daily data loads.
  • Preparing teh technical requirements for Informatica ETL mapping developments using Informatica 9.6.1/Teradata/Unix platform.
  • Developing teh new mappings & drive teh team to complete teh development in time as per requirements.
  • Developing teh Informatica mappings (SCD Type 1/2) to in corporate teh business logic using Informatica PC 9.6.1, Informatica DT Studio & Informatica IDQ Developer tools & to load teh source data in staging table available in oracle 11g.
  • Modifying existing teh Teradata BTEQ/FLoad/MLoad Teradata scripts & TPT for load balancing teh data movement across database environments.
  • Developing teh re-usable UNIX scripts for executing teh above Teradata utility jobs for data load and extraction purpose.
  • Preparing teh test data using Informatica Test Data Manager (TDM) for required individual tables using data masking techniques as trail application for testing team.
  • Testing teh new developments/enhancements using teh Cognizant In-house Platinum testing tools for logical testing & data quality testing purpose.
  • Providing teh daily monitoring support to existing services of Informatica jobs.
  • Understanding teh Oracle Gloden-Gate Data Integration application errors & support teh Oracle DBA team for fixing teh issues.
  • Providing support for teh D&B Integration Manager Web Service applications for monthly data restore applications.
  • Preparing teh new SFTP scripts & support teh existing applications - teh FTP solutions on Tumbleweed internal/external servers.
  • Responsible for accepting ad hoc modifications/new enhancements from Business users for existing & new applications& incorporated teh change wifout effecting teh current functionalities.
  • Responsible for improving/maximising teh ETL jobs performance by modifying teh query/mappings and partitioning teh sessions using performance-tuning techniques.
  • Lead teh team both offshore development team & support team.
  • Responsible for monitoring teh Daily/Weekly/Monthly Jobs that are scheduled using UC4 manager.

Environment: - Informatica 9.6.1 PC, Informatica IDQ Developer, Oracle 11g, Teradata, UC4 Manager, Toad, SQL Assistant, Windows 7& UNIX.

Confidential

ETL Informatica - Netezza Developer

Responsibilities:

  • Remodelling teh Existing business logic into new Netezza models.
  • Understanding teh exiting SQL Server Store procedures logic and convert them into ETL Requirements.
  • XML Generation Process - Identifying teh required NZ source tables from teh re-modelled NZ tables.
  • Creating teh hybrid mappings for XML generation process for different frequencies.
  • Created B2B DT mapping to read teh data from PDF files that contains health rules to apply on data sets.
  • Identifying teh Rulesets to apply on each Client Info along wif Members &Providers’ info.
  • Created User defined functions in PowerCenter to apply teh Rulesets identified for teh data.
  • Validate teh data received & generate teh XML files for each client and transferred to require to third parties/downstream systems.
  • Modifying teh generated XML files using XML formatter/Validator/Beautifier as per business owners/third-party requirements.
  • Preparing teh UNIX scripts for SFTP of XML files to different vendors on external Servers.
  • Createdparameter fileswifGlobal, mapping, sessionandworkflowvariables using Unix scripts.
  • Created SSIS package for loading into MS SQL Server database for validated provider information.
  • Unit testing and System Testing of mappings.
  • Scheduling teh ETL jobs using Control M scheduler.

Environment: - Informatica 9.1 PC, Oracle 11g, Netezza 7.0, MSSQL Server, Autosys, Toad, WinSQL, XML Reader, Windows XP & UNIX.

Confidential

Senior ETL Informatica Developer

Responsibilities:

  • Understanding teh exiting Oracle10g PL/SQL Procedures & re-engineering teh logic into Informatica ETL requirements.
  • Extracted source definitions from oracle and flat files.
  • Developed PC mappings and workflows as per teh new Netezza DB models.
  • Converting oracle stored procedures into Type1, 2 & 4 mappings as per new business requirements.
  • Loaded data from sources to Netezza tables using NZLOAD utility & PowerCenter mappings. Further loaded into HDFS files using UNIX scripts for BigData Analytics.
  • CreatedReusable TransformationsandMappletsto use inMultiple Mappings.
  • Created User defined functions in Informatica PC to derive systems/application rules.
  • Createdparameter fileswifGlobal, mapping, sessionandworkflowvariables using UNIX scripts.
  • Experience in supporting Attunity 'Click-2-Load' solution for data replication purpose on
  • Netezza IBM PureData System for Analytics.
  • Validating teh data using Informatica DVO (Data Validation Option) tool as part testing.
  • Unit testing and System Testing.
  • Created re-usable, non-reusablesessionsandEmail tasksforon success/on failuremails.
  • Scheduling teh ETL jobs using Informatica scheduler.
  • Monitoring teh daily/weekly DW ETL workflows.
  • Fixing teh issues occurring on daily ETL load.

Environment: - Informatica 9.0.1 PC, Oracle 11g, Netezza 7.x, WinSQL, Toad, Windows XP and UNIX.

Confidential

Senior ETL Informatica Developer

Responsibilities:

  • Preparation of ETL HLD & LLD document and publish and review teh same wif various teams like Modelling, testing and teh End Users.
  • Created mappings using Informatica PowerCenter Designer 8.6, as per ETL specification document & Implemented Reject report for finding teh reject records along wif teh reason for rejecting those in ETL.
  • Developed Informatica mappings for TYPE 1, 2, 3 & 4 wif 2 levels of staging to cleanse data.
  • Creating teh Multi group sources (Multi Group Application SQ) using Informatica Power Exchange for mainframes sources.
  • Worked specifically wif theNormalizer Transformationby converting teh incoming fixed-width files toCOBOL workbooksand using teh Normalizer transformation to normalize teh data.
  • Loading teh ODS & ADS table available in oracle 10g using above Informatica mappings
  • Unit testing and System Integration Testing.
  • DevelopedreusabletransformationsandMapplets.
  • Designed complex mappings involvingtarget load orderandconstraint based loading.
  • Configured and ran theDebuggerfrom wifin teh Mapping Designer to troubleshoot teh mapping before teh normal run of teh workflow.
  • Created and fine-tunedsessionsby creatingperformance detailson theInformaticaserver for a session and evaluated teh performance details and identified teh reason forslow performancein teh source and teh target databases.
  • Extensively used various performance tuning techniques likepipeline partitioningto improve teh session performance.
  • Createdparameter fileswifGlobal, mapping, sessionandworkflowvariables using UNIX scripts.
  • Scheduling teh Informatica jobs using teh Autosys tool for weekly/monthly base.
  • Involved in system Integration testing support and responsible to drive until SIT signoff.
  • Lead a team of three members for new enhancements and delivering teh tested code.
  • Interacted wif Business owners in status meeting and understand their requirements for enhancing and fixing teh current issues in teh existing system. Come up wif recommendations on improvements to teh current applications.

Environment: - Informatica 8.6.1 PC & PX, Oracle 10g, Netezza, Autosys, Toad, Squirrel, Windows XP & UNIX.

Confidential

ETL Informatica Developer

Responsibilities:

  • Creating teh Source flat files using Basic scripts from Unisys mainframes.
  • Preparing teh UNIX scripts for SFTP to transfer source data files to Informatica servers.
  • Creating mappings using Informatica PowerCenter designer 8.1, as per ETL specification document & creating teh jobs for teh mappings.
  • Extensively worked wifInformatica Designer, Workflow ManagerandWorkflow Monitor.
  • Worked wifSource Qualifier, Sorter, Aggregator, Expression, Joiner, Filter, Sequence generator, Router, Update Strategy, LookupandNormalizertransformations.
  • Created complexjoiners, transformations of all types as needed to pass smoothly data through ETL maps.
  • ImplementedType1/2/3 slowly changing dimensions (SCD)logic.
  • Loading teh data from flat files to oracle tables using Informatica mappings.
  • Configured and ran theDebuggerfrom wifin teh Mapping Designer to troubleshoot teh mapping before teh normal run of teh workflow.
  • UsedTOAD& its utilities to evaluate SQL execution.
  • Createdparameter fileswifGlobal, mapping, sessionandworkflowvariables using UNIX scripts.
  • Schedule teh job using Informatica scheduler.
  • Unit testing and System Testing.
  • Monitored teh workflows using Workflow Monitor.

Environment: - Informatica 8.1, Oracle 9i, Unisys M/F, Toad, Windows XP &UNIX.

We'd love your feedback!