Informatica Intelligent Cloud Developer(iics)/ Integration Engineer Resume
TX
SUMMARY
- Certified Informatica Cloud Data and Application Integration R34 Professional.
- 8 years of professional experience with expertise in Design, Development and Implementation of Data Warehouse applications and Database business systems.
- Strong working experience in all phases of development including Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using IICS Informatica Cloud (CIH,CDI,CAI) and Power Center (Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor).
- Developed PowerShell script’s which help in smooth flow of files with process in Cloud Application Integration (CAI) and Cloud Data Integration
- Created N number of mappings mapping tasks and TaskFlows based on requirement in CDI (Cloud Data Integration)
- 2+ years of Experience in IICS Informatica Cloud ETL Development USING CIH,CDI,CAI
- 4+ years of Experience in Power Center ETL Development USING Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor
- Experience converting SSIS jobs to IICS jobs.
- Created Process, Sub - Process and reusable process, reusable service connectors and reusable app connection to connect to oracle Web services, to read File in Local and SFTP Folders using File parsers and also to write to target folders.
- Hands on Experience to pull data from Service Now and pull and load data to Share point.
- Created Multiple Task Flows For loading data from Different sources to SalesForce using Salesforce connecter with Datasynchronization Task,Mapping Task and Used Bulk API, and Standard API as required.
- Using XML Pay loads have uploaded files to Oracle UCM by converting to base64 format, called submitESSJobRequest to load interface table, called submitImportRequest to load Base table, called getESSJobStatus to get status in oracle cloud
- Good Experience in using Informatica Power Center 10.1/9.1 and Informatica Cloud for extraction, transformation and loading mechanism.
- Good knowledge on HP Vertica Platform Architecture, SQL Server
- Hands on experience with HP Vertica SQL Analytics, Loading and exporting data
- Hands on experience with ETL Testing and writing Test Cases to assure Correctness in Etl Flow by querying and collecting different level of information like Count of records in source and target and for SCD 2 checked for deactivating old records and activating new records and created Test DocumentS and Subimited in meetings to get approval for deployment of code
- Experience with Vertica tables/projections data modelling,Optimize Vertica projection Segmentation, table Partitioning
- Good in writing UNIX shell scripting. Developed wrapper scripts for ETL and Load jobs
- 5+ years of UNIX shell scripting experience for developing wrapper scripts for ETL jobs and for creating environment files, to create the jobs and job streams(schedules) for daily runs.
- Depth of SQL skills with Oracle, HP Vertica,SQL Server,Terradata.
- Experienced using Teradata utilities (SQL, B-TEQ, FLOAD, MLOAD, Fast Export, Tpump, TPT)
- Extensively used Informatica Workflow manager and Workflow monitor for creating and monitoring workflows, Worklets and sessions.
- Experience in maintenance and enhancement of existing data extraction, cleansing, transformation, and loading processes to improve efficiency.
- Experience with relational and dimensional models using Facts and Dimensions tables.
- Experience in working with various databases which include Terradata, Vertica, Oracle, MS SQL Server and Flat files.
- Worked with Sql Server Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL*Loader, Export/Import utilities.
- Experience in Dimensional Modeling using Star schema and Snow schema.
- Experience in UNIX Shell Scripting, TWS and Good Knowledge on Autosys for scheduling the Workflows.
- Experience in writing Test plans, Test cases, Unit testing, System testing, Integration testing and Functional testing.
- Expertise in documenting the ETL process, Source to Target Mapping specifications, status reports and meeting minutes.
- Very good understanding of ‘Versioning’ and ‘Deployment Groups’ concepts in Informatica 9.X. Worked extensively with versioned objects and deployment groups.
- Good experience in data analysis, error handling, error remediation and impact analysis.
- Experience in Agile and Waterfall methodologies.
- Versatile team player with excellent analytical, communication and presentation skills.
TECHNICAL SKILLS
Data Warehousing/ ETL: Informatica PowerCenter 9x/10X, Informatica Cloud
Programming Languages: SQL, Shell script
Platforms: MS-DOS, Windows 9x/2000/XP, Red Hat Linux 4.0 and 5.3
Databases: Oracle 9i/10g/11g, Teradata, VSQL, My SQL, SQL Server, MariaDB, Vertical
Modeling Tools: Oracle JDeveloper, MS Visio, Rational Rose
Application Packages: Sql*Loader, Database Import and Export
Reporting Tools: JasperSoft iReports 1.3.2
IDE: Oracle SQL Developer, Quest Toad, Pl/SQL Developer, Eclipse, Nexus, Web services
Version Control: Totoise SVN, Github, Informatica Power center
Application: Microsoft Office 97/2000/XP/2003/2007 and Acrobat Professional.
Data Modeling: Erwin, Toad Data Modeler, Microsoft Visio
Quality Management: JIRA, HP Quality Center (QC), Inbuilt defect tracking systems
Incident/Log Tracking: BMC Remedy, Splunk, Sumo Logic
PROFESSIONAL EXPERIENCE
Confidential, Irving, TX
Informatica Intelligent Cloud Developer(IICS)/ Integration Engineer
Responsibilities:
- Understanding the business rules and sourcing the data from multiple source systems.
- Interacted with users and supported multiple projects for HTH Data integration team (Suppliers to oracle, invoices to oracleAP, Manual journal Entries to oracle AP, etc).
- Extracted the raw data from SharePoint, Sql Server, My Sql, Flat Files to staging tables, and loaded data to topics in Cloud integration hub (CIH), Flat files, SharePoint, Sql server using Informatica Cloud.
- Convertind SSIS jobs to IICS jobs with help of BRD document and Flow chart from Visio.
- Refreshing the mappings for any changes/additions to source attributes.
- Created Swagger Files for connection to pull data from ServiceNow from CDI and for CIH to run PUB an SUB
- Developed the audit tables for all the cloud mappings.
- Hands on Experience with loading Customer data, Teller transactions data and Accounts Data to Salesforce and did Upserts with help of External ID(Hash Key), to maintain SCD Type 1 within CDI Using Task Flows and Used Bulk API, and Standard API.
- Integrated different sources from and to Salesforce, sharepoint,serviceNow,Sql server, oracle, vertica, Terradata,Flat Files, IICS CIH Topics(PUB and SUB).
- Automated/Scheduled the cloud jobs to run daily with email notifications for any failures and Timeout errors.
- Created File Mass ingestion task for moving Data files from FTP,SFTP,Local Folders to FTP SFTP, Local Folders and Data from Databases Mass ingestion task to load data from on-premise database to Salesforce cloud.
- Monitored and improved the performance of Sources, Targets and Mappings
- Documented the Sources, Targets and Mappings.
- Created File Parser connection for vast number of files and process to setup the dependency between Cloud and source.
- Created and developed mappings to load the data from staging tables, Files to EDW DataMart tables, CIH Topics and topics to files based on Source to Staging mapping design document.
- Implemented Type1, Type2, CDC and Incremental load strategies.
- Used Exceptional mappings to generate dynamic parameter files after the staging loads.
- Used Shared folders to create reusable components (Source, target, Mapplets, email tasks).
- Wrote PowerShell scripts to help smooth flow of files and ETL process
- Performed Testing Activity bu using Sql queries to validate the data after loading.
Confidential, Augora Hills, CA
Informatica Intelligent Cloud Developer(IICS)
Responsibilities:
- Understanding the business rules and sourcing the data from multiple source systems.
- Interacted with users and supported multiple projects for NVABI team (Pet Resorts, Animal Hospitals, Equine).
- Extracted the raw data from SharePoint, Sql Server, My Sql, Flat Files to staging tables using Informatica Cloud.
- Developed Cloud mappings to extract the data for different regions (AUSTRALIA, UK and America).
- Refreshing the mappings for any changes/additions to source attributes.
- Developed the audit tables for all the cloud mappings.
- Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
- Generated automated stats with the staging loads comparing with the present day counts to the previous day counts.
- Monitored and improved the performance of Sources, Targets and Mappings
- Documented the Sources, Targets and Mappings.
- Created Firewatcher jobs to setup the dependency between Cloud and source.
- Created and developed mappings to load the data from staging tables to EDW DataMart tables based on Source to Staging mapping design document.
- Implemented Type1, Type2, CDC and Incremental load strategies.
- Followed ETL standards - Audit activity, Job control tables and session validations.
- Used Exceptional mappings to generate dynamic parameter files after the staging loads.
- Used Shared folders to create reusable components (Source, target, email tasks).
- Wrote PowerShell scripts to help smooth flow of files and ETL process
- Used Sql queries to validate the data after loading.
- Maintained and automated the production database to retain the history backup tables to 30 days.
Confidential, Dallas, TX
IICS Cloud Developer /Vertica/Informatica Developer
Responsibilities:
- Interacted with Subject Matter Experts (SME) to gather Business Requirements for the application, through one-on-one interviews, and worked on developing the business rules for cleansing. Created Teradata, Vertica tables DDL scripts using Erwin for Stage, Target environments.
- Design and develop HP Vertica anchor tables, Projections. Analyse query logs and make corrections to Projections.
- Develop HP Vertica vSQL scripts for bulk loading, delta loading stage & target tables Using IICS Cloud Data Integration
- Developed scripts to copy data between various Vertical environments
- Created end to end ETL data Lineage documents which include Source, Target, Interface, Transformation details Confidential table level
- Developed Interface design specifications to source data as well as sending extracts
- Worked with Golden Gate SDEF files, parm files and various replicat commands (ggsci start, stop, altseq, info, logdump) to develop and troubleshoot data replication
- Sourced data from Oracle, Flat files and loaded into target tables of SCD Type 1 and Type 2, Full Refresh
- Extracted data from wide variety of data Sources like Flat files and Relational Databases (Oracle& SQL Server).
- Extensively used Informatica Power Center 9.6/9.1 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
- Used Informatica Power Connect for SAP to pull data from SAP R/3.
- Designed Data transformation from Transient to Staging, Fact and Dimension Tables in the Warehouse.
- Used transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure and update strategy to meet business logic in the mappings.
- Loaded files to Teradata tables using B-TEQ, FLOAD, MLOAD on daily basis
- Exported Data from Teradata tables to files using Fast Export connection in informatica and sent to destination teams
- Created Global temporary tables and local temporary tables based on requirement.
- Created Workflow, Worklet, Assignment, Decision, Event Wait and Raise and Email Task, scheduled Task and Workflow based on Client requirement
- Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Created sessions, batches for incremental load into staging tables, and schedule them to run daily/weekly/monthly.
- Extensively worked on Performance tuning to increase the throughput of the data load (like read the data from flat file & write the data into target flat files to identify the bottlenecks).
- Created UNIX shell and Wrapper Scripts to run the INFA Workflows
- Used Autosys and cronjobs in UNIX environment for scheduling routine tasks.
Environment: Informatica Power center 9.1.0/8.6.1 , Oracle 10g, SQL Server 2005/2008, Autosys, TFS, 8.3/8.4.1, Unix, Windows XP/Windows7
Confidential
Informatica /Sql Developer
Responsibilities:
- Worked on complete SDLC (Software Development Life Cycle) from Extraction, Transformation and Loading of data using Informatica.
- Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, Mapplets, transformations, re-usable transformations.
- Worked on the Installation and configuration of Informatica Power center 8.6 (Server Manager, Designer and Repository manager).
- Mentored team members in using new features of Informatica power center 8.6.1 and power exchange.
- Worked with Informatica power exchange tools to give on demand access to the business users.
- Transferred the mainframe data using Direct connect.
- Worked on Source Analyzer, Warehouse Designer, Mapping &Mapplets Designer and Transformations, Informatica Repository Manager, Workflow Manager and Workflow Monitor.
- Used most of the transformations such as the Source Qualifiers, Routers, SQL, Transaction control, Sequence Generators and Expression as per the Business requirement.
- Created Workflows with Worklets, event wait, decision box, email and command tasks using Workflow Manager and monitored them in Workflow Monitor.
- Used Workflow Manager for creating and maintaining the Sessions. Also used Workflow Monitor to Monitor, edit, schedule, copy, aborts and delete sessions.
- Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
- Monitored and improved the performance of Sources, Targets and Mappings
- Documented the Sources, Targets and Mappings.
- Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
- Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.
- Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
Environment: Informatica Power Center 8.6, Oracle 10g, PL/SQL, UNIX, PERL, Erwin, TOAD, SQL Server 2005, Mainframes (Cobol, VSAM, DB2, File Aid, Flat files).
Confidential
Informatica Developer/Sql Developer
Responsibilities:
- Involved in design and development of data warehouse environment, liaison to business users and technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.
- Worked on Informatica Power Center tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations.
- Using Informatica Designer designed Mappings, which populated the Data into the Target Star Schema on Oracle Instance.
- Optimized Query Performance, Session Performance and Reliability.
- Extensively used Router, Lookup (Connected/Unconnected), Aggregator, Expression and Update Strategy Transformations.
- Tuning the Mappings for Optimum Performance, Dependencies and Batch design.
- Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.
- Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
- Used Agile Methodology throughout the project and participated in daily Scrum meetings and worked with Business
- Involved in writing UNIX shell scripts for Informatica ETL tool to run the sessions.
Environment: Informatica Power Center 7.x, Oracle 9g, PL/SQL, UNIX, Erwin, TOAD, SQL server, Cognos, Agile.