Informatica Powercenter/iics Developer Resume
5.00/5 (Submit Your Rating)
Mckinney, TX
SUMMARY
- 7+ years of experience involving Data integration and Data warehousing techniques, using Informatica Power Center 10.2/9.6/9.1/8.6 , Informatica Power Exchange 10.2/9.6, Informatica Intelligent Cloud Services (IICS)
- Experience working with data integration module in Informatica intelligent cloud services.
- Experience working with cloud - based database solutions including Azure Synapse, Azure Data Lake store, AWS Redshift and Snowflake
- Experience working with traditional on-premises database including Oracle, Sql Server and Teradata
- Experience working a ETL conversion project from SSIS to Informatica cloud
- Expert in all phases of Software development life cycle (SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance
- Worked with different non-relational Databases such as Flat files, XML files, Mainframe Files.
- Worked extensively with Data migration, Data Cleansing, Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse
- Instrumental in setting up standard ETL Naming standards & BEST Practices throughout the ETL process (Transformations, sessions, maps, workflow names, log files, bad files, input, variable, output ports)
- Worked with different Informatica performance tuning issues like source, target, mapping, transformation, session optimization issues and fine-tuned Transformations to make them more efficient in terms of session performance.
- Experience in implementing the complex business rules by creating re-usable transformations, developing complex Mapplets and Mappings, PL/SQL Stored Procedure and Triggers
- Experience in Creating ETL Design Documents, strong experience in complex PL/SQL packages, functions, cursors, indexes, views, materialized views.
- Excellent communication, presentation, project management skills, an exceptionally good team player and self-starter with ability to work independently and as part of a team.
- Extensive experience in UNIX Shell Scripting, AWK and file manipulation techniques
- Demonstrated ability in defining project goals and objectives, prioritizing tasks, developing project plans and providing framework for effective communication while maximizing responsiveness to change.
- Possess experience in working on concurrent projects in very demanding and high-pressure situations.
TECHNICAL SKILLS
- Star Schema
- Snowflake Schema
- Informatica PowerCenter 10.2, 9.6.1, 9.1.1,8.6
- Informatica Power Exchange 10.2/9.6.1
- Informatica Intelligent Cloud Services (IICS)
- Oracle 11i, 10g, 9i, 8g
- Microsoft SQL Server 2008/2012/2016
- Teradata V13
- Snowflake
- Azure Sql Datawarehouse (Azure Synapse)
- Azure Sql Database
- AWS Redshift
- Windows Batch
- Unix Shell
- Tivoli (TWS)
- Control M
- Informatica Scheduler
- Agile & Waterfall
PROFESSIONAL EXPERIENCE
Confidential, McKinney, TX
Informatica PowerCenter/IICS Developer
Responsibilities:
- Developed Data Integration Platform components/processes using Informatica Cloud Platform, Azure SQL Datawarehouse, Azure Data Lake Store and Azure Blob Storage technologies
- Analyzed existing ETL Datawarehouse process and ERP/NON-ERP Applications interfaces and created design specification based on new target Cloud Datawarehouse (Azure Synapse) and Data Lake Store
- Created ETL and Datawarehouse standards documents - Naming Standards, ETL methodologies and strategies, Standard input file formats, data cleansing and preprocessing strategies.
- Created mapping documents with detailed source to target transformation logic, Source data column information and target data column information.
- Designed, Developed and Implemented ETL processes using IICS Data integration
- Created IICS connections using various cloud connectors in IICS administrator.
- Extensively used performance tuning techniques while loading data into Azure Synapse using IICS
- Extensively used cloud transformations - Aggregator, Expression, Filter, Joiner, Lookup (connected and unconnected), Rank, Router, Sequence Generator, Sorter, Update Strategy, Union Transformations
- Extensively used cloud connectors Azure Synapse (SqlDW), Azure Data Lake Store V3, Azure BLoB Storage, Oracle, Oracle CDC and SQL Server
- Converted legacy SSIS packages into Informatica mappings.
- Analyzing the SSIS code for Informatica conversion and reviewing design solution with the team lead. Developing the STM based on the approved solution.
- Developed Cloud integration parameterized mapping templates (DB, and table object parametrization) for Stage, Dimension (SCD Type1, SCD Type2, CDC and Incremental Load) and Fact load processes.
- Extensively used Parameters (Input and IN/OUT parameters), Expression Macros and Source Partitioning Partitions
- Extensively used Push Down Optimization option to optimize processing and use limitless power of Azure Synapse (SqlDW)
- Extracted data from Snowflake to push the data into Azure warehouse instance to support reporting requirements.
- Performed loads into Snowflake instance using Snowflake connector in IICS for a separate project to support data analytics and insight use case for Sales team.
- Developed CDC load process for moving data from Peoplesoft to SQL Datawarehouse using “Informatica Cloud CDC for Oracle Platform.”
- Developed complex Informatica Cloud Taskflows (parallel) with multiple mapping tasks and taskflows.
- Developed MASS Ingestion tasks to ingest large datasets from on-prem to Azure Data Lake Store - File ingestion
- Designed Data Integration Audit framework in Azure SqlDw to track data loads, data platform workload management and produce automated reports for SOX compliance.
- Worked with a team of 4 onshore and 6 offshore development teams and prioritizing project tasks.
- Involved in Development, Unit Testing, SIT and UAT phases of project.
Environment: Informatica Intelligent Cloud Services, Informatica Powercenter 10.2, Informatica PowerExchange 10.2, SSIS, Windows Secure Agent, Teradata v1310, Azure Synapse (Azure SqlDW), Azure Data Lake Store, SQL Database, PowerBI reporting
Confidential, Southfield MI
Informatica Developer
Responsibilities:
- Experience in creating high level documents, Source to Target mapping document and detailed design level document for the entire ETL process.
- Extracted/loaded data from/into diverse source/target systems like Oracle, Sql Server, Salesforce, XML and Flat Files
- Extracted data from various On-premises systems and pushed the data to AWS Redshift using Informatica PowerCenter which in turn feeds the analytics use case.
- Project involved usage of most of the transformations like Transaction Control, Active and Passive look up transformation, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Unstructured data transformation, SQL transformation and more.
- Extensive implementation of Incremental/Delta loads with the help of various concepts like mapping variable, mapping parameter and parameter table concept.
- Creating ETL Code in such a way to support implementation of full loads for the initial run and incremental/delta loads for next daily runs.
- Developed mappings to load data into landing layer, staging layer and publish layer with extensive usage of SCD Type I and SCD Type II development concept.
- Experience development of SCD Type I and Type II with the help of MD5 hash function
- Hands on experience working on profiling data using IDQ.
- Experience working with extracting and loading data directly into Salesforce objects using Informatica Powercenter.
- Experience working with various session properties to extract data from Salesforce object using standard api, Bulk api.
- Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views.
- Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
- Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer
- Loaded data from Unstructured file format using unstructured data transformation into Oracle database.
- Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.
Confidential, Sacramento CA
Informatica PowerCenter Developer
Responsibilities:
- Developed a standard ETL framework to enable the reusability of similar logic across the board Involved in System Documentation of Dataflow and methodology
- Extensively developed Low level Designs (Mapping Documents) by understanding different source systems
- Designed complex mappings, sessions and workflows in Informatica PowerCenter to interact with MDM and EDW.
- Design and develop mappings to implement full/incremental loads from source system.
- Design and develop mappings to implement type1/type2 loads.
- Responsible for ETL requirement gathering and development with end-to-end support.
- Responsible to coordinate the DB changes required for ETL code development.
- Responsible for ETL code migration, DB code changes and scripting changes to higher environment
- Responsible to support the code in production and QA environment.
- Developed complex IDQ rules which can be used in Batch Mode.
- Developed Address validator transformation through IDQ to be interacted in Informatica PowerCenter mapping
- Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter
- Worked closely with MDM team to identify the data requirements for their landing tables and designed IDQ process accordingly.
- Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML, Update strategy, union, aggregator, normalizer and sequence generator
- Created reusable mapplets, reusable transformations and performed Unit tests over Informatica code.
- Responsible for providing daily status report for all Informatica applications to customer, Monitoring and tracking the critical daily applications & code migration during deployments.
- Responsible for reloads of Informatica applications data in production and closing user tickets and incidents.
- Identify performance issues and bottlenecks.
Confidential
Informatica Developer
Responsibilities:
- Participate in gathering and evaluating requirements, working with application / Data Warehouse team and project managers to provide solutions to end users.
- Develop Technical design and reporting solutions to influence business results Oversee the performance of the project throughout the life cycle from initiation till completion stage.
- Proficient in translating user’s statements of needed system behavior and functionality into
- Business and Functional Requirement
- Involved in data modeling using ER, star schema and dimensional modeling Excellent understanding of OLTP/OLAP System Study, Analysis and developing Database Schemas like Star and Snowflake schema Exposure to Reporting tools OBIEE, BI Publisher
- Developing ETL mappings from the given requirements and unit testing them accordingly
- Creating Technical Design Documents from Business Requirements
- Created data maps in Informatica Power Exchange to read and write data to the mainframe datasets.
- Creating batch scripts for different requirement of the project such as file validation, moving files from share point to Informatica server, archiving files with date time stamps etc
- Performance tuning various mappings, Sources, Targets and transformations by optimizing caches for lookup, joiner, rank, aggregator, sorter transformation and tuned performance of Informatica session for data files by increasing buffer block size, data cache size, sequence buffer length and used optimized target-based commit interval and Pipeline partitioning to speed up mapping execution time.
- Reviewing Informatica ETL mappings/workflows and SQL that populates data warehouse and data mart Dimension and Fact tables to ensure accuracy of business requirements.
- Created Informatica Source & Targets Instances and maintain shared folders so that shortcuts are used in project.
- Responsible for Unit Testing and Integration testing of mappings and workflows