Sr. Informatica Developer Resume
Chicago, IL
SUMMARY
- Over 8+ years of IT experience wif expertise in analysis, design, development and implementation of Data warehouses, data marts and Decision Support Systems (DSS) using ETL tools wif RDBMS like Oracle, MS SQL server, Teradata, DB2 and Snowflake cloud databases on Windows and UNIX platforms.
- Strong experience in Informatica Cloud, PowerCenter10.0/9.x/8.x and PowerExchange 10.0/9.x/8. x.
- Worked on multiple Data Synchronization using Informatica Cloudlike Cloud application to cloud application, On - premise application to cloud application and On-premise application to on-premise application.
- Data integration wif SFDC and Microsoft Dynamics CRM using Informatica cloud.
- Extensive exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, production support.
- Strong experience wif Informatica tools using real-time CDC (change data capture) and MD5.
- Experience in integration of various data sources like Oracle, Teradata, Mainframes, SQL server, XML, Flat files, JSON and extensive experience on Oracle, Teradata and Snowflake.
- Experience wif Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system.
- Very strong in Data Warehousing Concepts like Dimensions Type I, II and III, Facts, Surrogate keys, ODS, staging area, cube also well versed in Ralph Kimball and Bill Inmon Methodologies.
- Realistic understanding of teh Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling.
- Proficient in Data Analysis, Data Validation, Data Lineage Data Cleansing, Data Verification and identifying data mismatch.
- Expert in writing complex SQL queries, PL/SQL and optimizing teh queries in Oracle, SQL Server and Teradata also excellent in working wif Views, Synonyms, Indexes, Partitioning, Database Joins, stored procedure, Stats and Optimization.
- Well-versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues, applying compression for space reclamation etc.
- Expertise in SQL and PL/SQL programming and also excellent in Views, Analytical Functions, Stored Procedures, Functions and Triggers.
- Experience in developing very complex mappings, reusable transformations, sessions and workflows using Informatica ETL tool to extract data from various sources and load into targets.
- Experience in tuning and scaling teh procedures for better performance by running explain plan and using different approaches like hint and bulk load.
- Experience in Performance tuning of ETL process using pushdown optimization and other techniques. Reduced teh execution time for huge volumes of data for a company merger projects. Heavily created Mapplets, user defined functions, reusable transformations, look-ups.
- Experience in designing and development of Variable Length EBCDIC VSAM Files, Cobol Copybooks usingInformaticaPowerExchange 9.6/9.1/8.6
- Technical expertise in designing technical processes by using Internal Modeling &working wif Analytical Teams to create design specifications; successfully defined &designed critical ETL processes, Extraction Logic, Job Control Audit Tables, Dynamic Generation of Session Parameter File, File Mover Process, etc.
- Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint
- Expert in T-SQL coding and testing: functions, views, triggers, cursors, dictionary, stored procedures etc.
- Assisted teh other ETL developers in solving complex scenarios and coordinated wif source systems owners wif day-to-day ETL progress monitoring.
- Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to target mapping, Requirement Traceability Matrix, performing Impact Assessment and providing Effort estimates, deployment artifacts
- Deftly executed multi-resource projects following Onsite Offshore model while serving as a Mentor for teh Junior Team Members
- Strong Knowledge of Hadoop Ecosystem (HDFS, Hive, Hdfs, Impala, Hue, etc.)
- Exposure of end to end SDLC and Agile methodology
- Good communication and presentation skills, works well as an integral part of a team, as well as independently, intellectually flexible and adaptive to change.
TECHNICAL SKILLS
Data Warehousing/ ETL Technology: Informatica Cloud, Informatica PowerCenter 9.x/8.x, Informatica PowerExchange 10.0/9.x/8.x
Database: Oracle 11g/10g, IBM UD2 DB2, MS SQL Server 2008 / 2012 / 2016, MS Access 2000, My SQL, Teradata 13/12/V2R5, Snowflake cloud data warehouse
Data modeling: Erwin 9.1/7.1
Languages: SQL, PL/SQL, XSD, XML, Unix shell scripting
Tools: Microsoft Visio, TOAD, Oracle SQL developer, WINSQL, WINSCP, Secure Shell Client. TOAD, OBIEE 10g, SQL Loader, MS Office, Smart FTP, Ultra Edit, Autosys, Control-M, MS Visio, Autosys
Operating System: Windows, UNIX
Reports: Cognos 10.0 /9.0, Qlik View, Tableau 9/10
Methodologies: Ralph Kimball’s Star Schema and Snowflake Schema.
Mythologies: SDLC, Agile
Others: MS Word, MS Access, MS Office, GitHub
PROFESSIONAL EXPERIENCE
Confidential, Chicago, IL
Sr. Informatica Developer
Responsibilities:
- Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
- Actively involved in interacting wif business users to record user requirements and Business Analysis.
- Translated requirements into business rules & made recommendations for innovative IT solutions.
- Outlined teh complete process flow and documented teh data conversion, integration and load mechanisms to verify specifications for this data migration project.
- Parsing high-level design spec to simple ETL coding and mapping standards.
- Developed data Synchronization tasks dat load data from a source to a target and provide some transformation during transit.
- Created mappings dat flow from multiple complex operations (filter, join, functions) together to build a complex integration process.
- Performed Update Delete (CRUD) operations and also bulk pulling and loading data to 3rd party systems/On-premise databases.
- Worked wif PowerCenter Designer tools in developing mappings and Mapplets to extract and load teh data from flat files and AWS cloud database.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Created teh design and technical specifications for teh ETL process of teh project.
- UsedInformaticaas an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Responsible for mapping and transforming existing feeds into teh new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
- Created mapping to load data into AWS S3 bucket using Informatica S3 connector also populated data into Snowflake from S3 bucket using complex SQL query.
- Loaded diverse types (Structured, JSON, flat files etc.) into teh Snowflake cloud data warehouse.
- Worked onInformaticaPowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
- Worked wif various complex mapping, designed slowly changing dimension Type1 andType2.
- Maintained Development, Test and Production Mappings, migration using Repository Manager. Involved in enhancements and Maintenance activities of teh data warehouse.
- Performance tuning of teh process at teh mapping level, session level, source level, and teh target level.
- Implemented various new components like increasing teh DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
- Worked wifSQL*Loaderto load data from flat files obtained from various facilities.
- Created Workflows containing command, email, session, decision and a wide variety of tasks.
- Tuning teh mappings based on criteria, creating partitions in case of performance issues.
- Tested End to End to verify teh failures in teh mappings using shell scripts.
- Performed data validation after teh successful End to End tests and appropriate error handling in ETL processes.
- Resolving teh tickets based on teh priority levels raised by QA team.
- Developed Parameter files for passing values to teh mappings for each type of client
- Scheduled batch and sessions wifinInformaticausingInformaticascheduler and also wrote shell scripts for job scheduling.
- Understanding teh entire functionality and major algorithms of teh project and adhering to teh company testing process.
Environment: Informaticacloud, InformaticaPowerCenter 10.0,Tableau10.0, Linux, SQL, PL/SQL, Oracle 11g, TOAD, Snowflake cloud data warehouse, AWS S3 Bucket, SQL Server 2012, ORACLE 11g, Control M, Shell Scripting, JSON, SQL Loader
Confidential, Des Moines, IA
Informatica Developer
Responsibilities:
- Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
- Created mappings using Designer and extracted data from various sources, transformed data according to teh requirement.
- Involved in extracting teh data from teh Flat Files and Relational databases into staging area.
- Mappings, Sessions, Workflows from Development to Test and tan to UAT environment.
- Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
- Developed teh Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
- Created Sessions and extracted data from various sources, transformed data according to teh requirement and loading into data warehouse.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in teh Informatica Power Center Designer.
- Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
- Developed several reusable transformations and mapplets dat were used in other mappings.
- Prepared Technical Design documents and Test cases.
- Involved in Unit Testing and Resolution of various Bottlenecks came across.
- Implemented various Performance Tuning techniques.
- Used Teradata as a source system
- Worked wif Cognos team to build teh data ware house.
- Generated matrix reports, drill down, drill through, sub reports, chart reports, multi parameterized reports.
- Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Lookup, Sorter, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
- Used SQL tools like TOAD to run SQL queries to view and validate teh data loaded into teh warehouse.
- Performed data integration and lead generation from Informatica cloud into Salesforce cloud.
- Created summarized tables, control tables, staging tables to improve teh system performance and as a source for immediate recovery of Teradata database
- Extracted teh Salesforce CRM information into BI Data Warehouse using Force.com API/Informatica on Demand to provide integration wif oracle financial information to perform advanced reporting and analysis.
- Created Stored Procedures to transform teh Data and worked extensively in T-SQL, PL/SQL for various needs of teh transformations while loading teh data into Data warehouse.
- Developed transformation logic as per teh requirement, created mappings and loaded data into respective targets.
- Worked wif Informatica Cloud Data Loader for Salesforce, for reducing teh time taken to import or export critical business information between Salesforce CRM, Force.com.
- Worded on profiling source data and determining all source data possible values and metadata characteristics.
- Responsible for determining teh bottlenecks and fixing teh bottlenecks wif performance tuning.
- Extensively worked on Unit testing for teh Informatica code using SQL Queries and Debugger.
- Used teh sandbox for testing to ensure minimum code coverage for teh application to be migrated to production.
- Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate teh process.
- Used PMCMD command to run workflows from command line interface.
- Improved performance testing in Mapping and teh session level.
- Worked wif UNIX shell scripts extensively for job execution and automation.
- Coordinated wif Autosys team to run Informatica jobs for loading historical data in production.
- Documented Data Mappings/ Transformations as per teh business requirement.
- Created XML, Autosys JIL for teh developed workflows.
- Migration of code from development to Test and upon validation to Pre-Production and Production environments.
- Provided technical assistance to business program users, and developed programs for business and technical applications.
Confidential, Columbus, OH
ETL Informatica Developer
Responsibilities:
- Worked wif teh business team to gather requirements for projects and created strategies to handle teh requirements.
- Worked on project documentation which included teh Functional, Technical and ETL Specification documents.
- Experienced in using Informatica for data profiling and data cleansing, applying rules and develop mappings to move data from source to target systems
- Designed and implemented ETL mappings and processes as per teh company standards, using Informatica PowerCenter.
- Extensively worked on complex mappings which involved slowly changing dimensions.
- Developed several complex mappings in Informatica a variety of PowerCenter transformations, Mapping
- Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both teh Informatica PowerCenter.
- Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update strategy, Sequence generator and Joiners.
- Debugged mappings by creating a logic dat assigns a severity level to each error, and sent teh error rows to error table so dat they can be corrected and re-loaded into a target system.
- Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing teh development time.
- Worked on developing Change Data Capture (CDC) mechanism using Informatica PowerExchange for some of teh interfaces based on teh requirements and limitations of teh Project.
- Implemented performance and query tuning on all teh objects of Informatica using SQL Developer.
- Worked in teh ETL Code Migration Process from DEV to ITE, QA and to PRODUCTION.
- Created teh design and technical specifications for teh ETL process of teh project.
- Responsible for mapping and transforming existing feeds into teh new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
- Worked wif SQL*Loader to load data from flat files obtained from various facilities.
- Worked on loading of data from several flat files to Staging using Teradata MLOAD, FLOAD and BTEQ.
- Worked wif teh Release Management Team for teh approvals of teh Change requests, Incidents using BMC Remedy Incident tool.
- Worked wif teh infrastructure team to make sure dat teh deployment is up-to-date.
- Provided 24x7 production support when necessary.
Environment: Informatica PowerCenter 8.6,Informatica PowerExchange 8.6, Oracle 11g, SQL, Erwin 5, UNIX CRONTAB, Control-M, Remedy Incident Tool, Ultra Edit, Teradata 14, Cognos, Teradata SQL Assistant
Confidential, Plano, TX
Informatica Developer / Data Analyst
Responsibilities:
- Using Informatica PowerCenter Designer analyzed teh source data to Extract & Transform from various source systems (oracle 10g, DB2,SQL server and flat files) by incorporating business rules using different objects and functions dat teh tool supports.
- Using Informatica PowerCenter created mappings and Mapplets to transform teh data according to teh business rules.
- Used various transformations like Source Qualifier, Joiner, Lookup, sql, and router, Filter, Expression and Update Strategy.
- Implemented slowly changing dimensions (SCD) for some of teh Tables as per user requirement.
- Documented Informatica mappings in Excel spread sheet.
- Tuned teh Informatica mappings for optimal load performance.
- Has used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
- Created and Configured Workflows and Sessions to transport teh data to target warehouse Oracle tables using Informatica Workflow Manager.
- This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.
- Worked along wif UNIX team for writing UNIX shell scripts to customize teh server scheduling jobs.
- Constantly interacted wif business users to discuss requirements.
Environment: Informatica 8.1, Oracle 10g, SQL server 2008, SQL, T-SQL, PL/SQL, Toad, Erwin, Unix, Tortoise SVN, Flat files