Sr.etl Developer/informatica Architect Resume
Grand Rapids, MI
SUMMARY
- Over 14+years of IT experience in Data warehousing with emphasis on Business Requirements, Application Design, Development, testing, implementation, and maintenance of Data Warehouse
- Involved in all phases of SDLC from analysis and planning to development and deployment
- Experience in OLTP Modeling and OLAP Dimensional modeling (Star and Snow Flake)
- Extensive experience on Informatica 10.2/9.6.1/9.1.1/8. x, Informatica Power exchange 9.x and IICS
- Strong experience with various AWS services including S3 and RDS
- Strong experience in loading HR data from PeopleSoft (Flat file) to staging, Dimension, Fact and Error tables.
- Strong experience developing complex mappings using transformations like Source Qualifier, Filter, Expression, Joiner, Router, Union, Unconnected / Connected Lookups and Aggregator
- Good exposure to incremental loading using mapping variables
- Experience in Slowly Changing Dimensions (Type 1 and Type 2), azu5 and Surrogate keys
- Developed standard and re - usable mappings, mapplets using various transformations like expression, aggregator, joiner, source qualifier, lookup, and Router. Also, developed mappings using parameters and variables.
- Good exposure to static, dynamic and lookup cache
- Good experience working on Salesforce CRM platform
- Designed and Developed ETL logic for implementing CDC (Change Data Capture) using Informatica Power exchange Oracle CDC concepts
- Caterpillar data warehouse exists in snowflake, HOLT CAT is also building their entire data warehouse into cloud on snowflake using the cloud ETL tool IICS (Informatica Intelligent Cloud Serves
- Extensive experience with all tasks in workflow manager to implement job control in Informatica to support dependencies in loading data in target systems including post session commands/SQL
- Strong skills in SQL, PL/SQL packages, functions, stored procedures, triggers and materialized views to implement business logic in oracle databases
- Experience with the healthcare data in HIPAA formats including NCPDP, EDI’s 837,834,835, HL-7
- Working experience on databases Oracle, Teradata, SQL Server, MySQL and interfaces like SQL Loader, TOAD, Teradata SQL assistant/ console and ERWIN
- Experience in UNIX working environment, writing UNIX shell scripts for Informatica pre & post session operations
- Excellent interpersonal and communication skills, technically competent and result-oriented with problem solving skills and ability to work effectively as a team member as well as independently
TECHNICAL SKILLS
Data Warehouse Tools: Informatica Power Canter 10.x/9.x/8.x, IICS, Data Stage, Informatica Power Analyzer/Powermart 9x
Operating System: Windows XP/Vista 08/07, Unix, Linux, IBM Mainframes, Putty, Winscp
Databases: Oracle 11g/10g/9i, MySQL, Sql Server 08/05, Teradata, MS Access
Reporting Tools: Cognos, SAP BO, SSRS, SSAS
Dimensional Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snowflake, Modeling, Erwin.
Data Base Tools: Oracle SQL Developer, SQL Plus, Toad, MS office, Microsoft Visio
Languages: SQL, PL/SQL, XML, Unix Shell Scripting, Cobol
Dimensional Data Modeling: Dimensional Data Modeling, Star & Snowflake Modeling, Erwin
PROFESSIONAL EXPERIENCE
Confidential, Grand rapids, MI
Sr.ETL Developer/Informatica Architect
Responsibilities:
- Provided development and technical support for Sales related data and loaded this data from Insight to Oracle database.
- Worked with Product Analyst and business users to clarify requirements and translate the requirement into technical specifications.
- Involved in business analysis and technical design sessions with business and technical staff to develop requirements documents, and ETL specifications.
- Loaded HR data from PeopleSoft (Flat file) to staging, Dimension, Fact and Error tables.
- Good experience integrating data from various source systems like Flat file, XML, Snowflake, AWS S3, AWS RDS, MongoDB, API and Salesforce.
- Extraction, Transformation and Load was performed using Informatica Power Center to build Data warehouse. Worked on Informatica power center tools like Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager.
- ExtensiveTableauExperience in Enterprise Environment. Experience includes technical support, troubleshooting, report design and monitoring of system usage
- Used reverse engineering in Erwin to understand the existing data model of the data warehouse.
- Involved in Relational and Dimensional Data Modeling Techniques to design ERWIN data models.
- Worked extensively on Informatica designer to design a robust end-to-end ETL process involving complex transformation like Source Qualifier, Lookup, Update Strategy, Router, Aggregator, Sequence Generator, Filter, Expression, Stored Procedure, External Procedure, Transactional Control for the efficient extraction, transformation and loading of the data to the staging and then to the Data Mart (Data Warehouse) checking the complex logics for computing the facts.
- Good experience with creating SCD Type1 and Type2 mapping to load and maintain Data warehouse using MD5 and Surrogate keys.
- Developed IICS Informatica Cloud mappings to extract the data from SFDC.
- Creating IICS Mappings Tasks/Data Replication Tasks/Data Synchronization tasks
- Worked with Connected, Unconnected, Static, Dynamic and Persistent lookups.
- Extensively used the reusable transformation, mappings and codes using Mapplets for faster development and standardization.
- Created mapping document documents based on the requirement.
- Designed and documented validation rules, error handling and test strategy of ETL process.
- Tuned Informatica mappings/sessions for better ETL performance by eliminating bottlenecks. Used Informatica ETL to load data from flat files, which includes fixed length as well as delimited files and SQL Server to the Data mart on Oracle database.
- Deployed and Managed highly scalable, fault tolerant and adaptive data pipelines to ingest and standardize data in Azure Datalake and Azure SQL Server.
- Integration of data from a variety of sources by building robust data pipelines and dynamic system using Python, Hive, Shell scripts and Databricks Spark.
- Implementation of ETL solution using Informatica BDM with batch framework, audit and reconciliation in a highly dynamic model.
- Build CICD Pipelines for code deployments using Azure Repos and Azure DevOps services.
- Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
- Created mappings, complex workflows with multiple sessions, worklets with consecutive/concurrent sessions for loading fact and dimension tables into data mart presentation layer.
- Created Identity Insert in the Informatica connection for the required tables to insert their own sequence no.
- Implemented source and target-based partitioning for existing workflows in production to improve performance so as to cut back the running time
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1.
- Has a very good knowledge of FACETS tool and Healthcare domain, Worked on the various modules like Subscriber/Member, Groups, Enrollment, Claims, Billing, Accounting, Provider, MTM and Utilization Management.
- Involved in migration of the maps from IDQ to power center
- Applied the rules and profiled the source and target table's data using IDQ
- Analyzed workflow, session, event and error logs for trouble shooting Informatica ETL process.
- Worked with Informatica Debugger to debug the mappings in Informatica Designer.
- Involved in creating test plans, test cases to unit test Informatica mappings, sessions and workflows.
- Migrated Informatica ETL application and Database objects through various environments such as Development, Testing and Production environments.
Environment: s: Informatica 10.x/9.x, IICS, Oracle 11g, UNIX, SQL, SQL Server, Rapid SQL, TOAD 8.6, Erwin.
Confidential, Long Beach, CA
Sr. Informatica Developer
Responsibilities:
- Designed ETL functional specifications and converted them into technical specifications.
- Participating in user meetings, gathering requirements, discussing the data issues with end users. Translating user inputs into ETL design docs.
- Preparing High Level Design documents.
- Preparing source to target mappings documents and mappings development.
- ETL mappings development & bug fixing.
- Modifying Mapping workflows using Informatica PowerCenter as per the Change Requests raised by client.
- Used various transformations like filter, expression, sequence generator, update strategy, joiner, router, and aggregator.
- Used complex transformations like Java transformation, transaction control transformation, SQL transformation and more.
- Designed and configuredAzure Virtual Networks(VNets), subnets,Azure network settings, DHCP address blocks, DNS settings, security policies and routing.
- DeployedAzure IaaS virtual machines(VMs) andCloud services(PaaS role instances) into secure VNets and subnets.
- Conduct systems design, feasibility and cost studies and recommend cost-effective cloud solutions such asAmazon Web Services (AWS).
- Responsible for monitoring the AWS resources usingCloud Watchand also application resources usingNagios.
- CreatedAWSMulti-Factor Authentication(MFA)for instanceRDP/SSHlogon, worked with teams to lockdownsecurity groups.
- Heavily involved in testing Snowflake to understand best possible way to use the cloud resources.
- Responsible for code Migration from development to System test and Production environments.
- Parsing high-level design specification to simple ETL coding along with mapping standards.
- Worked on Informatica Data Quality to resolve customers related issues
- Configuring and running sessions, tasks, and workflows.
- Migrating Informatica workflows/folders from one environment to another.
- Responsible for maintaining repository backups and their restorations.
- Defined relationships and cardinality between different database tables.
- Worked with DBA to setup the new Databases and Modify Databases for Reporting Models.
- Created Bar Charts in Tableau using data sets and added trend lines and forecasting on future forecasting, based on the predefined data set conditions by the business.
- Created Crosstab’s to display underlying data based on various graphs and charts created for further data analysis.
- Involved in creating reports based on Budgeting, Forecasting and planning.
- Deployed packages from a development environment to test environment.
- Working experience on Test Data Management tools HP Quality Center, HPALM and Load Runner
- Involved in Test Data Management with the testing team to build and run the test plans to improve data efficiency.
- Involved in developing Proof of Concept for Big data technology implementation.
- Performed data analysis on daily, weekly, and monthly scheduled refresh of data based on that business or system change to ensure that the published dashboards are displaying accurate and up to date.
Environment: Informatica PowerCenter 9.x, Oracle 11g, UNIX, XML, PLSQL, Erwin 4.0, Putty, Winscp, Windows XP/Vista, Quality Center.
Confidential, Minneapolis, MN
Sr.ETL/Informatica Developer
Responsibilities:
- Collaborated with Project Manager, Tech Lead, Developers, QA teams and Business SMEs to ensure delivered solutions optimally support the achievement of business outcomes.
- Designed and developed ETL Processes based on business rules, job control mechanism using Informatica Power Center.
- Data Warehouse Data modeling based on the client requirement using Erwin (Conceptual, Logical and Physical Data Modeling).
- Worked extensively on complex mappings using source qualifier, joiner, expressions, aggregators, filters, Lookup, update strategy and stored procedure transformations.
- Used workflow monitor to monitor the jobs, reviewed session/workflow logs that were generated for each session to resolve issues, used Informatica debugger to identify issues in mapping execution.
- Re-engineered lots of existing mappings to support new/changing business requirements.
- Used mapping variables, parameters, workflow variables & parameter files to support change data capture and automate workflow execution process.
- Created Workflow tasks in IICS Informatica Cloud
- Implementation of cloud data integration using Azure HDInsight, Azure Synapse Analytics, Azure Data Factory, Azure Databricks, Azure DevOps,Azure Storages, Azure Batch, Azure Function, REST API, etc.
- UsedSSISto unite data from existing system and performed transformations on MS SQL 2008.
- Extract Transform Load(ETL) development Using SQL server 2008,SQL 2008 Integration Services(SSIS)
- Generated several drill down and drill through reports usingSSRS.
- Evaluate business requirements to come up with Informatica mapping design that adheres to Informatica standards.
- Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
- Created mappings using different look-ups like connected, unconnected and Dynamic look-up with different caches such as persistent cache
- Worked on End to End analysis of the EDW to report the questions/clarifications of Adhoc requests with supporting data and findings.
- Worked with No SQL databases likeMangoDBto save and retrieve the data.
- Responsible for analyzing and solve the data related issues inTableauDashboards.
- Develop mappings using Power center andInformaticaCloudto extract data from salesforce,
- Involved in analysis of source, target systems and data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
- Extracted data from heterogeneous sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files and transformed into a harmonized data store.
- Involved in cleansing and extraction of data and defined quality process using IDQ and Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data.
- Involved in writing UNIX shell scripts (Pre/Post Session commands) for the Sessions & wrote shell scripts to kickoff workflows, unscheduled workflows, get status of workflows.
- Performed Informatica administration like user, privileges, migrations, starting, stopping pmrep/pmserver. Backup and restore repository service.
- SAP and load it into Teradata and oracle database
- Created partitions, SQL override in source qualifier, session partitions for improving performance.
- Tuned SQL statements, Informatica mappings, used Informatica parallelism options to speed up data loading to meet defined SLA
- Developed a detailed Test Plan, Test strategy, Test Data Management Plan, Test Summary Report based on Business requirements specifications.
- Supported Informatica, non-informatica code migration between environments (DEV/QA/PRD)
- Developed Oracle PL/SQL Packages, Procedures, Functions and Database Triggers.
Environment: Informatica Power Center 8.6, Informatica PowerCenter, Power Exchange, Windows, IBM DB2 8.x, Mainframes, SQL Server 2007, Enterprise Architect, Meta data Manager, ER studio, Oracle, SQL Plus, PL/SQL, Windows 2007.
Confidential, Denver, CO
Sr.ETL/Informatica Developer
Responsibilities:
- Maintaining the data coming from the OLTP systems.
- Developed and maintained complex Informatica mappings.
- Involved in analyzing and development of the Data Warehouse.
- Created complex mappings in Informatica Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Sorter, Lookup, Joiner transformations etc.,
- Perform data validation tests using complex SQL statements.
- Worked with data warehouse staff to in corporate best practices from Informatica.
- Worked with Business Analysts, using work sessions, to translating business requirements into technical specifications, including data, process and specifications.
- Created Packages inSSISwith error handling.
- Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
- Designed ETL specification documents for all the projects.
- Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
- Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
- Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
- Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
- Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
- Involved in the ETL design and its documentation.
- Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system using ER-STUDIO.
- Followed Star Schema to design dimension and fact tables.
- Experienced in handling slowly changing dimensions.
- Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.
- Responsible for the development, implementation and support of the databases.
- Extensive experience with PL/SQL in designing, developing functions, procedures, triggers and packages.
- Worked on converting some of the PL/Sql scripts into Informatica mappings.
- Worked extensively on Unit Testing and preparing efficient unit test documentation for the developed code to make sure the test results match with the client requirement.
- Created E-mail notifications tasks using post-session scripts.
- Prepared detail documentation for the developed code for QA to be used as guide for future migration work.
- Worked with different source systems like MS sql server, DB2, mainfram flat files.
- Implemented Type II slowly changing dimensions using date-time stamping.
- Created database structures, objects and their modification as and when needed.
- Investigating and fixing the bugs occurred in the production environment and providing the on-call support.
- Performed Unit testing and maintained test logs and test cases for all the mappings.
- Testing for Data Integrity and Consistency.
Environment: Informatica Power Center 8.6, Informatica PowerCenter, Power Exchange, Windows, IBM DB2 8.x, Mainframes, SQL Server 2007, Enterprise Architect, Meta data Manager, ER studio, Oracle, SQL Plus, PL/SQL, Windows 2007.
Confidential, Owings Mills, MD
Informatica Developer
Responsibilities:
- Successfully developed and maintained ETL maps to Extract, Transform and Load data from various data sources to the Enterprise Data warehouse called PMS (Product Management System).The data warehouse contained information regarding sales data, purchase data, valued customer information, employee information. The projects aim to help make decisions for new product improvements, analysis of existing products.
- Imported various Sources, Targets, and Transformations using Informatica Power Center Server Manager, Repository Manager and Designer.
- Created and managed the global and local repositories and permissions using Repository Manager in Oracle Database.
- Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.
- Used heterogeneous files from Oracle, Flat files and SQL server as source and imported stored procedures from oracle for transformations.
- Designed and coded maps, which extracted data from existing, source systems into the data warehouse.
- Used Dimensional Modeling Techniques to create Dimensions, Cubes and Fact tables.
- Analyzed business requirements, performed source system analysis, prepared technical design document and source to target data mapping document
- Developed complex mappings using Transformations like Lookup (Connected and Unconnected), Joiner, Sorter, Rank, Source Qualifier, Router, Union, Aggregator, Filter, and Expression in the Power Center Designer.
- Involved in troubleshooting performance issues in the mappings and Optimized performance by tuning the Informatica mappings as well as SQL.
- Worked closely with QA team during the testing phase and fixed bugs that were reported
- Extensively worked with various re-usable components like tasks, workflows, Worklets, Mapplets, and transformations
- Developed Informatica workflows and sessions associated with the mappings using Workflow Manager
- Worked with session logs, Informatica Debugger, and Performance logs for error handling to fix workflows and session failures.
- Created UNIX Shell Scripts for batch scheduling and loading process.
- Written PLSQL procedures for processing business logic in the database. Tuned SQL queries for better performance.
- Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manager.
- Generated completion messages and status reports using Informatica Server manager.
- Tuned ETL procedures and STAR schemas to optimize load and query Performance.
Environment: Informatica Power Center 7.1.2, DB2 v8.0, SQL, Windows 2000, UNIX, SQL Server 2000, Oracle 8i, Flat files, SQL *Plus, Business Objects 5.1.6