Etl- Developer Resume Profile
PROFESSIONAL SYNOPSIS
- Informatica Certified professional having around 8 years of experience in Data Warehouse/Data mart implementations teamed with Analysis, requirements gathering, Effort Estimation, ETL Design, development, System Integration testing, Implementation and production support
- In-depth hands on experience in ETL/ELT architecture and development.
- Worked on Administrative activities like creating deployment groups, assigning privileges and code migrations.
- Experienced in performance tuning by implementing pipeline partitioning, push down optimization, persistence cache etc.
- Experience in resolving on-going maintenance issues, bug fixes and bottlenecks at various levels like sources, targets, mappings and sessions.
- Implemented audit and alert strategies throughout ETL processes.
- Good Knowledge in implementing Slowly Changing Dimensions SCD Type1 and Type 2 .
- Strong in Data Warehousing concepts as well as methodologies, Star Schema and Snowflake Schema dimensional models.
- Experience in integration of various data sources like Oracle, xml, Excel, Teradata, DB2, Vsam, .csv, Text files and web services.
- Experience in creating reusable transformations and complex mappings, sessions, and workflows for high volume and high complexity processing.
- Capable of analyzing the business requirements and creating technical specification documents.
- Hands on experience in using IDQ used various Informatica Data Quality transformations like Parser, address validator, standardizer, labeller, Match, case, Merger and Exception transformation.
- Experience in parsing Cobol copy books thru Informatica.
- Worked on Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load, TPT to export and load data to/from different source systems including flat files.
- Extensive experience of using Informatica PowerCenter 9.5.1/8.6.1/7.x and Tableau.
- Hands on experience in Data profiling and Unix shell scripting.
- Good knowledge on OBIEE/Siebel Analytics Administration Tool involving all the 3 layers namely Physical Layer, BMM Layer and Presentation Layer
- Experience in scheduling Informatica Jobs using DollarU and have good knowledge on Autosys.
- Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes success / failure rates for analysis as maintenance part.
TECHNICAL SKILLS
- RDBMS : Oracle 11g/10g/9i, Teradata, DB2, SQL Server
- Tools : TOAD, Putty, Power Term, Harvest and HP-QC
- Kintana, Toad, SQL-Assistant. DollarU, DAC,
- Autosys, Tidal, PVCS
- Business Intelligence : OBIEE 11g/10g, Tableau, P6 Analytics
- ETL Tools : Informatica Power Center 9.5.1/9.1/8.6/7.x/6.X,
- Abintio, IDQ 9.5.1
- Operating Systems : Windows NT/XP ,UNIX AIX, AS400
- Languages : C, SQL and PL/SQL, COBOL, Java, JCL
Confidential
ETL- Developer
Environment : Informatica Power Center 9.5, Tableau, COBOL, Oracle11g, Teradata, SQL Server
Scope :
- IE Rewrite is an initiative to convert the SAS legacy code into Informatica.
- Currently SAS will ready data from Clarity systems and will generate the files,
- which feeds ETL Process. So instead of SAS reading from clarity and generate
- the data feed to etl process, a new code was developed to eliminate the
- dependency on SAS system.
- This project is to keep track of Encounters that were generated at Institution.
- The data is used to identify, when the encounter was happened, at what facility
- at what location, what medication order was prescribed, dosage used per day,
- max and min dose quantity, min and max infusion given, current status,
- medication name, therapy name, when it was closed and so on.
ETL- Developer
- Translated the business processes/SAS code into Informatica mappings for building the data mart.
- Used Informatica powercenter to load data from different sources like flat files, COBOL files and Oracle, teradata into the Oracle Data Warehouse.
- Implemented pushdown, pipeline partition, persistence cache for better performance.
- Developed reusable transformations and mapplets to use in multiple mappings.
- Implementing Slowly Changing Dimensions SCD methodology to keep track of historical data.
- Assisted the QC team in carrying out its QC process of testing the ETL components.
- Created pre-session and post-session shell scripts and email notifications.
- Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
- Involved in data quality checks by interacting with the business analysts.
- Performing Unit Testing and tuned the mappings for the better performance.
- Maintained documentation of ETL processes to support knowledge transfer to other team members.
- Created Tableau extracts and developed reports by using Tableau.
- Created sites and assign privileges to them on tableau.
Confidential
IT-Engineer
- Environment : Informatica Power Center 9.5.1, OBIEE, Oracle10g/11g, TERADATA, TOAD,
- SQL-Assistant, Unix, PVCS, Kintana, Dollar Universe.
- Scope : Usage Analytics is to keep track of the dollar amount vs. the usage of IT.
- Management is interested in knowing both highest and least accessing reports,
- Dashboards with respect to Application group, vendor, region and customer.
- The plan to put some threshold limit on certain group of reports, dashboards
- and if that particular group/report is below to threshold, then, they will be
- decommissioned/educate the users.
Designed the ETL architecture.
- Implemented pushdown optimization PDO , pipeline partition, cache techniques for efficient loading.
- Used Informatica Data Quality IDQ transformations like Parser, address validator, standardizer, labeler, Match, case, Merger and Exception transformation for data quality.
- Implemented automation, audit and alert strategies throughout ETL processes.
- Implemented Java Transformation logic to transform the data.
- Designed restart-ability through ETL processes.
- Identify the data cleansing elements and prepare the specifications for data cleansing.
- Extensively used Informatica to extract wide range of data from different sources like flat files, XML, Excel spread sheets, Teradata and Oracle into the Data Warehouse.
- Performed Unit Testing and tuned the mappings for the better performance.
- Coordinated with business analysts to gather the new requirements and to work with the existing issues.
- Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
- Involved in data quality checks by interacting with the business analysts.
- Developed complex mappings to load Dimension Fact tables.
- Assisted the QC team in carrying out its QC process of testing the ETL components.
- Created documents for complex transformations in ETL process to support knowledge transfer to other team members.
- Used shortcuts Global/Local to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
- Scheduled the ETL Jobs using Dollar Universe.
- Created pre-session and post-session shell scripts and email notifications.
Confidential
Project Lead
Environment : Informatica Power Center 8.6.1, OBIEE, Oracle9i/10g,DB2, TOAD 9.1 Windows XP, UNIX AIX, IBM Mainframe, Harvest, HP-QC, Autosys
Scope :
Claims data is used to help drive the product development process. Data about diagnosis and mortality is critical in new product development. Claims data is used to identify, determine, and support premium pricing increase requests to the state regulatory agencies and for pricing new products. Claims data is also used to better understand and project the performance of new healthcare/insurance products. This project deals with huge amount of data, and performance is a bit challenging.
- Involved in Extraction, Transformation and Loading of data using Informatica.
- Translated the business processes/requirements into Informatica mappings for building the data mart.
- Involved in Unit, System integration and User Acceptance Testing for the project.
- Extensively used Informatica to extract wide range of data from different sources like flat files, Excel spread sheets, and Oracle into the Oracle Data Warehouse.
- Performed Unit Testing and tuned the mappings for the better performance.
- Assisted the QC team in carrying out its QC process of testing the ETL components.
- Maintained documentation of ETL processes to support knowledge transfer to other team members.
- Used the Debugger to debug the mappings by setting Breakpoints across instances to identify the root cause of errors.
- Developed simple complex mappings to load Dimension Fact tables.
- Worked on different tasks in Workflow Manager like Sessions, Event raise, Event wait, Decision, Email, Command, and Worklets.
- Communicate with the client for updating weekly status reports.