Sr.teradata/etlconsultant Resume
SUMMARY
- Having over 8 Years of IT experience in the Analysis, Design, Development, Testing and Implementing the ETL/Data warehouse system for Banking and Insurance Domains.
- Having in - depth work experience in Abinitio, Informatica Power center, Informatica Power Exchange, PL/SQL,IBM Cognos
- Has sound knowledge and experience of ETL Design and testing withAbinitiousing data parallelism andAb Initiodata transform components
- Expertise in creating Real time Change Data Capture (CDC), MQ and Web Service workflows
- Having experience of working with cloud based tool Salesforce
- Expertise in developing SQL, PL/SQL codes through various procedures, functions, and packages to implement the business logics of database in Oracle.
- Expertise in working with relational databases such as Oracle Exadata, Mainframe DB2,Microsoft SQL Server, MySQL, Teradata
- Exposure to Hadoop Ecosystem including HDFS, MapReduce, HBase, Hive and Pig,Oozie and Zookeeper
- Experience with NoSQL databases Cassandra and Mongo DB
- Developed Models and Packages using Framework Manager and published to Cognos Server.
- Developed reports and performance tuning (aggregation at the database, indices, Query Optimization and Framework Manager Model improvement) were done to improve the report output.
- Experience in developing Unix Shell Scripts for automation of ETL process
- Knowledge in Planning, Designing, developing and deploying Data warehouses / Data Marts with experience of both relational & multidimensional database design using Data Modelling tool ERwin (Physical and Logical data models).
- Experience in working with Teradata utilities like BTEQ, Fast load, Multiload, Fast export, Tpump.
- Very strong onTeradataSQL query optimization techniques like: EXPLAIN feature, COLLECT STATISTICS,Secondary Indexes, Partition Primary Index (PPI), Volatile and Global temporary tables.
- Experience in resolving on-going maintenance issues, bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Working with business stake holders about impact of implementing the changes in production environment
- Extensive expertise in designing, developing and executing test scenarios and test cases.
- Conducted training sessions for the users and also prepared user manuals for better understanding of the designed system for the users
- Having good experience as team lead and led the team with team size of 12-15.
- Excellent interpersonal and communication skills and having experience in working with senior level managers, business people and developers across multiple streams.
TECHNICAL SKILLS
ETL Tools: AbinitioGDE-1.14.38, GDE -1.15.7.2 and CO ops 2.14.73, 2.15, Abinitio EME 2.14.70,2.15, Informatica Powercenter (9.x, 8.x), Power Exchange (9.x, 8.x), Informatica Data Quality, Pentaho
BI Tools: Cognos 10 BI Series, SAP Business Objects, OBIEE, SSRS, Tableau
Databases: Oracle Exadata, Mainframe IBM DB2, Microsoft SQL Server, Sybase, Teradata
Development/Productivity tools: TeradataSQL Assistant v11,V12,V13,Teradatautilities Fast Load, Multi Load, BTEQ, TPump, FastExport, Queryman(SQL assistant), SQL Developer, SQL Plus, TOAD
Data Modeling Tool: Erwin
Version Controlling Tools: Clear Case, Harvest, Tortoise SubVersion
Scheduling Tools: ESP Scheduler, Control M, Autosys
Operating Systems: Windows, AIX-UNIX, Mainframe Z/os
Languages: SQL, PL/SQL, Java, JSP, C, UNIX Shell Scripting, HTML, XML, Jquery
PROFESSIONAL EXPERIENCE
Sr.Teradata/ETLConsultant
Confidential
Technology: Abinitio, Cognos 10.1, Autosys, Teradata 13.0, Oracle, Unix Shell Scripting, HP Quality Centre, Tortoise Sub Version
Responsibilities:
- Implemented AbinitioPlans and continues flows for some applications.
- Worked on enhancement of existingAbinitioapplications and made graphs generic by incorporating parameters into the graphs and adopted the best practices to enhance performance for the graphs.
- Developed reusable Informatica mapplets and worklets for audit balance framework.
- Extensively usedTeradatautilities BTEQ/Fast Export to provide the required solution to meet the requirements
- Involved in Data Migration betweenTeradataand Oracle
- Developing complex BTEQ SQL scripts for loading/unloading data by applying transformations as per business rules.
- Developed Map Reduce programs to parse the raw data, populate tables and store the refined data in partitioned tables. Managed and reviewedHadooplog files.
- Developed and written Apache PIG scripts and HIVE scripts to process the HDFS data.
- Migrated the needed data from Teradata into HDFS using Sqoop and imported various formats of flat Files into HDFS.
- Performing POC to solve existing system/reporting problems using emerging technologies like Hadoop.
- Created ETL design and mapping flow for highly complex requirements.
- Done in depth data Analysis to see what kind of data present in source and how they should to be loaded into the target.
- Involved in the Design, Development, Testing phases of the Data warehouse.
- Prioritized Business requirements and translated those requirements into functional technical requirements
- Involved in Design and Data Model development
- Created Informatica mappings for initial load and daily updates.
- Modified mappings as per the changed business requirements.
- Designed reusable transformations and shortcuts to share different mappings.
- Developed, tested Store procedures, Functions and packages in PL/SQL for Data ETL.
- Designed and developed Mapplets and worklets for reusability. Extensively used ETL to load data from heterogeneous sources like flat files, Oracle tables, XML and Teradata.
- Involved in fixing, testing of store procedures and functions, unit and integration testing of Stored Procedures and the target data.
- Designed Audit Strategy to audit the data between Source System, Target System and Other Parallel Reporting Systems.
- Involved in troubleshooting the load failure cases, including database problems.
- Analysed the query performance and designed the load process schedules.
- Involved in Production Deployment and Support.
- Creating UNIX scripts to run workflows and to check files availability and sending generated files to different systems for processing further.
- Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database.
- Troubleshooting the problems by checking various logs.
- Involved in unit testing and Integration Testing.
- Created and performed testing strategy and test plans for any changes/upgrade to Informatica mappings
Teradata Data Warehouse Designer/Developer
Confidential
Technology: Abinitio, Informatica power exchange 9.0 & 9.5, Informatica Power center 9.0 & 9.5, ESP scheduler, Teradata, DB2, Unix Scripting
Responsibilities:
- Creating High Level Design documents and Low Level Design documents based on requirements
- Done in depth data Analysis to see what kind of data present in source and how they should to be loaded into the target.
- Involved in Data modelling.
- Created mapping with various type of sources and targets (XML, Flat file, MQ, CDC, VSAM, DB, Web service ).
- Creating CDC Registrations for Real time data.
- Creating PWX data maps to fetch data from the Mainframe Files.
- Monitor the Client Information Warehouse jobs to make sure SLAs are met. Ensure that environment related issues are reported immediately, and follow up to closure.
- Extracted data from various sources like Oracle, DB2, SQL server and loaded intoTeradata.
- Worked with Teradata 12 utilities like BTEQ, Fast Load, and Multi Load.
- Review of statistics and joins for performance improvement ofTeradataSQL's using DBQL, Explain Plans
- Creating UNIX scripts to run workflows and to maintain Real time workflows Dependency.
- Developed mappings in Informatica to load the Real time data using PWX source along with normal sources.
- Also developed mappings which involve Mainframe file data extraction using PWX Data maps.
- Developed List, cross tab, drill through, master-detail, chart and complex reports which involved Multiple Prompts in Report Studio.
- Developed Models and Packages using Framework Manager and published to Cognos Server.
- Developed reports and performance tuning (aggregation at the database, indices, Query Optimization and Framework Manager Model improvement) were done to improve the report output.
- Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database.
- Troubleshooting the problems by checking various logs.
- Involved in unit testing and Integration Testing.
- Created and performed testing strategy and test plans for any changes/upgrade to Informatica mappings
- Involved in UAT Support.
- Involved in Peer reviews of LLDs, mappings, Sessions and PWX data maps.
- Monitoring currently running jobs.
- Worked on performance tuning of queries in project.
- Worked in MQ's and channels setup for project.
- Effective coordination with the team to deliver the task with in schedule with Quality
- Coordinating status meetings between offshore and onshore.
ETL Technical Lead
Confidential
Technology: Abinitio,Informatica power exchange 9.0, Informatica Power center 8.6 & 9.0ESP scheduler, Teradata, Unix Scripting, Harvest
Responsibilities:
- Developing new mappings or making changes in existing mappings according to new business requirements.
- Real time processing of data using Informatica Power exchange which is essential for banking.
- Monitoring the daily warehouse load process and troubleshoot time sensitive job failure issues
- Tuning the Code after verifying the real-time data test results
- Incidents fixing
- Handling adhoc loads
- Implemented Star Schema Grouping and Dimension Hierarchy in Framework Manager Model.
- Developed List, cross tab, drill through, master-detail, chart and complex reports which involved Multiple Prompts in Report Studio.
- Working Closely with IGS (Informatica Global Support) after hot fixes are provided by Informatica
- Carry out Impact Analysis for any new changes being pushed to Production.
- Testing and reviewing the deliverables.
- Coordinating meetings between offshore and onshore.
- Effective Co-ordination with the team to deliver the task with in schedule with Quality
ETL Technical Leader
Confidential
Technology: Informatica power exchange, Informatica Power center 9.0ESP scheduler, Oracle Exadata, Unix Scripting, Harvest, BMC Remedy User, Sales Force
Responsibilities:
- Developing new mappings or making changes in existing mappings according to new business requirements.
- Real time processing of data using Informatica Power exchange which is essential for banking.
- Monitoring the daily warehouse load process and troubleshoot time sensitive job failure issues
- Tuning the Code after verifying the real-time data test results
- Incidents fixing
- Handling adhoc loads
- Working Closely with IGS (Informatica Global Support) after hot fixes are provided by Informatica
- Carry out Impact Analysis for any new changes being pushed to Production.
- Testing and reviewing the deliverables.
- Coordinating meetings between offshore and onshore.
- Effective Co-ordination with the team to deliver the task with in schedule with Quality
ETL Developer
Confidential
Technology: Informatica 8.5, Oracle 9i, Informatica Power Centre 8.5.1, TOAD, Peregrine service center, Clear Case
Responsibilities:
- Doing production support activities
- Doing root cause analysis for Incidents raised
- Handling adhoc loads
- Developing new mappings or making changes in existing mappings according to new business requirements.
- Carry out Impact Analysis for any new changes being pushed to Production.
- Coordinating UAT testing with stakeholders.
- Facilitating meetings between offshore and onshore.
Developer
Confidential
Technology: Abinitio, Oracle 9i,TOAD, Visual Source Safe
Responsibilities:
- Understanding existing business model and customer requirements.
- Developed mappings using Informatica for data flows from Source to Target.
- Used various transformations like the Joiner, filter, lookup, expression, aggregator etc... In mappings to perform the ETL process.
- Worked on mapping review, if design flaws found as a part of peer testing.
- Created Effective Test Data and Unit Test cases to ensure successful execution of data loading processes.
- Performed Informatica code migrations, testing, debugging and documentation.