Teradata/informatica Developer Resume
Piscataway, NJ
SUMMARY
- Around 7+ years of extensive experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Teradata and Informatica technologies.
- Participated in full Software Development Life Cycle (SDLC) of the data warehousing project; project planning, business requirement analysis, data analysis, logical and physical database design, setting up the warehouse physical schema and architecture, developing reports, security and deploying to end users.
- Extensively used different features of Teradata such as BTEQ, Fastload, Multiload, SQL Assistant, View Point, DDL and DML commands. Very good understanding of Teradata UPI and NUPI, secondary indexes and join indexes.
- Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modelling.
- Extensive knowledge on Data Profiling using Informatica Developer tool.
- Very good understanding of Teradata’s MPP architecture such as shared nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc.
- Extensively created and used various Teradata Set Tables, Multi - Set table, global tables, volatile tables, temp tables.
- Experience in both technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and STAR schema and/or SNOW FLAKE schema for OLAP, Multi-dimensional cubes.
- Proficient in implementing Complex business rules by creating re-usable transformations, workflows/ worklets and Mappings/Mapplets.
- Proficiency in design and developing the ETL objects using Informatica Powercenter with various Transformations like Joiner, Aggregate, Expression, SQL, Lookup, Filter, Update Strategy, Stored Procedures, Router, Rank, normalizer transformations etc.
- Involved in Data Migration projects from DB2 and Oracle to Teradata. Created automated scripts to do the migration using UNIX shell scripting, Oracle/TD SQL, TD Macros and Procedures.
- Automated the BTEQ report generation using UNIX scheduling tools on weekly and monthly basis. Well versed with understanding of Explain Plans and confidence levels and very good understanding of Database Skew. Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes.
- Extensively worked on PMON/Viewpoint for Teradata to look at performance Monitoring and performance tuning. Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain. Experience in programming with SQL and PL/SQL (Stored Procedures, Functions, Cursors, and Database Triggers).
- Very good experience in Oracle database application development using Oracle 10g/9i/8i/x, SQL, PL/SQL, SQL Loader.
- Strong SQL experience in Teradata from developing the ETL with Complex tuned queries including analytical functions and BTEQ scripts.
- Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility and also worked with XML Sources & Targets.
- Data Processing Experience in Designing and Implementing Data Mart applications, mainly Transformation Process using Informatica.
- Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
- Knowledge of push down optimization concepts and tuning Informatica objects for optimum execution timelines.
- Experienced with identifying Performance bottlenecks and fixing code for Optimization in Informatica and Oracle.
- Created UNIX shell scripts for Informatica post and pre session operations, database
- Administration and day-to-day activities like, monitor network connections and database ping utilities.
- Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
- Extensive experience in implementation of Data Cleanup Procedures, Transformation, Scripts, Stored Procedures and execution of Test plans for loading the data successfully into Targets.
- Maintaining the Visual Source Safe (VSS), Quality matrices, Knowledge management and Defect prevention & analysis.
- Creating Checklists for Coding, Testing and Release for a smooth, better & error free project flow
TECHNICAL SKILLS
Primary Tools: Teradata SQL, Teradata Tools and Utilities, Informatica Power Center 9.0.1/8.6/8.1 , Ab Initio (Co>Op 3.0.3.9/2.15/2.14 , GDE 3.0.4/1.15/1.14 ), IBM Information Server 9.1/8.5/8.0.1 , Oracle 10g/9i, MS SQL Server 6.5/7.0/2000
Languages: Teradata SQL, Bteq, SQL,C,C++,Shell Script
Teradata Utilities: BTEQ, FastLoad, MultiLoad, TPump, TPT, SQL Assistant, Teradata Manager
Databases: Teradata 13/12/V2R6.2, Oracle 10g/9i, DB2/UDB, SQL Server
Operating Systems: Windows 95/98/NT/2000/XP, UNIX, Linux, NCR MP-RAS UNIX
Data Modeling: Erwin, ER Studio
Scheduling tools: Control M, Autosys
PROFESSIONAL EXPERIENCE
Confidential, Piscataway, NJ
Teradata/Informatica Developer
Environment: Teradata15.10/14.x/13.x, Oracle 10G, SQL Assistant, Informatica Power Center 9.6.1, Workflow Manager, Workflow Monitor, Target Designer, Source Analyzer, Transformationdeveloper, Mapplet Designer, Mapping Designer, Repository manager, Bteq, TPT, Fastload, Mload, Control-M, UNIX, SSH (secure shell), TOAD, ERWIN,WINSCP.
Responsibilities:
- Responsible for developing, support and maintenance for theETL(Extract, Transform and Load) processes.
- Interfaced with various members of the technical and business team to translate the business reporting and data maintenance requirements into functionalETLcode.
- Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using Fast Load, Multiload and BTEQ utilities ofTeradata.
- Performed application level DBA activities creating tables, indexes and monitored and tunedTeradataBETQ scripts usingTeradataVisual Explain utility.
- Developed theTeradataMacros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
- Created and Configured Workflows, Worklets and Sessions to transport the data to target tables.
- ReducedTeradataspace used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions
- Extensively involved in data validation to ensure that the highest levels of data quality and data integrity are maintained.
- Created deployment groups to migrate code for one environment to another.
- Used Aginity and Win SQL to run Queries, for testing and validation of data.
- Created mappings to write Infusion data, CPT HCPC codes etc. to flat files for Care Centrix for reporting purposes.
- Worked on UNIX shell scripting for automating Informatica workflows
- Used transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, and lookup, Rank, Joiner, Expression, Stored Procedure and Update Strategy to meet business logic in the mappings.
- Created reusable transformations and Mapplets to use in multiple mappings.
- Used workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, and scheduled them to run at a specified time.
- Created Workflow, Worklet, Assignment, Decision, Event Wait and Raise and Email Task, scheduled Task and Workflow based on Client requirement.
- MigratedETLcode from Dev to QA and then from QA to Prod for new release cycles.
- Developed testing strategies and performed in-depth testing to ensure data quality.
Confidential, Riverwoods, IL
Teradata/Informatica Developer
Environment: Teradata 13.10/14.0 (FastLoad, MultiLoad, FastExport, BTEQ), Teradata SQL Assistant, Informatica Power Center 9.5.1/9.6.1 , UNIX,AUTOSYS,WINSCP, SQL, PL/SQL, Work Load Manager, MS Access.
Responsibilities:
- Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatile tables and derived queries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Created Shell Script to send the data to downstream systems using FTP / SFTP.
- Created Autosys JiL Scripts,Parameter files and automated ETL Process by scheduling the jobs Using Autosys.
- Extensively worked on design, development and Unit test of ETL Code and Teradata load utilities like MLOAD, BTEQ and FASTLOAD which tied with Mainframe technology.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad,Tpump and Teradata Parallel Transporter Utility(TPT).
- Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
- Worked on Change data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
- Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
- Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, Worklets and scheduling of the workflow.
- Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
- Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
- Performed unit testing at various levels of the ETL and actively involved in team code reviews.
- Created shell scripts to fine tune the ETL flow of the Informatica workflows.
- Worked on exporting data to flat files using TeradataFastExport.
- Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc…
- In-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks.
- Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Router and Sequence generator.
- Developed, documented and executed unit test plans for the components.
- Documenting the developed code, run the sessions and workflows, while keeping track of source and target row count
- Worked with PPI Teradata tables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process.
Confidential, Charlotte,NC
Teradata/ETL Consultant
Environment: Teradata 14.10 (FastLoad, MultiLoad, FastExport, BTEQ), Teradata SQL Assistant, Informatica Power Center 9, Unix, SQL, PL/SQL, Work Load Manager, MS Access, UNIX.
Responsibilities:
- Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
- Providing technical support and guidance to the offshore team to address complex business problems.
- Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
- Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
- Developed multiple MapReduce Jobs in java for data cleaning and pre-processing.
- Analyzed large data sets by running Hive queries and Pig scripts.
- Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
- Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatile table and derived queries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Dealt with initials, delta and Incremental data as well Migration data to load into the Teradata.
- Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created mapplets to use them in different mappings.
- Created data models for information systems by applying formal data modeling techniques.
- Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.
- Created Mappings and used transformations like Source Qualifier, Filter, Update Strategy, Lookup, Router, Joiner, Normalizer, Aggregator, Sequence Generator and Address validator.
- Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.
- Created several Teradata SQL queries and created several reports using the above data mart for UAT and user reports. Used several of SQL features such as Group By, Rollup, Rank, Case, Union, Subqueries, Exists, Coalesce, Null etc.
- Performed reverse engineering of physical data models from databases and SQL scripts.
- Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, Worklets and scheduling of the workflow.
- Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
- Involved in tuning the mappings, sessions and the Source Qualifier query.
- Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Manage all technical aspects of the ETL mapping process with other team members
- Performed Unit testing and created Unix Shell Scripts and provided on call support.
- Created sessions and workflows to run with the logic embedded in the mappings
- Actively participated in Scrum Meetings.
Confidential, Springfield,IL
Teradata/ETL Developer
Environment: Teradata 12.0/13.0,Informatica,BTEQ, FastLoad, MultiLoad, Fast Export, Teradata SQL Assistant, OBIEE 11g/10g, DB2, ERwin R7.3, IBM Mainframes MVS/OS, JCL, TSO/ISPF, Change man, SPUFI, File Aid, COBOL, ZEKE, DB2, UNIX, FTP.
Responsibilities:
- Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Provided maintenance and support of Online and Batch Programs using COBOL, DB2, CICS, JCL.
- Extracted datafrom DB2 database on Mainframes and loadedit into SET and MULTISET tables in the Teradata database by using various Teradata load utilities. Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
- Architected and developed FastLoad and MultiLoad scriptsdeveloped Macros and Stored procedures to extract data, BTEQscripts to take the date range from the database to extract data.
- Created JCL scripts for calling and executing BTEQ, Fast Export, Fload, and Mload scripts.
- Writing queries using SPUFI to extract data from various DB2 Views for reporting purpose
- Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL.
- Converting the Table data from DB2 region to TERADATA region using FASTLOAD and MULTILOAD Utilities.
- Responsible for Coding, Unit Test Plans, Unit Test Results, Functional Testing and Regression Testing.
- Migrated mainframe DB2 data to Teradata for one of their critical application.
- Synchronizing all regions PCR, Unit and System while migration changes from lower region to acceptance.
- Wrote several DB2 Stored Procedure scripts to implement the business logic.
- Extensively involved in data transformations, validations, extraction and loading process. Implemented various Teradata Join Types like Inner-join, outer-join, self-join, cross-joins. And various join strategies like Merge join, Product join, Nested join, Row Hash Joins.
- Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
- Wrote highly complex SQL to pull data from the Teradata EDW and create AdHoc reports for key business personnel within the organization.
- Created data models for information systems by applying formal data modeling techniques.
- Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.
- Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.
- Performed reverse engineering of physical data models from databases and SQL scripts.
- Provided database implementation and database administrative support for custom application development efforts.
- Performance tuning and optimization of database configuration and application SQL by using Explain plans and Statistics collection based on UPI, NUPI, USI, and NUSI.
- Developed OLAP reports and Dashboards using the Business intelligence tool - OBIEE.
- Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Understanding the entire functionality and major algorithms of the project and adhering to the company testing process.
Confidential, Stamford,CT
Informatica/Teradata Developer
Environment: Informatica 9.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Teradata, UNIX, Citrix, Toad, Putty, PL/SQL Developer
Responsibilities:
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
- Worked on Teradata and its utilities - tpump, fastload through Informatica. Also created complex Teradata Macros
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
- Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Exhaustive testing of developed components.
- Worked on the various enhancements activities, involved in process improvement.
- Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
- Working with Power Centre Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.
- Worked on Ab Initio in order to replicate the existing code to Informatica.
- Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
- Created and managed different Power exchange directories like condense files directory, check point directory etc.
- Developed wrapper shell scripts for calling Informatica workflows using PMCMD command and Created shell scripts to fine tune the ETL flow of the Informatica workflows.
- Involved in building tables, views and Indexes.
- Involved in ad hoc querying, quick deployment, and rapid customization, making it even easier for users to make business decisions.