We provide IT Staff Augmentation Services!

Teradata Developer Resume

0/5 (Submit Your Rating)

Riverwoods, IL

SUMMARY:

  • 7+ years of experience in DATAWARE HOUSING database TERADATA database as Teradata Consultant in designing, development, testing, Implementation,
  • Maintenance and support of applications for Banking, Financial and Retail domains in various environments like Mainframes, Windows, UNIX.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and Life Cycle Development.
  • Extensive experience on Data Warehousing Database Teradata Database V2R6/V12/V13/V14.
  • ETL experience using Ab Initio
  • Worked with Data Modelers, ETL staff, BI developers, and Business System Analysts in Business/Functional Requirements review and solution design.
  • Very strong knowledge on Teradata architecture and data model concepts.
  • Good knowledge on Application Programming Interfaces like ODBC drivers for Teradata, JDBC drivers for Teradata, Teradata CLI.
  • Extensive experiences in using teradata Utility pack like:
  • Very good understanding of Teradata MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc. Extensively used different features of Teradata such as BTEQ, Fast load, Multiload, SQL Assistant, DDL and DML commands.
  • Very strong on Teradata SQL query optimization techniques.
  • Experience in using OLAP tools like Business Objects 6.0 /XIR2(Reports, Designer, Web Intelligence, Info view, and Supervisor)
  • Used SAS SQL for, retrieving Data from a Single Table, from Multiple Tables, Creating and Updating Tables and Views, Programming with the SQL Procedure.
  • Worked with SAS ETL Studio to specify metadata for data sources, specify metadata for data targets and create jobs that specify how data is extracted, transformed, and loaded from sources to targets.
  • Very good in writing scripts in UNIX
  • Proficient in Data Modeling Techniques using Star Schema, Snowflake Schema, Fact and Dimension tables, RDBMS, Physical and Logical data modeling for Data Warehouse and Data Mart.
  • Expert in Business Modeling Techniques using Process Flow Modeling and Data Flow Modeling.
  • Strong MVS knowledge, JCL skills and IBM mainframe expertise.
  • Very good understanding of Software Development Life Cycle (SDLC).
  • Excellent Communication skills in interacting with various people of different levels on all projects and also playing an active role in Business Analysis.

PROFESSIONAL EXPERIENCE:

Confidential, Riverwoods, IL

Teradata Developer

Responsibilities:

  • Interacted with the business Analysts to understand the flow of the business.
  • Involved in gathering requirements from the business and design of physical data model.
  • Involved in Data Extraction, Transformation and Loading from source systems.
  • Involved in writing complexSQLqueries based on the given requirements
  • Loaded data into Teradata using FastLoad, BTEQ, FastExport, and MultiLoad.
  • Written several Teradata BTEQ, TPT scripts for reporting purpose
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality
  • Used BTEQ and SQL Assistant (Query man) front - end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality
  • Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Developed the Teradata macros which pulls the data from several sales table and performs calculations and aggregations and dumps into a results table
  • Worked with database administrators to determine indexes, statistics, and partitioning to add to tables in the data warehouse
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables
  • Performed Unit testing, Integration testing and generated various Test Cases.
  • Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Sub queries, EXISTS, COALESCE, NULL etc.
  • Prepared Job scheduling docs and Job Stream List Using Dollar U for code migration to test and production.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Developed various Abinitio ETL graphs for data extraction from oracle database.
  • Extract and load monthly data for a customers.
  • Handling the coordination of Acquisition System Process is design for complex objectives of loading for huge flow of loading data into tables. It is basically a work table is creates using Teradata Fastload Utility.
  • Worked with various XML files by using Abinitio to load into tables.
  • Worked with web services team to interface for communication between EPRS, CE/SE and Dynamic Profile client requests from EDW.

Environment: Teradata V13/V14, Oracle, MVS, XML, SAS 9.1, DB2, VBA Macros, Ab Initio, BTEQ, MLOAD, FLOAD, UNIX AIX, Business Objects XIR2, Affinium, Trillium, ERWIN

Confidential, Chicago, IL

Sr. Teradata Developer

Responsibilities:

  • Data Modeler involved in creation of Enterprise Project Management (EPM) Data Mart for the enterprise project level reporting and analytics.
  • Conducted user requirement sessions to get the requirements for the EPM Analytics and reporting.
  • Had a walkthrough of the Prism (EPM Vendor software) tables to have better understanding of the attributes/elements presented in the software related to projects, project requests, service requests and tasks.
  • Followed the enterprise standard of creating Normalized Standard Layer and Dimensional Presentation Layer for the Data Mart creation.
  • Did the data profiling using SQL and PL/SQL code to understand the data and relationships on the operational system.
  • Created Logical and Physical Models for the EPM Standard Layer and Presentation Layer using Erwin Modeling tool and also created DDLs for DBA to create Oracle Structures.
  • Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, match and merge, creating DDL scripts, creating subject areas, creating DDL scripts, publishing model to PDF and HTML format, generating various data modeling reports etc.
  • Identified and tracked the slowly changing dimensions and determined the hierarchies in dimensions. Used Kimball methodologies for Star Schema Implementation.
  • Created Mapping Documents and ETL design documents for both Presentation Layer and Standard Layer. Followed Enterprise Level Metadata and Audit standards while creating design and source-to-target mapping documents.
  • Coordinated with ETL developers from Indian Offshore Company to develop Ab Initio ETL process to populate both Standard and Presentation Layers. Reviewed the entire ETL process for the performance tuning, audit and backfill and moving forward. Did the Integration testing in both the Layers. And also reviewed the audit checks. Validated the data against the Operational EPM reports.
  • Handed over the production maintenance phase to Offshore Team.
  • Coordinated with BO developer for the creation of BO Universe and created templates for the daily/monthly reports to be created. And also helped him in creation of Dashboards with useful KPIs.
  • Coordinated with BO developer to create several OLAP cubes for the dimensional analysis.
  • Conducted user s to help them understand the Presentation Layer structures and available Cubes and Reports for analysis.
  • Created complete metadata (data lineage) from ETL to Reporting and loaded to Enterprise Metadata Repository.

Environment: Teradata-V12, MAINFRAMES MVS, TSO / ISPF / JCL, CA-7, BTEQ / BTEQWin, FastLoad, Visual Explain, Teradata Manager, SQL Assistant, Oracle 10g, PL/SQL

Confidential, Los Angeles, CA

Teradata Developer

Responsibilities:

  • Developed forward and backfill strategies to load data as per requirements Created several DB2/UDB7 Database scripts to load backfill data to Data Warehouse.
  • Worked within a team to populate Type I and Type II slowly changing dimension tables from several operational source files.
  • Involved heavily in writing complex SQL queries based on the given requirements such as complex DB2 Joins, Sub Queries,Stored Procedures, Macros etc.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the analysts.
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
  • Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Did the production support and maintenance of the existing Data Mart processes. Involved in enhancements to the existing Data Marts.
  • Extensively used DB access to DB2 and Oracle 9i using SAS SQL pass-thru facility. Extensively used the following SAS features: data, data null, sort, merge, append, transpose, tabulate, freq, means, summary, contents, copy, print, datasets, compare, report, format, proc sql, macros, symput, symget, includes and ods to create various SAS reports.
  • Participated in Unit and Integration testing and also participated in User Acceptence Testing (UAT).
  • Used existing Enterprise Balance and Control framework to do the audit checks with ETL processes.
  • Did the SQL performance tuning by scanning through.
  • Involved in user and support for data mining and reporting.

Environment: Teradata-V2R6, BTEQ / BTEQWin, FastLoad, MultiLoad, FastExport, Visual Explain, Queryman, Oracle 9i, PL/SQL, IBM-MAINFRAMES, TSO / ISPF, PeopleSoft, UNIX, Windows NT / 2000, NCRs World Mark 5200 MPP Systems with 16 CPU and 4 Nodes.

Confidential, Atlanta, GA

Teradata Developer

Responsibilities:

  • Conducted several JAD sessions with key business analysts to get the requirements related Reports, KPIs and Data Mining.
  • Performed extensive Data Profiling on the source data by loading the data sample into Database using Database Load Utilities.
  • Worked with Data Modelers to create a star schema model for the above subject areas and made sure that all the requirements can be generated from the models created.
  • Created High level and detail design documents for above data mart projects containing process flows, mapping document, initial and delta load strategies, Surrogate key generation, Type I and Type II dimension loading, Balance and control, Exception processing, Process Dependencies and scheduling.
  • Identified the required dependencies between ETL processes and triggers to schedule the Jobs to populate Data Marts on scheduled basis.
  • Did the performance tuning on the Ab Initio graphs to reduce the process time.
  • Designed and coordinated the backfill process to load 3 to 5 years of the data into Data Marts.
  • Participated in the evaluation of Teradata DBMS to replace DB2/UDB 7 for the EDW. Created several large tables with real card portfolio data totaling 4 TB for the POC. With the help of Teradata folks at San Diego, created tables with right primary index and partitioning index on the 4 node Teradata system. Created several complex queries and ran them to get the performance measurements. Compared the results with the results from running the same queries on UDB DB2 system. Presented the results to Management.
  • Worked with DBA team to ensure implementation of the databases for the physical data models intended for the above data marts. Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the analysts.
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
  • Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Implemented data level, object level and package level security in framework manager to achieve highly complex security standards.
  • Created Master Detail reports, drill through and custom prompting reports, and Scheduled reports for efficient resource utilization.
  • Created Query Prompts, calculations, Conditions & Filters. Developing Prompt pages and Conditional Variables. Involved in Testing and improving Report Performance.
  • Involved heavily in writing complex SQL queries based on the given requirements such as complex DB2 Joins, Sub Queries, Stored Procedures, and Macros etc.
  • Helped Business Users by writing Complex efficient Teradata SQLs to get a detailed for Data Mining. Automated these extract using BTEQ an Unix Shell Scripting.
  • Involved in user and support for data mining and reporting.

Environment: Teradata V2R6, Oracle9i, Teradata Priority Scheduler, Teradata SQL Assistant, FastLoad, MultiLoad, Teradata Administrator, BTEQ, MVS, UNIX, DB2, COBOL.

Confidential

ETL Developer

Responsibilities:

  • Involved in analysis of Extraction, Transformation and Loading ETL process.
  • Involved in creation of the mappings between sources and target tables.
  • Used different transformation like Expression, Aggregator, Source Qualifier, Sequence Generator, Filter, Router, Update strategy and Lookup Transformations.
  • Created reusable transformations, sessions and Batches using Server Manager. Involved in debugging mappings and tuned for better performance.
  • Involved in Unit testing as well as Functional testing, Performance Tuning of Queries.
  • Created staging Tables, Indexes, Sequences, Views and performance tuning like analyzing tables, proper indexing.
  • Identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions to improve performance.
  • Writing Reconciliation Queries as per Business Requirements.

Environment: Informatica, Business objects, Oracle 9i, Windows NT, Teradata V2R6

We'd love your feedback!