We provide IT Staff Augmentation Services!

Teradata Developer Resume

3.00/5 (Submit Your Rating)

Durham, NC

Professional Summary

  • Over 6 years of IT experience and technical proficiency in the field of Data Warehousing teamed with Data Analysis, Business Requirements Analysis, Application Design, Development & testing, Data profiling, data Standardization & Quality Control.
  • Proficiency in Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimension Table design, Physical and Logical Data Modeling implementations.
  • experience in using Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to T pump on UNIX/Windows environments and running the batch process for Teradata.
  • Strong Data Warehousing experience in Application development & Quality Assurance testing using Informatica Power Center 9.1/8.6(Designer, Workflow Manager, Workflow Monitor), Power Exchange, OLAP, OLTP
  • Experience in creating complex Informatica Mappings using Source Qualifier, Expression, Router, Aggregator, Lookup, Normalize, and other transformations in Informatica and well versed in debugging an Informatica mapping using Debugger
  • Experience in Business Intelligence arena with Business Objects XI/6.5/5.
  • Experience in Creating Physical / Logical data models, Reverse engineering form RDBMS, Forward engineering to create Schema’s, Complete compare using CA All fusion Erwin 4x
  • Strong SQL experience in Coding and Query tuning. Well versed with Oracle SQL tools like TOAD, SQL developer and SQL Plus
  • Experienced in various Teradata utilities like Fastload, Multiload, BTEQ, and Teradata SQL Assistant.
  • Teradata Developer and ETL developer and ensured successful Data Warehouse Implementations.
  • Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working with loader logs.
  • Experience in designing error and exception handling procedures to identify, record and report errors.
  • Expert in debugging, troubleshooting, monitoring, and performance tuning.
  • Involved in Unit Test Case, Integration Test, User Acceptance Test (UAT) preparation and testing of the applications.
  • Hands on experience in using the MS Office Suite, MS Project plan, Visio, Erwin
  • Experience in setting up CVS modules and in checking files into and out of the CVS
  • Through experience in scripting using UNIX Shell.

Technical Skills

Operating System

Windows NT/2000/XP, UNIX, Linux, Solaris

Languages

JAVA, SQL, PL/SQL,

RDBMS

Oracle 11g/10g/9i/8.x, Teradata V2R5, V2R6, R12/R13.10

ETL

Informatica 9.1.x/8.6.1,Teradata SQL Assistant, BTEQ, FLoad, Fast Export, MLoad, TPump.

Scripting

Java Scripting, UNIX Scripting.

Server

Tomcat 5.x,Apache web Server 2.x,BEA Weblogic 7.x

Tools

SQL* Loader, TOAD, Putty, WinSCP, MS-VISIO

Reporting

B.O 6.5, Cognos 8.0 B.I

Scheduling tools

Autosys,Tidal,Control-M

Professional Experience

Project #1

Feb ’11 - Present
Client: Confidential,Durham NC
Title: Teradata Developer

Project Name: Frontier 13: Scalability, Transformation, Conversion (FTR 13 – STC)

Description: - Frontier is one of the fast growing firm in telecom industry. Frontier acquired 13 states of business from Verizon. Frontier 13 (FTR 13) project deals converting the Verizon data to Frontier standards and this has three phases - Scalability, Transformation and Conversion.
The overall purpose of the project is to capture raw information from the customer systems to use as a foundation for customer reporting.

Responsibilities:
As a Teradata/Informatica Developer was responsible for

  • Involved in requirements gathering and data gathering to support developers in handling the design specification.
  • Extracted data from various source systems like Oracle, SQL Server and flat files as per the requirements.
  • Extensive experience in writing and executing BTEQ scripts for validation and testing of the sessions, data integrity between source and target database and for report generation.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex MLOAD scripts and FASTLOAD scripts.
  • Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.
  • Created proper Primary Index (PI) talking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Write numerous BTEQ scripts to run complex queries on the teradata database.
  • Created Mappings and scheduled Workflows using Informatica.
  • Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working with loader logs.
  • The mappings involved extensive use of transformations like Aggregate, Filter, Join, Expression, Lookup, Update Strategy, Expressions, Sequence Generator Transformations. Used debugger to test and fix mapping
  • Tuned mappings using Power Center-Designer and used different logic to provide maximum efficiency and performance
  • Reduced Teradata space used by optimizing tables – adding compression where appropriate and ensuring optimum column definitions.
  • Performance tuned the workflows by identifying the bottlenecks in targets, sources, mappings, sessions and workflows and eliminated them.
  • Responsible for migration and production support.
  • Wrote hundreds of DDL scripts to create tables, views and indexes in the company Data Warehouse.
  • Prepared Unit Test Specification Requirements.

Environment: Teradata R12/R13.10 (BTEQ, SQL Assistant, Multiload, FastLoad, MultiLoad) Informatica Power Center 9.1/8.6, Oracle 11g, SQL, PL/SQL, UNIX Shell scripts.

Project #2

Client: Confidential,NY Nov’09-Jan’11
Title: Teradata Developer

Confidential, is a leading global investment bank with a strong and profitable private client’s franchise. Its businesses are mutually reinforcing ,Leasing and asset finance transactions involve complex, long-term structures that are tailor-made to serve the specific legal and accounting requirements of the parties involved as well as to provide for the general framework of the respective regions. To successfully close such transactions, AFL is made up of professionals with backgrounds in the banking, leasing, and manufacturing and consultancy sectors.This project meant to hold the raw data getting from the fields and keeping historical transactions and satisfying the reporting needs of the high level management. This project delivered a most trusted data warehouse to satisfy the reporting needs and single fact store.

Responsibilities:

  • Having experience in relational database theory and design including logical and physical structures & data normalization techniques.
  • Involved in Designing and Development of logical and physical data models of systems that hold Terabytes of data.
  • Worked on loading of data from several flat files sources to staging area using teradata utilities like MLOAD, FLOAD, TPUMP and Fexport.
  • Worked on Tuning, and troubleshooting Teradata system at various levels.
  • Gathering of Requirements from the various systems business users.
  • Created Global and volatile temporary tables to load large volumes of data into teradata database.
  • Created, updated and maintained ETL technical documentation.
  • Build tables, views, UPI, NUPI, USI and NUSI.
  • Involved in SQL Tuning, Optimizer, Indexes, Table partitions, and clusters.
  • Written VBA macros with formulas and extracted Data and made Excel reports.
  • Written several Teradata BTEQ scripts to implement the business logic.
  • Interacting with SME’s and Business Analysts for clarifications.
  • Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
  • Designed and developed mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data from multiple systems into the data warehouse.
  • Done Query optimization explain plans, collect statistics, Primary and Secondary indexes Used volatile table and derived queries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
  • Excellent in Teradata Data warehousing environment.
  • Involved in physical and logical design of the applications.
  • Worked with DBA’s for transition from Development to Testing and Testing to Production.
  • Used the Informatica Designer to develop processes for extraction, cleaning, transforming, and integrating.

Environment: Teradata V2R6/R12 (Fast Load, Fast Export, TPump, Multi Load, BTEQ), Informatica (8.6.1) Oracle 10g, Sql server, Shell Script, UNIX, Erwin 6.0, SAS, B Business Objects 5.1 IBM AIX, Erwin 4.0,control M.

Project #3

Client: Confidential,Dallas, TX May’08-Oct’09
TeradataDeveloper


Vartec is a telecom company for LOCAL AND LONG DISTANCE service provider to residential and small business customers. The objective of the project is to get a single repository for Vartec business with Teradata enterprise data warehouse.

Responsibilities

  • Involved in complete SDLC (System Development Life Cycle).
  • Exported the data into from UNIX to Mainframe for backup.
  • Created Mainframe datasets, JCLs, Procedures and libraries.
  • Developed performance utilization charts, optimized and tuned SQL and designed physical databases. Assisted developers with Teradata load utilities and SQL.
  • Converted batch jobs with BULKLOAD utility to TPUMP utility
  • Researched Sources and identified necessary Business Components for Analysis.
  • Gathered the required information from the users.
  • Interacted with different system groups for analysis of systems.
  • Created tables, views in Teradata, according to the requirements.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Implemented slowly changing dimensions methodology to keep track of historical data.
  • Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from Oracle and DB2 databases.
  • Imported data from various Sources (DB2, Oracle, SQL Server), created mappings using Informatica Power Center 8.1 Designer (Source analyzer, Warehouse developer, Transformation developer, Mapplet designer, and Mapping designer).
  • Designed and developed Informatica Mappings, Mapplets, and Sessions, Workflows based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables. Used transformations like Joiner, Expression, Connected/Unconnected lookups, Filter, Aggregator, Rank, Update, Router and Sequence generator.
  • Developed and scheduled Workflows using workflow designer, worklet designer in Workflow manager and monitored the results in Workflow monitor. Scheduled batch and sessions within Informatica using Informatica scheduler.
  • Involved in unit testing of mappings, Mapplets also involved in integration testing and user acceptance testing.
  • Experience with Power Connect in accessing data from mainframes.
  • Wrote unload wrapper shell (.ksh) scripts to unload (serial/parallel) data from a flat file into Teradata database.
  • Wrote appropriate code in the conversions as per the Business Logic using BTeq scripts.
  • Loaded the data into the Teradata database using Load utilities like (Fast Export, Fast Load, MultiLoad, and Tpump).
  • Expertise in performance tuning the user queries.
  • Worked with GSC NCR to resolve problems and tickets.
  • Written many of the MACROS in Teradata to generate less channel traffic and easy execution of frequently used SQL operations and improve the performance.
  • Automated the routine tasks using Shell Scripts. HP-Unix (Shell Scripting, Awk, Sed).

Environment
Teradata V2R6, BTeq, Mload, Fast Load, Tpump, Informatica (Power Center 8.1), DB2, UNIX, Windows NT.

Project #4

Confidential,India Jan’06 – Apr’08
Client: Confidential,
Teradata /Informatica Developer
Confidential, was developed to analyze the effectiveness of the marketing campaigns for the incentives and offerings offered by VODAFONE. The ETL process involved extracting the data from SQL Server and Oracle sources and loading them into Teradata Data Warehouse. The data was used to analyze customer opinions, loyalty, customer profiles and customer satisfaction.

Responsibilities:

  • Used External Loaders like Multi Load, T Pump and Fast Load to load data into Teradata database. Wrote Teradata BTEQ and FASTEXPORT scripts to export data to various data marts.
  • Designed ETL process flows for entire DWH application and developed data mapping spreadsheets to define transformation rules for each stage of ETL process.
  • Developed Informatica ETL code using various mappings and transformations for transport of data from legacy extract files to data mart as per the business requirements.
  • Developed reusable transformations and Mapplets transformations.
  • Worked with PMCMD to interact with Informatica Server.
  • Developed Shell scripts for Job Automation.
  • Responsible for migration of Informatica mappings, workflows between different environments.
  • Developed robust Informatica mappings and fine-tuned them to process millions of input records with estimated throughput.
  • Developed Windows Batch scripts for setting customized commands.
  • Responsible for monitoring scheduled, running completed and failed sessions. Involved in debugging the failed mappings and developing error-handling methods.
  • Extracted data from Oracle, SQL Server, Flat files, and DB2 source systems.
  • Involved in Unit testing and Integration Testing of the developed code.
  • Responsible for four types of performance tuning Mapping Level, Session Level, source Level, and target level.
  • Coordinate with the Actuate reporting team in modifying the existing Informatica ETL code as per the new business requirements.
  • Familiar with concepts involving different types of Indexes, Joins, Spaces etc. for optimum performance of Teradata.
  • Development of Bteq /Fast Load /Multiload script for loading purpose.
  • Designed/Created several BTEQ scripts to apply the business rules manipulate and/or massage the data according to the requirements.
  • Designed/Developed scripts to move data from the staging tables to the target tables.
  • Designed the order of flow for the execution of the jobs and scheduling the jobs.
  • Fine tuning of SQL to optimize the performance, spool space usage and CPU usage.
  • Responsible for migration and production support.
  • Wrote hundreds of DDL scripts to create tables, views and indexes in the company Data Warehouse.
  • Provided on-call trouble-shooting support for all applications run in the west coast data centre with responsibility for all applications involved in Data Warehouse Services.

Environment:Informatica Power Center/Power Mart 6.x, Informatica Power Exchange, Sun Solaris, Teradata Database V2R5.x, NCR Servers, Teradata Utilities (BTEQ, Fast load, Multiload, T pump).

Education

  • B.Tech in Electronics and communications

We'd love your feedback!