We provide IT Staff Augmentation Services!

Teradata Developer/etl Testing Resume

0/5 (Submit Your Rating)

TX

SUMMARY

  • Having Seven (7+) years of comprehensive experience with s/w skills in Data Warehousing Involves Requirements Analysis, Application Design, Data Modelling, Development, testing and documentation.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co - ordination and development with Teradata/Oracle/SQL Server based Relational Databases.
  • 6+ years of experience using Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to T pump on UNIX/Windows environments and running the batch process for Teradata CRM and as well Data Testing.
  • 2+ Years of experience in Visual Studios(SSAS/SSIS) and Visual Source Safe (VSS) to manage OLAP Cubes and did many operations as well to cubes in OLAP.
  • Expertise in database programming in writing of the SQL, Stored Procedures, Functions, Triggers, Views in Teradata, Oracle, DB2 & MS Access .
  • Exposure in extracting, transforming and loading data from other databases and text files using SQL*Loader utility and SqlServer ETL tools.
  • Experienced with data warehousing applications in financial services, banking, insurance, retail industries.
  • Experienced in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Informatica.
  • Experience in writing UNIX shell scripts to check the status, space management and performance of the database. Extensive use of Crontab in UNIX environment for scheduling routine tasks.
  • Developed UNIX shell scripts using K-shell, awk, SED and UNIX commands to perform net change (Incremental data).
  • Involved in all the stages of the software development cycle, namely design, specification, coding, debugging, testing (test plan and test execution), documentation and maintenance of the programs.
  • Experience in application development using system analysis, design and modeling techniques like Unified Modeling language (UML), Sequence diagrams, Case diagrams, Entity Relationship Diagrams (ERD).
  • Good team player and can work efficiently in multiple team environments and multiple products. Easily adaptable to the new systems and environments.

TECHNICAL SKILLS

Data Warehousing: TeradataV2R4/V2R5/V2R6, Oracle11g/10g/9i, SQL server2008, 2005, 2000, MS Access.

Operating Systems: HP-UX, Sun Solaris, Red Hat Linux, Windows2003/2000/XP/NT, IBM AIX.

Languages: SQL, PL/SQL, VB.Net, C, C++, UNIX shell scripting, PERL.

DB Utilities: BTEQ, Fast Load, Multi Load, Fast Export, T Pump, SQL*Loader, Exp/Imp.

Scheduling Tool: MLoad, AutoSys, Version Control.

Tools: CSV, VSS, Arcmain, Teradata Administrator, Visual Explain, SQL Assistant, Toad, Putty, WINSCP, CYGWIN, Oracle Developer 2000, SQL*Plus, TFS

Others: MS Office, Informatica Power Center 7.X/8.X, Telnet, BITools, Adobe Acrobat Reader/Writer, HTML, UML, SFTP, FTP, SCP, Telnet, TCP/IP.

PROFESSIONAL EXPERIENCE

Confidential, TX

Teradata Developer/ETL Testing

Responsibilities:

  • Coded complex and highly-optimized SQLs looking up at core EDW in Third Normal Form Financial Services Physical Data Model to output at demoralized data mart requirements.
  • Design, create and regular tuning of physical database objects (tables, views, indexes) to support normalized and dimensional models.
  • Requirements analysis, data assessment, business process reengineering.
  • Provide ongoing support by developing processes and executing object migrations, security and access privilege setup and active performance monitoring.
  • Index maintenance & analysis.
  • Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities.
  • Create and Maintain Teradata Databases, Users, Tables, Views, Macros, Triggers and Stored Procedures
  • Create and Maintain Users Profile Definition to determine the Performance
  • Coding using Teradata Analytical functions, BTEQ SQL of TERADATA.
  • Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader (Oracle), Exposure to T pump.
  • Worked on Visual Source Safe (VSS) as to save latest versions of modified code in scripts.
  • Worked on Visual Studios to manage OLAP Cubes and did many operations as well to cubes in OLAP.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Worked on complex queries to map the data as per the requirements.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user acceptance testing and loading history data into Teradata.
  • Designed delta load on all fact tables.
  • Created Stored Procedures and Macro.
  • Involved in Testing of data and Performance tuning for the long running queries.
  • Extensively worked on Teradata Parallel Transport utility (TPT) coding
  • Explored ways to optimize existing ETL processes and enhance their performance.
  • Worked SSAS processing OLAP Cubes and adding dimensions to it.
  • Worked on SSIS, created packages for execution of SQL Scripts.
  • Application development that enables data to be moved from legacy systems and external data sources into a staging area for the transformation effort to final data load into the Enterprise data warehouse Teradata, the tools extensively used were Multiload, Fast load, Fast export and BTEQ.

Environment: Teradata 13, MVS, Teradata SQL Assistance 6.1, Teradata Administrator 6.0, Visual Studio, VSS, Teradata Utilities (Mload, Fast load, Fast Export, BTEQ, ARC Utilities, Teradata SQL Assistant, SSAS,SSIS.

Confidential, Pleasanton, CA

Teradata/ETL Testing Developer

Responsibilities:

  • Requirements analysis, data assessment, business process reengineering, index maintenance & analysis. Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities.
  • Created database schemas based on the logical models and experience in planning of the physical database and data-files mapping.
  • Experience using modeling tools, logical and physical database design. Maintaining and updating database models, generating or modifying the database schemas and data warehouse loading.
  • Highly proficient in writing loader scripts like BTEQ, MLoad, FLoad and Fast Export.
  • Designed the code members to support re-startability and re-runnability maintaining the transaction integrity. Worked on optimizing and tuning the Teradata SQLs to improve the performance of batch.
  • Responsible for the design and development of the database models with the logical data modeling.
  • Worked on Visual Source Safe (VSS) as to save latest versions of modified code in scripts.
  • Worked on Visual Studios ( SSAS) to manage OLAP Cubes and did many operations as well to cubes in OLAP.
  • Used SQL Server 2005 to have an automated run of scripts for Monthly loads
  • Educated the team on how to utilize the teradata visual tools like Teradata Manager, Database query language (DBQL), TDQM, Visual explain, Exposure to T pump on UNIX/Mainframes environments, Security and access privileges and IP filtering, etc.,
  • Gained extensive knowledge in parallel jobs, Server Jobs and Job sequence. Various job sequence properties used to line up parallel jobs, database commands, operating system commands such as exec command activity, nested condition activity, loop activity etc
  • Designed, modeled, developed and optimized complex work-in-progress (WIP) summary collections such as Summary throughput equipment, Summary throughput lot recipes etc.,
  • Created a Generic BTEQ script to load the data into various target tables from flat files using control table mechanism.
  • Using Teradata manager, Index Wizard and PMON utilities to improve performance.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Various generic scripts developed to generate schema files, SQL scripts and database scripts using Perl.
  • Implemented different kind of purging techniques to free up space in certain teradata databases and to remove orphan records from collection tables.
  • Provided support during the system test, Product Integration Testing and User Acceptance Testing.
  • Designed and Created the Business Objects Universes. Developed classes with Dimension, Detail & Measure Objects and specified the hierarchies.
  • Involved in configuring and installing Business Objects 6.5

Environment: Teradata V2R5.0, Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq), Visual Studio, Teradata SQL, Teradata EDW Roadmaps, Teradata Relationship Manager, SSIS, SSAS.

Confidential, TX

Teradata Developer

Responsibilities:

  • Involvement in implementation of BTEQ and Bulk load jobs.
  • Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader .
  • Involvement in database optimization.
  • Debugged the problems when migrating from Oracle to Teradata (Conversion of data types, Views, tables etc.).
  • Created complex mappings to compare two databases Oracle and Teradata.
  • Developed Fast Load, Multi Load and BTEQ scripts to load data from various data sources and legacy systems to Teradata
  • Involved in the ongoing delivery of migrating client mini-data warehouses or functional data marts from oracle environment to Teradata.
  • Used Workflow Manager for Creating, Validating, Testing and running the workflows and Sessions and scheduling them to run at specified time.
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Analyzing production support documents and finding the feasible time to run their jobs.
  • Performance changes to allow fast handling of transaction processing related request (plan caching).
  • Created Logical Data Model from the Source System study according to Business Requirements.
  • Create and execute test plans for Unit, Integration, and System test phases.
  • Automated related tasks by developing UNIX shell scripts used to maintain the core EDW.
  • Worked closely with Project Managers, Business Analysts, BI Architect, source system owners, Data Management/Data Quality team to ensure timely and accurate delivery of business requirements.

Environment: Informatica Power Center 7.1.3, Teradata V2R5.0, Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq) Teradata SQL, Oracle9i,, Unix Shell Scripting, Windows XP,TOAD, PLSQL.

Confidential, Kansas City, KS

ETL Developer

Responsibilities:

  • Responsible for designing ETL strategy for both Initial and Incremental loads.
  • Responsible for writing Program specifications for developing mappings
  • Worked on collecting the data from pre-paid and post-paid source systems of Sprint Services.
  • Loaded data into Teradata using DataStage, FastLoad, BTEQ, and Fast Export, MultiLoad, and Korn shell scripts.
  • Analyzed business requirements, transformed data, and mapped source data using the Teradata Financial Services Logical Data Model tool, from the source system to the Teradata Physical Data Model.
  • Worked closely with the source team and users to validate the accuracy of the mapped attributes.
  • Troubleshoot and created automatic script/SQL generators
  • Sending the stats report to the high level management using the pivot tables and xls graphs.
  • Helping the Reporting team by providing the teradata queries
  • Fulfilled ad-hoc requests coming from superiors.
  • Analyzing the specifications and identifying the source data that needs to be moved to data warehouse.
  • Involved in extracting, cleansing and transforming of the data
  • Responsible for writing PL/SQL procedures and functions to load the data marts.
  • Involved in tuning of SQL queries for better performance. Worked on database connections, SQL joins views and aggregate conditions.
  • Knowledge of Ab Initio software in the analysis of data, batch processing and data manipulation.
  • Data Quality Analysis to determine cleansing requirements.
  • Involved in the performance tuning of Informatica mappings/sessions for large data files by increasing block size, data cache size, and commit interval.
  • Created mappings with different transformations, mapping parameters and variables
  • Used pmcmd commands in the UNIX scripts and responsible for development of test Cases for unit testing the Autosys jobs and UNIX scripts.
  • Worked on migration issues from development to testing environments and fixed the same.
  • Worked with the QA team to research on the issues they raised.
  • Responsible for documentation of projects
  • Worked on preparing a table of summarization and aggregation of the fact data

Environment: Informatica 6.1,Teradata V2R5, SFTP, BTEQ, Query man, Multi load, fast export, Oracle9i, ERWIN, PL/SQL, Perl, SQL*Loader, Visual Basic, SAS, Business Objects, Toad, Data Quality, Control M, UNIX and Windows, SQL Server, TOAD

Confidential

Teradata Developer

Responsibilities:

  • Developed Korn Shell scripts and complex SQL to load data into the data warehouse.
  • Analyzed test case results and documented test cases/plans.
  • Documented Program Specifications, Unit and Integration Test Specifications, Test Results.
  • Accomplished data movement process that load data from DB2 into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Winddi and Queryman reviewed and improved the design of Extract and Load specifications.
  • Coded test SQLs and analyzed results.
  • Developed Korn Shell scripts based on Load and Extract Specifications.
  • Analyzed test case results and documented test cases/plans.
  • Addressed any testing discrepancies found in a timely manner.
  • Provided expertise and trained less experienced developers, creating the necessary training-materials.

Environment: Teradata V2R5, Teradata SQL Assistant, BTEQ, FLOAD, FEXP, MLOAD, FTP, Toad, UNIX Shell Scripting

We'd love your feedback!