Sr. Teradata Developer Resume
NC
PROFESSIONAL SUMMARY:
- 7+ years of working experience in data migration, Enterprise Data Warehousing, Including Teradata, UNIX and manual testing.
- Experience in Teradata Database design, Development, Implementation and Maintenance mainly in large scale Data Warehouse environments.
- Extensive experience in Administration and Maintenance of Dev, Stage, prod and standby databases for DSS and Data Warehousing environments.
- Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.
- Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
- Experience in different database architectures like Shared Nothing and Shared everything architectures. Very good understanding of SMP and MPP architectures.
- Strong Teradata SQL, ANSI SQL coding skills.
- Extensively worked with BTEQ, FASTEXPORT, FASTLOAD and MULTILOAD Teradata utilities to export and to load data to/from flat files.
- Worked on Moody’s Riskfrontier tool which helps you identify portfolio risks and opportunities, which can improve strategic decision - making and performance in banking environment.
- Hands on experience in monitoring and managing varying mixed workload of an active data warehouse using various tools like PMON, Teradata Workload Analyzer, Teradata Dynamic Workload Manager and Teradata Manager .
- Extensively worked on Query tools like SQL Assistant, MS SQL Server, Aginity Netezza workbench and PLSQL Developer.
- Good Knowledge in Logical and physical modeling using Erwin. Hands on experience in 3NF, Star/Snowflake schema design and De-normalization techniques.
- Extensively worked on Query Analyzing, performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
- Proficient in converting logical data models to physical database designs in Data warehousing Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts and Data Analysis.
- Skillfully used OLAP analytical power of Teradata by using OLAP functions such as Rank, Quantile, Csum, MSum, and group by grouping set etc to generate detail reports for marketing folks.
- Used extensively Derived Tables, Volatile Table and GTT tables in many of the BTEQ scripts.
- Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
- Experience in writing UNIX shell and PERL scripts to support and automate the ETL process.
- Experience in Oracle RDBMS Architecture.
- Sourced data from disparate sources like Mainframe Z/OS, UNIX flat files, IBM DB2, Oracle, and SQL Server and loaded into Oracle and Teradata DW.
- Involved in Unit Testing, Integration Testing and preparing test cases.
- Involved in production support activities 24/7 during on call and resolved database issues.
TECHNICAL SKILLS:
Databases: Teradata 13/14, MS-SQL Server, Netezza, Oracle.
DB Tools/Utilities: Teradata SQL Assistant 13/14, BTEQ, Fastload, Multiload, FastExport, TPump, Teradata Manager, Teradata Query Manager, Teradata Administrator, Teradata SQL Assistance, PMON, SQL Loader, TOAD 8.0,Team Viewer, Aginity Netezza Workbench, Moody’s Analytics (Riskfrontier).
Programming Languages: C, C++, SQL, PL/SQL, UNIX and PERL Shell Scripting.
ETL Tools: Ab-Initio, Informatica, Big Data.
Data Modelling: Logical/Physical/Dimensional, Star/Snowflake, OLAP, ERWIN.
Scheduling Tools: Autosys, Crontab (UNIX)
Operating Systems: Sun Solaris 2.6/2.7/2.8/8.0 , Linux, Windows, UNIX, Z/OS.
PROFESSIONAL EXPERIENCE:
Confidential, NC
Sr. Teradata Developer/ Analyst
Responsibilities:
- Developed and maintained existing application portfolio and planned the project implementation.
- Developed and implemented the model changes for a work stream in an application.
- Worked on Moody’s Riskfrontier tool which helps you identify portfolio risks and opportunities, which can improve strategic decision-making and performance in banking environment.
- Actively managed the planning, organizing of activities for JIRA tickets.
- Organizing meeting with the SME’s of the dependent systems when changes are done for the existing system.
- Translate business requirements into system solutions as per Basel views.
- Managed and took lead in development to perform tasks in Enterprise Capital Management team (ECM).
- Developed BTEQ and load utilitity (fastload, multiload, tpump) scripts as part of month end release tickets.
- Created data manipulation and data definition scripts.
- Views creation, modification and developed scripts for automation of processes as per the requirements.
- Created UNIX scripts for various purposes like FTP, Archive files and creating parameter files.
- Experience in maintaining code repository by using subversion and production deployment with troubleshooting activities.
- Developed scripts to load high volume data into tables using Fastload and Multiload utilities.
- Teradata performance tuning via Explain, PPI, AJI, Indices, collect statistics or rewriting of the code.
- Involved in the analysis of the Issues and proposing the solution to the client.
- Involved in the analysis of test results, preparing test cases and test data, documenting the unit test cases and participated in end-end testing activities.
- Developed front end GUI (Control Panel) for work streams based on which business users will be able to access the process and kick of runs during month end production process without developer’s involvement.
- Involved in process creation for existing SABER to SABER2 migration, which is based on Quartz, python and Netezza, teradata databases.
- Analyzing data and implementing the multi-value compression for optimal usage of space.
- Involved in Creating the UNIX Shell Scripts/Wrapper Scripts that uses for scheduling jobs.
- Hands on experience and verifications during the teradata 13.1 to teradata 14.1, JIRA upgrades.
- Involved in weekly meetings with the users for decision making for changing the existing programs for special processing.
- Developed the recommendations for continuous improvement in efficiency and effectiveness by following Bank's CAB Processes.
- Responsible for trouble shooting, identifying and resolving data issues. Worked with analysts to determine data requirements and identify data sources, provide estimates for task duration.
- Helped to document step by step process of running the calculator.
- Offshore team management, assigning daily tasks and organizing stand up calls when required.
- Prepared the documents in details which will be shared across the organization.
- Involved in 24x7 production support.
Environment: Teradata RDBMS 13,14(SQL Assistant BTEQ, Fast load, Teradata Administrator, MultiLoad, TPump, Power), UNIX (putty,WINSCP 4.2.7, SSH Tectia), QUATZ Desktop2, MS SQL Server Management Studio 10.0.55, Oracle 11.6g, PL/SQL Developer 9.0,Notepad++ v6.3,Aginity Netezza Workbench 2.1, FTP, SFTP.
Confidential, GA
Sr. Teradata Developer
Responsibilities:
- Interacted with the Functional Analysts to understand the flow of the business.
- Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
- Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
- Worked efficiently on Teradata Parallel Transport and generated codes.
- Generated custom JCL scripts for processing all mainframes flat files, IBM DB2.
- Created various Teradata Macros in SQL Assistant for to serve the analysts.
- Responsible for trouble shooting, identifying and resolving data problems.
- Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.
- Developed performance utilization charts, optimized and tuned SQL and designed physical databases, Teradata load utilities, SQL.
- Loaded and transferred large data from different databases into Teradata using MLoad and OLELoad.
- Created series of Teradata Macros for various applications in Teradata SQL Assistant.
- Involved in writing complex SQL queries based on the given requirements and for various business tickets to be handled.
- Performance tuning for Teradata SQL statements using Teradata Explain command.
- Created several SQL queries and created several reports using the above data mart for UAT and user reports.
- Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
- Analyzing data and implementing the multi-value compression for optimal usage of space.
- Excellent experience in performance tuning and query optimization of the Teradata SQLs.
- Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
- Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
- Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
- Creating Test cases for Unit Test, System Integration Test and UAT to check the data.
- Performed scheduling techniques with ETL jobs using scheduling tools, jobs through pmcmd commands, based on the business requirement.
- Developed Shell Scripts for getting the data from source systems to load into Data Warehouse.
- Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
- Used PMON/Viewpoint to tune SQL’s, which are taking max AMP/Io skew of the Teradata Server.
- Good exposure to onshore - offshore model.
Environment: NCR Teradata V13, BTEQ,DB2, Teradata SQL, Teradata SQL assistant, Fast Load, MultiLoad, FastExport, Viewpoint, PLSQL,TOAD, ERWIN, Oracle SQL, JCL, UNIX, Shell scripting.
Confidential, DE
Teradata Developer
Responsibilities:
- Meetings with business/user groups to understand the business process and gather requirements. Extracted and analyzed the sample data from operational systems (OLTP system) to validate the user requirements. Created high level design documents.
- Loading Data into the Enterprise Data Warehouse using Teradata Utilities such as BTEQ, Fast Load, Multi Load, Fast Export and Tpump in both mainframes and Unix environments
- Utilized BTEQ for report generation and running the batch jobs as well
- Developed the Teradata macros which pulls the data from several sales table and performs calculations and aggregations and dumps into a results table
- Utilized the Teradata utilities to load the data into EDW from DB2 sources using JCLs and Cobol scripts
- Performance Tuning for the existing Teradata SQL scripts for the OTL code
- Utilized Global temporary tables(GTT’s) in an efficient way by reducing the run time of the jobs
- Build tables with UPI, NUPI, USI, NUSI, macros and stored procedures.
- Well experienced in using Partition (Partition by key, partition by round robin) and Departition components (Concatenate, Gather, Interleave and Merge) to achieve data parallelism.
- Created common graphs to perform common data conversions that can be used across the applications using parameter approach using conditional DMLs.
- Troubleshot problems by checking sessions and error logs.
- Interacted with metadata to troubleshoot session related issues and performance issues, tuning of ETL Load process.
- Very good understanding of the several relational Databases such as Teradata, Oracle and DB2. Wrote several complex SQLs using sub queries, join types, temporary tables, OLAP functions etc.
- Successfully identified problems with the data, produced derived data sets, tables, listings and figures that analyzed the data to facilitate correction.
- Data manipulation by merging, Appending, Concatenating, sorting datasets.
- Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, Fast Export, Teradata Parallel Transporter, DDL and DML Commands.
- Performance tuning for Teradata SQL statements using Teradata Explain command and Run stats.
- Created several Teradata SQL queries and created several reports using the above data mart for UAT and user reports. Used several of SQL features such as Group By, Rollup, Rank, Case, Union, Subqueries, Exists, Coalesce, Null etc.
- Prepared Unit and Integration testing plans. Involved in Unit and Integration testing using the testing plans.
- Involved in Creating the Unix Shell Scripts/Wrapper Scripts that uses for scheduling jobs.
- Involved in after implementation support, user training and data models walkthroughs with business/user groups.
Environment: Teradata 12, Erwin 7, Teradata Administrator, Teradata SQL Assistant, Teradata Visual Explain, BTEQ, Multi Load, Fast Load, Fast Export, MVS, UNIX Shell Scripts, Erwin.
Confidential, Riverwoods, IL
Teradata Developer
Responsibilities:
- Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Transfer of large volumes of data using Teradata FastLoad, MultiLoad and T-Pump.
- Database-to-Database transfer of data (Minimum transformations) using ETL (Ab Initio).
- Fine tuned the existing mappings and achieved increased performance and reduced load times for faster user query performance.
- Creation of customized Mload scripts on UNIX platform for Teradata loads using Ab Initio.
- Sorted data files using UNIX Shell scripting.
- Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.
- Used data profiler in ETL processes, Data integrators to ensure the requirements of the clients including checks on column property, column value, and referential integrity.
- Acted as a single resource with sole responsibility of Ab Initio - Teradata conversions.
- Extensively used Derived Tables, Volatile Table and Global Temporary tables in many of the ETL scripts.
- Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
- Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
- Created COBOL programs and worked on creating JCL scripts to extract data from Mainframes operational systems. Extracted data from mainframe DB2 tables.
- Created Primary Indexes (PI) for both planned access of data and even distribution of data across all the available AMPS.
- Created appropriate Teradata NUSI for smooth (fast and easy) access of data.
- Worked on exporting data to flat files using Teradata FastExport.
- Analyzed the Data Distribution and Reviewed the Index choices.
- In-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks.
- Worked with PPI Teradata tables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process.
- Extensively used the Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize and DE normalize.
- Prepared Unit and Integration testing plans. .
- Involved in after implementation support, user training and data models walkthroughs with business/user groups.
Environment: Teradata 12, Ab Initio (GDE1.15, Co>Op Sys 2.15), Fastload, Multiload, FastExport, UNIX, Unix Shell scripts.
Confidential, IL
Teradata Developer
Responsibilities:
- Involved in database design/preparing SQL scripts to support the larger databases that involves terabytes of data.
- Coordinated with the database team to create the necessary data sources for PSG (Premier Services) and FA (Financial Accounts) using Metadata Utility.
- Involved in the design of complex campaigns for the Business users to accomplish different marketing strategies.
- Coordinated with the test team in the design of test cases and preparation of test data to work with different Channels, setup regency and timeout for the Campaigns.
- Creation of BTEQ, Fast export, MultiLoad, TPump, Fast load scripts.
- Worked on complex queries to map the data as per the requirements.
- Participated in data model (Logical/Physical) discussions with Data Modelers and creating both logical and physical data models.
- Involved in Investigating and resolve data issues across platforms and applications, including discrepancies of definition, format and functions
- Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the business analysts
- Involved heavily in writing complex SQL queries based on the given requirements such as complex Teradata Joins, Stored Procedures, Macros
- Extensively used various Teradata Set Tables, Multi-Set table, global tables and volatile tables for Loading/Unloading.
- Working closely with CA7 Schedulers to set up job stream through CA7 to run daily, weekly and Monthly process jobs.
- Involved in creating UNIX Shell Wrappers to run the deployed Ab Initio scripts.
- Used Ab Initio components like Reformat, Scan, Rollup, Join, Sort, Partition By key, Normalize, Input Table, Output Table, Update Table, Gather Logs and Run SQL for developing graphs.
- Involved in complex Ab Initio XFRs and DMLs to derive new fields and solve various business requirements
- Involved in providing 24/7 production support to various ETL applications.
Environment: Teradata V2R6, Ab Initio (GDE 1.13, Co>Op Sys 2.13, EME), BTEQ, Fast Load, Multi Load, Fast Export, Teradata SQL Assistant, Teradata Visual Explain, CA7, UNIX, Windows.
Confidential
Teradata Developer
Responsibilities:
- Created UNIX shell scripts for FTP, merge and send success/failure mails
- Created different types of reports, such as Master/Detail, Cross Tab and Chart.
- Used various Teradata Index techniques to improve the query performance and also used Teradata Explain to enhance the performance of incoming queries.
- Responsible for loading data into warehouse from different sources using Multiload and Fastload to load millions of records.
- Performed tuning and optimization of complex SQL queries using Teradata Explain. Created several custom tables, views and macros to support reporting and analytic requirements.
- Involved in creating Partition Primary Index, Secondary, Join and Hash indexes for efficient access of data based on the requirements.
- Performed data validation, data reconciliation on the target, source data.
- Prepared the strategies for performance tuning and reusable components in Ab Initio Co>Operating System
- Extensively worked with Multi files system and used Partition Components and De Partition components extensively.
- Involved in the preparation of the QA check list as per client provided standards.
- Performed coordination work between the onsite and offshore.
- I was a key resource for the support during the system, integration and User acceptance test phases.