Teradata Dba Resume
Chicago, IL
PROFESSIONAL SUMMARY:
- 7 + years of experience and 5 years of Teradata administration, analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverable within committed deadlines
- 4+ years OLTP, ODS and EDW data modeling (logical and physical design, and schema generation) using Erwin, ER/Studio and other tools for Teradata, Oracle, DB2/UDB and MS SQL Server models & repositories.
- 3+ years administering large Teradata database system in development, staging and production.
- 5 Years of experience in Teradata 13/12/V2R6.2/V2R5, Data Load/Unload utilities like BTEQ, FastLoad, MultiLoad, Tpump, FastExport and Teradata administrative utilities like Archive/Restore, Table Rebuild, Check Table, Configuration, Reconfiguration, Filer, DIP. ), OLAP, OLTP, ETL, BI, Sync sort for UNIX/Mainframes, NCR 4300/5200.
- Proficient in Teradata V2R5 database design (conceptual and physical), Query optimization, Performance Tuning.
- Over 5+ years of experience in implementing Data warehouse and data base applications with Ab Initio, Informatica ETL in addition with data modeling and reporting tools on Teradata, Oracle, DB2, and Sybase RDBMS.
- Strong hands on experience using Teradata utilities (FastExport, MultiLoad, FastLoad, Tpump, BTEQ and QueryMan).
- Extensive experience with managing, leading various complex data warehousing applications and installations, integrations, upgrades, maintenances of ETL, Datamart Siebel 7.0/2000/5.5, Siebel Warehouse 6.3, Siebel EIM, OLAP, OLTP, Autosys, Control M, Sybase, BI, data cleansing, data profiling tools.
- Over 4+ years of industry experience in developing strategies for ETL (Extraction, Transformation and Loading) mechanism using Ab Initio tool in complex, high volume Data Warehousing projects in both Windows and UNIX.
- Strong hands on experience with Ab Initio GDE ( 3.0/1.15/1.14/1.13 ), Co>Op (3.0/2.15/2.14/2.13/2.12/2.11 ).
- Expert knowledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De - normalize, Partitioning and De-partitioning components etc.
- Well versed with various AB Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques.
- Strong hands on experience using Teradata utilities.
- 3 years of Data Cleansing experience using Trillium 7.6/6.5 (Converter, Parser, Converter & Geocoder), Firstlogic 4.2/3.6, UNIX Shell Scripting and SQL coding.
- Good Knowledge in Dimensional Data modeling, Star/Snowflake schema design, Fact and Dimensional tables, Physical and Logical data modeling.
- Experience with business intelligence reporting tools using Business Objects, Cognos and Hyperion.
- Experience in integration of various data sources
- Experience in supporting large databases, troubleshooting the problems.
- Experience in all phases of SDLC like system analysis, application design, development, testing and implementation of data warehouse and non-data warehouse projects
TECHNICAL SKILLS:
Teradata Tools: ARCMAIN, BTEQ, Teradata SQL Assistant, Teradata Manager, PMON, Teradata Administrator.
ETL Tools: AbInitio(GDE 3.0/1.15/1.14/1.13 )Co-op (3.0/2.15/2.14/2.13/2.12/2.11 ), Informatica Power Center 6.2/7.1/7.1.3, SSIS
DB Tools: SQL*Plus, SQL Loader, TOAD 8.0, BTEQ, Fast Load, Multiload, FastExport, SQL Assistant, Teradata Administrator, PMON, Teradata Manager
Databases: Teradata 13/12/V2R6.2/V2R5, Oracle 10g, DB2, MS-SQL Server 2000/2005/2008, MS-Access, Oracle
Scheduling Tools: Autosys
Version Control Tools: Clear Case
Programming Languages: C, C++, Java, J2EE, Visual Basic, SQL, PL/SQL and UNIX Shell Scripting
Data Modelling /Methodologies: Logical/Physical/Dimensional, Star/Snowflake, ETL, OLAP, Complete Software Development Cycle. ERWIN 4.0
Operating Systems: Sun Solaris 2.6/2.7/2.8/8.0, Linux, Windows, UNIX
PROFESSIONAL EXPERIENCE:
Confidential, Chicago, IL
Teradata DBA
Responsibilities:
- Maintain and support Teradata architectural environment for EDW Applications.
- Generating the monthly health reports to assess current performance management practices and recommend improved processes.
- Analyze performance trends and project future capacity needs and Identify bottlenecks to improve query performance.Strong experience Teradata Tools like SQL Assistant, PMON, Teradata Administrator, Viewpoint server and TARA Backup server.
- Supporting different Application development teams, production support, query performance tuning, system monitoring, database needs and guidance.
- Involved in the migration of MOST Non-Retail application from Oracle to Teradata Database. Created MOST Non-Retail physical data model for Teradata. Created automated Teradata Macros, Teradata Procedures and Unix scripts to migrate initial data from Oracle to Teradata. Changed Informatica ETL mappings to convert from Oracle to Teradata. Changed Informatica sessions to point to Teradata TPT connections using Stream, Update and Load operators.
- Involved in the Physical Data modeling of the MOST Retail and Non-Retail sales data for the Abbott’s Pharmaceutical products.
- Created a Abbott Teradata standards and best practices document for the EDW and TD DBA teams to follow.
- Mentoring the Abbott’s Oracle DBA’s on the Teradata best practices and Teradata MPP architecture.
- Created several TD scripts for the ETL and DBA needs.
- Involved in the integration of Teradata with down stream consumption groups like SAS, COGNOS and other BI groups.
- Mentored Abbott ETL teams for the right usage of the Teradata Utilities like TPT Operators, Fastload, Multiload etc.
- Involved in the fixing and following-up of the integration issues with Teradata and Informatica ETL tool.
- Involved in the migration of MOST Non-Retail application from Oracle to Teradata Database.
- Created MOST Non-Retail physical data model for Teradata.
- Created automated Teradata Macros, Teradata Procedures and Unix scripts to migrate initial data from Oracle to Teradata.
- Changed Informatica mappings to convert Oracle functions to Teradata.
- Changed Informatica sessions to point to Teradata TPT connections using Stream, Update and Load operators.
- Resolved the Teradata TPT connections issues and ODBC driver configuration issues.
- Did the Unit and Integration testing and migrated the code to Production.
- Use SQL to query the databases and do as much crunching as possible in Teradata, using very Complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance.
- Designed DDLs and efficient PIs along with Identity Keys for efficient data distribution.
- Assisted Developers with coding and effective Join issues.
- Responsible for Backup and Restore to Europe server every day after ETL loads on US Server.
- Developed statistics macros and automated to run based on the frequency.
- Highly successful in testing fail over nodes and vprocs migration.
- Experienced in Loading, archiving and restoring data base on the user request.
- Performance tuning, monitoring, UNIX shell scripting, and physical and logical database design.
- Worked with ETL users to provide access and creating the objects on Production environment.
- Worked with SQL tuning, Locks release during backups, Viewpoint server to monitor various.
- Controlled and tracked access to Teradata Database by granting and revoking privileges.
- Planned the releases, monitored performance and reported to Teradata for further technical issues.
Confidential, NY
Teradata DBA
Responsibilities:
- Involved in the day to day administration and architecture of eMedNY Medicaid/Medicare data warehouse for the state of NY. The data warehouse is housed in Teradata System.
- Very good understanding of Medicare/Medicaid subject areas like Paid/Encounter/Denied Claims, Providers, Recipients etc.
- Involved in the enhancements to the eMedNY Physical data warehouse model.
- Involved in day to day administration of 4 Node V2R6.2 TD production and 1 Node V2R6.2 TD development system.
- Worked with ETL developers on daily basis for the ETL code promotions for the eMedNY Medicare/Medicaid data warehouse application
- Did the monitoring and performance tuning of the eMedNY user adhoc queries and standard reporting queries coming from BI query tool.
- Implemented Join Index to improve the join performance of the major PPI and non-PPI fact tables. Eliminated unused secondary indexes to save around 200GB of space. Fixed the PI selection for some of the DW tables to improve the performance.
- Involved in disaster recovery test of the eMedNY front-end application which covers TD eMedNY Datawarehouse application.
- Created several TD system usage reports for monitoring and capacity planning and for the higher management.
- Cleaned up the unused Profiles and Roles, eliminated redundant security rights to eliminate user level security.
- Implemented TD Multi-Valued compression on major Fact and Dimension tables and saved around 1TB of space.
- Presently helping the new vendor for eMedNY DW group with several data feeds and reports. Created TD scripts to give sample form and fit data from 600 DW tables to the new vendor. Created scripts using TD Stored Procedures, bteq, FastExport and Unix to automate the extraction and secured transfer of 5 years of data on month by month basis to the new vendor.
- Fixed the data related issues to the existing DW using sql and other TD scripts.
- Implemented Priority Scheduling to penalize the resource intensive queries and bad queries with missing join conditions. Created the comparison reports to show the reduction in elapsed times taken by the short queries and increase in elapsed times taken by the resource intensive queries.
- Presently implementing the TDQM to manage the resources efficiently and to give user better response times by eliminating bad queries entering the system and automatic demotion of certain workloads.
Confidential
Teradata Application DBA
Responsibilities:
- Modification of views on Databases, Performance Tuning and Workload Management.
- Maintenance of Access Rights and Role Rights, Priority Scheduling, Dynamic Workload Manager, Database Query Log, Database Administrator, Partitioned Primary Index (PPI), Multi-Value Compression Analysis, Usage Collections and Reporting of ResUsage, AmpUsage, Security Administration Setup etc. and leading a team of developers for workaround with different users for different, complicated and technical issues.
- Performed application level DBA activities creating tables, indexes, monitored and tuned Teradata BETQ scripts using Teradata Visual Explain utility.
- Performed Space Management for Perm & Spool Space.
- Reviewed the SQL for missing joins & join constraints, data format issues, mis-matched aliases, casting errors.
- Analyzing data and implementing the multi-value compression for optimal usage of space.
- Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.
- Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions.
- Granted & Revoked Object privileges to the Databases Users / Roles.
- Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
- Suggested some tuning of DBS control parms such as FreespacePercent, Cylinderssavedforperm, DefragLowCylProd, MiniCylPackLowCylProd etc.
- Used extensively Teradata Analyst Pack such as Teradata Visual Explain, Teradata Index Wizard and Teradata Statistics Wizard.
- Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
- Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc…
- Flat files are loaded into databases using FastLoad and then used in the queries to do joins.
- Developed new reports for the end-users and maintained current ones. All the DSS and the Legacy systems are stored in Teradata Tables
- Use SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance
- Use PMON, Teradata manager to monitor the production system during online day.
- Excellent experience in performance tuning and query optimization of the Teradata SQLs.
- Used UNIX shell scripts for automating tasks for BTEQ and other utilities.
- Used NCR GNU Debugger (gdb) for crash dump analysis. Performance tuning, monitoring and index selection while using PMON, Teradata Dashboard, Statistics wizard and Index wizard and Teradata Visual Explain to see the flow of SQL queries in the form of Icons to make the join plans more effective and fast.
- Analyze root causes for failed batch jobs.
- Documentation of scripts, specifications and other processes.
Confidential
Teradata DBA
Responsibilities:
- Create and Maintain Teradata Databases, Users, Tables, Views, Macros and Stored Procedures using Teradata Administrator (WinDDI), SQL Assistant (Queryman), SQL DDL, SQL DML, SQL DCL, BTEQ, MLoad, Fastload, FastExport, TPUMP, Statistics Index and Visual Explain
- Supporting jobs running in production for failure resolution with tracking failure reasons and providing best resolution in timely manner.
- Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
- Working with OBIEE team to optimize various report run times by applying various TD performance techniques such as Partitioning, JIs, AJIs, USI and NUSI.
- Existing FT Physical DM is created based on even distribution of tables by defining 10 to 20 column Primary Indexes not taking usage into consideration. Redesigned and Optimized the Finance Transform Physical DM by defining the right PIs and PPIs using ELDM. Also implemented multi-valued compressions and Soft RIs when necessary.
- Implemented various Teradata recommended best practices while defining Profiles, Roles, Alerts, Multi-Value Compression, and Backup.
- Created and implemented BAR strategy for the TD application and system databases.
- Created a Teradata standards and best practices document for the EDW and TD DBA teams to follow.
- Used CTL-M to schedule Backups, Collect Statistics, DBQL, AmpUsage, ResUsage jobs. Created and scheduled various Teradata recommended Performance, Capacity, trending and activity reports such as Space, Usage, Access, security reports.
- Did the Teradata SQL troubleshooting and fine tuning of scripts given by Analysts and developers.
- Working with Data Architecture group for the right approach on Integrated Data Warehouse environment using Manufacturing ILDM, integrated PDM and various Dependent Data marts and Semantic layers based on subject areas.
- Suggested some tuning of DBS control parms such as MaxLoadTasks, DBSCacheThr, MaxParseTreeSegs,, ReadAheadCount, FreespacePercent, Cylinderssavedforperm, DefragLowCylProd, MiniCylPackLowCylProd etc.
- Used Teradata Manager and Teradata Administrator to Monitor and Manage Teradata Databases.
- Involved in various Operational Teradata DBA activities on daily basis.
- Performed application level DBA activities creating tables, indexes, monitored and tuned Teradata BETQ scripts using Teradata Visual Explain utility. Performed Space Management for Perm & Spool Space.
- Interaction with different teams for finding failure of jobs running in production systems and providing solution, restarting the jobs and making sure jobs complete in the specified time window.
- Write Unix shell script to perform ETL interfaces (BTEQ, MLoad, Fastload, and FastExport jobs)
- Used TSET to export information from target system to a specified file on a workstation
- Troubleshooting an export, import, or undo-import (cleanup) operation of a session using the TSET log file and summary reports
- Worked on exporting data to flat files using Teradata FastExport
- Analyzed the Data Distribution and Reviewed the Index choices
- In-depth expertise in the Teradata cost based query optimizer, identified potential bottlenecks
- Worked with PPI Teradata tables and was involved in Teradata specific SQL fine-tuning to increase performance of the overall ETL process
- Debugging and monitoring the code using GDB commands
- Build tables, views, UPI, NUPI, USI and NUSI.
- Design and manage troubleshooting/exceptional procedures such as handling hot AMPs, crash dumps, slowdowns, connectivity issues, etc.
- Interact with Teradata Global Support Centre for problem analysis and troubleshooting.
- Developed various Mappings to load the Fact and Dimension tables and set the order of the session to effectively utilize the tool’s ability to load different tables in parallel.
- Administered 40TB Teradata System with several ODS, EDW and BI applications accessing the data.
Confidential, Deerfield, IL
Teradata Developer/ETL Developer
Responsibilities:
- Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
- Used the following components of Ab Initio in creating graphs. Dataset components (Input file, output file, lookup file, and intermediate file), Database components (Input table, output table, RunSql, Truncated Table), Transform Components (Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components), Partitioning Components (Broad Cast, partition by expression, partition by key, partition by round robin), Gatherlogs, Redefine format, Replicate, Runprogram components.
- Extensively used the Ab Initio tool’s feature of Component, Data and Pipeline parallelism.
- Configured the source and target database connections using .dbc files
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Implemented star-schema models for the above data marts. Identified the grain for the fact table. Identified and tracked the slowly changing dimensions and determined the hierarchies within the dimensions.
- Worked with DBA team to ensure implementation of the databases for the physical data models intended for the above data marts.
- Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
- Adapted Agile software development methodology to ETL the above data marts.
- Created mapping documents from EDS to Data Mart. Created several loading strategies for fact and dimensional loading.
- Designed the mappings between sources (external files and databases) to Operational staging targets.
- Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
- Did the performance tuning for Teradata SQL statements using Teradata Explain command.
- Created .dml files for specifying record format.
- Created checkpoints, phases to avoid dead locks and tested the graphs with some sample data then committed the graphs and related files into Repository from sandbox environment. Then schedule the graphs using Autosys and loaded the data into target tables from staging area by using SQL Loader.
- Worked heavily with various built-in transform components to solve the slowly changing dimensional problems and creating process flow graphs using Ab Initio GDE and Co>Operating System.
- Analyzed the Data Distribution and Reviewed the Index choices
- Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes
- Extensively worked under the UNIX Environment using Shell Scripts.