Sr. Teradata Technical Specialist/data Analyst Resume
5.00/5 (Submit Your Rating)
NY
SUMMARY
- As a Teradata/ Data Warehouse Consultant, me have provided leadership in developing Enterprise Scalable Data Warehouses that are closely aligned to business goals and objectives. me have built an experienced noledge base with major DBMS’s including Teradata, SQL server, Oracle and Green Plum. Built on a foundation of around 9 years of computer software services in Administration of Teradata, Performance monitoring/tuning, Teradata RDBMS internals, and Data Warehousing model and concepts and TEMPhas built experience noledge on ETL/ reporting tools Informatica, Data Stage, Tableau and Cognos.
- Served as DBA Lead in Enterprise Data Warehouse (EDW) project meetings as well as EDW/Teradata infrastructure planning initiatives. Meetings covered Teradata database issues, project questions, security questions, space and support issues. Participated in multiple software/hardware upgrades (software release upgrades, addition of nodes, hardware swap). dis included preparation of upgrade project plans, change control documentation, and coordination/performance of pre - and post-upgrade tasks. Promoted consistency and efficiency among the DBA Team through automation of routine tasks (Teradata Security - CreateUserSQL macro; Model Implementation - Database/Views Creation and Load ID Macros). Directed and controlled the activities related to database planning and development and the establishment of policies and procedures pertaining to its management security, maintenance, and utilization. Performed security administration for the Teradata database.
- As a DBA lead, responsible forTeradataDBA coordination among the Teradata GSC teams and Datacenter CSR teams for all P1/P2/P3 issues. Tracking of all Teradata Security Vulnerabilitiesand applied the relevant patches. Build the process to have the various checkpoints to identify the risk. Handled the various scenarios to avoid the single point of failures with respective the resource wide and system-wide.
PROFESSIONAL EXPERIENCE
Sr. Teradata Technical specialist/Data Analyst
Confidential
Responsibilities:
- Solved Various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts.
- Designed, developed, optimized and maintained database objects like tables, views, macros, indexes, stored procedure etc.
- Worked on loading of data from several flat files sources to Staging using Teradata MLOAD, FLOAD and TPump.
- Worked on exporting data to flat files using Teradata FASTEXPORT.
- Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
- Worked on SQL queries and tuned them to improve performance
- Used volatile table and derived queries for breaking up complex queries into simpler queries.
- Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
- Monitoring the ETL batch and tune the long runningETL batch enrichment sqls and reporting sqls.
- Involved in developing Unit Test cases for the developed mappings.
- Used ETL methodology for supporting data extractions, transformation and loading process.
- Involved in the logical and physical design of the database and creation of the database objects.
- Ensured data integrity and referential constraints for accurate database operations.
- Coded stored procedures, functions, and ref cursors to store, retrieve and manipulate data from database. Encapsulated the functionality into packages for an easy interface to the front end.
- Built the fast load and TPT scripts for full refresh of data loads.
- Involved in the performance tuning of the application through creation of necessary indexes.
- Fixing data quality related issues in theETLinfrastructure.
- Release Utility Locks of Multiload & dropping Multiload related tables for the aborted jobs.
- Fixing the ETL shell scripts and deployed the scripts in the QA and PROD.
- Creating the Wrapper Script in order to purge the log files and added 30 day retention period for log files.
- Purged the data from QA environment in order to free up the space.
- Refreshed the entire QA environment with Production data for SAP upgradation purpose.
Teradata Lead Consultant
Confidential, NY
Responsibilities:
- Configured systems in EDW from end to end.
- Configured TD/Hadoop Querygrid from QA to PROD and PROD to QA systems.
- Applied data protection including Transient Journal, Fallback and RAID.
- Configured intelliflex system from end to end.
- Resolved deadlocks and DBQL maintenance creating an archive DBQL SYSTEM MGMT database.
- Configured DBS control settings for System configuration.
- Point of contact for all outages related to ETL/Database; operations and engineering teams.
- Point of contact for Governors meeting monthly performance/issues deck
- Teradata representative on Monthly/Quarterly/Yearly Audit meetings.
- Acted as Lead/PM for L1/L2 teams at onsite and offshore. Established and build the stream line process to ensure the team provided exceptional support toall application and project teams.
- Configured the maxLoadAWT and MaxLoadTasks, DBQLFlushRate and system fields.
- Took crash dumps and TSETs for the Teradata technology to analyze for certain crashes.
- Configured Unity Director on all MS systems.
- Backup, Archive, and Recovery (BAR): Design, implement, maintain and automate the appropriate jobs (both cluster, single-stream and Multi stream) as required using TARA Net Backup and Bakbone Netvault.
- Involved with the application team to installHadoopupdates, patches and version upgrades as required.
- Customer representative for all Teradata SSM meetings for maintenance windows planning and approvals.
- Prepared blue prints and documented Query Grid installation and testing on Morgan Stanly Teradata and Hadoop systems.
- Approver for all PROD/Non-PROD changes tickets.
- Backup/Recover BAR system lead Confidential .
- Responsible for L3 on-call escalation
- Weekly meetings with DBA team members.
- Involved in importing and exporting the data from RDBMS to HDFS and vice versa using sqoop.
- Completed upgrade Morgan Stanley environment from Teradata Version 12 to Teradata 13.10 in the spring of 2014.
- Completed upgrade Morgan Stanley environment from Teradata Version 13.10 to Teradata 14.10 in the summer of 2015.
- Completed upgrade Morgan Stanley environments from Teradata Version 14.10 to Teradata 15.10 in the fall of 2016.
- Completed Production and BCP systems expansion from 10+5 node to 12+6 node.
- Co-ordinated with vendor in data migration projects using NPARC.
- Lead the company-wide Disaster Recovery Drills on Teradata yearly once.
- Created Hive tables and analyzing the loaded data in the hive tables using hive queries.
- Used Oozie job scheduler to automate the job flows.
- Supported 24/7 on call support on rotation base.
- Extensive experience inTeradata work load balancing concepts like TASM worked various user groups and developers to define TASM workload exceptions, implemented filters, and throttles as required.
- Defined Permanent space limits both at the database and user level for different business scenarios.
- Set up automated stats jobs in stats manager thru Viewpoint.
- Worked on setting up PDCR 13.10/14.10/15.10 on Viewpoint.
- Set up PDCR ten minute, daily, weekly and yearly crontab jobs.
- Developed statistics macros and automated to run based on the frequency.
- Designed Security model based on the business model and enforced the security through roles.
- Involved in various Operational Teradata DBA activities on daily basis.
- Create and Maintain Teradata Databases, Users, Tables, Views, Macros and Stored Procedures using Teradata Administrator, SQL Assistant (Queryman), SQL DDL, SQL DML, SQL DCL, BTEQ, MLoad, Fastload, FastExport, TPUMP, Statistics Index and Visual Explain
- Monitoring of Sessions, Nodes, Vproc Information and aborting the rouge queries in the system using Viewpoint.
- Created Entity Relationship Diagrams to document and organize logical process and data models of the corporate OLTP system using ERStudio and Erwin.
- Supporting jobs running in production for failure resolution with tracking failure reasons and providing best resolution in timely manner.
- Space management including PERM, TEMP and SPOOL space for the users and databases. Calculating the capacity and forecast the space proactively.
- Created and maintained Technical documentation for all the tasks performed like executing Pig scripts and Hive queries.
Sr. Teradata DBA/Modeler
Confidential, Jackson, MI
Responsibilities:
- Extensively worked on system utilities such as Ferret, Pack disk and Update space.
- Analyzed data and implemented the multi-value compression for optimal usage of space and improved me/O reads.
- Implemented Block level compression on cold data.
- Extensively worked to identify bottle necks for various aspects like data type different hash analysis, Latin vs. Unicode analysis, Check table skew of system, finding the non-performing views based on the logic...Etc. as part of performance tuning process.
- Provided performance tuning and physical and logical database design support in projects for Teradata systems.
- Participate in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project.
- Create and review the conceptual model for the EDW (Enterprise Data Warehouse) with business users.
- Analyze the source system to understand the source data and structure along with deeper understanding of business rules and data integration checks.
- Identify various facts and dimensions from the source system and business requirements to be used for the data warehouse.
- Create the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using ER Studio.
- Create multiple sub models which is halpful aid in the project which deals with large data models.
- Generate source code from database designs and construct graphical models from existing database or schema using forward and Reverse engineering process in ER Studio.
- Implement the Slowly changing dimension scheme (Type II) for most of the dimensions.
- Implement the standard naming conventions for the fact and dimension entities and attributes of logical and physical model.
- Performance tuning, monitoring and index selection while using PMON, statistics wizard and Index wizard and Teradata Visual Explain to see the flow of SQL queries in the form of Icons to make the join plans more effective and fast.
- Partitioned the tables which are not partitioned to reduce the CPU and me/O usage.
- Expanded the partition expressions on tables for accommodating the growing data.
- Review the logical model with Business users, ETL Team, DBAs and testing team to provide information about the data model and business requirements.
- Create the DDL scripts using ER Studio and source to target mappings (S2T- for ETL) to bring the data from NS to the warehouse.
- Work with DBA to create the physical model and tables. Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes case by case for the facts and dimensions.
- Work on the model based volumetric analysis and data based volumetric analysis to provide accurate space requirements to the production support team.
- Work on Mercury Quality Center to track the defect logged against the logical and physical model.
- Work as an onsite project coordinator once the design of the database was finalized in order to implement the data warehouse according to the implementation standards.
- Generated reports for high consuming CPU users on weekly basis and recommended query changes to reduce the CPU usage.
- Generate monthly reports for cleaning unwanted tables/backup tables.
- Took crash dumps and TSETs for the Teradata technology to analyze for certain crashes.
- Supporting different Application development teams, production support, query performance tuning, system monitoring, database needs and guidance.
- Designed DDLs and efficient PIs along with Identity Keys for efficient data distribution.
- Developed statistics macros and automated to run based on the frequency.
- Performance tuning, monitoring, UNIX shell scripting, and physical and logical database design.
- Highly successful in testing fail over nodes and Vprocs migration.
- Performed online archives by selecting the Full Online, Full Multi-Stream On-Line backups using NetBackup
- Worked with ETL users to provide access and creating the objects on Production environment.
Sr. Teradata DBA/Lead Developer
Confidential, NY
Responsibilities:
- Interacted with users, Business system consultants and ETL developers to design the Physical Data Model.
- Involved in DBA activities which include creation of users, spool and permanent space changes, checking the tables skew, implement compression, user id and access requests, queries with spool space and long runtime issues.
- For automating monthly archive processes, added new tables in the Infrastructure model and wrote several procedures, UNIX and BTEQ scripts.
- As part of Proactive Performance process (PPP) wrote several SQL to determine the unusual queries with high “Impact CPU” and investigated problems manually for every query.
- Designing System Alert through the Teradata Manager in automating the production system monitoring.
- Involved in designing logical model for the Infrastructure changes to automate the database administration processes.
- As part of monitoring Production system aborted many queries and changed the workload to balance the total load on the system.
- Worked extensively on Teradata SQL assistant for writing and executing the SQL queries.
- Attended training on TARA GUI and Veritas Net backup as Teradata is no longer supporting Bakbone NetVault. Worked on converting all the prior Net vault archives to Net backup Archives
- Used Viewpoint, Teradata Manager and PMON to monitor the system performance and load on production systems.
- Co-ordinate with business partners on archiving several production databases to meet the growth requirements
- Experienced in working with Teradata ETL tools like Fast Load, Multi Load, TPump and Fast Export.
- Extensively used Teradata Priority Scheduler (TASM) in controlling the load of the system and eventually setting the throttles on batch/load id’s to limit the total number of sessions running on system and hence balance the CPU consumed across the system.
- Capacity planning and proactive monitoring to meet performance and growth requirements.
- Worked with Micro strategy and Affinum teams to review and tune SQL as part of Performance Tuning and Query optimization.
- Used AtanaSuite to work on the compression of tables.
- Designing Archival Jobs using Net Vault 8.0 and considering archival resources like the number of Media Servers, tapes and various limiting factors of Transmit/receive of data to the media.
- Highly involved in Technical upgrades and system planning.
Confidential, IL
Responsibilities:
- Extensively worked with SME/business analysts to get the business requirements.
- Translate the business requirements into technical specifications.
- Worked to build semantic conceptual data model for given application specific requirements.
- Built semantic model for existing semantic production system using reverse engineering concept in ERWIN.
- Worked on cross area projects as PDM resource to meet project deadlines in Navistar environment.
- Closely worked on Teradata MLDM implementation at EDW level and leveraged the semantic process to achieve business goals.
- Built details technical design document for new subject areas, which consists of table/column level details. dis halped the team for faster development process.
- Built very complex process in terms of process and time critical applications (CCFOT - Customer case fill on time) to meet the business requirements.
- Extensively worked on Performance tuning of semantic process. Proactively worked to find long running SQL's using DBQLogs and find bottle neck areas for semantic process.
- Automated process for generating the basic collect stats for sets of tables in the system.
- Built the code migration process and guide lines for smooth deployments across environments.
- Built the process and enforced the security protocol for various users based on their roles.
- Performed the space requirement analysis for project growth based on current and history of data.
- Worked closely Teradata MLDM (Manufacture Logical Data Model) for EDW implementation and built the flexible semantic layer applications on MLDM design.
- Built the ABC (Audit, Balancing & control) process for semantic layer for continuous improvements.
Confidential
Application DBA.
Responsibilities:
- Worked on Application tuning to automate the determination of unusual queries and manually working on each problematic queries
- Performance Tuning, Query optimization (Explain plans, Collect statistics, Primary and Secondary indexes).
- Worked with users with high CPU consumption as per the monthly performance reports in order to reduce the total CPU usage and meet the capacity planning requirements.
- Involved proactively in the performance tuning opportunities.
- Worked extensively on Teradata SQL assistant for writing and executing the SQL queries.
- Maintain and Tune Teradata Production system queries
- Designing System Alert through the Teradata Manager in automating the production system monitoring. Build tables, views using UPI, NUPI, USI, NUSI and PPI.
- Upon upgrade to V2R12, worked in converting scripts from PMCP to PDCR on UAT and co-ordinate with business users during the downtime.
- Perform tuning and runtime optimization of database configuration and application SQL
- Experienced in working with Teradata ETL tools like Fast Load, Multi Load, TPump and Fast Export.
- Used Teradata Viewpoint to monitor the system performance when it is in load
- Worked on exporting data to flat files using Teradata Fast Export.
- Worked with the users and testing teams to implement the business logic as expected.