We provide IT Staff Augmentation Services!

Teradata Performance Dba Resume

0/5 (Submit Your Rating)

OBJECTIVE:

  • To understand the Business environment extensively and accordingly implement delivery excellence strategies to promote the growth of the business.

SUMMARY:

  • Nearly 8 years of experience in the field of IT related to Database Administration in Business Intelligence concepts like Relational Database Management, Design, Data modeling, Data Migrations, Implementation and Project Management.
  • Over 5+ years of experience in optimization, performance enhancement and Procedures for enhancing the load performance in various schemas across databases
  • Advanced expertise in Teradata DBA activities Data Mover, Teradata Active System Management (TASM), PDCR, Teradata tools Teradata Studio Express, BAR activities, Viewpoint (Stats Manager), Tuning ETL queries and very good hands on day - day DBA activities and Teradata DBA Certified professional.
  • Good hands on Unix Scripts, SQL Scripts, PL/SQL and Stored Procedures to ensure the successful execution of the ETL processes.
  • Stays updated with latest tools and technologies coming up in the Datawarehousing environment. New Teradata release features, tools UDA, ASTER, Analystics, Teradata Studio etc. and recommends the benefits and usage to team mates and project mates.
  • Expertise in installation of cloudera Hadoop (5.4.x) and managing the cluster.
  • Working knowledge on Informatica.
  • During my assignments, I have worked in various client locations Stockholm (Sweden), Jakarta (Indonesia), Eindhoven (Netherlands), Copenhagen (Denmark), Charlotte (U.S) and delivered outstanding performance with wide client exposure.
  • Currently working as Teradata and Hadoop administrator for one of the biggest clients.

WORK EXPERIENCE:

Confidential

Teradata Performance DBA

Responsibilities:

  • Review Design document, mapping document, DDL’s, reporting queriesAnd provide recommendations.
  • Work on Top 10 Bad queries of the week - based on runtime, CPU use andImpact CPU.
  • Setup unix scripts to generate Daily system usage reports, SuspectedQueries, canary Queries and bad query reports
  • Quarterly revisit TASM and make necessary changes/recommendations.
  • Setup Datalabs
  • Automation of Stats collection using STATS Manager
  • Implementation of Password change policy - Half yearly.
  • Monthly meetings with client on System performance, issues,New project releases and improvements made.
  • Proactively tuning the system at application level in loading
  • Installation of Cloudera manager on SLES11
  • Installation of Teradata sqoop connectors and Informatica connectors.
  • Installation of flume, Kafka, solr using Cloudera Manager.
  • Manage Clusters - Add/Delete, rename and Start/stop.
  • Import and export data using Sqoop to HDFS
  • Creating roles and granting access to the users.
  • Strong knowledge on Sqoop, Hive and Impala
  • Configure system health alerts
  • Configure replication of HBase, HDFS and Hive
  • Troubleshooting, tuning and solving Hadoop issues.
  • Manage resources with Cloudera Manager
  • Manage services and agents

Confidential

Lead Teradata DBA

Responsibilities:

  • Initially was in Client location Eindhoven, Netherlands to gather requirements
  • Worked on complete new setup, creation of users, databases, roles, profiles,
  • Setup of backups for production and Development using TARA and Netbackup.
  • Generate reports through PDCR and send an overview report on theproduction system to client every month.
  • Creation of filters and throttles in TASM ruleset.
  • Involve in biweekly meetings with client and vendors to understand the issuesand the upcoming activities in the project - plan for production migration.
  • Worked closely with client and setup various processes for release,deployments, statistics, user creations and setup of new environments.
  • Setup of backups (Policies and backups) for production and Development using
  • Automation of user creation, password reset activity.
  • Implementation of Online backups and Incremental backups using partitiontables.
  • Suggest Teradata best practices and recommendations to client in creation of
  • Keep the history data on HDFS landing area using Sqoop.
  • Manage the Hadoop cluster - Add/delete hosts
  • Start/Stop Clusters, agents and services.
  • Installation of Cloudera CDH 5.4.3 on RHEL 6.5
  • Import and export data using Sqoop to HDFS
  • Creating roles and granting access to the users.
  • Manage and control disk space usage on clusters.

Confidential

Performance DBA

Responsibilities:

  • Analyze DBQL and Resusage data to understand the system.
  • Identify Skewed tables, suggest proper PI, Identify missing/stale stats, suggestloading strategies in sourcing the data, schedule load windows during businessff hours, Changes in Workload management for various ETL accountspartitioning etc.
  • Manage concurrency be creating throttles.
  • Automation of stats collection through stored procedure.
  • Creation of secondary indexes and Join indexes to improvequery performance.
  • Create materialized views (Physical tables) in Semantic Layer as the reportingSLA was less than 5seconds.
  • Customize Teradata for better administration using DBQL data.
  • Worked on GCFR (Global Control Frame work/repository) in the environment.

Confidential

Responsibilities:

  • Create databases, users, roles and profiles as per the requirement and manage the system.
  • Work on system clean up, DBA maintenance backup tables, skewed tables, unused accounts, extend partitions, work with development teams on purge process.
  • Review and Deploy code in all the environments
  • Active monitoring the system using Viewpoint.
  • Viewpoint administrator - setup alerts, user creation and roles management.
  • Tuning long running and bad queries
  • Provide recommendations to application team on data loading options.
  • Automate data warehousing refreshes using Teradata Tools and Utilities
  • DBA Save process using DBQL Data
  • Manage and Partition large tables.
  • Responsible for backups - Tara, Netbackup.

Confidential

Responsibilities:

  • Extensive use of DBC tables to manage the system and assist users.
  • Create and manage databases, users, space, access rights and profiles.
  • Enable query logging and access logging as per the requirement.
  • Use TSET to upload the information requested by Teradata CS team T@YS.
  • Refreshed the data by using fast export, multiload, fastload and ARC utilities.
  • Overseeing daily and weekly back-ups.
  • Maintenance activities like Check Table, Scandisk, Packdisk using Ferret Utilities
  • Reviewing the projects in all the phases and Migrating the database objects
  • Backups using Arcmain
  • Worked with BI Production Supporting team for production database issues.
  • Worked on basic Oracle DBA tasks - schema refreshes, tablespace management, generate AWR reports, monitor OEM etc.
  • Setup TASM for the environment and enhance/revisit quarterly based on new usage and requirements.
  • Implementation and migrations of PDCR (performance Data Collection reporting).
  • Involved and handled various Teradata upgrades and migrations.TD12 to TD 13.10, TD 13.10 to TD 14.10.
  • Used DataMover to copy the data from one server to other and automate the process to save business time - dual production system environment for Reporting.
  • Automation of statistics through STATS MANAGER in Viewpoint.
  • Setup Alerts, monitor System response time, focus on Query spot light and tune the system.
  • Worked on setting up new systems/Teradata box: Users, Databases, roles, Profiles, Account strings, DBQL logging and including recommendations for DBS Control parameters, running DIP Scripts and enabling resource usage parameters (Resusage tables)
  • Manage and create Datalabs for AGILE projects
  • Setup Backups for a complete new environment - Polices and schedules( Netbackup using TARA )
  • Implementation of Online backups and incremental backups using partition tables.
  • Generate PDCR reports and present to client on overall system status, suggest and work on improvements accordingly.
  • Worked on Semantic Layer performance tuning.
  • Automation of basic DBA tasks (Password reset, User creation, Stats collection, access rights clean up, data copy and few other).
  • Installation of Cloudera CDH 5.4.5 on SLEs11.
  • Configure Hadoop system health alerts, Configure replication of HBase, HDFS and Hive

We'd love your feedback!