We provide IT Staff Augmentation Services!

Database Administration Resume

0/5 (Submit Your Rating)

Columbus, OH

SUMMARY

  • Around 12 Years of experience in Systems analysis, design, data modeling and Database Administration in data warehouse environments. Experienced in Multiple Skills on various Platforms and involved in designing and creating physical models to support normalized or dimensional models, and provided technical support for data warehouse environments in administration, performance monitoring, tuning and backup/recovery.
  • Major strengths include analytical and logical skills, problem - solving techniques and alignment to customer requirements. Highly motivated and self driven, can work in a team or independent to the best satisfaction of the customer.

TECHNICAL SKILLS

Operating Systems: NCR MP - RAS*Windows* MS-DOS IBM AIX, HP-UX and SUN SOLARIS MVS/ESA 4.3,OS/390, Z/OS

Languages: VS COBOL - II *FOCUS *C/C++ Visual Basic*Korn Shell Scripting *Java Perl *Rexx

Databases: Teradata *Oracle *SQL Server DB2 *FOCUS *IMS/DB

Tools: & Software: Abinitio*Syncsort *Informatica TSO/ISPF *Lotus Notes*File Aid Control - M 6.2*Developer 2000*Tamaris Visual Studio 2000*HP Service Desk*SPUFI ViaSoft*C:D (Connect Direct)* REXX XPediter *CA-Realia II*TOAD CICS*Control-M*TSET Teradata WinDDI/Adminstrator*Teradata Index/Statistic Wizard*Pinecone TASM*Teradata Utilities Teradata Manager

PROFESSIONAL EXPERIENCE

Confidential, Columbus, OH

Software Environment: Teradata 12 Windows Scripting JCL

Responsibilities:

  • Provided database administration support for Tax compliance or Discovery solution by setting up the database environment, created the database objects and setup the roles to manage the access rights. Recommended Query banding techniques for the portal and reporting modules to effectively classify queries and clear identification of source for audit procedures.
  • Coordinated the configuration of TDP and Teradata utilities on Z/OS client. Setup the utility procs and JCL to aid the ETL team in developing scripts on mainframe.Provided Technical input and designed the ETL process by using Teradata Parallel Transporter Utility.
  • Installed and configured the scheduling environment for the solution using Tivoli Workload Scheduler on Windows 2003 and Linux SLES10.
  • Performed Physical data modeling for the Tax solution Warehouse and setup Visual Source safe to facilitate the versioning and migration of the database objects between environments.
  • Performed data assessment by using Teradata Profiler on the staged data to identify any data patterns or anomalies.
  • Implemented the BAR solution using Teradata Tiered Archive and Restore Architecture (TARA) for TSM.
  • Coordinated with TACC on the incident management to resolve issues. Worked with the customer systems group to apply the Efixes to the Teradata libraries.
  • Created Sql Execution Audit reports by using DBQL data and setup the standard DBQL maintenance routines to archive and purge based on customer defined retention rules.
  • Written stored procedures to implement the Discovery module as part of the Tax compliance solution.
  • Performed performance tuning of the reports/sql during the initial migration of the reports from the legacy system to the warehouse.
  • Involved in the administration of Tivoli Storage Manager in setting up the storage policies. Defined and documented the procedures for backup migration and offsite tape management.
  • Involved in planning of the patch upgrade activities and coordinated with customer team in identification and implementation of the upgrade tasks.

Confidential, Wilkesboro,NC

Software Environment: Teradata V2R6.2 Korn shell scripting JCL

Responsibilities:

  • Performed Compression analysis and recovered about 2TB of usable space.
  • Involved in the planning and implementation of Backup and Recovery activities.
  • Identified and documented the redundant or out of date backups that exist and setup an automated process to simplify the creation of new Backups.
  • Analyzed and presented the performance or resource benefits of implementing PPI on tables by avoiding unnecessary full table scans.
  • Evaluated the benefit of using PPI on some of the reporting queries and saved about 80% of resource usage. Implemented PPI maintenance routines by creating and scheduling mainframe jobs.
  • Implemented the database maintenance routines for the Dual system by using Macros.
  • Designed, developed and documented a password maintenance process to implement password synchronization for the batch users on the dual system.
  • Performance Tuning of MSI queries and carried out performance evaluation of user queries from the query log by measuring the CPU, and IO usage.
  • Performed capacity analysis to cater to the new workloads that are introduced in to the system.
  • Created weekly audit procedures to pull information from the DBQL and accesslog for evaluation of activity on the secure databases.
  • Created benchmark/ baseline jobs on mainframe using MLOAD/FLOAD to assist in performance and Sanity testing.
  • Involved in the activities of analyzing the existing workload in preparation for setting up TASM.
  • Performed physical modeling and assisted the warehouse team in evaluation of primary indexes for the supply chain demand management project.
  • Responsible for maintaining and monitoring of the Production Teradata system.

Confidential, Atlanta, GA

Software Environment: Teradata Oracle Informatica Korn shell scripting JCL

Responsibilities:

  • Involved in planning and implementation of the activities to upgrade database and TTU on various servers.
  • Implemented the Backup and Recovery for the tables created using ARCMAIN from MVS system.
  • Involved in planning and performing activities for Teradata upgrade including setting up the Point in time Backup plan.
  • Responsible for creating Monthly usage reports of the Production Teradata System.
  • Responsible for object and security management and implemented the application level permissions by using Roles.
  • Designed and implemented the data transfer solution, to transfer data from Production Teradata system to Test system using Shell Scripts and Teradata Utilities.
  • Involved in performance tuning of runaway queries and proactively monitored the system workload to identify application or database optimization.
  • Assisted the ETL team by providing Technical Assistance and helping in the fine tuning of queries. Served as technical advisor to ETL team for the design and development of ETL streams, including ETL performance considerations.
  • Defined and documented ETL standards to assist application teams in developing consistent ETL routines and also minimize the errors.
  • Performed Database Request Analysis and Capacity planning.
  • Designed, configured and scheduled the database maintenance utilities. Implemented database monitoring utilities using Korn shell script and database procedures.
  • Installed and configured tools or utilities like Pinecone and Control-M.
  • Experienced in using various database utilities like showlocks, lokdisp, Recovery manager, qrysession etc..,

Confidential, Richmond VA

Software Environment: Teradata Oracle Abinitio Korn shell scripting

Responsibilities:

  • Performed Logical and Physical Data Modeling of the database to host the new presentation layer tables
  • Performed Table sizing and Capacity planning for the application.
  • Created Teradata objects in Development Test and Production Environments in a Change Controlled Environment.
  • Involved in fine tuning by reviewing the Primary Index and evaluating the value of Partitioning the data using PPI .
  • Involved in Monitoring of the System and Reporting Queries and evaluated the impact of using Join Index or Secondary Index.
  • Supported the ETL team by providing Technical Assistance to Development Group.
  • Implemented the Backup and Recovery for the tables created.
  • Estimated the Space Requirements for all the Enhancement requests involving the Teradata Database Objects.
  • Managed User Permissions on New Objects created by using Roles.
  • Monitored the usage using Teradata Manager and using various database utilities like showlocks, lokdisp, Recovery manager, qrysession etc.
  • Evaluated and Implemented the Multi Value Compression on the Fact tables and saved around 50% of space.
  • Analyzed the performance benefits of using Join Index to speed up the access and assisted ETL team in designing of the application

Confidential, Richmond VA

Software Environment: Teradata Oracle Abinitio Korn shell scripting

Responsibilities:

  • Supported the Enhancements for the applications in Teradata database Environment by providing Technical Assistance to Development Group.
  • Automated the Partitioned data Roll Process based on the Retention Period set for each table.
  • Created Teradata objects in Development Test and Production Environments in a Change Controlled Environment.
  • Estimated the Space Requirements for all the Enhancement requests involving the Teradata Database Objects.
  • Object and Security management.
  • Performance monitoring of database and tuning of applications.
  • Setting up of Test Environment in Oracle and Teradata to support the ETL team facilitating the testing of Enhancements.
  • Supported the Development team by ensuring the grants and objects are in sync between Production and Test Environment on both Oracle and Teradata.

Confidential, Kansas City, KS

Software Environment: Teradata Oracle Abinitio Korn shell scripting JCL SQL SERVER COM+ Visual Basic

Responsibilities:

  • Performed Dimensional Data Modeling, Logical Data Modeling (LDM) and Physical Data Modeling (PDM).
  • Involved in preparing Data Demographic information to create an Extended Logical Data Model.
  • Involved in requirements analysis and responsible for creating technical specifications from functional specifications.
  • Written Teradata macros and stored procedures to implement the technical specifications and automate tasks. Involved in creation of Macros to automate some of the processes and in generation of reports.
  • Monitoring Teradata system usage with PMON.
  • Created Teradata Tables, Views and Indexes according to the requirements.
  • Loaded data into Teradata tables using MLOAD and FASTLOAD and implemented ETL using Abinitio and Sql.
  • Implemented Data recovery strategies using Teradata ARCMAIN (archive and recovery utility).
  • Involved in evaluating the V2R5 features and analyzed the applicability.
  • Involved in all the stages of the Application Development Life cycle.
  • Involved in the design of Abinitio graphs to extract & transform data from files on UNIX. Performed Data Validation using Transform Components of Abinitio
  • Involved in Optimizing and fine tuning of Teradata applications by evaluating the choice of Primary Index and came up with suggestions to decrease the Run times.
  • Optimized the queries by monitoring the CPU and Spool Space usage and evaluating the Access Plans the optimizer has generated.
  • Created Teradata Tables, Views and Indexes according to the requirements.
  • Implemented Data recovery strategies using Teradata ARCMAIN (archive and recovery utility).
  • Evaluated the Options and performance of using Join Index and Partition Primary Index.
  • Used Partition Components in Abinitio to leverage the Parallel Processing options.
  • Performed Data loads into Teradata through Abinitio using Database Components like
  • Involved in the design and Development of the PTC tool using Visual Basic
  • COM+ the components services module is used to house the software components for Reusability and to yield a distributed computing Environment.
  • SQL SERVER is used to house all the data from different sources and to function as the data store for the Tool.
  • Designed packages using the Data Transformation Services (DTS) of SQL Server to pull the data from Teradata tables.

Confidential, SBC St Louis, MO

Software Environment: Teradata FOCUS JCL

Responsibilities:

  • Involved in the design, development and testing of various new reports according to the specifications for different performance measurements.
  • Customized and maintained existing FOCUS Programs.
  • Retrieved data from multiple Teradata Views and Tables using BTEQ.
  • Created Master File Description (MFD) for FOCUS and Teradata databases.
  • Maintained FOCUS databases as well as restructuring and rebuilding them.
  • Wrote BTEQ scripts to export data from Teradata tables to load into Focus tables.
  • Loaded data into Teradata tables using utilities such as MLOAD and FASTLOAD after formatting the data using FOCUS and COBOL routines.
  • Formatted reports and placed them on OMVS(UNIX) server, to be presented on the Web.
  • Monitored and administered month end processes to generate reports
  • Responsible for setting JCL’s to run jobs on a regular basis (Daily, Weekly and Monthly) and FTP them to the server.

Confidential

Software Environment: DB2*MVS/ESA CICS*JCL VS COBOL II* Confidential (TAMARIS)

Responsibilities:

  • Worked as an Analyst and IQA
  • Devised affective Solutions for Y2K Compliance of Modules.
  • Functional analysis of the information flow within the client’s business
  • Preparing Technical specifications
  • Coding and testing new batch interfaces, reconciliation and extraction programs to bring the existing legacy system on the client’s UNISYS m/c in-sync with the newly brought system on their IBM mainframe.

Confidential

Responsibilities:

  • The project required migration of the report generation software of the client, coded in MarkIV to FOCUS. I have worked as an Analyst and Programmer in this project.
  • The project was completed in three phases.
  • Analysis of MarkIV Report Code running across IMS, GDGs, VSAM and flat files.
  • Coding reports in FOCUS
  • Functional and System testing

We'd love your feedback!