We provide IT Staff Augmentation Services!

Database Engineer Resume Profile

3.00/5 (Submit Your Rating)

Summary of Qualifications:

  • 18 Years' experience 1995 - Present As a full life cycle DBA/Developer/Architect responsible for Planning, Design, Implementation, Maintenance, and Documentation of, Database Servers, Databases, Database Objects Tables, Indexes, Views, Triggers, Stored Procedures and full OLTP/OLAP Data Services Applications coded in T/ANSI SQL and/or MDX using MS SQL Server versions 4.21a, 6.x, 7.0, 2000, 2005, 2008, 2008 R2, 2012 including connectivity/Data Pulls from Oracle 8i, 9i Data Sources.
  • Infrastructure planning/implementation, Requirements Analysis, Data Modeling using Erwin, ERStudio, and Visio to develop Normalized 3NF and Denormalized star, and snowflake schemas, SQL Server Master Data Services MDS , Analysis Services SSAS , Integration Services SSIS , Reporting Services SSRS , Replication, Security, Performance Tuning, Disaster Recovery/ High Availability Solutions Active/Passive and Active/Active Clusters, Log Shipping, Database Mirroring, and AlwaysOn High Availability Groups , Establishment Monitoring of scheduled tasks Database Integrity Checks, Index rebuilds, Backups, ETL Processes, etc. , and Training IT personnel to assume the DBA/Developer Role.
  • Tasks performed, tools utilized, and technologies implemented/configured include:
  • 24 Years' experience 1989 Present As a Systems Architect responsible for Planning, Design, Implementation, Maintenance, and Documentation of Novell 2.X, 3.X, and 4.X and Microsoft NT 3.X, 4.X, Windows 2000, 2003, 2008, 2008 R2, and 2012 Server based solutions including specification/configuration of Server/SAN/LAN/WAN hardware and MS Infrastructure support services AD/DNS/WINS/DHCP .
  • Tasks performed, tools utilized, and technologies implemented/configured include:
  • Planning, Design, Implementation, Maintenance, and Documentation of Physical and Virtualized Systems for DEV/QA/STAGING/PROD environments to support N-tier web application development/deployment. Typical environments consisted of Dell/HP Hardware with or without VMware/MS Virtual Server. Optionally, Operations Management Availability/Resource Consumption Monitoring/Reporting Solutions were implemented utilizing Microsoft's System Center MOM/SCOM, SMS/SCCM solution set.

Professional Experience:

Confidential

Data Warehouse Architect

  • Designing, developing, and deploying an End to End Solution using the Kimball Methodology and the MS SQL Server 2008 R2 BI Stack SSIS, SSAS, and SSRS .
  • Worked with Education Data Subject Matter Experts to perform Requirements Gathering
  • Developed/Documented the Dimensional Model using Erwin based on established requirements.
  • Coded and deployed the solution using ANSI-SQL/MDX and an Agile approach.

Confidential

Contract Sr. Data Architect

  • Prototyped a Master Data Services MDS Solution and presented it to Management, resulting in the decision to move towards a MDS Centric environment to allow for centralized rules management/enforcement across numerous data import processes.
  • Participated in the design of a data export solution to help overcome scale limitations associated with the DrillingInfo standard java Presentation Tier, record oriented data export solution.
  • Developed and Deployed the Solution using MS SQL Server 2008 R2.

Confidential

Contract Sr. Data Architect

  • Lead Data and Systems Architect for the Texas Windstorm Data Warehouse. Responsible for maintaining the existing Data Warehouse, including Reporting Services Reports and Integration Services Packages associated with Nightly ETL and scheduled extracts. My responsibilities included troubleshooting Load and Performance related issues. Initially tasked with root cause analysis of an intermittent ETL SSIS Package failure, examination of standard Performance Monitor counters during Package Execution indicated an I/O Bottleneck related to the NAS configuration. Upon moving the destination database for the ETL Pull to a different local set of disks, the bottleneck was alleviated, resulting in a consistently reliable Nightly ETL and DW Load.
  • Backup DBA on call 7/24 for support issues on Transactional as well as Decision Support SQL Servers, my responsibilities included Performance Tuning of SQL Servers from database through operating system to hardware level.
  • Manager of the Data Warehouse DEV/QA VMware environment, my responsibilities included installation/configuration of VMware on the VMHosts as well as preparation of the VM sets for use. The preparation process consisted of creation/configuration of the VMs, installation/configuration of the Operating System and SQL Server memory, disk space, processors, etc. , Database Restores, and the installation of Source Control Subversion .
  • As Manager of the DW VMware environment and at my Managers request, I assisted IT in the isolation/determination of a performance bottleneck disk I/O in the existing VMware vSphere POC environment, resulting in the modification of proposed 750K VMware vSphere Production implementation to provide substantially improved performance and greatly increased VMHost Utilization.

Confidential

Contract Microsoft Trainer

Certified to teach the Systems Engineering and Application Development Tracks as a contract Trainer, my primary focus was on teaching the SQL Server related curriculum.

Confidential

Contract Sr. Data Architect

  • Callaway Golf needed a Master Data Management solution. There were two Transactional Systems, both with Consumer Data. The first supported Callaway's Microsoft Commerce Server implementation the second supported Customer Service and Marketing Campaign Data Pulls. There was intentional duplication of some consumer attributes across systems, and substantial variance between the types of consumer attributes stored in each system.
  • The process associated with identifying duplicate consumers across both data sets combined, was already being handled by a third party data cleansing company on a Quarterly Basis in preparation for the Callaway Quarterly mail-out of approximately 750K Callaway Golf Magazines. This gave us a starting point for our MDM Process.
  • We sent every name/address pair found in both databases to be cleansed . We got our data back with duplicate groups and type flags. There were flags for Individual Duplicates, Family Duplicates or Multiple Occupant Duplicates.
  • What the Data Set did not provide for was creation of the Master Record. We determined that as part of the Master Data Management solution we needed to bring the process in house . The application that was being used was Data Quality , by SAP. I was responsible for the Installation, Configuration, and Operation of Data Quality .
  • The final component required for the MDM Solution required Business Rules to be applied to each of the tables containing child records prior to deleting the Duplicate Consumer Record.
  • The previous Merge process worked at the consumer record level, ran for several days before completion, and failed to address some of the business rules, causing errors and orphaned records.
  • I obtained the requirements from the Business Owners and coded a solution that, using the Data Quality duplicate key values would perform the consolidation in a set oriented fashion that ran in less than 30 minutes, leaving nothing but Master Data.

Confidential

Contract Microsoft Trainer

Certified to teach most of the courses in the Systems Engineering and Application Development Tracks, as a contract Trainer, my primary focus was teaching the SQL Server related curriculum.

Confidential

Contract Sr. DBA

  • While at Confidential, my responsibilities included assisting in maintaining uptime for several hundred SQL Servers in the Systems Integration Testing SIT Lab. My primary focus was the monitoring and troubleshooting of the Transactional Replication feeds across systems.
  • During that time, I saw a need for a Systems Management Solution. I presented the features/benefits of Microsoft Operations Manager MOM and received management's approval for a limited 20 systems, primarily SQL Servers deployment as a Proof of Concept.
  • Implemented Proof of Concept utilizing VMware Server on existing underutilized Windows 2003 Servers and presented the operations and reporting capabilities to Upper Level Management, resulting in Management Directive for controlled deployment to all 1500 servers in SIT.

Confidential

Contract Microsoft Trainer

Certified to teach the Systems Engineering and Application Development Tracks as a contract Trainer, my primary focus was on teaching the SQL Server related curriculum.

Confidential

Senior Database Engineer

  • As a member of the team that developed the State of Confidential Lottery, I participated in the development of a DW/BI related solution for the analysis of Lottery Gameplay. The solution included SQL Server Replication to the Network Operations Center and a Partitioned View to ease maintenance on tables handling the recording of massive amounts of Transactional Gameplay Data from eight physical locations and 10's of thousands of Video Lottery Terminals.
  • I also introduced the use of virtualization for QA as well as DEV by demonstrating its functionality to the CTO in a Proof of Concept, emulating multiple physical lottery locations, connected by routers, communicating with the Network Operations Center, on a single physical machine VMHost . Each location emulated the multiple computers required by the Confidential Lottery Infrastructure including AD/DNS/Web Services and multiple Active-Passive SQL Server Clusters.
  • This resulted in the decision by the CTO to move forward with Virtualization by purchasing 6 VMHosts for QA and allowing the repurposing of multiple machines in the Confidential Lottery Lab to be used as VMHosts as well.

Confidential

Contract Sr. Data Architect

As Lead Application Architect for the Set Analyzer Project, I developed a custom application for the Database Marketing Group based on SQL Server 2000 and the Business Objects Analytical Tool Set Analyzer to automate the Direct Mail Marketing Campaign Pulls. The application automates the process of generating Campaign Lists for their Direct Marketing Group. Resulting in reducing the time required to determine one month's Direct Mail Marketing Campaigns from one month to 72 hours. Since deployment, its use has resulted in a minimum of 80K/month reduction in postage costs due to providing the ability to buy bulk postage.

We'd love your feedback!