We provide IT Staff Augmentation Services!

Data Architecture And Development Manager Resume

0/5 (Submit Your Rating)

Charlotte, NC

SUMMARY:

  • A highly productive data analytics manager, uniquely positioned with technical data warehousing and statistical modeling skill sets, with over 6 years of experience in data analysis/statistical modeling, and over 13 years of experience in data warehousing, data mining, and data - driven applications development.

TECHNICAL SKILLS:

Programming: Proficient in: R, SAS, Python, Hadoop (Scala, Spark/Pyspark, HBASE, Hive, Sqoop, ElasticSearch), SQL, T-SQL, PL/SQL, Microsoft Visual Studio .NET (all languages, all releases), Java, HTML5, CSS, C, C++, PHP, Javascript, SOAP, Unix & Batch scripting, Objective-C/Xcode (iOS development)

Modeling: Experience with: Artificial Neural Networks, Clustering, Segmentation, Decision Trees/Random Forests, Support Vector Machines, Regression, Survival/Econometric Analysis, Ensemble Methodologies

Database: Netezza/ Confidential PureData, Microsoft SQL Server, Oracle, DB2, MySql, Microsoft Access, Filemaker Pro

Platforms: Microsoft Windows (all versions), Unix, Linux

Reporting/Dashboards: Tableau, QlikView, Microstrategy, Business Objects, Pyramid Analytics, Microsoft PowerBI, SSRS

Software: SAS BI/Base/Enterprise Guide, R/R Studio, Jupyter, Zeppelin, SPSS, Weka, Gephi, Microsoft Visual Studio .NET, VMware, Toad/SQL*PLUS/SQL Developer, Microsoft Office (including Office automation/VBA macros, Office 2013/365), Microsoft SQL Server (SSMS/SSIS/SSRS/SSAS), JUnit, NUnit, Sharepoint (all versions), Sharepoint Designer, WSAD (Websphere Studio Application Developer), WAS (Websphere Application Server), Eclipse, MSMQ, Filemaker Pro

PROFESSIONAL EXPERIENCE:

Data Architecture and Development Manager

Confidential - Charlotte, NC

Responsibilities:

  • Architected/designed, developed, and maintained production business/analytical/reporting layer of the Enterprise Data Warehouse
  • Abstracted dimensional structures that were agnostic to source systems, enabling information to be used and interpreted in the same way across all source systems
  • Acted as lead architect and developer for a new analytics platform (a cloud-based Hadoop environment)
  • Developed machine learning record linkage algorithms in Hadoop (Scala/Spark) as the first foundational pieces of this new analytics platform (record linkage for physicians, Confidential ts, and care locations), serving as the base Master Data for these domains
  • Instrumental in growing a team of 3 to a team of over 30 analysts/developers within 4 years
  • Spearheaded initiatives to bridge the knowledge/skill gap between analysts and technical team members:
  • Weekly “Office Hours” web meetings (for 130+ attendees) to answer questions about and increase understanding of the enterprise data warehouse (including how to join/relate various data assets in SQL statements)
  • Led cross-functional teams of analysts and database developers in analytical pursuits
  • Developed frameworks for building/deploying production-quality statistical models (including automation by deploying models developed in R to run directly within the Confidential Netezza data warehouse appliance)
  • Acted as a bridge between analysts with limited database experience and technical data warehouse developers, having a more thorough understanding of both subject matters
  • Hosted “office hours” meetings to educate other team members about data availability and how to use/join various data assets
  • Worked closely with internal customers to design and implement database and web application solutions for both business and clinical needs
  • Developed ASP.NET web forms as a means of collecting data from physician practices and hospitals (surveys, forms for updating practice information - such as addresses, employees and their roles, etc.)
  • Developed ETL processes for automated database updates, including aggregate processes to perform some calculations within the database to make reporting more efficient
  • Designed and implemented role- and location-based security to limit the data an employee could access based on their role and facility/practice
  • Led POCs with third party vendors to evaluate their products/solutions and their ability to fulfill the needs of the organization, and provided recommendations to drive acquisition of such products/solutions:
  • Data Visualization and Exploration Platform (Tableau acquired vs. Microsoft PowerBI vs. Microstrategy vs. Cognos vs. SAP Lumira/Hana vs. Qlikview vs. Pyramid Analytics)
  • Hadoop/Analytics Platform (HDInsight/Hortonworks within Microsoft Azure acquired vs. Hortonworks within internal data center vs. Cloudera vs. Confidential BigInsights)
  • Acted as team lead and primary developer for the following projects:
  • Evaluated multiple cloud-based Hadoop vendor offerings on multiple criteria from cost, maintenance, and out-of-the-box capabilities to form a recommendation
  • Worked with vendors to conduct pilot projects within their various environments
  • Assisted in forming the business case for funding considerations to build an Hadoop environment
  • Successfully acquired funding for the platform and approval to form a new team devoted to the platform
  • Led the team in architecting the environment and configuration management plan, along with development standards and processes for the environment
  • Established the plan for integration of existing relational enterprise data warehouse with Hadoop (including translation between traditional data models to those optimized for Hadoop)
  • Worked closely with the Population Health Analytics team to locate and acquire potential variables that could be used to describe a 360-degree view of a Confidential t’s use of our care locations and that could also feed segmenting the Confidential t population
  • Incorporated external data sources to supplement internal source systems for a more robust view of Confidential t information than Confidential has ever had before
  • Designed and built a data mart to serve this data to reports and analytical needs
  • Worked with a team of analysts to build/evaluate segmentation models and the variables that best describe population segments, combining insights found from unsupervised cluster analysis with business knowledge/understanding to build a more robust supervised segmentation model
  • Designed and developed an Emergency Department data mart, dashboard, several reports, and self-service BI tool (via Business Objects), including over 80 metrics tracking Confidential t flow, bed management, Laboratory and Radiology orders, etc.
  • Leveraged this data mart to build an ED Utilization predictive model to forecast utilization across various facilities, identifying locations with low utilization whose providers could be reassigned temporarily to the high utilization facilities for better resource management
  • Worked closely with SMEs (Subject Matter Experts) in the various Emergency Departments across Confidential ’s 30+ hospitals to define metrics and map out data points across their various software/database systems
  • Conducted sessions with representatives from the different hospitals to enable them to use the data mart as a self-service BI tool for their own ad-hoc data questions and reporting
  • Designed and developed a data mart and Tableau visualization for aggregated system-wide performance measures whose audience included executive-level CHS leaders
  • Worked closely with analysts/”Strategic Priority Owners” to define measures/requirements
  • Designed and developed a Transplant data mart for tracking pre-transplant activity (i.e. time between referral for a transplant until being place on the waitlist) and post-transplant activity (i.e. measures of survival/transplant success)
  • Leveraged the Transplant data mart for survival analysis, predicting time in the future when complications are likely to surface for transplant Confidential ts
  • Designed and developed a Critical Care data mart (including Virtual ICU data), with more than 30 metrics tracking hospital-acquired conditions, infection control, Confidential t mortality rates, readmission rates, and cost per visit (both variable and fixed costs)
  • Created a web-based interactive dashboard allowing users to form their own comparisons and filters of the data
  • Designed and developed a System Metrics data mart, with more than 30 metrics tracking net revenue/percent margin, cost per visit, payer mix, length of stay, etc.
  • Created interactive data visualizations using the Microsoft Office 2013/Office 365 PowerBI stack (this was performed as part of Microsoft’s RDP, Rapid Deployment Program, allowing early use of their BI tools in beta release)
  • Designed and developed a Levine’s Cancer Institute data mart with more than 30 financial and clinical metrics
  • Created an interactive Flash dashboard using Xcelsius, which allowed for animated charts for high-level reporting, but also included a self-service BI aspect which would permit users with the correct access to drill down to the Confidential t level to see every encounter used to derive each result
  • Designed and developed a Quality Metrics data mart and SSAS Cube with more than 15 quality metrics designed around Diabetes and Asthma population control
  • Worked closely with the Quality/Performance Enhancement teams to work through definitions for the metrics and included population
  • Conducted sessions with members of the Quality and Performance Enhancement teams to enable them to use the data mart/cube within the new Microsoft Office 2013 PowerBI stack to perform their own in-depth data exploration to assist in creating system-wide goals for the 2014 calendar year

Research and Development Systems Engineer

Confidential - Charlotte, NC

Responsibilities:

  • Executed every phase in the software development lifecycle as the sole Software Engineer/Developer in the company (requirements gathering with customers, design, documentation, implementation, testing,, maintenance and support)
  • Developed an application to analyze transitions of chromatography columns for pharmaceutical/biotechnology manufacturing facilities to provide performance metrics to be used for process optimization/quality control and ensuring compliance with FDA standards
  • Developed an OPC client to provide control systems with real-time statistical analysis calculations to allow for programmatic adjustments to the control system while a process is running, which effectively resulted in real-time predictions of where batch processes were headed so that operators could proactively make adjustments to avoid negative outcomes
  • Collaborated with a team of engineers (not employed by Confidential, located in Cambridge, MA and Austin, TX) to successfully create an interface/adapter to bridge an MES/ERP web application with data from an OPM (Oracle Process Management) system, as part of a Weigh and Dispense manufacturing process, leveraging MSMQ to ensure the ability to serialize and queue transactions, enabling any failed transactions to be re-executed at a later time
  • Designed and implemented a software copy protection mechanism (machine-specific) which could easily be incorporated into all Confidential -developed applications, allowing trial periods of the fully functional applications, forbidding users to uninstall/reinstall to restart the trial period
  • Developed web services and extensively used SOAP for making web service calls for validation and testing
  • Developed SQL-LIMS applications for Savannah River Site to track/store/analyze/report nuclear waste for ensuring compliance with US Department of Energy standards
  • Developed OPC clients as Windows Service applications to interface between Process Control Systems (particularly DeltaV) and other applications such as web portals for dashboards (Incuity, AMS Asset Portal, Sharepoint, Iconics, Syncade), other control systems, MS Office, and web services
  • Developed an application, named Documentor, that accepts an .fhx file (a text file export of DeltaV configuration data and logic) and creates documentation in the form of html tables and Excel spreadsheets, and also generates Visio diagrams detailing the logic (resembling state machines) of the control system
  • Enhanced the Documentor application to ensure proper installation and operation in multiple locales (character sets, language settings, etc.), successfully delivering the application to customers all over the world (Netherlands, Switzerland, Brazil, Germany, United States, Canada)
  • Wrote user guides, design documents, test protocols, and worked with customers (internal and external to Confidential ) to create requirement specifications for all applications developed
  • Created internal system tools allowing co-workers to create lengthy design documents through MS Office automation in minutes instead of weeks
  • Acted as the technical support contact for all developed applications, issuing KBAs/hotfixes as needed
  • Developed automated reporting applications to combine data from multiple remote sources into a single report
  • Administered sessions to educate colleagues and customers on the use of developed applications

Websphere Software Test Engineer (Co-op)

Confidential - Durham/RTP, NC

Responsibilities:

  • Worked almost exclusively on Unix/Linux environments
  • Set up Blade servers (including RAID configuration), cluster environments, and extensively used LDAP, DB2 and Oracle
  • Tested a SIP (Session Initiation Protocol) API developed by Confidential, including ensuring that the API met all criteria specified in RFC 3261
  • Created automated stress tests to send random bursts of messages over a long period of time to test load balancing and throughput of Confidential ’s SIP API
  • Wrote and executed test cases against Confidential ’s Websphere Application Server, including testing on real and simulated mobile devices to ensure proper rendering of web pages
  • Wrote defect reports and worked closely with Software Engineers to set up environments for reproducing defects and verify fixes

Virtual Director

Confidential - Raleigh, NC

Responsibilities:

  • Collaborated with a team of 5 students to develop portions of a “Virtual Director” for video games, an artificial intelligence application for dynamically controlling camera placement and movement, as well as soundtrack nuances to heighten player experience, narrative, and action (i.e. pivoting camera around environmental structures to show upcoming obstacles, and adjusting volume and song selection to influence the mood of a particular situation

Senior Design Project, CSC 492

Confidential - Raleigh, NC

Responsibilities:

  • Acted as team lead with a group of 4 other students to develop software for controlling acceleration, braking, and steering of an automated vehicle
  • Developed software to analyze video from a camera mounted on the vehicle, using filtration/noise reduction techniques to remove greyscale elements from video in order to accurately map boundaries of a road and send suggestions to the vehicle’s controls based on real-time and interpolated future road conditions
  • Developed methods of using alternative analyses of the video feed, selecting and applying different noise/filtration algorithms based on whether the vehicle was on- or off-road, including methods of collision avoidance for obstacles that may present in the vehicle’s path

We'd love your feedback!