Devops/ Dataops Engineer Resume
SUMMARY
- DevOps/Data Engineer with more than 12 years of industry experience spent improving Data efficiency, achieving Continuous build and Integration of code throughout the software development lifecycle.
- Have broad knowledge in Configuration Management, CI/CD and Data Pipelines, and building dashboards for monitoring and analysis.
TECHNICAL SKILLS
Languages: Shell scripting, PLSQL
Version Control: Git /GitHub
DevOps Tools: Jenkins, Docker, Ansible, Maven, Terraform, Docker, Kubernetes
App/Web Servers: Tomcat, Apache
Databases: Oracle, MS SQL Server, Teradata
Cloud Platforms: AWS (VPC, Route 53, IAM, EC2, S3, Subnets, Security Group, ELB, Autoscaling) & Google Cloud
Monitoring Tools: Splunk
Operating Systems: Linux and Windows
RDMS/ Data Warehouse: Oracle, MS SQL Server, Teradata, BigQuery
BI Visualization: Tableau, MS Power BI
SDLC Methodology: Waterfall, Agile
PROFESSIONAL EXPERIENCE
Confidential
DevOps/ DataOps Engineer
Responsibilities:
- Design, implement, and maintain data pipeline tooling, developing best practices for data collection, ingestion, transformation, and storage.
- Communicate and collaborate with various teams (SMEs, developers, Data Scientist, other stake holders) to assess data needs and prioritize accordingly.
- Read, extract, transform, stage and load data from multiple targets into BigQuery.
- Build, test and maintain the infrastructure and tools to facilitate the consistent and automated AI software solution development and release process.
- Create and maintain Git workflows for version control (Source Code Management) coordinates build activities from development, QA, UAT through to production
- Developed and maintained automated CI/CD workflows and tools for code deployment using Git, Jenkins, Maven, Docker, Tomcat Server, Ansible and Kubernetes.
- Built and deployed Docker containers for implementing Microservices Architecture from Monolithic Architecture
- Provision Servers and deployed features using Ansible.
- Orchestrated Docker container cluster using Kubernetes.
- Troubleshoot automation issues and find practical solutions that move projects forward in a timely manner.
- Script, debug and automate Bash Shell scripts to reduce manual administration tasks and cloud deployments.
Confidential
Quality and Population Programmer/Analyst IV
Responsibilities:
- Developed prototype for the HEDIS mart that led to the conversion of 1500 SAS reports to SQL Server to increase efficiency.
- Employed SQL & Tableau for data collection and analysis to provide solutions in a timely fashion.
- Conserved two thirds processing time by building automation framework for HEDIS data load.
- Led the design of the Quality and Population Analytics (QPA) Datamart by developing and maintaining SQL code to extract data from various data sources (Claims, Pharmacy, Clarity, Membership & Third - Party Claims.
- Performed monthly quantitative review of all data for quality as well as working with upstream data producers to track and improve data quality as needed Led additional research as needed to address unpredicted findings.
- Supported the production of the Yearly, Quarterly & monthly HEDIS deliverables, Quality dashboard and Benchmark reporting by increasing the availability of HEDIS reports from a yearly to a monthly frequency run; and performing monthly quality checks on the data.
- Participated in the review, analysis and reporting of HEDIS measure results to business teams, management, auditors, regulatory and rating agencies.
- Led the design of database queries, and reports on the HEDIS measures, QST, ECHO & Quality Metrics suite of reports.
- Participated in the Continuous development and improvement of the HEDIS measurement process.
- Developed documentation and created and executed work plans for programs, models and reports including Business requirements, Flow charts and Mapping documents
- Consulted & provided expertise to Business Partners/analysts outside the department on the use and methods of extracting data from the QPA Datamart and its supported data sources.
- Assisted in providing education and proper documentation in the use of QPA Datamart for other teams.
Confidential
Business Intelligence Analyst III
Responsibilities:
- Created & supported ad hoc and canned reports for internal & external customers by building queries using tools such as Cognos, Business Objects, SAS, SQL, Crystal, etc. to extract data from the Colorado Regional data store (CRDS) and Clarity.
- Worked with Business Partners to identify the questions they are seeking to understand; document report requirements and provided mentorship of data usage.
- Evaluated new reporting needs to ensure that steady, accurate and non-redundant, information is provided to end users.
- Contributed to Conducting and continuously improving an ongoing program to ensure reliability, validity and integrity of the data used for reporting and analysis.
- Built and proposed new business logic to the development of conformed data fields in CRDS and where applicable.
- Participated in defining new business rules, conducting analysis, and making recommendations to business partners based on the findings of the analysis.
- Developed basic data check & test proposed logic changes to ensure the reliability and accuracy of the Colorado Regional Data store
- Developed/maintained appropriate documentation on data content, access & criteria for populating BIS-supported data repositories, & data problems.
- Contributed to the Development of claims content document such as claims 101 training for analysts and Source to Target mapping document to facilitate ETL development for All Claims.