Data Engineer Resume
Alexandria, VA
SUMMARY:
- A collaborative professional with substantial experience, & expertise with big data analytics, data mining, establishing, measuring and reporting on cloud and Hadoop architecture and executing solutions for complex business problems involving large scale data warehousing, real - time analytics and reporting solutions.
- Extensive experience in supporting Multi-DBMS, IBM DB2, Oracle, IBM IMS, SQL Server, PostgreSQL, MySQL, MongoDB DBA, Hadoop DBA, Capacity Planning, Application support, Architectures, IMS to DB2 Migration, VSAM to DB2 Migration, IDMS to DB2 Migration Projects.
- A dep t a t m anag i n g m u lti p l e concu rr en t p r o j ec t s, be i n g a tt en ti v e t o de t a i l a n d maintaining t h e ab ilit y t o m ak e r a ti ona l dec i s i on s i n p r essu r e s it u a ti ons. Willingness to own responsibility and work to resolve issues.
- Ability to work with commitment, passion, and thrust to exceed Customer’s expectation both as an Individual, and as a Team member with the knowledge that each Customer’s expectations are unique. Willingness to provide Lead role when required.
- E xcep ti ona l i n t e r pe r sona l sk ill s w it h t h e ab ilit y t o c l ea rl y co mm un i ca t e w it h no n-t echn i ca l an d t ec hn i ca l persons a t a l l l eve l s bo t h w it h i n an d ou t s i d e o f t h e o r gan i za ti on.
- Approximately three decades of core IT experience in three different countries (US, India, and Singapore) with varying cultures, environments, and work ethics.
- Some of the corporations that I worked are Confidential.
- Expert DB2 Database Administrator in database administration, Systems Administration, DB2 Installation Support, DB2 New Release availability, scalability, and performance enhancements support, Planning for migration, conversion, and fall back support, Support for SMP/E steps involved with DB2 migrations, RSU Support etc, DB2 Capacity planning, & performance monitoring and tuning DB2 Sub System & Database.
- Expert in Database Utility Jobs, using IBM, BMC and CA utilities (QUIESCE, RUNSTATS, REORG, STOSPACE, IMAGE COPY, FLASH (SUSPEND, COPY, RESUME), UNLOAD, DSNTIAUL, DSN1COPY), IBM Administration Tool, RC-Query, RC-Migrator, Data-Analyser, Catalog Manager, Omegamon, PBG, PBR, Inline LOB, CLOB, BLOB, XML etc.
- Expert Support for DB2 Ver 9, Ver 10, Ver 11, & Ver 12 utilities. High proficiency in DB2 Version Migrations, Expert Support for Disaster Recovery Exercise, 24X7 Support for DB2 Systems / Applications, Expert support for DB2 Systems Utility Jobs (DSNTIJMV, DSNTIJCA, DSNTIJIN, DSNTIJUZ, DSNTIJEX, DSNTIJVC, DSNTIJTM, CATENFM, CATMAINT, & DSN1CHKR), Estimating DB2 Storage needs for DB2 Migrations, Post Implementation validation and verification Support.
- CA-7, Zeke & Zebb, Zena automated job schedulers for Batch jobs (Date and Time Driven Scheduling, & Event Driven Scheduling).
- Support for Database Protocols, communication protocols, communication database, Installation Considerations, estimating Buffer Pools size, Storage, TCP/IP Communication etc.
- High Proficiency in Data sharing, Backup & Recovery architecture in Data sharing, Parallel Sysplex Performance, Data sharing Performance, Systems managed Duplexing, Planning, maintenance, & operational Support for DB2 Data sharing, Installation, enabling, geographically dispersed parallel Sysplex (GDPS(Waukegan, IL, & Fort worth, TX data centers)), support setting up performance expectations, Support with RMF Reports, support with performance monitoring and tuning, improving the performance of data sharing applications, improving concurrency, tuning group buffer pools etc.
- Expert Support for Operational, Monitoring, Installation, & setting up traces with / of IBM Tivoli OMEGAMON for z/OS, BMC Mainview, Detector, Apptune, DB2 PM, IBM Infospheare Change Data Capture (CDC), & Classic Data Architect (CDA).
- Expert Application Support for easier development and integration of business applications, Expert Support for Database design for optimal application performance, creation and development of Physical Model and Logical Model of Relational and XML, SQL Fine tuning, Triggers, and stored procedures, UDF, Locking & Concurrency, Lock escalations of tablespaces due to application loads, Distributed Processing, People Soft etc.
- Expert in Data Replication, Support for UNICODE, ASCII, EBCDIC Translations, Support and implementation of conversion of Data.
- High Proficiency in IMS DBA utilities, IMS DB Control, DBRC, COPE, OMEGAMON/IMS, IMS Version Migration Support. Maintenance support to convert segments of database, changing data in segment, adding logical relationships, Reorg, Unload, Load Utilities, unidirectional, & bidirectional symbolic pointers, adding secondary Index, DEDB, DBDGEN, PCBGEN, ACBGEN, PHDAM, PHIDAM, HALDB, DFSURGU0, DFSURGL0, DFSURRL0, DFSURUL0, DFSRRC00, DFSUOCU0, Backup, DLI, DL2, CDC, CDA & Recovery.
- High proficiency i n des i gn i ng, bu il d i n g an d ad m i n i s t e ri n g O r ac l e c l us t e r e d se r ve r con fi gu r a ti on s suppo r ti n g 10 g an d 11 g R ea l A pp li ca ti o n C l us t e r s ( RAC ) i ns t a ll a ti on s o n L i nux, So l a ri s an d A I X .
- Ability to work on developing stored procedures, Functions and Triggers, PL/SQL procedures, Korn shell scripts.
- Installing & configuring Oracle clusterware & Database Software (Troubleshooting & handling common issues that arise during integration of the whole - bugs, network issues, configuration files issues, OCR issue, de-installation & cleanup of clusterware)
- Ability to Work with customers for various RAC related issue: RAC recovery, ASM, OCR corruptions, voting disk loss etc.
- Troubleshoot performance issues for the RAC instances (GC events)
- Backup and Recovery issues related (loss of OCR, Voting Disk and Oracle clusterware) & also issues with corruption of individual disk data blocks, loss of OS configuration files, loss of net configuration files. Also used the Merge backup for backing terabyte databases. Issues with block change tracking & also flash recovery area.
- Add or Remove Node from RAC.
- Installation of RAC and managing RAC environment efficiently.
- Proficient in RAC tuning.
- Developing parser and loader map reduce application to retrieve data from HDFS and store to HBase and Hive.
- Importing the data from the MySQL into the HDFS using Sqoop.
- Importing the unstructured data into the HDFS using Flume.
- Used Oozie to orchestrate the map reduce jobs that extract the data on a timely manner.
- Written Map Reduce java programs to analyze the log data for large-scale data sets.
- Involved in using HBase Java API on Java application.
- Automated all the jobs for extracting the data from different Data Sources like MySQL to pushing the result set data to Hadoop Distributed File System.
- Customize parser loader application of Data migration to HBase.
- Developed Pig Latin scripts to extract the data from the output files to load into HDFS.
- Developed custom UDFS and implemented Pig scripts.
- Implemented MapReduce jobs using Java API and PIG Latin as well HIVEQL.
- Participated in the setup and deployment of Hadoop cluster.
- Hands on design and development of an application using Hive (UDF).
- Responsible for writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).
- Provide support data analysts in running Pig and Hive queries.
- Involved in HiveQL, & Pig Latin.
- Importing and exporting Data from MySQL/Oracle to HiveQL Using SQOOP.
- Importing and exporting Data from MySQL/Oracle to HDFS.
- Configured HA cluster for both Manual failover and Automatic failover.
- Designed and built many applications to deal with vast amounts of data flowing through multiple Hadoop clusters, using Pig Latin and Java-based map-reduce.
- Specifying the cluster size, allocating Resource pool, Distribution of Hadoop by writing the specification texts in JSON File format.
- Creates a SOLR schema from the Indexer settings, & Implemented SOLR index cron jobs.
- Experience in writing SOLR queries for various search documents.
- Responsible for defining the data flow within Hadoop eco system and implement them.
- Exported the result set from Hive to MySQL using Shell scripts.
- Developed HIVE queries for the analysts.
- Environment: Apache Hadoop, Hive, Hue Tool, Zookeeper, Map Reduce, Sqoop, NIFI, Ambari, crunch API.
- Pig 0.10 and 0.11, HCatalog, Unix, Java, JSP, Eclipse, Maven, SQL, HTML, XML, Oracle, SQL Server, MYSQL.
- MongoDB DBA Administration.
- MongoDB Schema Design.
- Installation, Configuration and Deployment of MongoDB.
- Ability to Support creating shards, replica sets, monitoring, and projections for Mongo Systems.
TECHNICAL SKILLS:
Programming Languages: PL/SQL, COBOL, JCL, DB2, IMS, SMP/E, REXX, CICS, UNIX shell programming, SQL, PL/SQL, C++, Linux .
Software/Databases: Python, Java, Scala, SQL, PigLatin, MongoDB DBA, DB2, ORACLE 11g/10g/9i/8i on UNIX and NT, MongoDB, Hadoop, JSON, BSON, IMS, Easytrieve Plus, DB2, CICS, MQ Series, IMS/DC
IBM Utilities: DB2 Administration Tool, IBM Tivoli Omegamon, IBM Utilities, BMC Utilities, CA Tools & Utilities, Data Studio, Main View, QMF, Visual Explain, ER Studio, JCL, CA Log Analyzer, BMP, PSB, DBD, PeopleSoft DB2 DBA.
Oracle Utilities: DBArtisan, Erwin, TOAD, SQL Station, Quest Software Tools (Spot light, SQL Lab Vision, Storage and Performance Management) Recovery Manager, Tivoli ADSM version for oracle on AIX, Veritas NetBackup for oracle on Solaris and NT, DBverify, Exp/Imp, expdp/impdp, Server Manager, SQL*Loader, Log Minor, Statpack and Oracle Enterprise Manager grid control.
Big Data Tools: Hadoop (Horton, MapR, & Cloudera s Distribution of Hadoop/CDH), MapReduce, Yarn, Hive, HDFS, Pig, Hbase, Flume, Sqoop, R: Regression, Predictive Analysis, Data Mining, Sentiment Analysis, Hue, Cloudera Manager, Apache Ambari, NIFI, SOLR, Adobe SiteCatalyst, RapidMiner, R, Mahout, Tableau, MS Office, Progress 4GL, SQL (T-SQL, PL/SQL, MySQL, Hive SQL, PostreSQL), MDX, UNIX Shell, and Awk, Java, Perl, Python, RDBMS, NetBeans, Visual Studio, ETL.
Operating Systems: MVS/ESA, z/OS, AIX, IBM z 13 Series, HP9000, Red Hat Linux, UNIX (SUN Sparc Solaris, AIX)
Systems: Cloud Computing, Clustered Computing, Distributed File Systems, Business Intelligence Systems, Data Mining Systems, Reporting and Dash boarding Systems.
Web Applications: J2EE
PROFESSIONAL EXPERIENCE:
Confidential, Alexandria, VA
Data Engineer
Responsibilities:
- CBP is a federal Agency and and I am currently involved as Database Engineer providing support to BigData, & Oracle for OLAP, & OLTP, Datacom Modernization effortfor Passenger, & Cargo Systems.
Confidential, Mason, OH
Consultant Database Administrator
Responsibilities:
- Anthem is the Nations Biggest Healthcare Insurance provider, with a membership of over 50 million. I was involved as Database Administrator with Modernization of Legacy IMS, & DB2 database systems, to Oracle in addition to everyday monitoring, maintenance, enhancements and new Database development for applications.
Confidential, Chicago, IL
Sr Database Administrator
Responsibilities:
- HCSC is one of Nations Premier Healthcare Insurance provider, with a membership of over 13 million, & am currently involved as Database Administrator with zero down time, and supporting with maintenance, enhancements and new Database development for applications, DB2 Migrations, DB2 Systems, Performance Tuning, Capacity planning of applications, DB2 systems.
- Responsible for High Availability, & accessibility of databases. This effort involves 24X7 support of DB2 Sub systems to application interfaces on z/OS & Web.
- Extending support to Oracle, Hadoop, & MongoDB along with DB2 to embrace multi DBMS support concept, and embrace paradigm of management for Open Source Software, & Commodity Hardware to reduce costs.
- Support in setting up 3 Master Node, 47 Slave Node, & 15 Edge Node Cluster. Developed process to ingest Structured, & Semi-structured data into HBase using Scoop, and help applications with Data Analysis with HQL. Helped in supporting Flume Ingestion, and configuring Source, Channel, Sink, & setup automated backup’s using Oozie etc.
- Support management’s paradigm of globalization of IT services, and embrace new methodologies like DevOps, SCRUM, Waterfall, & Extreme Programming etc. The future state of Architecture has contained Mainframe, & transition to Open Source Platforms, & Architectures .
Confidential, Bloomington, IL
DB2 Database Administrator/ Data Specialist
Responsibilities:
- State Farm Corporate South is a centralized headquarters for its data processing functions. Statefarm is the Nations leading auto insurer.
- Was involved as project DBA in FSS group with maintenance, enhancements and new Database development for applications. Responsible for Physical database design, development of databases, maintenance Jobs, Performance validations using SQL Ease, BMC Main view and Explain.
- DB2 Administration Tool, CA tools, Platinum Products, BMC Mainview, BMC tools, IBM tools, SQL Ease etc.
- UNICODE, EBCDIC, ASCII translations, Information Integrator, DPROP.
Confidential, Baltimore, MD
DB2 Database Administrator/ High Performance Design Analyst
Responsibilities:
- Confidential supports the complete range of services related to Social Security Administration’s major systems modernization initiatives.
- Involved with analysis and design of implementation of DB2 Ver. 8 Release on very large Social Security Administration databases and migration of databases from BDAM to DB2. The objective is to gain a performance of 15%, with DB2 Version 8. The analysis also involves impact due to EBCDIC to UNICODE storage and its impact on Applications and also DB2 systems.
- DB2 Administration Tool, CA tools, Platinum Products, BMC Mainview, BMC tools, IBM tools
Confidential
Senior Consultant
Responsibilities:
- Sogeti provides systems integration and other technology services. It offers expertise in areas such as Data migration, software testing and applications management, Database Administration etc.
- Data migration & Database Administration.
Confidential, Columbia, SC
Consultant
Responsibilities:
- Expert support for installations, analyzing and testing the new releases of IBM DB2 Version Upgrades to Ver 7.0, enabling Parallel processing, Annual Disaster Recovery Exercise, Provide expert support Operational, Monitoring, Installation, & setting up traces with / of IBM Tivoli OMEGAMON for z/OS, BMC Mainview, Detector, Apptune etc.
- Performance analysis and capacity planning on an S/390 Sysplex system using Intune, DB2 Explain output and Strobe. Major performance determinant for DB2 lies in how programs and applications make use of DB2 resources. Tuning information and detailed analysis was used to minimize excessive DB2 resources, Lock/Latch excessive time, synchronous I/O suspension time and asynchronous read suspensions etc.
- Design new complex systems which involve optimization in DB2 Buffer pool tuning, with optimized data analysis.
- Logical Design from project Objective specs, with Physical databases and Walkthrus of applications with in-depth technicality, with a detailed focus on testing and development and compatibility of data to other dependent applications. Performance validation and Performance investigation is a mandatory in all of my projects for zero post implementation problems. The simple rule followed is “Investigate performance early, validate performance last”. The aim was targeted at zero post implementation problems. Let metrics control the projects that worked. The ideology was that “If you can’t measure it, you can’t manage it”, defects are uncovered at a faster rate, than they are being corrected early in the development cycle. DB design flaws have been minimized with simple checklist of Referential Integrity, Indexing strategies, Locking Architectures, Renormalizing data, File and data partitioning. Reviews have enhanced the quality and more than one hundred situations have been corrected in 7 years’ time. The aim was meet the implementation deadlines and customer expectations.
- Migration of VSAM to DB2 in line with the paradigm of the organization to have a single data infrastructure for growing business and technical needs, like faster processing of Claims, access of Data by customers and deployment of new applications, consistency in quality of data, real time availability of data 24X7, facilitating running of Sysplex environments, utilization of partition table space features of DB2, and concurrency in data updates between online and batch.
- DB2, IMS, Utilities (DB2 & IMS), DB CNTL, RECON, DLI, BMP, JCL, DBD Map, PSB Map, DB2 Administration Tool, Platinum Products, VSAM, COBOL, APS, CICS, SMARTTEST, QMF, DB2I, ACCESS/DB2, DB2 PM, EXPLAIN, DB2 Estimator, Intune, Strobe, DSNZPARM (Accounting, Statistics, Performance), SMP/E.
Confidential
Advanced Systems analyst/ Developer/Programmer
Responsibilities:
- Lead a team of around 80 resources for projects related to the clients Confidential.
- Took the responsibility of working in the most challenging and a very time sensitive project involving splitting of Confidential system, to Confidential and Delphi Automotive systems for the sub system Pension and retirement System(PARS).
- Pension and Retirement System (PARS/CRIS), of Confidential supports the pension and retirement maintenance for all Confidential employees. This system is part of the HR system that Confidential is supporting for Confidential across NAO. This system processes the retirement benefit for all the retired employees, generate retirement monthly checks and reports.
- Confidential India provides 24X7 support and enhancements and maintenance (Applications, Database Administration & Systems Administration, to DB2 & IMS Sub Systems) to Client Confidential .
- Support (Lead Offshore Contact) for Client: CND (Chevrolet National distribution), Australia is involved in run of the daily batch production cycle for Confidential, Australia and process orders from dealers all over Australia, prioritizes the orders and auto reallocates the colors and models of cars ordered and delivered.
Confidential
Senior Software Engineer
Responsibilities:
- Managed, & lead teams working on Y2K Projects for Clients CSX Transportation.
- Was involved in maintenance and Y2K changes to complex IMS Applications involving primary key, secondary keys, sub sequences and search fields of CSX databases. Invaluable contributions in development of common copybooks for Y2K impacts for very complex IMS programs involving impacted SSA’s, in Primary Key, Secondary Indexes, Sub sequences, IMS Databases, DB2 Databases etc for the Client CSX, at MGS (India).
Confidential
Project Engineer
Responsibilities:
- Chief Project engineer, and managed multi-national teams of work force across wide spectrum of Design of projects, Submitting proposals, Implementation of projects, & Customer acceptance.
- Unify Eng pte. Ltd, Singapore is an engineering firm involved in engineering design, manufacturing and eng. construction projects in Singapore. The clients included Singapore Cable Vision and Thomson Multimedia etc.
Confidential
Project Engineer
Responsibilities:
- Sapphire Construction Company, India is an engineering firm involved in Design and eng. Construction projects. Clients included National Thermal Power Corporation (NTPC) and Steel Authority of India (SAIL).