Sr. Big Data Lead / Devops Engg. Resume
New, JerseY
PROFESSIONAL SUMMARY:
- 12.5+ years of Strong IT experience in software Platform building and successful implementation for Production, Development, Staging and QA environment.
- 3+ years of experience in Administration, Configuration and managing Open source technology like Spark, Kafka, Zookeeper, Docker, Kubernets on RHEL.
- Successfully implemented CICD (DevOps Automation) using GIT, Jenkins, Docker images on kubernet cluster.
- Successfully setup platform on standalone spark cluster, Oracle Golden gate, kafka and zookeeper cluster, Neo4j.
- Knowledge of of Cloudera and Horton works distributions.
- Hands on experience on NoSQL Database like Vertica, Mongo DB.
- 6+ years of experience in ETL methodologies using IBM Websphere Datastage for extraction, transformation, manipulation and aggregation of data, NETeXPERT rule writing, centura team developer.
- 7+ years of hands on experience on various database technology - Oracle, MySQL, Teradata, BTEQ script and PL/SQL.
- 6+ yeas of hands on experience in Data Analytics of telecom network, various Call detail records including E911, predictive data analysis, time series data analysis.
- Experience in database designing using E-R Model, database programming skills in Oracle and data modeling.
- Provide production support, Development Integration test, User Acceptance testing and aid in root cause analysis for trouble tickets.
- Goal oriented and innovative with excellent people-skills and ability to manage change with ease including diverse situations, technologies and domains.
- Excellent problem solving and strong analytical skills.
- As part of Agile team, participate in scrum grooming sessions, plan and estimate capacity vs availability of resources and task.
TECHNICAL SKILLS:
Big Data/ Open source technologies: Spark 1.6.2/2.1.0 , Pig, Hive, splunk, Neo4j 2.3.2, KNIME 3.1.x, Hue, zookeeper 3.4.8, R, Kafka 2.11.
CICD tools: GIT, Jenkins, Nexus, Docker, Kubernet.
Scripting language: Python, Scala, shell, Java.
Database: Oracle, PL/SQL, MySQL, Teradata.
NoSql: Mongo DB 3.2.9, Vertica.
Reports: Crystal Reports, Cognos Report.
Other Tools: TIBCO BPM, TIBCO BW, HP Quality center, JIRA, I-TRACK, Rally Dev.
OSS COTS Product: NETeXPERT 6.2, Clarity, MAXIMO (IBM Tivoli).
PROFESSIONAL EXPERIENCE:
Confidential, New Jersey
Sr. Big Data Lead / DevOps Engg.
Environment: Spark 1.6.2/Spark 2.1.0, Kafka 2.11, Zookeeper 3.4.8, oracle golden gate 12.2.0.1.1 , Scala 2.10.4, mongo DB 3.2.9, UNIX, GIT, Jenkins, Docker, Kubernet
Responsibilities:
- Setup, configure and maintain Development, QA and Production Platform of RAPTOR.
- Successfully Implementing microservices using GIT, Jenkins to build docker images and implement on kubernet cluster (CI/CD) using Confidential Eco pipeline.
- Configure and administrate spark standalone cluster.
- Configure and administrate kafka and zookeeper cluster, Oracle Golden gate replicates and get data on kafka.
- Configure and administrate kafka topics as per requirements.
- Developed and tested spark jobs in scala to fetch stream of data from kafka topic.
- Install Redis, Node JS, Cherrypy, anaconda, Neo4j on RHEL.
- Tune mongoDB query and mongo collections design to improve performance.
- Monitor spark jobs on UI and Tune spark environment and code to improve performance of Spark job.
- Review the requirements in each iteration, define tasks, estimate efforts and deliver within timelines.
- Developed shell script to transfer file and configure it on cron job.
- Handle Deployment and operation activities to build platform.
- Responsible for L1 support of DEV, QA and Production environment.
- Lead offshore team.
Confidential, New Jersey
Lead ETL / DevOps Engg.
Environment: Data stage 8.7, Teradata 14.10, Oracle 11g, Vertica, UNIX
Responsibilities:
- Developed DataStage Job utilizing sequential/complex file stage, transform, filter, modify, join, merge, remove duplicates.
- Develop store procedure (PL/SQL) and script to dynamically monitor data quality of processed data by comparing it with raw data to detect issues like data mismatch, data loss, KPI calculation errors, wrong data aggregation.
- Create views using DB links for accessing data from another Database.
- Analyze variety of CDR data on Teradata/Vertica across various network elements to validate call flows (Mobility) operating on different 2G/3G/VO-LTE technologies
- Implemented apache Kafka on Cambria (UEB) services.
- Troubleshoot data to report any gap in algorithms, KPI calculation, call pattern issues to stack holders.
- Carryout analysis on BTEQ script using simulated LAB Call detail records as well as live network calls for AT&T.
- Review Data Mapping, Data model, logic for data flow, extraction and aggregation and Data retention policies with principle architect, development and test team of new KPI/deliverables for E2E system.
- Review requirements in each iteration, define tasks, estimate efforts and deliver within timelines.
- Check Data Quality of E2E System including ETL tool, Predictive analytics layer and different integrated system.
- Understand and test Algorithm to identify different call patterns (mobility) and its complex KPI calculation.
- Conduct client demo prior and after scheduled product releases to get client feedback as part of Agile methodologies.
- Coordinate production deployment activities, support ORT phase for users, conduct RCA on production tickets and track it.
- Train resources, Lead offshore team.
Confidential
Lead Software Developer
Responsibilities:- Delivered OSS southbound integration includes NetExpert SA suit- Fault, Performance & SLA module and northbound integration with Remedy.
- Handled Go Live activities. Co-ordinate onsite and offshore activities.
- Handled System Administration part of NetExpert which include Installation, configuration and integration of different modules (FMS, PMS and DMP) at SIT, UAT and Production environment.
- Executed SIT & UAT plan at onsite.
- Conducted NetExpert application and SOP Training for Client in Fiji.
- Involved in preparation of test cases, training documents and final handover documents to be delivered to Client.
- Developed south bound interface in NetExpert with various EMS like Marconi, Provision, Alcatel1353, AXE Switches and NGN for FMS and PMS.
- Developed of cluster monitoring and start up script catering to fail over scenarios.
- Designed and Developed Alarm co-relation policies and Escalation policy using DMP.
- Designed and developed Fault & Performance reports in Crystal reports.
- Designed and developed Heartbeat monitoring functionality.
- Done requirement gathering, BRS preparation, Integration with NX for NGN implementation.
Confidential
Software Developer
Responsibilities:- First phase of project involved complete End to End implementation of Clarity service assurance suite for 3G wireless services which include SLA, Alarm, Fault, Network Diagrammer modules.
- Involved in requirement gathering and analysis phase.
- Solution design, implementation, configuration and testing of SLA, FAULT, Alarm, Network Diagrammer modules.
- Designed & developed COGNOS reports.
- Prepared UAT Test cases.
Confidential
Software Developer
Responsibilities:- MAXIMO is BSS product of IBM which was integrated with AXIOSS, web portal etc. and use to manage WFM, TT Management, SLA and preventive maintenance.
- Customized product modules of MAXIMO for WFM, TT and SLA manager.
- Designed and developed database tables, store procedure and triggers.
- Prepared UAT Test cases.
Confidential
Software Developer
Responsibilities:- Customer Web Portal is a single interface for user for accessing the Fault, Performance and SLA Report of OSS
- Preparing Functional Specification Document.
- Develop portal and solve their queries for Functional Issue.
- Preparation of Test cases and Test Data.
- Internal POC on TIBCO BW and Web Method with Clarity(OSS)
- Design system Integration for Clarity using TIBCO BW and via web method
- Documentation of All API’s and their parameter details including STATE diagram of NMS.
- Design and Test Integration of Clarity (OSS) using TIBCO BW and Web Method.