Lead System Analyst Resume
VirginiA
SUMMARY:
- A competent programmer with profound knowledge and skills in computer programming with zeal to face the challenges and willing to work in competitive environment.
- Strong IT Professional with 13+ Years of programming experience and several years with Big Data and Big Data analytics.
- Experienced in installing, configuring Hadoop cluster of major Hadoop distributions.
- Have hands on experience in writing Map Reduce jobs
- Hands on Experience in working with ecosystems like Hive, Pig, Sqoop, Map Reduce,Python.
- Strong Knowledge of Hadoop and Hive and Hive's analytical functions.
- Hands on experience in installing, configuring and using ecosystem components Hadoop Map Reduce, Confidential, HBase, Zoo Keeper, Oozie, Hive, Cassandra, Sqoop, Pig, Flume.
- Successfully loaded files to Hive and Confidential from MYSQL.
- Loaded the dataset into Hive for ETL Operation.
- Good knowledge on Hadoop Cluster architecture and monitoring the cluster.
- Extensive knowledge and experience in COBOL,CICS,JCL,SQL,DB2 v10,VSAM,IMS DB, FILE - AID, TSO/ISPF,CA7 (Scheduler), CHANGEMAN, Endevor, Rex, Expediter,INFOMAN, SPUFI,IBM UTILITIES, DB2 Stored-procedure.
- Certified PSM I Scrum Master
- Extensive knowledge on Unemployment insurance Tax system and legacy system modernization
- Excellent Project Management skills like Estimation, Planning, Execution, Control and Delivery and was responsible for all deliverables and worked on providing cost-effective solutions with help from Project Managers
- Results oriented, strategic thinker, highly analytical with record of increasing application performance and reducing costs--excellent with both technical and business matters
- Excellent knowledge of IT Infrastructure Library .ITIL V3 certified professional.
- Knowledge of Service Oriented Architecture (SOA) and web services. Transferred Files and reports to open system.
- Managed production projects, enhancement and supports for RBS application management projects with Incident Management and Problem Management skills for all customer facing applications
- Project Communications management experience includes determining the information and communications needs of the project stakeholders. Collecting and distributing performance information, including status reporting, progress measurement, and forecasting.
- Extensive experience in design, development, testing, production support and implementation of IBM mainframe related Applications. Development life cycle experience including: Analysis, Design, Coding, Testing and Quality Control of various software projects.
- Proven strength in problem solving, coordination and analysis. Strong communication, interpersonal, learning and organizing skills matched with the ability to manage stress, time and people effectively.
- Played role of a Team Leader and handled activities such as work planning, allocation, tracking, reviewing and testing.
- Hands on experience in Hadoop, Confidential file system, PIG, HIVE, HBASE, Zookeeper SQOOP
TECHNICAL SKILLS:
Operating System: Z/OS, OS/390, Windows, UNIX
Languages: Cobol 390, Cobol 2, JCL, JES2,C, SQL, Python
Hadoop Ecosystem: Hadoop MapReduce, Confidential, Flume, Sqoop, Hive, Pig, Oozie, Cloudera Manager Zookeeper.
NOSQL: HBase
Data Base: DB2 v9/10, IMS DB
FMS: VSAM
Tools: and Utilities: XPEDITOR, SPUFI, FILE - AID, FIXIT, Endevor, MS Visio,CICS ChangeMan, TSO/ISPF, MQ SERIES, SOA,File Manager, HP Service center, Web services, Panvalet, OPS MVS,REXX,CA7,CLIST,UCC7/UCC11, FOCUS, Syncsort, FTP, IBM utilities, NDM,SAR,TMON
PROFESSIONAL EXPERIENCE:
Confidential, Virginia
Lead system analyst
Responsibilities:
- The vision of On-Demand Operational Reporting (ODR) is to organize information and make it accessible and useful to end-user community through a self-service Business Intelligence platform.
- Build the Operational Data Store Model in Big Data and publish ODR related structures to MicroStrategy using HIVE and Impala
Skills used: Hadoop, Apache Pig, Hive, OOZIE, SQOOP, UNIX, HBASE, Confidential, DB2, Cobol,VSAM
Mainframe lead Data Analyst
Confidential, Washington, DC
Responsibilities:
- I have been working in capacity of Business, Application, Data and Infrastructure Architect on multiple projects that span multiple ministries and business groups.
- I elevated documentation quality by applying due diligence, attention to detail and completeness.
- I have built relationships with business units by proactive and continuous facilitation.
- Capturing business rules for the mainframe application in Tax and Benefit side.
- Facilitating and nurturing relationships with 3rd party vendors
- Coordinated and managed the RFP (request for proposal) process for the current project. Review the response to RFPs.
Confidential, Maryland
Hadoop Programmer
Responsibilities:
- Create, validate and maintain scripts to load data using Sqoop manually.
- Create Oozie workflows and coordinators to automate Sqoop jobs weekly and monthly.
- Worked on reading multiple data formats on Confidential using Scala.
- Running reports in Pig and Hive.
- Develop, validate and maintain HiveQL queries.
- Fetch data to/from HBase using Map Reduce jobs.
- Writing Map Reduce jobs.
- Running reports in Pig and Hive Queries.
- Analyzing data with Hive, Pig.
- Designed Hive tables to load data to and from external tables.
- Run executive reports using Hive and Qlik View.
Confidential
Hadoop Consultant
Responsibilities:
- Used Sqoop to transfer data between RDBMS and Confidential .
- Involved in collecting and aggregating large amounts of streaming data into Confidential using Flume and defined channel selectors to multiplex data into different sinks.
- Implemented complex map reduce programs to perform map side joins using distributed cache
- Designed and implemented custom writable, custom input formats, custom partitions and custom comparators in Mapreduce.
- Responsible for troubleshooting issues in the execution of Mapreduce jobs by inspecting and reviewing log files
- Converted existing SQL queries into Hive QL queries.
- Effectively used Oozie to develop automatic workflows of Sqoop, Mapreduce and Hive jobs.
- Exported the analyzed data into relational databases using Sqoop for visualization and to generate reports for the BI team.
Confidential, Flowood, MS
Mainframe lead Data AnalystSkills used: Hadoop, Apache Pig, Hive, OOZIE, SQOOP, UNIX, HBASE, Confidential, Python, Flume
Responsibilities:
- Developed Hive (version 0.10) scripts for end user / analyst requirements to perform ad hoc analysis
- Working with application teams to setup new Hadoop users and setting up and testing Confidential, Hive, Pig and Map reduce access for the new users
- Involved in developing the Pig scripts. Solved performance issues in Hive and Pig scripts with understanding of Joins, Group and aggregation and how Confidential it translate to MapReduce jobs.
- Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.
- Involved in resolving the JIRAs based on Hadoop.
- Raw Html data extracted from Code Repository using PIG Regular expressions and used in Hive External Table using Dynamic Path ( Extern').
- Moved all crawl data flat files generated from various retailers to Confidential for further processing.
- Using Sqoop exported Hive external output processed data into Mysql, imported data from legacy system to Confidential
Confidential
Senior Analyst
Responsibilities:
- Participate in the recovery of production incidents and their permanent resolution through problem management. Used HP service manager tool for Incident/problem incidents in live production environment and handled multiple recoveries to recover the banking system
- Review, analyze and modify programming systems inclusive of encoding, debugging, testing, installing and documenting programs
- Collaborate with Project Managers, business partners, and developers to coordinate project schedules and timelines.
- Work with business partners, analysts, developers and project managers to develop test plans, produce test scenarios and repeatable test cases/scripts through all parts of the development lifecycle, execute and sign-off for a high volume of data and regular release schedule for successful project delivery.
- Worked are a cost-effective solution architect in determining the modification/impacts to the system infrastructure/framework and solutions to the business requirements.
- Worked with Business users and analysts to determine business and technical requirements and translating them to design (high and low level) and work with Project Managers for cost-estimation
- Technical Design (HLD and LLD) of the Mainframe components of the project (especially Cobol modules, JCL, DB2 and CICS)
- Responsible for creating the business process flows and dataflow diagrams
- Analysis of requirements provided by the clients and current system state to propose innovations for continuous system improvement.
- Transitioning and supporting infrastructure in “Business As Usual” Mode and ensuring high levels of customer satisfaction
- Incident and problem management, Support the development team in code reviews and defect tracking.Implemented SDLC phases for all the work efforts and document phase-associated deliverables.
- System and Integration Testing. Maintenance and Support for the project. Incident and Problem Management (tools used: HP Service Manager
- Coordinated Production Monitoring, Fine tuning of application programs with the help of DBA.
- Remodeled legislative processes to fine-tune performance and accurately report financial data
Confidential, Rochester, NY
Mainframe Developer
Responsibilities:
- Development and analysis of the specifications provided by the clients.
- Analysis of requirements provided by the clients
- Was responsible for production monitoring and fixing abends.
- Coding and Testing of modules related to my application.
- Maintenance and Support for the project.
- Designed and worked on COBOL modules
- Modified CICS online screens and then implement the screens, new business rules in mainframes
- Worked on creation of new Confidential 's & JCL's
- Performed extensive system testing and migration into production
- Preparation of Test Plans and Test Cases.
- Conducted walk through on all proposed changes, interviews with users concerning all modifications and impact analysis.