We provide IT Staff Augmentation Services!

Hadoop Application Architect, Enterprise Services Architecture Resume

4.00/5 (Submit Your Rating)

New, JerseY

SUMMARY

  • Career spans around 13 years of IT industry experience in Insights & Data Analytics as BigData/Hadoop Architect and DW/BI - ETL Solution Architect.
  • Very good experience implementing End to End Big Data projects in Hadoop Platform & Ecosystem - Hortonworks Data Platform, Cloudera distribution, HDFS, Map-Reduce, Hive, Pig, Kafka, Spark, Scala, Hbase, Falcon, Oozie, Nifi, Trifacta, Flume,Sqoop.
  • Strong technical skills in IBM InfoSphere DataStage, Informatica PowerCenter, Oracle Database and OBIEE Reporting.
  • Extensive experience in System Analysis, Application Design, Solution Development, Implementation of End to End Projects using Hadoop Ecosystem, ETL tools and DB technologies across Telecom/Retail/Financial/Healthcare services domain.
  • Good experience in Data Modeling, understanding of business dataflow and data relation.
  • Proficient in analyzing and translating business requirements to functional/technical requirements and application architecture.
  • Good Knowledge in Reporting & Visualization layers - OBIEE and Cloud Technologies - IBM BlueMix.

TECHNICAL SKILLS

Operating System: AIX, Sun Solaris, Linux, Windows

BigData Platform: Hortonworks HDP 2.3, Cloudera CDH 5.8

Hadoop Ecosystem: HDFS, Hive, HBase, Pig, Sqoop, Kafka, Spark, Oozie, Falcon, Nifi, Phoenix

Data Wrangling/Profiling: Trifacta

RDBMS: Oracle 10g/11g, DB2 UDB 8.1, Teradata 12

ETL Tools: IBM InfoSphere DataStage 8.7, Informatica PowerCenter 7.1.3

Reporting Tools: Business Objects XI, OBIEE 10/11g

Programming/Scripting: Java, Scala, Python, UNIX Shell Scripting, Perl, PL/SQL

Scheduling Tools: Autosys, Control - M, Crontab, Zena

Other Tools: TOAD, HP QC, MS Team Foundation Server, JIRA, GitHub, Jenkins

Quality Processes: IBM QMS AMS, EXPRESS One, IT/UP PRISM, Agile Methodology Scrum,Sprint

Functional Management and Leadership Skills: Ability to lead a team and manage/deliver a project, provide cost/effort estimation, provide realistic schedules, business/data analysis of applications, Support functional groups like Production Support, Business Analysts, Testing groups, Deployment groups

PROFESSIONAL EXPERIENCE

Confidential

Hadoop Application Architect, Enterprise Services Architecture

Responsibilities:

  • Prepare high level application architecture and design document.
  • Develop prototype/framework for Ingestion/Consumption modules.
  • Design and model Hbase tables for performance and web service calls.
  • Prepare the datasets for consumption using Hive and Hbase data storage.
  • Ingest varied source data formats - JSON, XML, Text, Avro, Sequence, ORC into Hadoop
  • Update JIRA issues, GitHub code management, deployment through Jenkins.

Environment: HDP 2.4, Hadoop - HDFS, Hbase, Pig, Java MapReduce, Python, Hive, Sqoop, Flume, Kafka, Phoenix

Confidential

Hadoop Architect

Responsibilities:

  • Prepare high level architecture document encompassing Logical/Physical/Development views.
  • Develop prototype/framework for development enhancement.
  • Design Nifi flows for data acquisition and data archival.
  • Prepare the datasets for consumption using Trifacta tool and transform Avro to JSON data format.
  • Create Nifi custom processors for delta detection and streaming into common data service layer.

Environment: CDH 5.8, Hadoop - HDFS, Hbase, Trifacta, Nifi, Pig, Java

Confidential

Hadoop Architect

Responsibilities:

  • Near Real-time streaming of Financial Markets data from various trading applications like PCT (Portfolio Control Tool), FMRAW (Cancel & Amends) and SWOT (Missed Trades Information) into Data Lake.
  • Loading of high-volume data from Core Banking Legacy Mainframe systems through Batch Ingestion Framework.
  • Tier-2 analysis of data for FCC Regulatory Compliance & Risk Reporting and sending data to downstream systems.

Environment: Hortonworks HDP2.3, Hadoop - HDFS, Hive, Pig, Kafka, Spark, Scala, HBase, Phoenix SQL

Confidential

ETL Architect, Engineering & Design

Responsibilities:

  • Requirement Impact Analysis, Effort estimation, Preparation of Functional, Technical Specification Documents.
  • Design, Develop, Unit test ETL jobs, Oracle PL/SQL packages.
  • Support OBIEE RPD Design and Modeling, Perform Adhoc report analysis.
  • Support SIT, UAT and Production Implementation of projects
Environment: DataStage 8.7, AIX, Oracle 11g, Control - M, OBIEE 10g/11g

Confidential

Team Lead/Design Architect

Responsibilities:

  • Analyzing the ETL specification documents and requirements provided by Business team /customer.
  • Working with solution architects to decide on the ETL DataStage flows & design.
  • Designed, Developed DataStage ETL jobs and load into RDM Data Models for Reporting.
  • Worked on Data Model with DBA team and SME(Subject Matter Expert)
  • Involved in creating the JIL’s and scheduling jobs on Autosys.
  • Preparing Unit Test Plan (UTP), Unit Test Plan Execution (UTPE) and Test data for Unit testing.
  • Resolving the defects assigned by the QA and Business team.
  • Involved in Deploying code to IST/REG/UAT/NFT by following Release Management Process.
Environment: IBM InfoSphere DataStage 8.5, Oracle 10g, Teradata, Linux, Autosys

Confidential, New Jersey

DataStage Application Consultant/Team Lead

Responsibilities:

  • Prepare the time and cost estimation using IBM time and cost Estimation model.
  • Provide the schedule for the individual development tasks and milestones.
  • Create high level design and low level design documents in discussion with System Engineers and Architects.
  • Develop, Test and deploy DataStage jobs, UNIX shell scripts, and Oracle stored procedures
  • Performance tuning of DataStage jobs, SQL queries and re - design of existing applications.
  • Interact with the offshore development team and provide technical guidance for solving complex requirement issues and technical issues.
  • Interact with users in resolving user acceptance testing (UAT) defects and code fixes.
  • Track the project plan from end - to - end and report the project development status to the client users and project managers.
  • Maintaining project documents in IBM Team for GCP following IBM/ Confidential & Confidential processes and methodologies - Quality Management System (QMS) - AMS MS, OPAL, Express One and IT/UP.
  • Help and mentor new team members to understand Confidential & Confidential ’s GCP Application.
Environment: IBM InfoSphere DataStage 8.0, Oracle 10g, Sybase, Teradata, Solaris

Confidential

Lead Informatica Developer/Administrator

Responsibilities:

  • Requirements gathering and impact analysis of Informatica Mappings, Sessions, Workflows & Database tables for the new enhancement projects.
  • Providing level of effort estimates and schedule for ETL and UNIX script development.
  • Planning of ETL load strategy for performance improvement.
  • Design, development and testing of Informatica mappings and UNIX shell scripts.
  • Providing resolutions for Service Calls, Incidents and Problem tickets created by Customers in HP Service desk.
  • Provide time critical production support to Informatica PowerCenter as SME, which has huge volume of data and processing happening each day.
Environment: AIX 5L, PeopleSoft EPM 8.8, Informatica PowerCenter 6.1/7.1.3, Oracle 9i, DB2 OS/390 7.1.2

We'd love your feedback!