We provide IT Staff Augmentation Services!

Architect Resume

0/5 (Submit Your Rating)

SUMMARY

  • More than 13 years of experience in the Information technology
  • The open group architecture framework (TOGAF) certified
  • Played roles of Big data, ETL, Solution and Data Architect
  • Scored more than 90% in MongoDB for DBAs course from MongoDB University.
  • DataStage Enterprise Edition Certified
  • American Healthcare Management (AHM - 250) certified
  • Electronic Data Interchange HL7 (Health level seven) Certified
  • Worked extensively in Healthcare and Insurance domains
  • Design the entire process flow for the Data Warehouse projects
  • ETL Jobs Design, Development, Testing and Implementation
  • Data modeling & SQL performance tuning
  • Solutions and POV's on big data
  • Worked extensively on HDFS, Hive, DataStage, Oracle SQL loader, SQL, PL/SQL, and UNIX Scripts.
  • Liaison with client teams, business analysts and architecture boards on DW, Oracle and SQL server Projects, played active role in technical consulting, architecture, ETL Design, Data modelling and Performance tuning.
  • In consulting assignments analyzed client requirements and came up with solutions to support both tactical and strategic requirements. Analyzed the current state, documented the findings and envisioned target state, provided options and recommendations to achieve the goals.
  • Won the first prize at Confidential Garage (Innovation) day challenge for excellent working solution to a given problem.
  • Awarded certification of appreciation from senior management (SVP’s and senior director’s) for outstanding contribution for successful delivery of important project
  • As a technology architect did POV’s and POC’s on Big data. Wrote whitepaper on Pentaho integration with Big data. Provided solutions for some customer engagements
  • As a solution architect analyzed existing environment of a leading UK based Insurance Company and came up with solutions to consolidate multiple databases, ETL and reporting tools which were result of acquisitions and Individual LOB’s growth.
  • Technical manager at onsite/offshore for implementations of various data warehouse projects for leading US Health Insurance Company. Designed the ETL process for some of the strategic projects with high service level agreements.
  • Worked as senior technical member at onsite for a leading disability management Insurance company in USA. Created ETL standards designed and developed DataStage jobs for various projects.
  • Experience working in both waterfall and agile methodologies
  • Worked as both team player and Individual contributor

TECHNICAL SKILLS

  • Big Data
  • HDFS
  • Hive
  • Pig
  • Sqoop
  • Oozie
  • Zookeeper
  • Solr
  • MongoDB
  • Impala
  • Hbase
  • Spark
  • Data Stage Enterprise and Server edition
  • Connect Direct
  • Microsoft Visual Source Safe (VSS)
  • ERWIN
  • Visio
  • TOAD
  • Pentaho Kettle ETL
  • Oracle 11G
  • DB2
  • SQL Server
  • MS-Access
  • C
  • C++
  • SQL
  • PL/SQL
  • T-SQL
  • UNIX shell scripting
  • DataStage Basic

PROFESSIONAL EXPERIENCE

Confidential

Architect

Responsibilities:

  • Analyze the functional and non-functional requirements and provide overall solution
  • Prepare high level process flow from data ingestion intoHadoop to reporting
  • Create logical and Physical data model for application database and Analytics data mart
  • Create overall big data strategy as part of core team
  • Work with teams to process data via Hadoop. Cleanse, Validate, extract and load data into Database
  • Provide ETL load strategies
  • Used Sqoop to dump data from relational database into HDFS for processing
  • Identify any schema changes in source based on meta file, Store data in Hive with schema evaluation
  • Provision data from Hive on demand based on criteria from the product systems
  • Prepare SAD as part of architecture team
  • Help reporting team with performance tuning
  • Work on POC's using Flume, Solr, Impala and Spark
  • Started using Change Data Capture (CDC) to ingest incremental data into Hadoop and use Data virtualisation to get consolidated view of data

Environment: Big data technologies (HDFS, Hive, Sqoop, Hbase,Spark), ERWIN 9.5, DataStage 8.5, Visio, Linux Shell Scripting, CDC, Pivotal HD and CDH 4

Confidential

Hadoop & Data Architect

Responsibilities:

  • Installed and configured Hadoop stack on a 7 node cluster.
  • Created Curl scripts to pull data from twitter
  • Developed Pig programs to parse the raw data, calculate the score and store in HBase.
  • Integrated HBase and Hive i.e. data is loaded into HBase and accessed via Hive
  • Analyze the existing client matching and householding process, and document the key capabilities of new solution
  • Create RFP for vendors, evaluate and rank the products, provide recommendations to client
  • Work with business owners and analysts to gather requirements for new sales and marketing data mart
  • Create logical and Physical data model
  • Create architectural diagrams to depict ETL, Data quality and MDM components their interactions and data flow. Create templates for ETL mapping and documentation.

Environment: CDH4 with Hadoop 1.x, HDFS, Pig, Hive, HBase, Linux, UNIX Shell Scripting, ERWIN and Informatica

Confidential

Technical Architect

Responsibilities:

  • Setup the hadoop cluster with Cloudera distribution of hadoop and Pentaho Kettle
  • Create template for evaluating various ETL tools from big data perspective.
  • Build hadoop capabilities in the team, headed the panel to review POC on Informatica, Pentaho and Talend for big data.
  • Provided technical help to pull data from twitter for analysis
  • Helped team to work on Windows Azure (Cloud) big data, move data to and from cloud
  • Provided high level solution to few customers

Environment: CentOS Linux, CDH 3.0 (HDFS, Pig, Hive, Hbase, Mahout), MongoDB, Pentaho Kettle, Windows Azure (Cloud)

Confidential

Technical Architect

Responsibilities:

  • Analyze the existing BI environment of the client
  • Check the management, governance and quality aspects of data, come up with strategy to enhance data quality, availability to meet the business and technical requirements.
  • Analyze the environment non-functional aspects such as scalability, performance and security and suggest the required changes.
  • Come up with a strategy to change some of the weekly full volume loads into incremental daily loads.
  • Study the security, high availability and disaster recovery of the environment and suggest necessary actions.

Environment: Sun Solaris, AIX, DataStage, SQL Server, Visio, MS Office

Confidential, Bloomfield, CT

Technical Manager/ Architect

Responsibilities:

  • Requirement Study (Reviewing the requirement specifications to understand how it improves or impacts the business users)
  • Client liaison (Coordinating with client and Business analysts) and Risk mitigation (Assessing the risk factors and ways to overcome those risks)
  • Design the ETL process flow of the projects as per the requirement.
  • Develop the DataStage jobs and review the jobs created by other team members to make sure they follow the Standards and best possible approach. Develop Oracle packages and queries. Performance tune queries for other Client teams.

Environment: Sun Solaris, AIX, DataStage, Oracle 10g/11g,VSS, TOAD, Connect Direct, SQL, PL/SQL, UNIX Shell scripts

Confidential, Philadelphia, PA

Project Leader

Responsibilities:

  • Functional Requirements Study (Reviewing the functional specification document with Business Analysts)
  • System Analysis
  • Design the process flow, prepare the logical data model and send to data modeler to create physical data model.
  • Design and develop the DataStage jobs and sequences. Review of deliverables (Review the task released by the individual team members). Documentation of changes (To ensure proper documentation in the project)
  • Client Communication (Communicate with client for business clarifications)

Environment: Sun Solaris, AIX, DataStage, Oracle 9i/10g,VSS, TOAD, Connect Direct, SQL, PL/SQL, VMS, UNIX Shell scripts

Confidential

Project Leader

Responsibilities:

  • Effort Estimation
  • Requirement Study (Reviewing the requirement specifications to understand the business functionalities)
  • Database design, objects creation, writing packages and performance tuning.
  • Knowledge sharing (Sharing the business logic among the team members in seminar sessions or meetings)
  • Co-ordination of project activities and Review of Deliverables

Environment: Sun Solaris, Oracle, Visual Basic 6.0, VSS, SQL, PL/SQL

Confidential

Module Leader

Responsibilities:

  • Prepare the High level and low level design of the database objects.
  • Coding of Visual Basic applications and Oracle procedures for the modules.
  • Review of design (Reviewing the low level and high level design documents prepared by the team members)
  • Status Reporting

Environment: Oracle 8i, VSS, VB 6.0, SQL, PL/SQL, VB Script

Confidential

Module Leader

Responsibilities:

  • System Analysis
  • Prepare the application flow diagram
  • Interacted with Head of departments, doctors, billing clerks etc to create the GUI which is easy to use and still robust.
  • Involved in the DB design, object creation and performance tuning with the DBA. Developed PL/SQL Packages, procedures and functions
  • Review of test plan & test script
  • Review of deliverables

Environment: Oracle 8i, VSS, VB 6.0, SQL, PL/SQL, VB Script

Confidential

Team Member

Responsibilities:

  • Requirement Study (Reviewing the requirement specifications with team leader to understand the business functionalities)
  • Prepared the process flow diagram of the modules assigned to me.
  • Development of application and PL/SQL packages.
  • Create unit test cases and do unit testing. Worked with QA team on system testing.
  • Code walkthrough (Performing the code walkthrough with other modules team as suggested by the team Leader)

Environment: SQl Server 2000, VSS, VB 6.0, T-SQL, VB Script

Confidential

Responsibilities:

  • Requirements Study
  • System Analysis
  • Coding
  • Testing and Release

Environment: Windows NT, SQl Server 7.0, VSS, VB 6.0, T-SQL, VB Script

We'd love your feedback!