We provide IT Staff Augmentation Services!

Data Warehouse Tech Lead Resume

0/5 (Submit Your Rating)

SUMMARY

  • Senior Business Intelligence Architect with 15 years of experience IT worked in all phases of project life cycle including project planning, feasibility study, analysis, design, development, testing and implementations. I successfully performed technical lead for various clients in different industry types like Finance, Retail and Healthcare.
  • Experienced in developing and designing Conceptual Model, Logical and physical Data model using concept of normalization, object oriented data modeling and designing data marts using dimensional models/star schema/snow - flakes.
  • Hands on experience in Big Data Technologies, Hadoop and Hadoop Ecosystem including Hadoop Distributed file system (HDFS), Yarn and MapReduce Framework, HBase, Hive, HiveQL, Sqoop, Flume, Pig, Ozzie and Zookeeper.
  • Perform planning hardware (memory, CPU, Disk), software and network specifications and capacity for Hadoop implementation, install and manage cluster, set up High Availability strategy, disaster recovery, backup and restore, monitoring and tuning Hadoop cluster and its various components, setting up access strategy for users running analytics, batch reports and interactive querying massive volume of multi structured data.
  • Deploy solutions using NoSQL Databases like Apache Cassandra, Mongodb, deploy database software, set up configuration, planning for scalability and fault tolerance, data partitioning strategy, backup and availability, manage clusters and tune for performance.
  • Performed data mapping, Data design, ETL design and development for large warehouse EDW project and Data marts in complex MPP environment using ETL tools like Informatica, Datastage, and Ab-Initio.
  • Hands on experience and expertise in design and implement solution in MPP databases like IBM DB2 LUW Enterprise, Teradata, IBM Pure Data for Analytics powered by Netezza by creating database models, designing database objects, choosing distribution key, organization key, running explain plan, query analysis, skew analysis, capacity planning for hardware and software, health monitoring using shell scripts.
  • Experienced in designing architecture for highly available, scalable and fault tolerant cloud infrastructure. Sound knowledge in IaaS, SaaS, PaaS as infrastructure and application management techniques. Experienced in deploying infrastructure in virtual private cloud, data centers in hybrid cloud using in Amazon Web Services (AWS).
  • Poses experience in vendor management, onsite-offshore delivery management, project planning and execution.
  • Self-starter, reliable, team player, enthusiastic and hardworking managed successfully several projects diligently, went extra miles for providing support and crisis resolution over weekends and night shifts.
  • Have strong analytical ability, problem solving capability with excellent communication, interpersonal and presentation skills.

TECHNICAL SKILLS

Data Architecture/Modeling: The Open Group Architecture Framework (TOGAF), 3 N-F EDW Modeling, Dimensional Modeling, Erwin 7.0, Power Designer, ER-Studio, Infosphere Data Architect.

BIG DATA and NoSQL: HDP 2.1, HDFS, MapReduce, Yarn, HBase, Hive, Ozzie, Zookeeper, Pig Latin, R

Master Data Management: IBM Infosphere MDM V11.4

RDBMS /Data Warehouse Appliances: Teradata (V2R5 & R13.1), IBM Pure Data System for Analytics 7.0 (Netezza), Mongodb, Apache Cassandra (Datastax), DB2 UDB V10, Oracle (10g, 9i).

ETL Tools: Datastage 8.5, Informatica 9.6 Power Center, SQL Server Integrator Service (SSIS), Ab-Initio.

BI Reporting/Analytics: SAP Business Object, Microstrategy, Cognos, Hyperion, Tableau, SAS

Programming Languages: Java, Python, UNIX Shell Scripting (Ksh/Bash), PERL Shell Scripting, C++, PL/SQL, SQL, VB, VB++Operating Systems UNIX, Red Hat Linux (RHEL), UBUNTU, Centos, AIX 5.x/4.x, OS 390, MVS XA/ESA, Z/OS, Windows NT Server, Windows 2000, Solaris 2.7/2.6, Linux 2.6

Hardware: Hortonworks Hadoop Platform (HDP 2.1), Amazon Cloud based Web Services on EC2, Teradata 5625, IBM ® RISC|6000, Netezza Twin Fin, Netezza Striper, IBM PC, eServer™, P series ® and power3 ™ processor, Sun E15K and HP servers, p690, RS/6000, IBM Regatta, AS/400, IBM

PROFESSIONAL EXPERIENCE

Data Warehouse Tech Lead

Confidential

Responsibilities:

  • Act as a technical domain expert. Identify architectural risks and plans to mitigation risks, adherence to standards and best practices. Demonstrate strong problem solving ability and analytical skills. Influences and communicate effectively with non-technical audiences including senior product and Business management. - Working with vendors on cost, quality and deliverable negotiation.
  • Performed technological leader and Data warehouse Design Architect role for Enterprise Data Liquidity platform where I provide leadership for tactical and strategic directions for Data Analytics and Data Management.
  • Perform overall architecture for data and application for multiple book of records including Provider, Member, Claim and clinical data involving internal and external data source in multiple structure and various complex data types.
  • Analyze different data sources, complex data types and relationship to develop data model for implementation.
  • Use data profiling technique to understand data and quality for different data domains and work closely with stake holders to develop a data quality and data validation rules and data processing techniques using ETL and ELT type design practice.
  • Perform subject matter expert role for Netezza Pure Data Platform (Netezza SME) and provide technical ownership of multiple projects for enterprise data hub build and integrated Book of Records.
  • Set up and configure Netezza, created databases and database objects, applied best practice for configuration and managing appliance hardware and software.
  • Design mastering data using match/dedup and merge technique through IBM infosphere MDM virtual SE implementation and create unique enterprise ID with data coming from various internal and external system.
  • Provide solution, design and guidance to implement Geo Code as a part of extension to organization’s capability of Geographical Information System capability.
  • Provide POC and technical guidance to acquire, integrating unstructured data from various different source at a high volume of 100s of TB and at a near real time streaming data integration with existing ETL framework by applying solution using Hadoop ecosystem, NoSQL data store.

Confidential

Data Warehouse Information Architect

Responsibilities:

  • Performed solution architect role for enterprise data integration platform (Enterprise operation data store) where I prepared software architecture document, Data Architecture document, performed source system data analysis, data discovery for creating an integrated data platform for enterprise wide analytical reporting requirement for multi department and distributed users’ community.
  • Performed lead Data Architect role performing data analysis, Data model design logical and physical for multiple projects with large database implementation for analytical information and Reporting projects enterprise use for area like Member, Claim and Plan/Product business area.
  • Working as Lead Architect and Administrator for Big Data platform projects performing data integration design, cluster administration, writing scripts and applications for Hadoop platform using MapReduce, Pig Latin and shell scripts.
  • Install software packages, perform hardware and software scaling requirement analysis, set up configurations, deploy databases, create users and groups and manage access for various business users to Hadoop platform using various ecosystem components like Hive, HBase, Pig scripts, Shell Scripts.
  • Perform cluster management, health monitoring, performance monitoring and tuning for large Hadoop cluster having petabytes of data.
  • Providing database solutions for application to run analytics in scalable, fault tolerant database like Apache Cassandra, perform deployment planning, install, set up configurations and implement backup and high availability strategy.
  • Mentoring team to write applications for data integration and access using Cassandra database and CQL query.
  • Perform schema design, create databases, collections and integrate large documents into NoSQL database to be used for HR applications and reporting data storage.
  • Design for scalability, software and hardware capacity planning, monitor performance and tuning, database health monitoring and troubleshooting.
  • Provide solution to complex data integration for very large and heterogeneous data source systems, prepared ETL design and development strategy for large complex multi-source data for enterprise usage.
  • Led team to solution designing for replatform of Human Resource DW application replaced from traditional RDBMS package application with industry standard vendor offered NoSQL solution to achieve lowering cost and achieved high ROI.
  • Developed and designed dimensional models for analytical application for BI reporting for tools like Microstrategy and BO and defined reporting framework for optimum performance and easy usage.
  • Performed system administration for IBM pure Data for analytics powered by Netezza.
  • Set up and configure Netezza, created databases and database objects, applied best practice for configuration and managing appliance hardware and software.
  • Defined security and access guidelines and set up automation for health and event monitoring for Netezza hardware, disk system and databases and applied best practice as suggested by IBM.
  • Set up nzevent for monitoring; customize database and system health checkup and resource usage reports, performed performance tuning, designing database including defining distribution key, organization key for faster query response and even data distribution for optimized resource usage.
  • Created automated scripts, used nzsql to write database applications, helped ETL team to set up application using nzload, creating external tables and CTAS.
  • Troubleshoot access issues, performance issues and helped designing ETL/ELT applications for Netezza, IBM DB2 Enterprise.

Confidential

Data Warehouse Design Architect

Responsibilities:

  • Performed Lead Data Architect role responsible for every aspect of Data warehouse requirement analysis, design architecture, design ETL process and prepare framework of data for reporting and analytical purpose.
  • Perform business intelligence requirement analysis, data discovery and profiling, prepared software architecture design document by analyzing business requirement as a lasting solution using industry standards and best practice.
  • Design conceptual model, LDM and PDM for multiple subjects are like Guest, Presentation, Pricing, Inventory, In-stocks, applied industry standard of Kimball and Inmon methodology.
  • Performed ETL high level and low level design for Enterprise Data Warehouse and Data mart data extract, transform and load.
  • Mentors and provides oversight for other data architect and data modeler resources, including reviewing designs and performing code reviews
  • Contributes to project planning discussions, provide status updates for development progress and be a critical resource for issue resolution.
  • Coordinates Activities with Vendor personnel (vendor presentations, site visits, prepare for installations, etc.).
  • Applied database architectural principal in Data warehouse and reduced redundant data and design complexity by normalization of existing structure in Teradata and solved many data quality issue.
  • Developed and designed dimensional models using denormalization technique for analytical application for BI reporting for tools like Microstrategy and BO.
  • Provided alternate solution for space and time optimization and recommended ELT methodology utilizing DBMS engine over ETL wherever applicable and improved process.

Confidential

Data Architect

Responsibilities:

  • Primary responsibility includes working as a data architect for various BI projects analyzing, designing and developing logical and physical data model for BI reporting and analytical purpose.
  • Use Meta data management, maintain meta data repository, produce mete data report and application to be used for ETL process.
  • Used data analysis and quality analysis for source system transactional data for proper usage and reporting for analytical purpose in decision support system
  • Design and develop conceptual and logical data model using modern industry standard data modeling techniques and procedures using data models in Financial domain.
  • Mentor and provides oversight to other data modelers and architects including reviewing design and addressing concerns.
  • Developed several Data design using complex techniques of normalization for BI project specially data to be used by BI tools like Cognos.
  • Advocate standardization and best practice of Data architecture principals advocated by Teradata and constantly utilizing techniques for highest data quality with optimum performance.
  • Write concise and comprehensive document for business users for their understanding of data representation, rules and relationship.
  • Analyze data requirements for building cubes, data modules and for dash board and tactical reporting for prospective customers.
  • Contributes to project planning discussions, provide status updates for development progress and be a critical resource for issue resolution.
  • Contributed in overall project planning and resource management, work distribution and delivery management including team onsite and offshore and update status to higher management, coordinated activities among different resources and resolve issues.

Confidential

Data Architect

Responsibilities:

  • Worked as a senior Architect in Information Management team of Information Architect and Database Architect.
  • Designed and Developed Logical and Physical data model (PDM) using ER Studio with Financial Data.
  • Created normalized database structure for base layer of warehouse and star schema for data mart applications for BI reporting requirement used in enterprise analytical purpose.
  • Attended meeting with business users and functional experts to understand business need and contemplate that into technical requirement and design document.
  • Understand complex business rules, explained data and its rules for use transactional source data into analytical data.
  • Designed databases, created database structure decided surrogate key, primary key, Referential Integrity (RI), table constraint, column default, determine table partition requirement, choosing partition key, DDL implementation, Index design and alter Teradata database environment.

Confidential

Data Architect

Responsibilities:

  • Performed data architect role for data warehouse and data mart projects responsible for design conceptual, logical and physical data models using Erwin tool.
  • Performed data analysis, business rule analysis and convert operational data into useful data for analytical purpose.
  • Worked with business analyst to understand their requirement and designed data access process and reporting for end user.
  • Performed database design, implement database structure, improved performance and performed ETL design and resolve any data or process related issue or concern.

Confidential

Database Administrator and Design Specialist

Responsibilities:

  • Primary DBA for an EDW database, total size 12TB span across 2 physical and 14 logical nodes.
  • Designed and Developed Logical and Physical data model (PDM) using ER Studio & Power designer, created and modified and updated Logical data model (LDM) and Physical Data Model (PDM).
  • Primary design and create physical database structure, deciding partitioning key, primary key, table constraints (foreign key, check constraint etc.), DDL implementation, modification and design of DB2 database objects.
  • Performed tuning of Microstrategy and Cognos complex and large SQL queries for BI reporting, tuned existing SQL queries and store procedures and rewrite many of them.
  • Performed database health monitoring, production support. Release migration, capacity planning, migrate and upgrade to newer version.

Confidential

DB2 UDB DBA

Responsibilities:

  • Performed database administrator responsible for more than 300 database, performed design, capacity planning, implement and maintain and provided remote support for production system,
  • Implemented database, objects, wrote store procedures, SQL applications, provide security and access and production on-call support.
  • Supported application, performed database tuning, application tuning, and SQL tuning and troubleshoot.

We'd love your feedback!