We provide IT Staff Augmentation Services!

Hp Vertica Database Developer Resume

3.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • Teradata Professional (Teradata12) with around 6 years of experience in the area of Data Warehousing, RDBMS, Mainframe Application Development& Maintenance, Source System Analysis, Data warehouse modelling, Application Design, Development, Testing and Implementation in Data warehouse for Banking & Financial and Telecom projects.
  • Expertise in Design, Development and OLAP operation with Teradata database
  • Good knowledge on HP Vertica Platform Architecture
  • Hands on experience with HP Vertica SQL Analytics, Loading and exporting data
  • Experience with Vertica tables/projections data modelling
  • Optimize Vertica projection Segmentation, table Partitioning
  • Expertise using Teradata utilities (SQL, B - TEQ, FLOAD, MLOAD, FastExport, Tpump, TPT)
  • Expertise inconceptual/logical/physical data model using Erwin r9
  • Experienced in Informatica with strong business understanding of Banking and Financial services.
  • Developed Complex mappings from various transformations like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure’s, Lookup, Update Strategy, Normalizer and more.
  • Expertise in writing UNIX shell and Perl scripting.Developed wrapper scripts for ETL and Load jobs
  • Familiar in creating Unique and Non Unique primary Indexes, Secondary indexes, and join indexes in Teradata.
  • Involved in designinglogical and physical data model using Erwin and Visio.
  • Experience on creating Reports and Dashboards using Qlikview.
  • 5+ years of UNIX shell scripting experience for developing wrapper scripts for ETL jobs and for creating environment files, to create the jobs and job streams(schedules) for daily runs.
  • Depth of SQL skills with Teradata, Oracle, DB2, HP Vertica.
  • Implemented performance tuning on sources, mappings, sessions and targets.
  • SQL, joins, indexing, data manipulation and transformation, and knowledge of relational databases.
  • Worked with heterogeneous data sources like Oracle, SQL Server, Salesforce, Flat Files and XML.
  • Experience in setting up definitions and process for test phases including unit test, system test, integration test, user acceptance test and product test and upload them into HP ALM
  • Experience on Hadoop MapReduce, Hive, Pig, HBase, Sqoop, Flume Frameworks.
  • Exposure to cluster setup Apache Hadoop Distributions.
  • Exposure to Sequence input format and custom input formats, join on Map side and Reduce side, Hive tables, partition, UDF, Indexes, loading data, and writing Hive queries.
  • Involved in Development, migration, enhancement, maintenance of project.
  • Proven ability to learn quickly, apply new technologies, hardworking ability to work in a paced environment.
  • High end working experience in Agile/Scrum methodologies.

TECHNICAL SKILLS

Programming Languages: UNIX Shell Scripting, PL/SQL, Perl, Stored Procedures

Databases: Oracle,IBMDB2 UDB, Teradata, HP Vertica, Postgres

ETL Tools: Informatica, Power Center

TD Utilities & Others: BTEQ, FASTLOAD, MULTILOAD, Tpump, FastExport, TOAD, Teradata SQL assistant, CA ERwin Data Modeler, HP Vertica vsql, TWS Maestro, Oracle Golden Gate, Autosys, SFTP, IBM Connect Direct(NDM)

Environment: s: HP-UX11/10/9, IBM-AIX 4.3/4.0/3.1, Sun Solaris 9/8/7/2.6/2.5, SCO-UNIX, LINUX, Windows 95/98/2000/XP, IBM Mainframe OS/390, Z/OS

Mainframe Technology: COBOL, JCL, VSAM, SQL, Easytrieve, IMS DB, DB2, SORT, CA7, IDCAMS, Endevor

PROFESSIONAL EXPERIENCE

Confidentia, Dallas, TX

HP Vertica Database Developer

Responsibilities:

  • Responsible for developing information architectures and data models development and solution architecture.
  • Confer with solution architects, data strategists, business analysts and cross-functional teams to obtain information on project limitations and capabilities, performance requirements and interfaces.
  • Making sure solution definition has all the upstream and downstream interfaces impacts to satisfy business requirement
  • Created logical and physical database models and data lineage diagrams.
  • Worked with CA Erwin to model new/modify entities/tables and generated Forward Engineer DDL scripts, Complete Compare Reports
  • Created Teradata, Vertica tables DDL scripts using Erwin for Stage, Target environments
  • Design and develop HP Vertica anchor tables, Projections. Analyse query logs and make corrections to Projections.
  • Develop HP Vertica vSQL scripts for bulk loading, delta loading stage & target tables
  • Developed scripts to copy data between various Vertica environments
  • Created end to end ETL data Lineage documents which include Source, Target, Interface, Transformation details Confidential table level
  • Developed Interface design specifications to source data as well as sending extracts
  • Worked with Golden Gate SDEF files, parm files and various replicat commands (ggsci start, stop, altseq, info, logdump) to develop and troubleshoot data replication
  • Sourced data from Oracle, Flat files and loaded into target tables of SCD Type 1 and Type 2, Full Refresh
  • Developed various Informatica mappings to load the data into Landing zone
  • Used various Informatica transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy, Union, Sequence Generator
  • Developed Teradata FastLoad, Multiload, BTEQ, TPT, Fast Export scripts to load Landing zone files into Stage and them to Target
  • Created Teradata DDL, DML scripts and executed them with bteq in Unix scripts
  • Developed Unix shell scripts utilizing commands like awk, sed, sort, join, merge, diff, grep, head, tail, wc
  • Worked with Teradata SQL Assistant to Import and Export data into/out of Teradata Tables
  • Worked with Teradata Volatile, Global temporary tables
  • Worked with various types of Teradata Index - Primary, Secondary, Join, Hash
  • Able to analyze and improve query performance using explain output
  • Worked with Vertica SQL, Partitioned Vertica tables
  • Fine tune performance of Teradata, Vertica tables by using various encoding, compression techniques, indexing.

Environment: Informatica 8.6, Teradata, BTEQ, FLOAD, MLOAD, FastExport, Tpump, HP Vertica, Vsql, Mercury Quality Center 9.0, CA Erwin r9, Shell Scripts (Bash, Korn), QWS, Flat files, VSAM files, Teradata SQL Assistant 13.0, TOAD, Perl, Notepad++, Winscp, AOTS for Ticket Management, IBM Tivoli(Maestro) for job scheduling

Confidential, Indianapolis, IN

Sr. Informatica Developer

Responsibilities:

  • Identify different source systems and performed data analysis to determine the best approach of procuring source data using Informatica Power Center 9.x.
  • Worked on issues that involved migrating the dev objects from Dev Repository to Test repository.
  • Created Informatica mappings, Sessions and executed them to Load the data from the source system using Informatica Server Manager.
  • Responsible for creating different reusable sessions, reusable worklets and workflows in Informatica for the initial data loads into development, QA and production environments for the Warehouse and Created User Defined Functions in Informatica to reduce code redundancy.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.x.
  • Worked on partioning methods Random/HASH Partioning while creating the tables.
  • Etezza is the Source and Target and will load the data into the target and will test the data.
  • Testing the data in Netezza DB to match source to target and raise a defect ticket in HP Quality centre.
  • Developed and documented Data Mappings/Transformations and Informatica Sessions as per the business requirement.
  • Performed Integrated Testing for various Mappings Tested the data and data integrity among various Sources and Targets.
  • Developing SQL Scripts for Data Validation after Loading and other data manipulation activities in various environment.
  • Provided 1st, 2nd, 3rd level support for existing production EDW data warehouses.
  • Worked on SSRS tool to generate the reports

Environment: Informatica 9.x,Netezza,Autosys and Windows Server 2000.

Confidential, Groton, CT

Informatica Developer

Responsibilities:

  • Analyzed the business requirements, design documents & whole life cycle of the project
  • ETL Developing in agile methodology environment with daily scrums and two week sprints.
  • Prepared Unit test scenarios, test scripts for testing the data from source, staging, data warehouse and data mart and loaded them into Quality center.
  • Design and map to understand the scope and review Level 2 architecture, High Level Designs, Functional Spec, and Assembly Docs, Mapping Spec provided by project team.
  • Responsible for maintaining the status and deliver on time.
  • Designed & coded the ETL requirements for projects - Confidential prism phase2.
  • Worked on BTEQ Utility for the insert/update/delete.
  • Worked on FASTLOAD/MULTILOAD to load the large amount of data.
  • Designed Star Schema with fact tables & dimension tables, developed Informatica Mappings to extract data from data mart.
  • Mapplets & Reusable Transformations were used to prevent redundancy of usage & maintainability.
  • Worked on the performance issues Confidential various levels such as target, sessions, mappings & sources, identifying performance bottlenecks, troubleshooting issues & resolving the same in the fastest timeframe possible

Environment: Informatica Power Center 8.x (Designer, Workflow Manager, Workflow monitor, Repository Manager), Teradata, Autosys.

Confidential

Teradata Developer

Responsibilities:

  • Extensively used Informatica PowerCenter 8.1.1 to extract data from various sources and load in to staging database.
  • Extensively used transformations such as Source Qualifier Aggregator Expression Lookup
  • Router Filter Update Strategy Joiner Transaction Control and Stored Procedure.
  • Developed Informatica mappings and mapplets and also tuned them for Optimum performance Dependencies and Batch Design.
  • Extensively worked with Informatica - Source Analyzer Warehouse Designer Transformation developer Mapplet Designer Mapping Designer Repository
  • Designed the mappings between sources (external files and databases) to operational staging targets.
  • Worked with pre and post sessions and extracted data from Transaction System into Staging Area. Knowledge of Identifying Facts and Dimensions tables.
  • Tuned sources targets mappings and sessions to improve the performance of data load.
  • Wrote Teradata SQL queries according to Process need
  • Used Teradata SQL Assistant & Manager for database.
  • Generated reports using Teradata BTEQ.
  • Wrote BTEQ Scripts for extracting data.
  • Establish Loading process using MLOAD FLOAD and BTEQ.
  • Exporting Data using BTEQ and Fast Export.
  • Implemented Slowly Changing Dimension methodology.
  • Created Several Informatica Mappings to populate the data into dimensions and fact tables.
  • Developed Perl Script for loading of Data from Source to Target.
  • Worked cooperatively with the team members to identify and resolve various issues relating to Informatica and other database related issues.
  • Designed and Developed Oracle PL/SQL Procedures and UNIX Shell Scripts Data Import/Export and Data Conversions.
  • Designing mapping templates to specify high level approach.
  • Was involved in the performance tuning of the queries using the stats and tune the queries for the optimal time frame.
  • Worked extensively on transformations such as Lookup, Aggregator, Expression, Joiner, Filter, Rank, Update Strategy, and Sequence Generator and performed performance tuning of ETL process.
  • Developed and used re-usable transformations and mapplets extensively to simplify development and maintenance.
  • Wrote Pre and post session Shell scripts for extracting data from files, remove duplicates and sorting in the database to optimize performance.
  • Improvised query performance by using explain plan.
  • Worked with CA Erwin for data modelling and Dimensional Data Modelling.
  • Created mappings for Type2/Slowly Changing Dimensions for updating and inserting of records and in turn maintaining history.
  • Wrote specific Mappings for populating aggregate tables and fact tables.

Environment: BTEQ scripting, Teradata, Informatica Power Center, Shell Scripts (Bash, Korn), Teradata SQL Assistant, UNIX (Solaris), FastLoad, MultiLoad, FastExport, Tpump, TPT, Mercury Quality Center 9.0, CA Erwin, Oracle 10g, Shell Scripts (Bash, Korn), Autosys, Flatfiles.

We'd love your feedback!