We provide IT Staff Augmentation Services!

Teradata/etl/hadoop Consultant Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • 5+ Years of total IT experience and Technical proficiency in the Data Warehousing involving Business Requirements Analysis, Application Design, Data Modelling, Development, testing and documentation.
  • 4 Years of experience in Teradata Database design, implementation and maintenance mainly in large scale Data Warehouse environments, experience in Teradata RDBMS using FastLoad, MultiLoad, TPump, FastExport, Teradata SQL Assistant, Teradata Parallel Transporter and BTEQ Teradata utilities.
  • Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.
  • Certified Teradata consultant with experience in Teradata Physical implementation and Database Tuning, technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
  • Technical expertise in ETL methodologies, Informatica 7.1/8.6 - Power Center, Power Mart, Client tools - Mapping Designer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.
  • Analyzed, Designed and documented requirements for data migration projects between numerous source Legacy/Oracle application feeds and the new Teradata platform.
  • Extensive database experience and highly skilled in SQL in Oracle, MS SQL Server, Teradata, Mainframe Files, Flat Files, MS Access.
  • Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema (Fact Tables, Dimension Tables) used in relational, dimensional and multidimensional modeling.
  • Expert in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers.
  • Expertise in Query Analyzing, performance tuning and testing.
  • Experience in writing UNIX korn shell scripts to support and automate the ETL process.
  • Experience in different database architectures like Shared Nothing and Shared Everything architecture.
  • Very good understanding of MPP architectures.
  • Experience in Tableau,Microstrategy reporting tools.
  • Exclusive knowledge in Identification of User requirements, System Design, writing Program specifications, Coding and implementation of the Systems.
  • Excellent communication and inter personnel skills, Proactive, Dedicated and Enjoy learning new Technologies and Tools.
  • Strong commitment towards quality, experience in ensuring experience in ensuring compliance to coding standards and review process.
  • Experience in Full Cycle of Software Development (SDLC) including Requirement Analysis, System study, Design, Development, Unit testing, System Testing, Integration Testing, System Maintenance, Production Support and Documentation.
  • Effective in cross-functional and global environments to manage multiple tasks & assignments concurrently.
  • Excellent communication skills.
  • Aptitude for analyzing, testing, debugging code, identifying problems, creating solutions and documentation.
  • Multilingual, highly organized, detail-oriented professional with strong technical skills.
  • Good aptitude for problem solving and logical analysis.
  • Highly motivated, team player with excellent interpersonal and communication skills, organized and quick learner.

TECHNICAL SKILLS

DATABASE : Teradata 12/13.10/14.10 , DB2, Oracle 7.x / 8.x / 9.x, MS Access, MySQL,Hbase

ETL TOOLS : DataStage 7.5.1 / 7.1, Abinitio (Co-op 3.0), Informatica

REPORTING TOOLS : Business Objects 6.x / 5.x, Seagate Crystal Reports 6.x / 5.x / 4.x / 3.x, SQR 6.x,Microstrategy reports and MS access reports,Tableau.

PROGRAMMING : PL/SQL, VBA, UNIX Shell Scripting, Teradata SQL, ANSI SQL, Transact SQL, C,Python Intrmrdiate level.

O/S: Sun Solaris 2.6 / 2.7, HP-UX, IBM AIX 4.2 / 4.3, MS- DOS 6.22, Novell NetWare 4.11 / 3.61,Mainframe,Linux,Macintosh.

Language : SQL, UNIX Shell Script, Hive(HQL),Pig,Sqoop.

Scheduling Tool : Autosys, CRON, UC4.

PROFESSIONAL EXPERIENCE

Confidential

TERADATA/ETL/HADOOP CONSULTANT

Responsibilities:

  • Gather requirements from Business People, Analyst and convert them into requiremnets.
  • Create Data Flow/architecture documents by using Gliffy tool.
  • Create/modify Teradata Tables, Views and Stored Procedures. Create Semantic layer views for business users.
  • Creating hive tables to the imported data for validation and debugging.
  • Scheduling Teradata scripts through CA ESP work station.
  • Performance Tuning of Teradata SQL and Hive/Impala Query’s in Hadoop.
  • Code Optimization. Parameterizing scripts reducing the number of scripts .
  • Testing/Validating data between cross environments by using Fitnesse.
  • Migrated data from Terdata to Hadoop Using Squoop/IBIS/Udeploy.
  • Create BI/Multi dimentional cubes by using Atscale by improving report generated time in Tableau.
  • Support and coach peers and juniors as and when required on specific technical competencies
  • Production/Release Support - escalation point for production issues.
  • Handled Production issues fixing data issues.
  • Experience in using Sqoop to migrate data to and fro from HDFS and My SQL or Oracle and deployed Hive and HBase integration to perform OLAP operations on HBase data.

Environment: Teradata Utilities TPT, ESP,Teradata SQL Assistant,Putty,WinSCP,Atsclae,Hadoop,Hue,Hive, Impala, Tableau,Fitnesse,Podium.

Confidential

TERADATA CONSULTANT

Responsibilities:

  • Coordination with clients, Requirements Gathering and Impact Analysis
  • Participate in business JAD sessions, create application architecture documents and design documents.
  • Create/modify Teradata Tables, Views and Stored Procedures. Create Semantic layer views for business users.
  • Coding and Unit Testing for Teradata SQLs, TPT (LOAD, UPDATE, STREAM, SQL Selector, SQL Insertor, ODBC, FastExport, Data Connector Operators), FastLoad, MultiLoad, FastExport and BTEQ Scripts.
  • Scheduling Teradata scripts through internal frame and CRON.
  • Performance Tuning of ETL Scripts and Teradata SQL.
  • Code Optimization. Parameterizing scripts reducing the number of scripts created for phase-I from 285 to 6.
  • Create data validation controls and reports using Teradata Statistical and OLAP functions.
  • Use DBC Tables for the performance measurement, Space calculations, Teradata objects analysis and usage reports on attributes.
  • Work on complex adhoc queries to support user help requests within a short time frame.
  • Production Support - escalation point for production issues.
  • Handled Production issues fixing data issues.
  • Creating hive tables to the imported data for validation and debugging.
  • Experience in using Sqoop to migrate data to and fro from HDFS and My SQL or Oracle and deployed Hive and HBase integration to perform OLAP operations on HBase data.
  • Worked on Mainframe JCL and datasets to extract data from Legecy system into infromatica.
  • Extracted data from mainframe, high volume of data sets from data files, Oracle using Informatica ETL mappings/SQL PLSQL scripts and loaded to Data Store Area

Environment: Teradata Utilities (TPT, MLOAD, FASTLOAD, BTEQ, XPORT, TPUMP), Autosys Teradata SQL Assistant, UNIX, CRON, Putty,Mainframe.

Confidential

TERADATA/ETL CONSULTANT

Responsibilities:

  • Played a major role in selection of Teradata load utilities based on the nature of data.
  • Responsible for coding and executing SQL scripts for Oracle data extraction.
  • Solely Responsible for Source data analysis and formulation of extract strategy.
  • Contributed in preparation of ETL specifications for partial subject areas by continual interaction with data modeler.
  • Suggested data type and other table modifications to physical and logical GRM model.
  • Created of staging tables and Test base tables for enhancements to the physical model.
  • Created Bteq scripts with data transformations for loading the base tables.
  • Standardized the ETL scripts catering to error table handling process and load statistics collection as per the organizational IT standards.
  • Documentation of scripts, specifications and other processes.
  • Responsible for testing and fixing all issues arising from data validation process.
  • Loading staging tables on Teradata and further loading target tables on Teradata via views.
  • Creating load process to perform aggregations and load base tables.
  • Worked on documentation for each phase of project. Created playbooks and operational manual for production support. Created System Appreciation Document and User guide.
  • Used various transformations like Source qualifier, Aggregators, lookups, Filters, Sequence generators, Routers, Update Strategy, Expression, Sorter, Normalizer, Stored Procedure, Union etc.
  • Used Informatica Power Exchange to handle the change data capture (CDC) data from the source and load into Data Mart by following slowly changing dimensions (SCD) type II process.
  • Used Power Center Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email.
  • Implemented various Teradata Manager Alerts. Involved in setting up alters to page DBA for events such as node down, AMP down, too many blocked sessions, high data skew, etc.
  • Working with the business team to provide accurate data in the final production target tables.
  • Followed the complete SDLC methodology from requirements, design, building the code, Unit Testing, provide data to UAT testing team and then deploy the code in Production.
  • Reviewed logs in order to fix the failure jobs.
  • Performance tuning the long running queries.
  • Performed Data verification and validations to evaluate the data retrieved is apt according to the data requirements and specifications.
  • Experience in working with production team. Responsible in fixing production failure job code and raise ticket for production team to implement it.

Environment: Teradata 13 and 14, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, Informatica 8.1, Cognos, Tableau, UNIX, Korn Shell scripts, SAP BW, ECC, Teradata Analytics for SAP.

We'd love your feedback!