We provide IT Staff Augmentation Services!

Teradata Developer Resume

2.00/5 (Submit Your Rating)

NJ

SUMMARY:

  • Having 4 years of experience in design, development, implementation, testing, maintenance and administration of Teradata database and having knowledge on big data technology like Hadoop.
  • Having good knowledge on Teradata Database Architecture, Database Features and Teradata tools.
  • Experience in using Teradata tools and utilities like BTEQ, SQL Assistant, Teradata Manager, Teradata Workload Analyzer, Teradata Visual Explain, Teradata Index Wizard.
  • Designed, developed, tested and supported Extract, Transform and Load (ETL) processes necessary to load and validate data warehouse.
  • Experience on ETL tool Informatica Power Center on OLAP and OLTP environment for Health care clients.
  • Knowledge in data modeling for OLTP relational databases and Data Warehouse applications using Ralph Kimball and Bill Inman design principles including facts and dimensions tables, Slowly Changing Dimensions (SCD) and Dimensional Modeling - Star and Snow Flake Schema.
  • Strong hands on experience on Teradata tools such as BTEQ, FLOAD, MLOAD, TPUMP, FASTEXPORT, TPT(Teradata Parallel Transporter).
  • Well versed knowledge on SQL scripts on various Relational Databases.
  • Expertise in creating databases, users, tables, views, functions, Packages, joins and hash indexes in Teradata database.
  • Strong knowledge of Data warehousing concepts, dimensional Star Schema and Snowflake schema methodologies.
  • Having good knowledge on Hadoop Eco-systems like HDFS, MapReduce, PIG, HIVE.
  • Extensively knowledge with Teradata SQL Assistant. Developed BTEQ scripts from to load data from Teradata. Staging area to Data Warehouse, Data Warehouse to data marts for specific reporting requirements. Tuned the existing BTEQ script to enhance performance.
  • Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De-normalization Concepts.
  • Experience with software development life cycle (SDLC), Agile Project Management Methodologies.
  • Attend and participate in meetings that pertain to assigned responsibilities.
  • Expertise in Problem solving and Tracking Bugs during the production failure.
  • Strong experience in shell scripting with different shells like Korn shell and BTEQ scripts in Teradata using UNIX.
  • Highly interacted with the Business Users to gather requirements, Design Tables, standardize interfaces for receiving data from multiple operational sources, coming up with load strategies of loading staging area and data marts. Deal with Type1/Type2/Type3 loads.
  • Involved in Teradata BTEQ scripts performance tuning for long running queries in production by using Explain plan.
  • Generated SQL and PL/SQL scripts to install create and drop database objects including tables, views, primary keys, indexes, and constraints.
  • Tuned various queries by COLLECTING STATISTICS on columns in the WHERE and JOIN expressions.
  • Involved in troubleshooting the production issues and providing production support
  • Extensive ability to debug, to troubleshoot, and solve complex PL/SQL issues, and the ability to reverse engineer existing PL/SQL code into functional requirements.
  • Extensively worked on creating, executing test procedures, test cases and test scripts using Manual/Automated methods.
  • Highly motivated with the ability to work effectively in teams as well as independently.

TECHNICAL SKILLS:

Programming Languages: C, SQL, PL-SQL.

Databases: Teradata, Oracle

Teradata Load and Unload Utilities: FLOAD, FASTEXPORT, MLOAD, TPUMP, TPT

Database Administration Tools: Index Wizard, Query scheduler, Workload analyzer, Viewpoint

Query management tools: BTEQ, SQL assistant

ETL Tools: AIX-IBM Tool, Informatica

Administrator tools: Teradata Manager, Teradata Administrator

Operating Systems: UNIX, LINUX and Windows

Office Applications: MS Word, Excel.

PROFESSIONAL EXPERIENCE:

Confidential, NJ

Teradata Developer

Responsibilities:

  • Co-ordinated with the Business Analysts, Data Architects, DM’s and users to understand business rules.
  • Designed and developed mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data from multiple systems into the data warehouse.
  • Involved in preparing Technical design documents and getting approvals from business team.
  • Created MLOAD scripts to load data related to Confidential orders into staging tables.
  • Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using Fast Load, Multiload and BTEQ utilities of Teradata.
  • Created stored procedures to load staging data into corresponding target tables.
  • Involved in error handling, performance tuning of SQLs, testing of Stored Procedures.
  • Strong knowledge to use Primary, Secondary, PPI, and Join Indexes.
  • Reduced Teradata space used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions
  • Involved in scheduling Teradata and UNIX objects to run the jobs on daily/weekly basis depending on business requirement.
  • Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
  • Wrote Insert and Updated SQLs in Teradata Scripts and validated the load process into the target tables.
  • Creating SQL and BTEQ scripts in Teradata to load the data into EDW as per the business logic.
  • Prepared Validation scripts to Validate data in target tables with source data.
  • Writing BTEQ's and views based on user and reporting requirements.
  • Involved in writing scripts for loading data to target data Warehouse for BTEQ and MultiLoad.
  • Involved in Preparing data cleansing scripts to clean source data as per business rules.
  • Involved in writing scripts for loading data to target data Warehouse for Bteq and MultiLoad.
  • Involved in performance tuning by collect stats and join index creation in different scenarios according to the requirement.
  • Involved in handling exception data into exception table for further analysis.
  • Involved in preparing scripts to load look tables based on new data coming from source tables.
  • Provided production support on various issues during daily loads.
  • Involved in analyzing various production issues and necessary enhancements required.
  • Participated in knowledge transfer sessions to Production support team on business rules, Teradata objects and on scheduling jobs.
  • Helped testing team to prepare test cases with various test scenarios.
  • Involved in Defect Analysis and fixing of bugs raised by users UAT.

Environment: Teradata 15.0, SQL Assistant, BTEQ, FastLoad, MultiLoad, FastExport, Oracle 10.2g, Linux

Confidential, WTC, NY

Teradata/ETL Developer

Responsibilities:

  • Performed analysis of complex business issues and provided recommendations for possible solutions. Writing SQL queries, reports, and used systems to do analysis based on data metrics and facts to provide recommendations.
  • Confidential for the project Clinical health system 2020 have different systems like member, provider, HF system Where we get data and load into new system.
  • Daily transnational data and claims data from hospitals will be available by 4AM next day to load into datawarehouse.
  • File check process (shell script) will be waiting for this file and kick start the actual load process once files arrives.
  • As a initial load process, we will Mload this file in to Teradata staging table and archive the file in to archive folder after Mload completes successfully.
  • This actual load process contains mostly Teradata stored procedures and are scheduled to run every day through 3 rd party scheduling tool.
  • Experience Informatica tool to load data from data files into landing zone tables
  • After completion of load process, we will run Validation process to validate the data loaded in target tables against source.
  • Develop Fast load scripts to load data from host file in to Landing Zone table.
  • Apply the business transformation using BTEQ scripts.
  • Designed and developed complex mappings to move data from multiple sources DB2, oracle, PostgreSQL, into a common target area such as Data Marts and Data Warehouse using lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Normalizer, Sequence Generator, and Update Strategy from varied transformation logics in Informatica.
  • Reviewed mapping documents provided by Business Team, implemented business logic embedded in mapping documents into Teradata SQLs and loading tables needed for Data Validation. /
  • Creating Database Tables, Views, Functions, Procedures, Packages as well as Database Sequences and database link.
  • Involved heavily in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistance.
  • Involved in writing scripts for loading data to target data Warehouse for Bteq and MultiLoad.
  • Involved in performance tuning by collect stats and join index creation in different scenarios according to the requirement
  • Involved in scheduling Backups and archives and restoring data when required.
  • Loading business data into data warehouse.
  • Written SQL queries for retrieving the required data from the database.
  • Tested and Debugged PL/SQL packages.
  • Create BTEQ scripts to load the data from staging table in to target table.
  • Used standard packages like UTL FILE, DMBS SQL, and PL/SQL Collections and used BULK Binding involved in writing database procedures, functions and packages.
  • Provided required support in Multiple (SIT, UAT & PROD) stages of the project.
  • Prepared BTEQ import, export scripts for tables.
  • Written BTEQ, FAST LOAD, MLOAD scripts.
  • Validated the target data with the source data

Environment: Teradata 14, Teradata SQL Assistant, Teradata Manager, PL/SQL, UNIX Scripts, MLOAD, BTEQ, FASTLOAD, Tivoli, Informatica.

Confidential

Teradata Developer

Responsibilities:

  • Interacted with business users, analysts to gather, understand, analyze, program and document the requirements of Data Warehouse.
  • Understood the business requirements with extensive interaction with users and reporting teams and assisted in developing the low-level design documents.
  • Using Sequence generator for Surrogate keys in the dimensions to reduce the complexity and for improving the performance of composite keys.
  • Worked on Workflow Manager to Schedule the workflows and run the sessions according to job dependency.
  • Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
  • Uploaded data from operational source system (Oracle 8i) to Teradata.
  • Used utilities of FLOAD, MLOAD, FEXP, TPUMP of Teradata and created batch jobs using BTEQ.
  • Imported Metadata from Teradata tables.
  • Implemented Change Data Capture (CDC) using Power exchange.
  • Prepared BTEQ scripts to load data from Preserve area to Staging area.
  • Worked on Teradata SQL Assistant querying the source/target tables to validate the BTEQ scripts .
  • Troubleshoot the issues by checking sessions and workflow logs.
  • Wrote Unix Shell scripts to automate workflows.
  • Developed unit testing, systems testing and post-production verification.
  • Involved in writing the test cases and documentation.
  • Involved in extensive Performance tuning by determining bottlenecks at various points like targets, sources and systems. This led to better session performance.
  • Participated in weekend reviews with team members and Business People.

Environment: Informatica Power Centre 9.1, Oracle 10g, Teradata 13.10, TPT, BTEQ, Teradata SQL Assistant, Teradata 14, FASTLOAD, Windows 2000.

We'd love your feedback!