Teradata/etl Developer Resume
5.00/5 (Submit Your Rating)
Charlotte, NC
SUMMARY
- 4+ years of Technical and Functional experience in Data warehouse implementations, ETL methodology using DataStage, Informatica Power Centre, and Teradata
- Strong hands - on experience in Teradata Utilities such as BTEQ, MultiLoad, FastLoad, TPT, and FastExport
- Expert in Teradata RDBMS, development, and production DBA support, Use of Fastload, Multiload, Teradata SQL, and BTEQ script to enhance performance
- In-Depth understanding of Teradata Functions, Proficient in Teradata SQL, Primary indexes, Secondary Indexes, PPI, Join Indexes, etc
- Proficient in Slowly changing dimension phenomenon (SCD’s), Change Data Capture (CDC)
- Extensively involved in identifying performance bottlenecks in targets, sources, and transformations and successfully tuned them for maximum performance using best practices
- Having a good understanding of Teradata MPP Architecture such as Partitioning, Shared Nothing, Nodes, AMPs, BYNET, etc
- Strong work experience on Autosys Scheduling tool
- Implemented process and quality improvements through Automation Scripts
- Experience in working wif Mainframe files, XML, and Flat Files
- Experience in UNIX shell scripting to support and automate the ETL process
- Experience in Query Optimization and Performance Tuning, functions
- Experience in documents Design Specs, unit test plans, and deployment plans
- Experienced in Agile methodology and waterfall models
- Experience in Teradata Production Support
TECHNICAL SKILLS
Database: Teradata, SQL, Hive
ETL: IBM Datastage expertise edition 8.1 & 9.1, Informatica power Centre 10.2
Scheduling Tool: Autosys R11
Big Data: Hadoop, Hive, Pig
Programming Languages: UNIX Shell Scripting, SQL, Teradata SQL
Methodologies: Agile, Scrum, Waterfall
Teradata Tools & Utilities: BTEQ, Fast Load, Fast Export, TPUMP, Multi Load, and SQL Assistant
PROFESSIONAL EXPERIENCE
Confidential, Charlotte NC
Teradata/ETL Developer
Responsibilities:
- Experienced in data movement via ETL and Teradata using FAST EXPORT, FAST LOAD, MULTILOAD, IMPORT utility, etc
- Analyze existing ETL process and come up wif an ETL design document dat lists the jobs to load
- Design ETL jobs using Informatica to load data from multiple source systems to Teradata database and parallel jobs to load the data into the target schema
- Create, test, and implement Teradata Fastload, Multiload, and BTEQ scripts DML and DDL
- Expert in writing scripts for Data extraction, transformations, and Loading data from legacy systems to target data warehouse using BTEQ, Fastload, Multiload
- Developed BTEQ Scripts to load data from Teradata staging area to data mart and created load scripts using Fastload and Multiload utilities in SQL Assistant
- Investigate and debug issues in the database and services you create and work wif QA and Data Analysts to ensure the highest quality wifin the system
- Fix issues and implement requested changes during the warranty support period
- Involve in ongoing production support and process improvements and run the jobs through a third-party scheduler
- Used Autosys scheduling tool to schedule the jobs as per the requirement and monitored daily data loads as per the schedule
- Performed Query performance tuning wif the halp of Collect statistics
- Explain plan, Primary and Secondary indexes, and used volatile tables and temporary/derived tables for breaking up complex queries into simple queries
- Provide project-level analysis, producing required project analysis, documentation of business requirements, future state proposals, UAT plan
- Experienced in creating test cases, testing strategy, UAT plan, and production Validation Approach
- Reduced the time of execution of a project by implementing shell scripting and Reusable jobs
- Knowledge of agile software development practices and release management
Environment: Teradata 16, TPT, SQL Assistant, Teradata loading utilities (BTEQ, FastloadMultiload, Fastexport), Informatica Developer-10.1, Informatica PowerCenter 9.6.1/10.1, UNIX shell scripting, Autosys