Teradata Developer Resume Profile
Summary:
- 7 years of extensive experience in IT, this includes Strong understanding of Data warehouse project development life cycle. Implementation of Data Warehousing projects with Teradata. Expertise in Teradata Database design, implementation and maintenance mainly in Data Warehouse environments.
- Experience on working with Teradata Utilities such as BTEQ, Fast Load, Multi Load, Xml import, Fast Export, Teradata SQL Assistant, Teradata Administrator and PMON.
- Expertize in the ETL Tool Informatica which includes components like Power Center, Power Exchange, Power Connect, Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server Administration Console, IDE Informatica Data Explorer, IDQ - Informatica Data Quality.
- Involved in developing strategies for Extraction, Transformation and Loading ETL mechanism using DataStage tool.
- In Depth understanding and usage of TERADATA OLAP functions. Proficient in TERADATA SQL, Stored Procedures, Macros, Views, Indexes Primary, Secondary, PPI, Join indexes etc .
- Experience in working with Microstrategy,Crystal reports, Business intelligence tools Business objects and ETL tool as informatica.
- Have good experience to UNIX/Windows/Mainframe environments for running Tpump batch process for Teradata CRM.
- Have good understanding of Teradata MPP architecture such as Partitioning, Primary Indexes, Shared Nothing, Nodes, AMPs, BYNET, etc.
- Experience in Teradata production support.
- Good working Knowledge with Teradata V2R5/V2R6/12.0 and Sound knowledge of Oracle 10g/11g, MS SQL Server 2000, DB2 7.0.
- Experience in development of Big Data projects using Hadoop, HDFS, Map Reduce, Hive, Sqoop, Hue, HBase, Pig and Oozie. Expertise in implementing Map Reduce jobs of Big Data Hadoop in Java.
- Expertise in gathering, analyzing and documenting business requirements, functional requirements, and data specifications for Business Objects Universes and Reports.
- Strong Data Modeling experience in ODS, Dimensional Data Modeling Methodologies likes Star Schema, Snowflake Schema. Design and development of OLAP models consisting of multi-dimensional cubes and drill through functionalities for data analysis.
- Very Good experience in writing shell scripts ksh .
- Excellent Web development skills. Experience in N-tier Client-Server based Internet technology, intranet portal design/development Web based data reporting system, Framework development for Internet application.
- Experienced in writing SQL, PL/SQL queries, and stored procedures for Oracle, MySQL databases.
Technical Skills:
- Programming languages: SQL, PL/SQL, JAVA
- ETL/Reporting Tools: Informatica 8.X., DataStage,Microstrategy
- UI Technologies: HTML, CSS, XML
- Scripts: VB script and Shell Scripting
- Databases: Teradata V2R5/V2R6/12.0, Oracle 10g/11g, SQLServer 2008/2005, DB2, Sybase, XML.
- Hadoop Ecosystems: HDFS, Map Reduce, Hive, Hbase, Hue, Sqoop, Pig, Oozie, Cassandra etc.
- Operating Systems: Windows up to 8, HP UX, Solaris, Linux.
Professional Experience:
Confidential
Teradata/ ETL Developer
Responsibilities:
- Understanding the specification and analyzed data according to client requirement. Extensively worked in data Extraction, Transformation and loading from source to target system using BTEQ, FastLoad, and MultiLoad.
- Design of process oriented UNIX script and ETL processes for loading data into data warehouse.
- Using Stored Procedures created Database Automation Script to create databases in different Environments.
- Involved in writing scripts for loading data to target data Warehouse for BTEQ, FastLoad, and MultiLoad.
- Error handling and performance tuning in Teradata queries and utilities.
- Tested the functionality of the systems in the development phase and designed the test plans for the data warehouse projects.
- Parsed high-level design spec to simple ETL coding and mapping standards.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Teradata performance tuning via Explain, PPI, AJI, Indices, collect statistics or rewriting of the code.
- Developed BTEQ scripts to load data from Teradata Staging area to Teradata data mart.
- Designed the complete workflow for all the extracts mappings to serve the business requirements with dependency hierarchy.
- Performance tuning for TERADATA SQL statements using huge volume of data.
- Created Fast Load, Fast Export, Multi Load, TPUMP, and BTEQ to load data from Oracle database and Flat files to primary data warehouse.
- Worked as a member of the Big Data team for deliverables like design, construction, unit testing and deployment.
- Loading data from large data files into Hive tables.
- Initial setup to receive data from external source.
- Designed and developed Hive job to merge incremental file.
- Involved in writing Map/Reduce jobs using java.
- Parsed high-level design spec to simple ETL coding and mapping standards.
- Used External Loaders like Multi Load and Fast Load to load data into Teradata database.
- Translation of functional and technical requirements into detailed architecture and design
- Responsible to manage data coming from different sources.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Responsible for operational support of Production system.
- Used External Loaders like Multi Load and Fast Load to load data into Teradata database.
- Designed the complete workflow for all the extracts mappings to serve the business requirements with dependency hierarchy.
- Tuned Informatica Mappings and Sessions for optimum performance.
Environment:
Teradata Utilities Multiload, FastLoad, FastExport, BTEQ, Tpump , Hadoop Cloudera , hive .9.x, Map Reduce jobs, Java, Shell Scripting, RedHat Linux, HDFS, Hive, Sqoop, Restful Services and Cassandra.
Confidential
Data Warehouse Consultant
Responsibilities:
- Experienced in Tuning SQL Statements and Procedures for enhancing the load performance in various schemas across databases. Tuning the queries to improve the performance of the report refresh time.
- Created customized Web Intelligence reports from various different sources of data.
- Involved in performance tuning on the source and target database for querying and data loading.
- Developed MLoad scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.
- Developed BTEQ scripts to load data from Teradata Staging area to Teradata data mart
- Developed scripts to load high volume data into empty tables using Fast Load utility.
- Created Reports using all the BO functions like Drill Down, Prompts, and Dimensional and Measure variables to show accurate Results.
- Writing SQL queries and matching the data with database and reports.
- Tuned and Enhanced Universes with SQL Queries for the Report Performance.
- Created complicated reports including sub-reports, graphical reports, formula base and well-formatted reports according user requirements.
- Developing Data Extraction, Transformation and Loading jobs from flat files, Oracle, SAP, Teradata Sources into Teradata using BTEQ, Fast Load, MultiLoad and stored procedure.
- Design of process oriented UNIX script and ETL processes for loading data into data warehouse.
- Using Stored Procedures created Database Automation Script to create databases in different Environments.
Confidential
Teradata Developer
Responsibilities:
- Analyzing the Business requirements and System specifications to understand the Application.
- Importing data from source files like flat files using Teradata load utilities like FastLoad, Multiload, and TPump.
- Creating adhoc reports by using FastExport and BTEQ.
- Designed Informatica mappings to propagate data from various legacy source systems to Oracle. The interfaces were staged in Oracle before loading to the Datawarehouse.
- Performed Data transformations using various Informatica Transformations like Union, Joiner, Expression, Lookup, Aggregate, Filter, Router, Normalizer, Update Strategy, etc.
- Responsible for Tuning Report Queries and ADHOC Queries.
- Wrote transformations for data conversions into required form based on the client requirement using Teradata ETL processes.
- Experienced in Tuning SQL Statements and Procedures for enhancing the load performance in various schemas across databases. Tuning the queries to improve the performance of the report refresh time.
- Created customized Web Intelligence reports from various different sources of data.
- Involved in performance tuning on the source and target database for querying and data loading.
- Developed MLoad scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.
- Exported data from Teradata database using Teradata Fast Export.
- Used UNIX scripts to run Teradata DDL in BTEQ and write to a log table.
- Creating, loading and materializing views to extend the usability of data.
- Automated Unix shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency
- Making modifications as required for reporting process by understanding the existing data model and involved in retrieving data from relational databases.
- Involved in working with SSA requestor responsibilities which will be assigned for both project and support requests.
- Managing queries by creating, deleting, modifying, and viewing, enabling and disabling rules.
- Loading the data into the warehouse from different flat files.
- Database testing by writing and executing SQL queries to ensure that data entered has been uploaded correctly into the database.
- Transfer files over various platforms using secure FTP protocol.
- Involved in creating Unit test plans for and testing the data for various applications
Environment: Teradata V12.0, Informatica, Business Objects XIR3.1, Crystal reports, Teradata Utilities Multiload, FastLoad, FastExport, BTEQ, Tpump , SQL Server 2000,Sybase, DB2, Oracle, FTP, CVS, Windows XP, UNIX, Pentium Server.
Confidential
Teradata /ETL Developer
Responsibilities:
- Parsed high-level design spec to simple ETL coding and mapping standards.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Developed mappings, sessions, workflows and worklets and scheduled the workflows in Maestro.
- Used External Loaders like Multi Load and Fast Load to load data into Teradata database.
- Designed the complete workflow for all the extracts mappings to serve the business requirements with dependency hierarchy.
- Tuned Informatica Mappings and Sessions for optimum performance.
- Worked and resolved on the production data issues due to the migration of Data warehouse from Teradata to Netezza for NFS.
- Worked extensively with Teradata Utilities like BTEQ, Fast export and all load utilities.
- Developed Shell Scripts for event automation and scheduling.
- Implemented duplicate removal logic for existing code to avoid duplicates on production loads.
Environment: Informatica PowerCenter 8.6, Oracle 10g, Teradata V2R6, Netezza, SQL Server 2005, Teradata SQL Assistant, Toad, Management Studio, Windows XP, Sun Solaris, UNIX, Maestro Scheduler.