We provide IT Staff Augmentation Services!

Teradata Developer Resume

3.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • Around 8 years of professional experience in Information Technology wif extensive experience in development of various projects using Data Warehousing tools like Informatica and databases like Oracle, SQL Server, Teradata and DB2 UDB.
  • Strong experience wif large and midsize data warehousing implementation using Informatica Power center/ power mart, Oracle, SQL server, UNIX and Windows platforms.
  • Analysis, Designing, Developing and Implementation of Data Warehousing and data mart project and software development cycles.
  • Good experience in Data warehousing applications, responsible for Extraction, cleansing, Transformation and Loading of data from various sources to teh data warehouse.
  • Good experience in Teradata RDBMS using Fastload, Multiload, Tpump, Fastexport, Multiload, Teradata SQL Assistant and BTEQ utilities.
  • Experience as an ETL developer using Informatica Power Center which includes Source analyzer, warehouse designer, mapping designer, mapplet designer, transformation designer and developer, Repository manager, Workflow manager and monitor.
  • Familiar wif creating Secondary indexes and join indexes.
  • Expertise in implementing complex business rules by creating Robust Mappings, Reusable Transformations using Transformations like Unconnected Look Up, Connected Look Up, Joiner, Router, Expression, Aggregator, Filter, Update Strategy etc.
  • Expertise on Star Schema and Snowflake Schema, Fact Dimension Tables, Slowly Changing Dimensions. Interacting wif both clients and users as well to know their requirements
  • Experience in creating transformations and mappings using Informatica Designer.
  • Expertise in Business Model development wifDimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management.
  • Worked wif push down optimization in Informatica.
  • Developed comprehensive models for managing and communicating teh relationships between Base Objects and Performed teh ETL process for loading teh data into hub
  • Implemented Performance Tuning Techniques and Error Handling at various levels like source and Target mappings.
  • Comfortable wif both technical and functional applications of RDBMS, Data Mapping, Data Management, Data transportation and Data Staging.
  • Experience in creating Tables, Views, and Indexes, Stored procedures, Triggers, Cursors, Function and packages in SQL Server.
  • Hands On experience in SQL 2005/2008 Integration Services(SSIS)
  • Sound knowledge of all phases of teh Software Development Life Cycle (SDLC) Methodology and Agile method.
  • Good knowledge on generating various complex reports using Cognos Suite of reporting tools, Business Objects, Micro Strategy and Excel reports.
  • Good knowledge of Full life Cycle Design and development for building data warehouse.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, TeradataSQL Assistantand Query man.
  • Experience wifdifferent indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI))and Collect Statistics.
  • Experience in working wif both3NF and dimensional modelsfor data warehouse and good understanding of OLAP/OLTP systems.

TECHNICAL SKILLS

Primary Tools: Informatica Power Center 9.6/9.5x/9.x/8.x/7.x, TWS, ControlM

Teradata Utilities: BTEQ, Fast Load, MutliLoad, Fast Export, Tpump, SQL Assistant, Teradata Parallel Transport utility (TPT), Stored procedures.

Languages: Teradata SQL, Oracle SQL

Databases: Teradata V2R5/V2R6.x/12.0/13.x/14.10

Operating Systems: Windows 95/98/NT/2000/XP, UNIX, AIX

Data Modeling: Erwin4.0/3.5, Logical/Physical/Dimensional/3NF, Star/ ETL

Scripting Languages: UNIX Shell Scripting, BTEQ

PROFESSIONAL EXPERIENCE

Confidential, Atlanta, GA

Teradata developer

Responsibilities:

  • Responsible for requirements gathering for an enhancement requested by client. Involved in analysis and implementation.
  • Extensively used teh Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL).
  • Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
  • Created a BTEQ script for pre population of teh work tables prior to teh main load process.
  • Extensively used Derived Tables, Volatile Table and Global Temporary tables in many of teh ETL scripts.
  • Created Primary Indexes (PI) for both planned access of data and even distribution of data across all teh available AMPS. Created appropriate Teradata NUPI for smooth (fast and easy) access of data.
  • Worked on exporting data to flat files using Teradata Fast Export. Expertise in SQL and Performance tuning on large scale Teradata Database
  • In - depth expertise in teh Teradata cost based query optimizer, identified potential bottlenecks.
  • Responsible for designing ETL strategy for both Initial and Incremental loads.
  • Developed teh Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging to Journal then move data from Journal into Base tables
  • Interacted wif business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build teh DataMart.
  • Provided scalable, high speed, parallel data extraction, loading and updating using TPT.
  • Developed UNIX scripts to transfer teh data from operational data sources to teh target warehouse.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Extracted data from various source systems like Oracle, Sql Server and flat files as per teh requirements Provided scalable, high speed, parallel data extraction, loading and updating using TPT.
  • Developed UNIX scripts to transfer teh data from operational data sources to teh target warehouse.
  • Very good understanding of Database Skew, PPI, Join Methods and Join Strategies, Join Indexes including sparse, aggregate and hash.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of teh ETL scripts.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to DataMart.
  • Implemented full pushdown Optimization (PDO) for Semantic layer implementation for some of teh complex aggregate/summary tables instead of using teh ELT approach.
  • Worked on InformaticaPower Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor

Environment: Informatica Power Center 9.6, Teradata 14.10, Oracle 11g, SQL Server 2012/2008, UNIX, Flat Files Oracle 11g, SQL Server 2012/2008, UNIX, Toad

Confidential, Omaha, NE

Sr. Informatica / Teradata developer

Responsibilities:

  • Understanding of teh Requirements Documents and interact wif client team to get clarifications for issues if any.
  • Participating in weekly all-hands meeting wif onsite teams, participating in client interactions to resolve any business logic issues/concerns, participated in knowledge transition activities, escalate any techno-functional issues to teh leads, and ensuring about teh task timelines and quality.
  • Designed and developed end-to-end ETL process
  • Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD in Teradata 14.
  • Write Teradata SQL, BTEQ, MLoad, FastLoad, and FastExport for Ad-hoc queries, and build Unix shell script to perform ETL interfaces BTEQ, FastLoad
  • Created ETL Scripts & Procedures to extract data and populate teh data warehouse using Oracle Gateway to Teradata.
  • Prepared technical specifications to develop Informatica ETL mappings to load data into various tables confirming to teh business rules.
  • Creating schematic diagrams for depicting teh pseudo mapping logic, and workflow logic.
  • Creation of Unit test cases for each component in workflow and for other tasks as part of ETL process.
  • Conduct/participate in teh peer reviews of Mapping Documents - Get teh sign-off from client to start construction phase.
  • Unit test teh ETL workflow & mappings - fixing teh defects which came out of unit testing and if needed, make modifications to teh documents to ensure dat those are up to date as per teh functionality in teh system.
  • Migrate teh code from one environment to other environments

Environment: Informatica Power Center 9.6, Teradata 14.10, TPT, Bteq, Fastload, Multiload, SQLA, Oracle 11g, SQL Server 2012/2008, UNIX, Flat Files

Confidential, Hoffman Estates, IL

ETL/Teradata Developer

Responsibilities:

  • Review business requirements wif Business Analyst and Technical Analyst.
  • UsedTeradata utilities fastload, multiload, tpumpto load data.
  • Wrote BTEQ scripts to transform data.
  • WroteTeradata Macrosand used various Teradataanalytic functions.
  • Performance tuned and optimized various complex SQL queries.
  • Used Informatica Power Center ETL tool to extract teh data from different source systems and load them into teh target systems
  • Extracted flat file source data using informatica Designer tool to staging database and then implemented business logics to load teh data into teh target systems.
  • Developed mappings using different transformations like source qualifier, Expression, Filter, Router, Aggregator, Update strategy, Sequence, Generator and Joiner.
  • Created mapping parameters and Variables.
  • Implemented Slowly Changing Dimensions to keep track of teh historical data.
  • Created mapping to dynamically generate parameter files.
  • Created UNIX Scripts to execute in command task configured email task to send error reports to user.
  • Used Informatica Power Center versioning option to support team-based development.
  • Used debugger for debugging teh mappings.
  • Performed teh data profiling and analysis making use ofInformatica Data Explorer (IDE) and Informatica Data Quality (IDQ)
  • Created Tasks, Workflows, Worklets and sessions using Workflow Manager Tool and Monitored workflow using Informatica Workflow manager.
  • Involved in Error Handling using session logs and Workflow logs.
  • Tuned Performance of Informatica session for large data files by increasing block size, data, cache size, sequence buffer length and Target based commit interval.
  • Used UNIX commands and shell scripts to schedule teh jobs.
  • Performed incremental aggregation to load incremental data into aggregate tables.
  • Created TEMPeffective Test Cases and performed Unit Testing, Integration Testing to ensure successful execution of data loading process.
  • Responsible for Unit Testing of ETL mappings, Bug Fixing and helping teh testing team to execute teh test cases.

Environment: Informatica Power Center 9.0.1, Teradata 14.10/13.x,TPT, Bteq, Fastload, Multiload,SQLA, Oracle 11g, SQL Server 2008/2005, UNIX, Flat Files

Confidential, River woods, IL

Informatica Developer

Responsibilities:

  • Analyzed teh requirements to identify teh necessary tables dat need to be populated into teh staging database
  • Created mappings using transformations like Source Qualifier, Aggregator, Expression, lookup, Router, Filter, Update Strategy, Joiner, Union, and Stored procedure, and XML transformations.
  • Worked on Informatica Power Center tools - Source Analyzer, Warehouse Designer, Mapping &Mapplet Designer, and Transformation Developer.
  • Converted existing PL/SQL Packages to ETL Mappings using Informatica Power Center.
  • Used Error handling strategy for trapping errors in a mapping and sending errors to an error table.
  • Implementedparallelismin loads bypartitioningworkflows usingPipeline, Round-Robin, Hash, Key Range and Pass-through partitions.
  • Worked on Exchange Management Console(EMC) or theExchange Management Shell(EMS) related topics.
  • Extensively involved in performance tuning, recommending SQL queries for better performance.
  • Automated teh process of starting and stopping dtlcacon/pwxccl process by creating a shell script.
  • Developed in SCRUM iterations using Agile Methodology, Iterative development and Sprint Burn down wif Story Boards.
  • Estimates and planning of development work using Agile Software Development.
  • Used Change Data Capture (CDC) to simplify ETL in data warehouse applications
  • Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
  • Used Mapping variables for Incremental Extraction of operational data.
  • Wrote Unix Shell scripts to automate workflows.
  • Schedule and Run Extraction and Load process, monitor task and workflow using teh Workflow Manager and Workflow monitor.
  • Used Workflow Manager for creating workflows, work lets, email and command tasks.
  • Used Informatica features to implement Type I, II changes in slowly changing dimension tables.
  • Used FTP services to retrieve Flat Files from teh external sources.
  • Worked on Teradata RDBMS using FASTLOAD, MULTILOAD, TPUMP, FASTEXPORT, MULTILOAD EXPORT, Teradata SQL and BTEQ Teradata utilities.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping, and Session for large data files.
  • Worked on Migration Strategies between Development, Test and Production Repositories.
  • Supported teh Quality Assurance team in testing and validating teh Informatica workflows.
  • Did unit testing and development testing at ETL level in my mappings.

Environment: Informatica Power Center 8.5, Oracle 10g, SQL Server 2005, DB2, SQL*Plus, SQL Loader, SQL Developer, Autosys, Flat files, UNIX, Windows 2000,Teradata V-14., DB2 UDB EE/8.1, AIX 4.3, Solaris

Confidential, Indianapolis

Informatica Developer/Teradata developer

Responsibilities:

  • Communication wif business users and analysts on business requirements. Gathering and documenting teh technical and business Meta data about teh data.
  • Teh Teradata EXPLAIN facility, which describes to end-users how teh database system will perform any request.
  • Teh TS/API product, a system to allow products designed for SQL/DS to access teh Teradata database machine wifout modification.
  • Responsible for configuring teh Workflow manager, Repository Server Administration Console, Power Center Server, Database Connection
  • Analysis of teh specifications provided by teh clients.
  • Worked wif various system interfaces to gather requirements for migration and implementation.
  • Coding using BTEQ SQL of TERADATA, wrote UNIX scripts to validate, format and execute teh SQL’s on UNIX environment.
  • Compared teh actual test results wif expected results using FTP wif UNIX scripts.
  • Designed database, tables and views structure for teh new data mart.
  • Using Teradata manager, Index Wizard and PMON utilities to improve performance.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing and loading history data into Teradata.
  • Done various optimization techniques in Aggregator, Lookup, and Joiner transformation
  • Involved in unit testing & integration testing.
  • Wrote Functions and Stored procedures in Teradata Database Having good experience in developing Stored Procedures and views.
  • Involved in Creating UNIX Scripts for triggering teh Stored Procedures and Macro.
  • Performance tuning of teh long running queries.
  • Worked on complex queries to map teh data as per teh requirements.
  • Reduced Teradata space used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.
  • Preparing Test Cases and performing Unit Testing.
  • Review of Unit and Integration test cases.
  • Production Implementation and Post Production Support.
  • Generate weekly, monthly reports for internal and external database and system audit.

Environment: Teradata V2R5, Teradata SQL Assistant, BTEQ, FLOAD, FEXP, MLOAD, FTP, Windows XP, Cognos, Visual Basic 6.

Confidential

Informatica/ Teradata developer

Responsibilities:

  • Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.
  • Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
  • Translated Business processes into Informatica mappings for building Data marts by using Informatica Designer which populated teh Data into teh Target Star Schema on Oracle 9i Instance.
  • Followed teh required client security policies and required approvals to move teh code from one environment to other.
  • Developed ETL mappings, transformations using Informatica Power Center 8.6
  • Deployed teh Informatica code and worked on code merge between two difference development teams.
  • Successfully Implemented Slowly Changing Dimensions (SCD). Used Type 2 dimension to keep teh history changes.
  • Involved in writing complex SQL queries based on teh given requirements and for various business tickets to be handled.
  • Built many Unix shell scripts to schedule ETL jobs based on trigger files.
  • Identified teh bottlenecks in teh sources, targets, mappings, sessions and resolved teh problems.
  • Created automated scripts to perform data cleansing and data loading.
  • Performed complex defect fixes in various environments like UAT, SIT etc to ensure teh proper delivery of teh developed jobs into teh production environment.
  • Attended daily status call wif internal team and weekly calls wif client and updated teh status report.

Environment: Informatica - Power Center 6.1, Oracle 8i, Erwin Data Modeler, BTEQ, FastLoad, Multiload, TOAD, Windows XP, Teradata SQL Assistant.

We'd love your feedback!