We provide IT Staff Augmentation Services!

Teradata Developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Over 7 Years of experience in Information Technology, with focus on Data Warehousing and Application development.
  • Experience in Teradata15/14/13/12 versions (Teradata SQL Assistant, BTEQ, Fast Load, Multi Load, TPump, TPT (Teradata Parallel Transporter), Fast Export, Visual Explain).
  • Worked with various versions of Informatica like Informatica1PowerCenter 10.X/9.X/8.X
  • Worked in several areas of Data Warehouse including Analysis, Requirement Gathering, Development, Testing, and Implementation.
  • Experienced with mentoring Teradata Development teams, program code development, test plan development, datasets creation, testing and result documentation, analyzing defects, bug fixing.
  • Experienced in Software Development and gained expertise in Data Warehousing.
  • Knowledge on large scale data manipulation, management and analysis with large data sets on platforms such as Horton Works and Cloudera
  • Extensively used Teradata application utilities like BTEQ, Multi Load, Fast Load, TPUMP and TPT.
  • Extensively worked on Mainframe/Unix and Informatica environments to invoke Teradata Utilities and file handlings
  • Good exposure to Insurance and Banking Domains.
  • Good knowledge of Dimensional Data Modeling, Star Schema, Snow - Flake schema, FACT and Dimensions Tables.
  • Worked in remediation (Performance Tuning) team for improving query performance of user queries and production SQL’s.
  • Very good experience in customer facing skills and release and change management.
  • Involved in Data Migration between Teradata and DB2 (Platinum).
  • Strong Data warehousing experience specialized in Teradata, ETL Concepts, Development and Testing. Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses using Teradata load utilities.
  • Extensive experience in Developing Informatica complex mappings/Mapplets using various Transformations for Extraction, Transformation and Loading of data from multiple sources of data warehouses and creating workflows with work lets & tasks and scheduling using workflow manager.
  • Good exposure to Teradata Manager, FASTLOAD, MULTILOAD, TSET, TPUMP, SQL for workload management.
  • Expertise in using Teradata SQL Assistant, Teradata Manager and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export and exposure to T-pump on UNIX environment.
  • Experience in Dimensional Modeling, ER Modeling, Star Schema / Snowflake Schema/3NF, FACT and Dimensional tables and Operational Data Store (ODS)
  • Exposure to Data Mapping/Modeling for data warehousing and Acquisition/Conversion related tasks.
  • Have a strong experience in Teradata development and index’s (PI, SI, PARTITION, and JOIN INDEX) etc.
  • Worked on Data Mining techniques for identifying claims on historical data.
  • Experienced in development/support of various data warehousing applications in communication, health services, insurance, banking and financial industries.
  • Proficient in performance tuning rom design architecture to application development.
  • Expertise in database programming like writing Stored Procedures (SQL), Functions, Triggers, Views in Teradata, DB2 & MS Access.
  • Exposure in extracting, transforming and loading (ETL) data from other databases and text files using SQL Server and ETL tools.
  • Experience in writing complex SQL to implement business rules, extract and analyze data

TECHNICAL SKILLS

Databases: Teradata V2R5, V2R6, V2R13,Teradata 12, Teradata 13, Teradata 14, Teradata 15,DB2,Oracle 10g,11g,12C, SQL Server 13.0.

SQL Tools: Teradata SQL Assistance, Teradata studio, TOAD 12.8, SQL Developer, T-SQL.

Languages: SQL,PL/SQL, Java script,perl,python,Job Control Language (JCL),, SQL

Reporting Tools: Tableau 2019.2/10.5/10.4/10.3.1/10.2/10.1/10.0/9. x/8.x/7.x, Tableau Server, Tableau Online, Tableau Mobile, Tableau Public, Tableau Reader, SAP Business Objects BI 4.2/ XI 3.1 R1/R2,6.5, SAP Business Objects Dashboards/Xcelsius 2008, SAP BO Web Intelligence Rich Client, Tableau, Micro strategy.

Teradata Utilities/ DBA / ETL /Modeling Tools: BTEQ, Fast Load, MultiLoad, Fast Export, Tpump, TPT,Mainframes Teradata viewpoint, Teradata Visual Explain, Teradata Manager, PMON,Informatica Power Center 8.X/9.X/10.X,Erwin r9.7, Erwin Studio, CICD, GIT GUI, DevOps, AppOps, JIRA, SAS DI 4.9, Confluence, IWS.

Configuration Tools: BMC Remedy, IBM Rational Team Concert6.02, VSS 6.0, Clear case, Zillow.

Project Methodology/Tools: Waterfall, Agile, Hybrid, Scrum, MS office 2016, MS VISIO 2016, MS Project 2016, JIRA, SVN, GIT, CI-CD Pipeline.

Operating Systems: Windows XP/NT/2000/98/95, UNIX, Linux

BigData/Hadoop Technologies: HDFS, Spark, Snowflake,Hive,Scala

Other tools: File-Aid, Endeavour, TSO, Perigreen Service Center, Exceed 8.0, Redbox, Expeditor, Hummingbird BI Query, Korn Shell, Maestro (Job Scheduling Console), Tivoli Work Scheduler, One Automation,HP ALM, Coin

PROFESSIONAL EXPERIENCE

Confidential

Teradata Developer

Responsibilities:

  • Develop logic to support enterprise Data Management solutions for healthcare analytics.
  • Develop and maintain complex and advanced Business Intelligence (BI) capabilities such as semantic layers, OLAP cubes, universes, predefined reports, access, capacity and performance management
  • Participate in various requirements sessions with internal and external stakeholders and convert business / functional requirements into technical solutions, along with supporting materials such as use cases, designs, flowcharts, models, specifications, and reports
  • Provide technical support to requirements, development, testing, help desk, and training teams
  • Support the project team to develop end-user training and reference materials
  • Interact with the database team to ensure data integrity and optimize end-user experience and performance as well as interpret error logs to debug and correct code.
  • Writing Teradata BTEQ Scripts to load data from main tables to our Medi-Medi tables based on the source to target mapping documentation.
  • Worked on MKS for promoting the code from one environment to other environments
  • Worked on service now ticketing tool.
  • Worked on UNIX for verifying the code and logs.
  • Follow the corporate and program SDLC and quality guidelines, as well as adhere to industry regulations and compliance requirements.
  • Support performance tuning to ensure logic performs efficiently.
  • Demonstrate a high degree of adaptability, comfortable in establishing new directions, managing rapid change, and trying different approaches to deal with uncharted territories
  • Experience with supporting leading commercial-off-the-shelf (COTS) BI applications like SAS EBI, Business Objects, Cognos, and Micro Strategy
  • Familiarity with post-relational databases such as Hadoop, Spark, No Schema, NoSQL and Big Data
  • Ability to analyze project requirements and develop detailed specifications and designs for new BI capabilities, while balancing cost, performance, quality and schedule demands
  • Experience with KornShell programming on Unix/Linux platforms
  • Practical experience with high volume data mining or competitive intelligence, statistical analysis, predictive modeling, or forecasting.

Environment: Teradata 12, Teradata 15, Unix, Hadoop, Korn Shell, Teradata - BTEQ, Fast load, Multi-load, Fast Export, TPUMP, Oracle, Mainframes, SQL server, Aqua data studio, Toad, MKS, Micro Strategy, Jira.

Confidential

Teradata Developer

Responsibilities:

  • Involved with business requirements, Technical requirements, and coordinated with Data analyst team on the requirements.
  • Managing database space, allocating new space to database, moving space between databases as need basis.
  • Extensively worked with DBQL data to identify high usage tables and columns.
  • Worked on ETL Informatica Power Center 9.5 for creating mappings/workflows and also invoking bteq’s via workflows.
  • Developing complex sql’s and Bteq scripts for processing the data as per coding standards.
  • Invoking shell scripts to execute Bteq, fast load, multi load, TPump utilities.
  • Invoking Korn shell scripts to do reconciliation checks and passing parameter files.
  • Extensively worked on Mainframe/Unix and Informatica environments to invoke Teradata Utilities and file handlings.
  • Expertise with Job scheduling console (Tivoli) on GUI and MVS.
  • Expertise in understanding data modeling involving logical and physical data model.
  • Worked on POC for using TPT utility in place of regular utilities to compare the performance with existing process.
  • Working on design for developing process to update the existing applications to Integrate TPT for processing heterogeneous files with new operators named as Load, Update, Stream and Export respectively along with other operators like Selector, Inserter, Data connector, and ODBC.
  • Developed new or modified SAS programs and used SQL Pass through and libname methods to extract data from the Teradata Database and created study specific SAS datasets, which are used as source datasets for report generating programs.
  • Experienced in handling Terabytes of records and manipulating them through SAS .
  • Used SAS Data Step logics to Sort, Merge, Stack, Update and Interleave datasets for producing required analysis of data and used SAS Business Intelligence (BI) to produce required Reports.
  • Imported raw data files in excel format in SAS and subsequently created SAS Datasets and performed data manipulations on the datasets.
  • Cleaned existing data and converted them into useful SAS Datasets, merged datasets and created reports based on Ad-hoc requirements
  • Developed and review the code, support QA and performed Unit and UAT testing.
  • Coordinate with offshore team on Development activities
  • Writing BTEQ Scripts to load data from landing zone to Work tables in staging area based on the ETL mapping documentation.
  • Designed complex UNIX scripts and automated it to run the workflows daily, weekly and Monthly
  • Unit scripting to invoke the Teradata Bteq scripts.
  • Creation of views in Target database to calculate measures from different schemas.

Environment: Teradata 12, Informatica Power Centre 10.X, Hadoop, Unix, Korn Shell, Teradata - Bteq, Fast load, Multi load, Fast export, TPUMP, Oracle, Mainframes, Sql server, Aqua data studio, Micro strategy, SAS 9.1.3, Toad.

Confidential

Teradata Developer

Responsibilities:

  • Analyzing the business requirements specified by the client
  • Preparation of Technical requirement, High level and Low-level design documents based on the business process document.
  • Involves in loading data from flat files to ware house landing zone tables using Teradata Utilities.
  • Writing BTEQ Scripts to load data from landing zone to Work tables in staging area based on the ETL mapping documentation.
  • Designed complex UNIX scripts and automated it to run the workflows daily, weekly and Monthly
  • Unit scripting to invoke the Teradata Bteq scripts.
  • Creation of views in Target database to calculate measures from different schemas.
  • Performance tuning of BTEQ scripts.
  • Scheduling of jobs in one automation based on daily or intra-day frequency.
  • Extensively works in data Extraction, Transformation and Loading from source to target system using Informatica and Teradata utilities like fast export, fast load, multi load, TPT.
  • Developed complex mappings to load data from Source System (Oracle) and flat files to Teradata.
  • Works with Teradata utilities like BTEQ, Fast Load and Multi Load.
  • Experience in logical and physical data models using Data Modeling tools Erwin and ER Studio.
  • Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modeling.
  • Establishes application hierarchies, databases, users, roles and profiles as per requirements.
  • Responsible for performance monitoring, resource and priority management, space management, user management, index management, access control, execute disaster recovery procedures.
  • Created Teradata physical models using ER Studio by identifying right PI, PPI and other indexes. Created both base later and semantic layers. Worked with enterprise Data Modeling team on creation of Logical models.
  • Created scripts using Fast Load, Multi-Load to load data into the Teradata data warehouse.
  • Involved in SQL scripts Macros, stored procedures in Teradata to implement business rules.
  • Updated numerous BTEQ/SQL scripts, making appropriate DDL changes and completed unit test.
  • Worked on Teradata SQL Assistant querying the target tables to validate the BTEQ scripts.
  • Used Informatica Power Center Workflow manager to create sessions and workflows to run the logic embedded in the mappings.
  • Wrote UNIX scripts to run and schedule workflows for the daily runs.

Environment: Teradata 15.10, Unix (Shell scripting), Teradata - Bteq, Fast load, Multi load, Fast export, Viewpoint, Informatica power center 10.1/10.2, Scheduling tool- One Automation.

Confidential

Teradata Developer

Responsibilities:

  • Extensively worked in data Extraction, Transformation and Loading from source to target system using Informatica and Teradata utilities like fast export, fast load, multi load, TPT.
  • Developed complex mappings to load data from Source System (Oracle) and flat files to Teradata.
  • Worked with Teradata utilities like BTEQ, Fast Load and Multi Load.
  • Experience in logical and physical data models using Data Modeling tools Erwin and ER Studio.
  • Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modeling.
  • Established application hierarchies, databases, users, roles and profiles as per requirements.
  • Responsible for performance monitoring, resource and priority management, space management, user management, index management, access control, execute disaster recovery procedures.
  • Created Teradata physical models using ER Studio by identifying right PI, PPI and other indexes. Created both base later and semantic layers. Worked with enterprise Data Modeling team on creation of Logical models.
  • Created scripts using FastLoad, Multi-Load to load data into the Teradata data warehouse.
  • Involved in Sql scripts Macros, stored procedures in Teradata to implement business rules.
  • Updated numerous BTEQ/Sql scripts, making appropriate DDL changes and completed unit test.
  • Worked on Teradata SQL Assistant querying the target tables to validate the BTEQ scripts.
  • Used Informatica Power Center Workflow manager to create sessions and workflows to run the logic embedded in the mappings.
  • Wrote UNIX scripts to run and schedule workflows for the daily runs.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and used them in the job.
  • Created different transformations like Source Qualifier, Expression, Filter, Lookup transformations for loading the data into the targets.
  • Wrote complex SQL scripts to avoid Informatica joiners and look-ups to improve the performance as the volume of the data was heavy.

Environment: Teradata 14, Hadoop, UNIX, KornShell, Informatica Power Centre 10.X, Teradata - Bteq, Micro strategy, Fast load, Multi load, Fast export, Viewpoint.

Confidential

Teradata Developer

Responsibilities:

  • Involved with business requirements, Technical requirements, and Design documents and coordinated with Data analyst team on the requirements.
  • Work in coordination with DBA Team on remediation activities. Create users, databases and roles.
  • Allocate database space and monitor queries and system performance using Viewpoint and PMON.
  • Developing complex sql’s and Bteq scripts for processing the data as per coding standards.
  • Invoking shell scripts to execute Bteq, fast load, multi load utilities.
  • Invoking Korn shell scripts to do reconciliation checks and passing parameter files.
  • Worked on conceptual/logical/physical data model level using ERWIN according to requirements.
  • Interacted with various Business users in gathering requirements to build the data models and schemas.

Environment: Teradata 13, Windows, Korn Shell, Teradata - Bteq, Fast load, Multi load, Fast export, Oracle, Informatica Power Center 9.6,SQL server, Aqua data studio, Toad.

We'd love your feedback!