We provide IT Staff Augmentation Services!

Application Engineer Resume

2.00/5 (Submit Your Rating)

Beaverton, OR

SUMMARY

  • Around 8+ years of extensive experience in ETL (Extract Transform Load), Data Integration and Data Warehousing using Teradata and Informatica technologies.
  • Very good understanding of Teradata’s MPParchitecture such as shared nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc.
  • Extensively created and used various Teradata Set Tables, Multi - Set table, global tables, volatile tables, temp tables.
  • Knowledge onHadoop, HDFS, MapReduce and Hadoop Ecosystem (Pig & Hive).
  • Extensively used different features of Teradata such as BTEQ, Fastload, Multiload, SQL Assistant, View Point, DDL and DML commands. Very good understanding of Teradata UPI and NUPI, secondary indexes and join indexes.
  • Extensively created Batch jobs and real time Integration using Informatica power center.
  • Experience in Mainframe applications development. Mainframe skills include TSO, COBOL II, JCL, DB2, SQL, SPUFI, QMF, IMS, IDMS, CICS and VSAM.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Working Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, Kimball Methodology used In Relational, Dimensional and Multidimensional data modelling.
  • Extensive knowledge on Data Profiling using Informatica Developer tool.
  • Implemented Slowly changing dimension types (I, II &III) methodologies for accessing the full history of accounts and transaction information designed and developed change data capture solutions (CDC) for the project, which captures and analyses changes from daily feeds to maintain history tables.
  • Proficiency in design and developing the ETL objects using Informatica Powercenter with various Transformations like Joiner, Aggregate, Expression, SQL, Lookup, Filter, Update Strategy, Stored Procedures, Router, Rank, normalizer transformations etc.
  • Involved in Data Migration projects from DB2 and Oracle to Teradata. Created automated scripts to do the migration using UNIX shell scripting, Oracle/TD SQL, TD Macros and Procedures.
  • Automated the BTEQ report generation using UNIX scheduling tools on weekly and monthly basis. Well versed with understanding of Explain Plans and confidence levels and very good understanding of Database Skew. Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes.
  • Extensively worked on PMON/Viewpoint for Teradata to look at performance Monitoring and performance tuning. Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain. Experienceinprogramming with SQL and PL/SQL (Stored Procedures, Functions, Cursors, and Database Triggers).
  • Very good experience in Oracle database application development using Oracle 10g/9i/8i/x, SQL, PL/SQL, SQL Loader.
  • Strong SQL experience in Teradata from developing the ETL with Complex tuned queries including analytical functions and BTEQ scripts.
  • Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility and also worked with XML Sources & Targets.
  • Data Processing Experience in Designing and Implementing Data Mart applications, mainly Transformation Process using Informatica.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Worked with Informatica Data Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Knowledge of push down optimization concepts and tuning Informatica objects for optimum execution timelines.
  • Experienced with identifying Performance bottlenecks and fixing code for Optimization in Informatica and Oracle.
  • Created UNIX shell scripts for Informatica post and pre session operations, database
  • Administration and day-to-day activities like, monitor network connections and database ping utilities.
  • Extensive experience in implementation of Data Cleanup Procedures, Transformation, Scripts, Stored Procedures and execution of Test plans for loading the data successfully into Targets.
  • Maintaining the Visual Source Safe (VSS), Quality matrices, Knowledge management and Defect prevention & analysis.
  • Creating Checklists for Coding, Testing and Release for a smooth, better & error free project flow

Areas of Expertise

  • Data Warehousing and ETL Tools
  • Data Transfer and Data Migration
  • Mainframe Development
  • Query Optimization
  • Data Migration
  • Technical and User Documentation

TECHNICAL SKILLS

Primary Tools: Teradata SQL, Teradata

Tools: and Utilities: Informatica Power Center 9.0.1/8.6/8.1 , Ab Initio (Co>Op 3.0.3.9/2.15/2.14 , GDE 3.0.4/1.15/1.14 ), IBM Information Server 9.1/8.5/8.0.1 , Oracle 10g/9i, MS SQL Server 6.5/7.0/2000 Languages: Teradata SQL, COBOL, JCL, REXX, SQL

Teradata Utilities: BTEQ, FastLoad, MultiLoad, TPump, TPT, SQL Assistant, Teradata Manager

Databases: Teradata 13/12/V2R6.2, Oracle 10g/9i, DB2/UDB, SQL Server

Operating Systems: Windows 95/98/NT/2000/XP, UNIX, Linux, NCR MP-RAS UNIX

Data Modeling: Erwin, ER Studio

Scheduling tools: Control M, Autosys

PROFESSIONAL EXPERIENCE

Confidential, Beaverton, OR

Application Engineer

Environment: Teradata 14.10 (FastLoad, MultiLoad, FastExport, BTEQ), Teradata SQL Assistant, Informatica Power Center 9, Unix, Alteryx Designer 10,Microsoft SQL Server Management Studio 2014,Oracle SQL Developer 4.1.1, SQL, PL/SQL, Work Load Manager, MS Access, UNIX.

Responsibilities:

  • Involved in Product Engine Solution (PES), Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
  • Involved in Emerging Market Project, Setting up new data base and roles specific to Emerging Markets,Business Requirements Analysis,Analysis of SQL Server queries and Cognos reports, Lift and Shift to Teradata from SQL Server,Worked on Point Of Sales (POS) data and Direct to Customer (DTC) data, Preparing Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
  • Created and implemented User Managed Data (UMD) Process for Logistic Team and Key Business Segment (KBS) Team by using Microsoft Access and Teradata.
  • Provide Teradata production Support for sales and DTC team in Emerging Markets.
  • Involved in adding new plant codes in Teradata for FRS team, analyzed the Source of the plant codes and Informatica work flows. Impact analysis of adding new plants in Teradata. Deployed plant codes as part of ERS Build plan spring 2016.
  • Involved in understanding the Requirements of the End Users/Business Analysts and developed strategies for ETL processes.
  • Created Batch jobs and Automated them according to business requirement.
  • Worked on Teradata and its utilities - tpump, fastload through Informatica.Also created complex Teradata Macros
  • Worked on Alteryx Designer 10 - Creating,Mapping, Analyzing, monitoring and automatingnew Workflows
  • Used Oracle SQL Developer 4.1.1 - Defining the schema, staging tables, and landing zonetables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
  • Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
  • Used Microsoft SQL ServerManagement Studio and SSIS package - Analyzing Technical Design documents, Data Analysis and analyzing complex logics in SQL server, Lift and Shift to Teradata from SQL Server.
  • Developed mapping parameters and variables to support SQL override.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Exhaustive testing of developed components.
  • Worked on the various enhancements activities, involved in process improvement.
  • Used Informatica and Alteryx client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.

Confidential, San Jose, CA

Teradata Consultant

Environment: Teradata 14.0 (FastLoad, MultiLoad, FastExport, BTEQ), Teradata SQL Assistant, Hadoop-Hdfs, Apache Pig, Sqoop, Flume, Hive, Map Reduce, Informatica Power Center 9, Unix, SQL, PL/SQL, Work Load Manager, MS Access, UNIX.

Responsibilities:

  • Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
  • Providing technicalsupport and guidance to the offshore team to address complex business problems.
  • Defining the schema, staging tables, and landing zonetables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
  • Expertise in writing scripts for Data Extraction, Transformation and Loading of data from legacy systems to target data warehouse using BTEQ, FastLoad, MultiLoad, and Tpump.
  • Developed multiple MapReduce Jobs in java for data cleaning and pre-processing.
  • Analyzed large data sets by running Hive queries and Pig scripts
  • Performed Query Optimization with the help of explain plans, collect statistics, Primary and Secondary indexes. Used volatiletable and derivedqueries for breaking up complex queries into simpler queries. Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
  • Dealt with initials, delta and Incremental data as well Migration data to load into the Teradata.
  • Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Using various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Created Batch jobs and real time Integration using Informatica power center.
  • Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created mapplets to use them in different mappings.
  • Created data models for information systems by applying formal data modeling techniques.
  • Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.
  • Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.
  • Performed reverse engineering of physical data models from databases and SQL scripts.
  • Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow.
  • Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
  • Investigating failed jobs and writing SQL to debug data load issues in Production.
  • Writing SQL Scripts to extract the data from Database and for Testing Purposes.
  • Interacting with the Source Team and Business to get the Validation of the data.
  • Involved in Transferring the Processed files from mainframe to target system.
  • Supported the code after postproduction deployment.
  • Familiar with Agile software methodologies (scrum).

Confidential, Lansing, MI

Informatica/Teradata Consultant

Environment: Teradata 14.0 (FastLoad, MultiLoad, FastExport, BTEQ), Teradata SQL Assistant, Informatica Power Center 9, Unix, SQL, PL/SQL, Work Load Manager, MS Access, UNIX.

Responsibilities:

  • Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
  • Developed mapping parameters and variables to support SQL override.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
  • Worked on Teradata and its utilities - tpump, fastload through Informatica.Also created complex Teradata Macros
  • Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Exhaustive testing of developed components.
  • Worked on the various enhancements activities, involved in process improvement.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked on Change data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
  • Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Worked on Ab Initio in order to replicate the existing code to Informatica.
  • Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
  • Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.

Confidential, Hoffman Estates, IL

Sr. Teradata Developer

Environment: Teradata 14.0/13.0, BTEQ, FastLoad, MultiLoad, Fast Export, Teradata SQL Assistant, OBIEE 11g/10g, DB2, ERwin R7.3, IBM Mainframes MVS/OS, JCL, TSO/ISPF, Changeman, SPUFI, FileAid, COBOL, ZEKE, DB2, UNIX, FTP.

Responsibilities:

  • Involved in understanding the Requirements of the End Users/Business Analysts and developed strategies for ETL processes.
  • Extracted datafrom DB2 database on Mainframes andloadedit into SET and MULTISET tables in the Teradata database by using various Teradata load utilities. Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
  • Architected and developed FastLoad and MultiLoad scriptsdeveloped Macros and Stored procedures to extract data, BTEQscripts to take the date range from the database to extract data.
  • Created JCL scripts for calling and executing BTEQ, FastExport, Fload, and Mload scripts.
  • Provided maintenance and support of Online and Batch Programs using COBOL, DB2, CICS, and JCL.
  • Writing queries using SPUFI to extract data from various DB2 Views for reporting purpose
  • Converting the Table data from DB2 region to TERADATA region using FASTLOAD and MULTILOAD Utilities.
  • Responsible for Coding, Unit Test Plans, Unit Test Results, Functional Testing and Regression Testing.
  • Migrated mainframe DB2 data to Teradata for one of their critical application.
  • Synchronizing all regions PCR, Unit and System while migration changes from lower region to acceptance.
  • Wrote several DB2 Stored Procedure scripts to implement the business logic.
  • Handling Ad-Hoc Report Requests.
  • Reviewing programs for QA and Testing.
  • Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
  • Wrote highly complex SQL to pull data from the Teradata EDW and create AdHoc reports for key business personnel within the organization.
  • Created data models for information systems by applying formal data modeling techniques.
  • Strong expertise in physical modeling with knowledge to use Primary, Secondary, PPI, and Join Indexes.
  • Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.
  • Performed reverse engineering of physical data models from databases and SQL scripts.
  • Provided database implementation and database administrative support for custom application development efforts.
  • Performance tuning and optimization of database configuration and application SQL by using Explain plans and Statistics collection based on UPI, NUPI, USI, and NUSI.
  • Developed OLAP reports and Dashboards using the Business intelligence tool - OBIEE.
  • Involved in comprehensive end-to-end testing- Unit Testing, System Integration Testing, User Acceptance Testing and Regression.
  • Provided 24/7 On-call Production Support for various applications and provided resolution for night-time production job abends, attend conference calls with business operations, system managers for resolution of issues.

Confidential, Deerfield, IL

Informatica/Teradata Developer

Environment: Informatica 9.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Informatica 8x, Oracle 10G, Teradata, UNIX, Citrix, Toad, Putty, PL/SQL Developer

Responsibilities:

  • Worked on Teradata and its utilities - tpump, fastload through Informatica.Also created complex Teradata Macros
  • Responsible for requirements gathering for an enhancement requested by client. Involved in analysis and implementation for an Intranet Based Information Management Information System.
  • Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
  • Developed mapping parameters and variables to support SQL override.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
  • Worked on Teradata and its utilities - tpump, fastload through Informatica.Also created complex Teradata Macros
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Exhaustive testing of developed components.
  • Worked on the various enhancements activities, involved in process improvement.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Worked on Ab Initio in order to replicate the existing code to Informatica.
  • Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
  • Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows.

We'd love your feedback!