Teradata Informatica Developer Resume
SUMMARY
- 7 years of expertise in Analysis, Design, Development, Testing and Implementation of various application systems in diverse domains.
- Expertise in Data Warehouse/Data mart, OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, Development, System testing, Implementation and production support.
- Proficient in all stages of software development life cycle (SDLC) including requirements analysis, system design, data modeling, database design, code development, code review, integration and testing, code migration, system documentation, transitioning to support teams as well as provide production support.
- Experienced in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using IBM Infosphere Data Architect.
- Worked with business managers and Subject Matter Experts (SMEs) to identify, prioritize and implement solutions to improve efficiency, reduce risk and support new business.
- Extensive experience in software development using ETL technologies like Informatica PowerCenter, Syncsort DMX - h, SAS Data Integration.
- Proficient in the Integration of various data sources with multiple relational databases like Teradata, MS SQL Server, Oracle and Flat Files into the Staging Layer.
- Strong Data Warehousing ETL experience of using Informatica PowerCenter Client tools - Mapping Designer, Workflow Manager/Monitor and Repository Manager.
- Worked extensively with slowly changing dimensions.
- Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load, TPT to export and load data to/from different source systems including flat files.
- Experienced in creating Tables, Views, defining and using Indexes, using various Joins, Functions, set operators, creating complex queries in Teradata.
- Proficient in performance analysis, data profiling, monitoring and SQL query tuning using Explain Plan, Collect Statistics in Teradata.
- Developed Informatica workflows to facilitate ETL process in project as per requirement.
- Used Informatica PowerExchange to source data from Legacy Mainframe systems.Experienced in developing Informatica Workflows, Worklets, Sessions and other Tasks, Mapping, Mapplets and various Transformations.
- Experience in resolving maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Have exposure to file processing in HDFS using Syncsort DMX-h Jobs and Tasks.
- Extensive experience in writing UNIX shell scripts.
TECHNICAL SKILLS
Database: Teradata, Oracle, MS SQL Server
ETL Tools: Informatica PowerCenter, Informatica PowerExchange, Syncsort DMX-hSAS Data Integration Studio
Job Scheduler: Autosys, SAS Management Studio
Operating Systems: Unix
Analytics: Base SAS
IDE: Teradata SQL Assistant, Eclipse, TOAD, PUTTY
Version Control: Tortoise SVN
Data Modeling Tool: IBM Infosphere Data Architect
Big Data: HDFS, MapReduce, Syncsort DMX-h
PROFESSIONAL EXPERIENCE
Confidential
Teradata Informatica Developer
Environment: Teradata, Informatica PowerCenter, UNIX, Autosys, Tortoise SVN
Responsibilities:
- Primarily includes requirement gathering from users on the Anti-Money Laundering Customer profiling data.
- Responsibility also includes Data analysis, prototyping the business logic, document the design approach and develop (1)the approved prototype.
- Created logical and physical data models and implemented the models in database.
- Created Tables, Indexes, Macros and views and worked in TPT Utilities extensively.
- Used Teradata utilities to source data from different source systems including flat files and Teradata tables.
- Developed BTEQs to transform data received from source to populate the various profiles in this application.
- Developed BTEQs to maintain history of slowly changing dimensions in various profiles.
- Captured Teradata Metrics to analyze query performance.
- Tuned Teradata queries to improve performance.
- Developed Informatica Workflows to source data from flat files and tables to stage tables in the application.
- Improved performance of Informatica workflows by implementing pipeline partitioning for parallel data processing.
- Exported Informatica workflows into XML file to facilitate migration to multiple environments.
- Developed Shell scripts for implementing controls and other file processing.
- Developed shell scripts for invoking BTEQs and for triggering Informatica Workflows.
- Scheduled the various processes in the application using Autosys scheduling tool by determining the dependencies and SLAs.
- Prepared Low Level Design (LLD) Documents
- Prepared Unit test cases and test scripts for code validation.
- Prepared Detailed Task Schedule (DTS) document which is a detailed document that shows the schedule to be followed by the various teams involved in production deployment as well as the various deployment activities.
- Performed Unit testing and supported User Acceptance Testing.
- Created SVN Tags to facilitate code migration to other environments.
- Involved in Deployment and Release management Activities.
Confidential
Teradata Informatica Developer
Environment: Teradata, Informatica PowerCenter, Informatica PowerExchange, UNIX, Autosys, Syncsort DMX-h, HDFS, MapReduce, Oracle
Responsibilities:
- Prepared High Level (HLD) and Low Level Design (LLD) documents based on Business Requirement Document.
- Developed Informatica Workflows for sourcing data from various sources to stage tables.
- Processed data using various Informatica Transformations (Source Qualifier, Joiner, Lookup, Filter, Router, Aggregator, Expression, Union, Update Strategy, Sorter, Rank etc.) and loaded target tables as well as created files.
- Transmitted files to other environments using NDM and SFTP.
- Developed Syncsort DMX-h Jobs to process and store files in HDFS.
- Developed BTEQs to transform data as per requirement.
- Developed Shell Scripts to facilitate file processing, for implementing controls, invoking BTEQs, triggering Informatica Workflows and DMX Jobs.
- Scheduled Jobs in Autosys using JILS.
- Captured Teradata Metrics and tuned queries that were non-compliant as per platform standards.
- Prepared Unit Test Plan and Unit Test Cases for validating the code modules.
- Validated migrated components, provided deployment support and monitored jobs during production deployment.
Confidential
Teradata Informatica Developer
Environment: Teradata, Informatica PowerCenter, Informatica PowerExchange, UNIX, Autosys
Responsibilities:
- Developed Informatica Workflows for sourcing data from various sources to stage tables using Teradata API and Utilities.
- Used Informatica PowerExchange to pull data from Mainframe Legacy systems.
- Used various transformations like Source Qualifier, Joiner, Lookup, Filter, Router, Aggregator, Expression, Union, Update Strategy, Sorter, Rank to develop robust mappings in the Informatica Designer.
- Developed mapping parameters and variables to support SQL override.
- Created mapplets and used them in different mappings.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Worked on different tasks in Workflows like sessions, e-mail, command, worklets.
- Transmitted files to other environments using NDM and SFTP.
- Developed BTEQs to transform data as per requirement.
- Developed Shell Scripts to facilitate file processing, for implementing controls, invoking BTEQs and triggering Informatica Workflows.
- Scheduled Jobs in Autosys using JILS.
- Captured Teradata Metrics and tuned queries that were non-compliant as per platform standards.
- Prepared Unit Test Plan and Unit Test Cases for validating the code modules.
- Validated migrated components, provided deployment support and monitored jobs during production deployment.
Confidential
Data Modeler
Environment: UNIX, IBM Infosphere Data Architect, Oracle
Responsibilities:
- Identifying the key fields and entities used in the legacy reports for a particular subject area using shell scripts.
- Developing Conceptual, Logical and Physical Data Models after analyzing the relationship between these entities.
- Developed mapping spreadsheets with source to target data mapping with physical naming standards, data types and metadata definitions.
- Implementing the data model into Oracle database after removing all the redundancies identified.
- Developing end user views for these entities in the new platform.
Confidential
SAS ETL Developer
Environment: MS SQL Server, Base SAS, SAS DI Studio, SAS Management Studio, UNIX
Responsibilities:
- Developed SAS Data Integration jobs to source data from MS SQL Server source systems to SAS Datasets.
- Developed SAS Data Integration jobs to process and transform data from SAS Datasets and create target files which will be transmitted using NDM to downstream.
- Scheduled jobs using SAS Management Studio.
- Prepared Unit Test Plan and Unit Test Cases and tested the code modules developed.
Confidential
Environment: Matlab
Responsibilities:
- Designing and developing the application with a team consisting 4 undergraduate students.
- Reviewing the design with Professors and validating the design approach
- Developing code modules in Matlab for image noise reduction, number plate extraction and character extraction.
- Training the neural network.
- Testing the created network using various image inputs.
- Developing user interface
- Detailed Documentation of the project.