We provide IT Staff Augmentation Services!

Sr. Informatica/teradata developer Resume

5.00/5 (Submit Your Rating)

Atlanta, GA

PROFESSIONAL SUMMARY:

  • Over 7+ years of extensive ETL (Extract Transform Load) experience in Data Integration, Data Warehousing, Data Migration, Data Management and Data Cleansing using Informatica and SSIS.
  • Expertise in Data Warehousing Concepts like Star Schema, Snow Flake Schema, Fact Table, Dimension Table, OLAP/OLTP, Logical Data Modeling, Physical Modeling, Dimension Data Modeling, multidimensional modeling, Data profiling and data cleansing.
  • Solid experience in designing and developing complex mappings to extract data from various legacy diverse sources including CSV files, flat files, fixed width files, delimited files, XML files, web services, Teradata, Oracle, MS SQL Server, DB2, Netezza, Sybase, Salesforce, SAP and FTP into a common reporting and analytical Data Model using Informatica Power Center and SSIS.
  • Strong knowledge of database management concepts like conceptual, logical and physical data modeling and data definition, population and manipulation.
  • Proficiency in design and developing the ETL objects using Informatica Powercenter with various Transformations like Joiner, Aggregate, Expression, SQL, Lookup, Filter, Update Strategy, Stored Procedures, Sorter, Sequence, Generator, Router, Rank, normalizer, B2B transformations etc.
  • Widely used different features of Teradata such as BTEQ, Fastload, Multiload, SQL Assistant, DDL and DML commands and very good understanding of Teradata UPI and NUPI, secondary indexes and join indexes.
  • Very good understanding of Teradata’s MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc.
  • Experience in working on Apache Hadoop ecosystem with Informatica BDE. Very good knowledge in manipulating and handling hdfs files using Linux putty and query HIVE database using hue and Teradata SQL assistant.
  • Extensively created and used various Teradata Set Tables, Multi - Set table, global tables, volatile tables, temp tables.
  • Experience in Informatica B2B Data Exchange using Unstructured, Structured Data sets.
  • Widely used Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B Data Exchange of Informatica.
  • Experience in Mainframe applications development of TSO, COBOL II, JCL, DB2, SQL, SPUFI, QMF, IMS, IDMS, CICS and VSAM.
  • Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC).
  • Widespread knowledge on Data Analysis and Data Profiling using SQL.
  • Experience working with data anonymization, healthcare claims, healthcare enrollments, electronic health records, pharmacy data and other data sets involving HIPAA and high compliance standards.
  • Implemented slowly changing dimension types (I, II &III) methodologies for accessing the full history of accounts and transaction information.
  • Designed and developed change data capture solutions (CDC) for the project which captures and analyses changes from daily feeds to maintain history tables.
  • Good experience working with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, address standardization, exception handling, and reporting and monitoring capabilities of IDQ.
  • Experience working with Informatica Data Quality Developer/Analyst Tools to remove the noises of data using different transformations like Standardization, Merge and Match, Case Conversion, Consolidation, Parser, Labeler, Address Validation, Key Generator, Lookup, Decision etc.
  • Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes including Join and Sparse Indexes.
  • Extensively worked on Viewpoint for Teradata to look at performance Monitoring and performance tuning.
  • Experience in programming with SQL, PL/SQL (Stored Procedures, Functions, Cursors, and Database Triggers), Bteq, and T-SQL.
  • Excellent knowledge and experience in creating and performance tuning the high volume Databases, Tables, Stored Procedure, DDL/DML Triggers, Views, User defined data types, Cursors and Indexes.
  • Strong SQL experience in Teradata from developing the ETL with Complex tuned queries including analytical functions and BTEQ scripts.
  • Extensively used Mapping Variables, Mapping Parameters, and Dynamic Parameter Files for improved performance and increased flexibility and also worked with XML Sources & Targets.
  • Knowledge of push down optimization concepts and tuning Informatica objects for optimum execution timelines.
  • Experienced with identifying Performance bottlenecks and fixing code for Optimization in Informatica and Oracle.
  • Extensive experience in implementation of Data Clean-up Procedures, Transformation, Scripts, Stored Procedures and execution of Test plans for loading the data successfully into Targets.
  • Experience in Normalization and De-Normalization techniques for both OLTP and OLAP systems in creating Database Objects like tables, Constraints (Primary key, Foreign Key, Unique, Default), Indexes.
  • Good working experience and knowledge on Cloud computing with AWS tools and Hadoop Big Data Concepts with emphasis on ETL solutions using Hadoop ecosystem technologies.
  • Involved in requirement gathering from user through meetings, understanding the business from end user and technical perspective.
  • Experience with industry standard methodologies like Agile and Scrum methodology within the Software Development Life Cycle (SDLC).
  • Results oriented, self-starter looking for challenges, ability to rapidly learn and apply new technologies and good interpersonal skills.
  • Excellent written and verbal communication skills and experience in interacting with business owners both formally and informally.

TECHNICAL SKILLS:

ETL Tools: Informatica 9.6.1/9.6/9.1/9/8/7, Informatica Cloud, Informatica B2BInformatica PowerExchange, Informatica Big Data Edition 9.6, Informatica IDQ, MS SSIS2012/2008

Relational Databases: Teradata 14.10/14/13.10/13, Oracle 11g/10g/9i, MS SQL Server 2014/2012/2008, Netezza, DB2, MS Access

NoSQL Databases: MongoDB, Cassandra

Teradata Utilities:: BTEQ, FastLoad, MultiLoad, FastExport, TPump, TPT, SQL Assistant, Teradata Manager,Teradata Viewpoint

BI Tools: Tableau, Microstrategy, SSRS, Business Objects.

Big Data Technologies: Hadoop, Spark, HDFS, Map Reduce, Hive, Pig, HBase, Sqoop, Oozie

Data Modeling: Erwin, MS Visio

Scripting Languages: Bteq, Pl/Sql, T-SQL, Shell Scripts

OLAP Tools: SSAS

Programming Languages:: SQL, HTML, XML, C, Java, Python, Ruby, Scala

Scheduling Tools:: Tidal, Autosys

DB Query Tools: : Toad, SQL Developer, SQL Assistant

Business Management Tools: : MS Office, MS Excel, MS Visio

Environment: s: Windows 7/2003/XP/2000/NT/98/95, UNIX (Sun Solaris, HP-UX, AIX), Linux

PROFESSIONAL EXPERIENCE:

Confidential, Atlanta, GA

Sr. Informatica/Teradata Developer

Responsibilities:

  • Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Extracted data from DB2 database on Mainframes and loaded it into SET and MULTISET tables in the Teradata database by using various Teradata load utilities. Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
  • Designed and developed a number of complex mappings using various transformations like Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup, Filter, Update Strategy, Stored Procedure, Sequence Generator, etc.
  • Created reusable sessions, Workflow, Worklet, Assignment, Decision, Event Wait and Raise and Email Task and Scheduled Task based on Client requirement.
  • Used Informatica B2B data exchange to handle EDI (Electronic Data Exchange) for handling the payments for the scheduled dates.
  • Worked with Informatica Cloud to create Source /Target connections, monitor, and synchronize the data with salesforce.
  • Worked with Informatica cloud for creating source and target objects, developed source to target mappings.
  • Experience with Informatica BDE related work on HDFS, Hive, Oozie, Spark and sqoop.
  • Experience with Linux HDFS commands to manipulate, Validate (ETL output),
  • Extracted hadoop hdfs data using putty for Unit testing
  • Used Informatica debugger to test the data flow and fix the mappings.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, address standardization, exception handling, and reporting and monitoring capabilities of IDQ.
  • Worked on profiling the source data to understand and perform Data quality checks using Informatica Data Quality and load the cleansed data to landing tables.
  • Worked with Informatica Data Quality Developer/Analyst Tools to remove the noises of data using different transformations like Standardization, Merge and Match, Case Conversion, Consolidation, Parser, Labeler, Address Validation, Key Generator, Lookup, Decision etc.
  • Involved in creating Unit test plans for and testing the data for various applications.
  • Loaded data between traditional rdbms or hive/hdfs source and targets and pushed the whole ETL logic to hadoop cluster.
  • Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created mapplets to use them in different mappings.
  • Maintained Development, Test and Production mapping, migration using Repository Manager also used Repository Manager to maintain the metadata, Security and Reporting.
  • Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views.
  • Conduct business requirement review sessions with IT and Business teams to understand what the outcome should be based on data elements.
  • Development of scripts for loading the data into the base tables in EDW and to load the data from source to staging and staging area to target tables using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables
  • Extensively worked on Performance tuning to increase the throughput of the data load (like read the data from flat file & write the data into target flat files to identify the bottlenecks).
  • Worked on Error handling and performance tuning in Teradata queries and utilities.
  • Writing SQL Scripts to extract the data from Database and for Testing Purposes.
  • Interacting with the Source Team and Business to get the Validation of the data.
  • Involved in resolving issue tickets related data issues, ETL issues, performance issues, etc.
  • Used Unix Schell scripting to automate several ETL processes.
  • Create new UNIX scripts to automate and to handle different file processing, editing and execution sequences with shell scripting by using basic Unix commands and ‘awk’, ‘sed’ editing languages.
  • Created UNIX shell scripts for Informatica post and pre session operations, database administration and day-to-day activities like, monitor network connections and database ping utilities.

Environment: Informatica Power Center 9.6.1/9.6, Informatica Cloud, Linux, Teradata 14.10/14, Hive, Informatica B2B Exchange, Informatica Big Data Edition, Oracle EBS, Control-M, Oracle 11g, Restful API, uDeploy, fixed width files, TOAD, SQL Assistant, Git, Jenkins, Unix, Korn Shell Scripting, Autosys r11.1, SAS, Tableau 9/10, Business Objects, Flat Files.

Confidential,Dayton, OH

Sr. ETL Developer

Responsibilities:

  • Worked on Teradata and its utilities - tpump, fastload through Informatica. Also created complex Teradata Macros
  • Responsible for requirements gathering for an enhancement requested by client. Involved in analysis and implementation for an Intranet Based Information Management Information System.
  • Analyzing, designing and developing ETL strategies and processes, writing ETL specifications, Informatica development, and administration and mentoring other team members.
  • Developed mapping parameters and variables to support SQL override.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
  • Worked on Teradata and its utilities - tpump, fastload through Informatica.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Exhaustive testing of developed components.
  • Worked with Informatica BDE using Blaze to convert mapping logic to MapReduce and run it on Hive.
  • Worked on the various enhancements activities, involved in process improvement.
  • Used DT studio for B2B exchange of HL7 messages to our source landing.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
  • Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.

Environment: Informatica 9.6/9.5.1/9.5 (Designer, Repository Manager, Workflow Manager, Workflow Monitor), Oracle 11G, Teradata 14, Informatica B2B Exchange, Informatica Big Data Edition 9.6, Tableau 9, UNIX, Citrix, Toad, Putty, PL/SQL Developer, Microstrategy.

Confidential, Indianapolis, IN

ETL Developer

Responsibilities:

  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions type 2 and type 1.
  • Mostly worked on Dimensional Data Modelling, Star Schema and Snowflake schema modelling.
  • Worked on the various enhancements activities, involved in process improvement.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked on Change data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
  • Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Worked on PowerExchange bulk data movement process by using PowerExchange Change Data Capture (CDC) method, Power Exchange Navigator, Power Exchange Bulk Data movement. PowerExchange CDC can retrieve updates at user-defined intervals or in near real time.
  • Worked on Informatica B2B Data exchange to parse hl7 files.
  • Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
  • Wrote Unix Shell Scripts to process the data received from source system on daily basis.
  • Involved in the continuous enhancements and fixing of production problems.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
  • Created and reviewed scripts to create new tables, views, queries for new enhancement in the application using TOAD.
  • Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.
  • Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
  • Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
  • Supported the development and production support group in identifying and resolving production issues.
  • Was instrumental in understanding the functional specifications and helped the rest of the team to understand the same.
  • Involved in All the phases of the development like Analysis, Design, Coding, Unit Testing, System Testing and UAT
  • Implemented DBMOVER configuration file, dbmover.cfg, to configure the operation of various Power Exchange tasks as well as their communication with other Power Exchange tasks.
  • Applied different types of monitoring for Power exchange processes like dtlcacon, dtllst, oracle scn, lag scripts, Heartbeat table to monitor the lag
  • Created and managed different Power exchange directories like condense files directory, check point directory etc.
  • Developed wrapper shell scripts for calling Informatica workflows using PMCMD command and Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Experience in managing different Power exchange directories like condense files directory, check point directory etc.
  • Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.

Environment: Informatica 9.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Windows, Oracle 11G, Teradata 14/13.10, Tableau, UNIX, Putty, PL/SQL Developer, Power Exchange.

Confidential

ETL/Tableau Developer

Responsibilities:

  • Worked as ETL developer and Tableau developer and widely involved in Designing, development and debugging of ETL mappings using Informatica designer tool as well as Created advanced chart types, visualizations and complex calculations to manipulate the data using Tableau Desktop.
  • Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
  • Worked closely with Business Analyst in understanding the requirements.
  • Extracted data from DB2 database on Mainframes and loaded it into SET and MULTISET tables in the Teradata database by using various Teradata load utilities.
  • Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
  • Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
  • Extensively used Data Blending to Join Multiple Data Sources.
  • Performed unit, integration and system level performance testing. Associated with production support team in various performances related issues.
  • Created mappings using Aggregator, Expression, Joiner, Filter, Sequence, Procedure, Connected & Unconnected Lookup, Filter and Update Strategy transformations using Informatica Power center designer.
  • Extensively used ETL to load data from different sources such as flat files, XML to Oracle.
  • Implemented advanced geographic mapping techniques and use custom images and geo coding to build spatial visualizations of non-geographic data.
  • Worked with all levels of development from analysis through implementation and support.
  • Resolved end user reporting problems through collaboration with IT and Operations.
  • Developed reports that deliver data from cubes.
  • Responsible for ongoing maintenance and change management to existing reports and optimize report performance.
  • Provided production support by monitoring the processes running daily.
  • Provided 24/7 On-call Production Support for various applications and provided resolution for night-time production job abends, attend conference calls with business operations, system managers for resolution of issues.
  • Worked on mapping parameters and variables for the calculations done in aggregator transformation.
  • Tuned and monitored in Informatica workflows using Informatica workflow manager and workflow monitor tools.
  • Created, Scheduled and configured workflows, worklets and sessions using Informatica workflow manager.

Environment: Informatica Power Center 8.1, Linux, Tableau 6/7, Teradata 13, Oracle 10g, Toad, Erwin, UNIX and Windows.

Confidential

ETL/Teradata Developer

Responsibilities:

  • Developed mapping parameters and variables to support SQL override.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
  • Worked on Teradata and its utilities - tpump, fastload through Informatica.
  • Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Exhaustive testing of developed components.
  • Worked on the various enhancements activities, involved in process improvement.
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Transformation Developer, WorkFlow Manager, Workflow Monitor.
  • Worked on Change data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
  • Compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
  • Worked on Ab Initio in order to replicate the existing code to Informatica.
  • Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.
  • Environment: Informatica 7.1, Teradata 12, Windows, BTEQ, FastLoad, MultiLoad, Fast Export, Teradata SQL Assistant, Oracle 9i, DB2, UNIX, FTP.

We'd love your feedback!