We provide IT Staff Augmentation Services!

Senior Teradata / Etl Informatica Power Center Developer Resume

2.00/5 (Submit Your Rating)

West Chester, PA

SUMMARY

  • Over 8 years of experience in business Intelligence as senior Teradata, ETL Informatica PowerCenter and database developer.
  • Extensively used Teradata versions 14/13/12, Informatica PowerCenter 9.6/9.5/9.1/8.6/8.1, Informatica Data Quality (IDQ) 9.5/9.1 as ETL tool for extracting, transforming and loading data from various source data inputs to various targets.
  • Extensive knowledge of data warehouse approaches - Top down (Inmon's approach) and Bottom up (Kimball's approach), methodologies- Star Schema, Snowflake.
  • Worked extensively with Teradata utilities - Fastload, Multiload, Tpump and Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database.
  • Extensively used Fastexport to export data out of Teradata tables.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Worked with various user groups and developers to define TASM Workloads, developed TASM exceptions, implemented filters and throttles as needed.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
  • Excellent experience with Scheduled and ran Tivoli Workload Scheduler (TWS V.8.4) job streams and jobs requested by Applications support. Created streams and jobs for day and night batch runs.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL.
  • Extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Expertise in transforming data imported from disparate data sources into analysis data structures, using SAS functions, options, ODS, array processing, macro facility, and storing and managing data in SAS data files. experience working with Teradata PDCR utility
  • Extensive experience in integrating data from flat files - fixed width, delimited, XML, WSDL, Web Services by using various transformations available in Informatica such as - Source qualifier, XML parser, and Web services consumer transformation.
  • Extensively used various Informatica Powercenter and Data quality transformations such as - source qualifier, aggregator, update strategy, expression, joiner, lookup, router, sorter, filter, web services consumer transformation, XML Parser, address validator, comparison, consolidation, decision, parser, standardizer, match, merge to perform various data loading and cleansing activities.
  • Extensively used data masking transformation for masking / scrubbing various sensitive fields such as social security number, credit card number, agreement numbers etc.
  • Integrated Informatica data quality mappings with Informatica powercenter.
  • Extensively used various performance techniques in Informatica such as - partitioning, tuning at source/target/transformation, usage of persistent cache, replacing transformations that use cache wherever possible.
  • Created complex mapplets to be shared among team members.
  • Extensive experience in developing various tasks, workflows, worklets, mapplets and mappings.
  • Extensive experience in invoking 3rd party web services using Web Service consumer transformation, enabled Informatica workflows as web services to be invoked by different client programs.
  • Experience in testing the 3rd party web services using Try-It option as well as with Java program.
  • Extensive knowledge on WSDL, XML, SOAP messages and web services.
  • Generated explain plans to identify bottlenecks, path of the query, cost of the query, broadcasting in partitioned database, indexes that are getting picked.
  • Extensive knowledge on different types of dimension tables - type 1 dimension, type 2 dimension, junk dimension, confirmed dimension, degenerate dimension, role playing dimension and static dimension.
  • Created complex PL/SQL programs, stored procedures, functions and triggers.
  • Declared cursors to move the data between tables, for temporary storing of data to perform various DML operations in stored procedures.
  • Extensive experience in writing unix scripts to invoke Informatica workflows using PMCMD command, to perform data validations between source and target in terms of counts.
  • Created complex unix scripts to perform cleanse and purging of source staging data due to data anomalies.
  • Created complex unix scripts using awk, sed and arrays for updating Informatica parameter files with correct process dates and connection string details in parallel.
  • Extensive knowledge on Business objects, updated existing universe with the new data objects and classes using universe designer, built joins between new tables, used contexts to avoid loops and Cartesians.
  • Worked extensively on QlikView (Qlikview desktop/ extract layer only), created complex queries, wrapped them under QVW (qlikview worksheet) to load the data into qlikview in-memory and create QVD (Qlikview data file) that would be further used by join and qlikview user interface layers.
  • Knowledge on Qlikview publisher.
  • Extensive knowledge on scheduling tools - Control-M, Autosys, Tivoli (TWS) and CRON.
  • Extensively used Control-M enterprise manager to schedule jobs, perform initial data loads, data copy from one environment to another when the environment is initially setup.
  • Extensive experience in writing Autosys JIL scripts.

PROFESSIONAL EXPERIENCE

Confidential, West Chester, PA

Senior Teradata / ETL Informatica Power center Developer

Responsibilities:

  • Analyzing the requirements and making functional specification by discussing with business user groups and translate the Business Requirements and documented source-to-target mappings and ETL specifications.
  • Solid experience in performance tuning onTeradataSQL Queries andInformaticamappings
  • Worked onTeradataSQL Assistant,Teradataadministrator,Teradataview point and BTEQ scripts.
  • Experience in migrations of project's application interfaces.
  • Implemented PL/SQL queries, triggers and Stored Procedures as per the design and development related requirements of the project.
  • Handling the File transfer through SFTP Scripts, which are running throughInformaticaand also having some Unix Shell Scripts used to send mails to Clients whenever there is success or failure depending upon the Business requirements to the Customers.
  • Extensively used theTeradatautilities like BTEQ scripts, MLOAD and FLOAD scripts to load the huge volume of data.
  • Designed ETL processes for optimal performance.
  • Worked on moving the code from DEV to QA to PROD and performed complete Unit Testing, Integration Testing, user Acceptance Testing for the three environments while moving the code.
  • Worked with Source Analyzer, Warehouse Designer, Transformation designer, mapping designer and Workflow Manager to develop new Mappings & implement data warehouse.
  • Other Teradata duties include managing workloads and performance using Teradata TASM, Teradata Dynamic Workload Manager, and ViewPoint. Managing ViewPoint (defining ViewPoint portlets, managing access, provide ViewPoint training to users, creating alerts).
  • Scheduled informatica jobs to trigger BTEQ Scripts with the help of UC4 job scheduler
  • Automated jobs using korn shell scripting SAS and Teradata jobs, stage flat-files data onto UNIX environments to support ETL processes.
  • Performed system level and application level tuning and supported the application development teams for database needs and guidance using tools and utilities like explain, visual explain, PMON, DBC views.
  • Developed, tested and reviewed complex Ab Initio graphs, sub-graphs, DML, Pset, XFR, deployed scripts, DBC files for connectivity, create Package and exports.
  • Extensively worked on reading of data from DB2, Netezza and loading of data into Teradata datamart.
  • Created Parameter files to pass database connections, parameter entries for source and target.
  • Implemented Target Load Plan and Pre Session and Post Session Scripts.
  • Created test cases and test scripts to validate data.
  • Created complexInformaticamappings, re-usable transformations and prepared various mappings to load the data into different stages like landing, staging and target tables.
  • Worked with cross-functional teams to resolve the issues.
  • Used the debugger inInformaticato test the mapping and fix the mappings.
  • Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
  • Used various transformations including Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy for designing and Optimizing.
  • Involved in scheduling, creating snapshots and subscriptions for the reports using SSRS
  • Created various tasks like Session, Command, Timer and Event wait.
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
  • Tuned the performance of mappings by followingInformaticabest practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Involved in the defect analysis for UAT environment along with users to understand the data and to make any modifications to code.
  • Prepared the recovery process in case of workflow failure due to database issues or network issues.
  • Conduct a thorough code review and ensure that the outcome is in line with the objective and all the processes and standards are followed.

Environment: Teradata14/13.10,TeradataSQL Assistance,TeradataUtilities, TASM, Mload, Fastload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), BTEQ, Teradata SQL Assistant, SAS v9.3/9.4, Oracle 11g, Netezza,InformaticaPower Center 9.6/9.1,UC4, UNIX, Autosys, Business Objects XI R2, Linux, Korn shell.

Confidential, Whitehouse Station, NJ

Senior Teradata / ETL Informatica Powercenter Developer

Responsibilities:

  • Meetings Senior Teradata and Informatica powercenter developer on a data warehouse initiative responsible for requirements gathering, preparing mapping document, designing ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW) and additional target data warehouses.
  • Collaborated with data architects, BI architects and data modeling teams during data modeling sessions.
  • Worked extensively with Teradata utilities - Fastload, Multiload, Tpump, Teradata Parallel Transporter (TPT) to load huge amounts of data into Teradata database.
  • Extensively used Fastexport to export data out of Teradata tables.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Performance reporting for Capabilities using PDCR
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL.
  • Extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Extensively used Informatica transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer transformations to extract, transform and load the data from different sources into DB2 and Oracle targets.
  • Extensively used ETL Informatica to integrate data feed from different 3rd party source systems - Claims, billing, payments and load into Teradata database.
  • Extensively worked on performance tuning of Informatica mappings.
  • Experience in Crystal reports
  • Extensively used XML, XSD / schema files as source files, parsed incoming SOAP messages using XML parser transformation, created XML files using XML generator transformation.
  • Used cursors to copy the data between tables whenever new environments were created in Formal environment.
  • Generated explain plans for performance tuning of queries and identifying the bottlenecks for long running queries.
  • Extensively used PowerExchange for Mainframe to read data from mainframe / VSAM/ COBOL files and load into Teradata tables.
  • Constructed highly optimized SQL queries and Informatica mappings to transform data as per business Rules and load it into target databases.
  • Extensively used Control-M and UC4 scheduling tool to load the charts and run the jobs for initial load of the tables whenever a new environment is created.
  • Owned the defects from production as well as from system testing and worked on the solutions.
  • Extensive knowledge on defect tracking tools - TRAC.
  • Extensively used RMS as version control management tool.
  • Involved in unit testing and prepared documents with job flow information for QA team for their testing purpose and coordinated with QA team for defect fix and regression testing.
  • Prepared implementation documents for every release, worked on initial loads and data catchup process during implementations and provided on-call support for first few days of execution.
  • Extensive experience in PL/SQL programming, stored procedures, functions and triggers.
  • Built UNIX Linux shell scripts for running Informatica workflows, data cleansing, purge, delete, and data loading and for ELT process.
  • Extensively used sed, awk commands for various string replacement functionalities
  • Worked extensively on QlikView (Qlikview desktop/ extract layer only), created complex queries, wrapped them under QVW (qlikview worksheet) to load the data into qlikview in-memory and create QVD (Qlikview data file) that would be further used by join and qlikview user interface.
  • Knowledge on Business objects, updated existing universe with the new data objects and classes using universe designer, built joins between new tables, used contexts to avoid loops and Cartesians.
  • Created Procedure Document for Implementation procedures for every release for all the UNIX, Informatica Objects and if any catch-up process needed to be done.
  • Provided on-call support for the newly implemented components.

Environment: Informatica Power Center 9.6/ 9.5, Teradata 14/13, Oracle 11i, DB2 10.1, LOAD, Ingest, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), Teradata SQL assistant,TASM, BTEQ, SQL Developer, ERWIN, PL/SQL, RMS, Linux, AIX, Netezza, TOAD, Winsql, Putty, UltraEdit,UC4, PowerExchange for mainframes, XML, Control-M, UC4, Enterprise Manager, JIRA, Lotus Notes, UNIX Scripting, Unix shell scripting, Microsoft Visio, XML Spy, Business Objects XI R3, Qlikview 11.2, QlikView desktop

Confidential, San Francisco, CA

Teradata Developer / ETL Informatica Powercenter Developer

Responsibilities:

  • Translate Created the Design for Extraction process from legacy systems using combined techniques of Data Replication and Change Data Capture.
  • Completed the Gap Analysis which includes identifying the gaps between the downstream partner requests to the source system files and to fill the gaps either by rejecting the downstream partner's requests or requesting additional files from the source system.
  • Worked extensively with Teradata utilities - Fastload, Multiload, TPump to load huge amounts of data from flat files into Teradata database.
  • Extensively used Fastexport to export data out of Teradata tables.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Worked extensively with Teradata utilities - Fastload, Multiload, Tpump, Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database.
  • Extensively used Fastexport to export data out of Teradata tables.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Involved in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Worked extensively with Informatica transformations to source, cleanse and parse the data, load into Teradata database.
  • Extensive experience with DB2 load and Ingest external loaders.
  • Created explain plans for long running queries, worked with DBA's to identify and solve the bottlenecks by adding appropriate indexes.
  • Extensively used Informatica web services Consumer transformation to invoke 3rd party web services on Billing, Payments.
  • Enabled Informatica workflows as web service to be invoked and used by different client systems.
  • Created test Java programs to test the Informatica web services and Try-it option on Web Service Hub.
  • Extensively used XML parser transformations to parse results from Web services output i.e. SOAP messages.
  • Created highly optimized ETL processes to move the data from legacy systems, DB2 and flat files into Oracle database.
  • Extensively used Informatica DVO (Data Validation Option) for data comparison between source and target tables, data validation after workflow completion.
  • Used Informatica DVO to create table pairs, test rules, SQL views options to create various data validation scenario's and used RunTests command to execute the table pairs and send out emails with the status of data validation.
  • Extensively used Informatica DVO for unit testing in Dev environment.
  • Involved in performance tuning of existing SQL queries.
  • Involved in performance tuning of Informatica mappings.
  • Extensive knowledge on banking products, equities and bonds.
  • Created complex ETL Informatica procedures to load data from legacy systems.
  • Prepared implementation document for moving the code from development to QA to production environments.
  • Worked with QA team and implementation team during different phases of the project.

Environment: Informatica Power Center 9.1, Teradata 12, Oracle 11i, DB2 9.1, Ingest, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), BTEQ, Teradata SQL Assistant, Web services, Java, Business Objects XI R2, Linux, Windows XP, SQL, PL/SQL, XML, SQL Loader, TOAD, Tivoli scheduler, control-M, UC4, Toad 9.5, Korn shell, Erwin.

Confidential - Bloomington, IL

Senior ETL Developer

Responsibilities:

  • Conduct source System Analysis and developed ETL design document to meet business requirements.
  • Developed Informatica Mappings and Workflows to extract data from PeopleSoft, Oracle, CSV files to load into Teradata staging area using FastLoad/Tpump utilities.
  • Developed ETLs to load data from source to 3NF, stage to 3NF and Stage area to Work, work to 3NF using Informatica Push Down optimization technique to utilize Database processing power.
  • Designed and developed custom Data Quality audits to identify and report the data mismatch between source and target systems and alert Operations Team.
  • Customize the existing programs using SAS Macros as per the statistician's requirements
  • Tuned Teradata Sql queries and resolved performance issues due to Data Skew and Spool space issues.
  • Created Uprocs, Sessions, Management Unit to schedule jobs using $U.
  • Developed Flat files from Teradata using fast export, Bteq to disseminate to downstream dependent systems.
  • Coordinated with the offshore project team members on daily basis for the continuation of tasks and resolving any issues.
  • Supported System Integration and User acceptance tests to obtain sign off.
  • Post go live Production Support and Knowledge Transfer to Production Support team Recoded existing SAS code at the member level to group level/section level to meet specific business needs.

Environment: Teradata V2R6/V2R5, SAS/BASE 9.2, Teradata SQL Assistant, BTEQ, MLOAD, ARCHMAIN, BOXI R3.1, Teradata Manager, SAS/MACROS, Mainframes DB2, Erwin Designer, UNIX, Windows 2000, Control-M, Clear Case, Shell scripts.

Confidential

ETL Developer/Business Objects Developer

Responsibilities:

  • Involved Documenting functional specifications and other aspects used for the development of ETL mappings
  • Worked with the Business Analysts for requirements gathering, business analysis, testing, and project coordination
  • Documented user requirements, translating requirements into system solutions and developing implementation plan
  • Developed core components of the project which includes XML, Validation of XSD and created well defined Views in Pre-Staging area and Load them
  • Developing a number of Complex Mappings, Mapplets and Reusable Transformations using Informatica Designer to facilitate daily and monthly loading of data
  • Optimized Performance of existing Informatica workflows.
  • Scheduled Informatica Workflows using workflow manager.

Environment: Oracle 9i, SQL Server 2000, DB2, Informatica Power Center 7.1, Erwin, Cognos, XML, Windows NT/2000, Unix

We'd love your feedback!