We provide IT Staff Augmentation Services!

Technical Lead/informatica Resume

2.00/5 (Submit Your Rating)

Rhode, IslanD

SUMMARY:

  • 16 years of IT experience in design, development, implementation of database centric intelligence solutions and in all phases of Software Development Life Cycle.
  • PMP Certified, member Project Management Institute.
  • Played Developer/Lead/Data Architect/Manager/Technical Architect roles - Strong skills in Data modeling using Erwin Data modeler. Experienced in Relational data modeling and Dimensional modeling. Well versed in Dimensional Modeling techniques (Star and Snow-flake Schema).
  • Experience in managing teams on Data warehousing.
  • Very strong experience in Informatica PowerCenter. Experienced in building Enterprise Data Warehouse (EDW), Operational Data Store (ODS) and Data Marts.
  • Experience in setting up Integration Competence Center (ICC) and writing technical documents
  • Experience in MS Project and VersionOne software.

TECHNICAL SKILLS:

Data Modeling Tools: Erwin Data Modeler 9.6/9.5/8/7.x

ETL Tools: Informatica PowerCenter 9.6/9.5/8.6/8.5/7.1, ESP

Databases: IBM Netezza, Teradata 12/13/14, Oracle 9i/10g/11g, DB2 UDB 7.1 EEE/EE, MS SQL Server 2000/2005/2007/2011/2012

DB Tools: SQL*Plus, SQL*Loader, TOAD 7.6/9.0.1, Excel Macro, Teradata SQL Assistant

Development Languages: SQL, PL/SQL, Unix Shell Scripting, C/C++, HTML

Operating Systems: Sun Solaris 8.0/9.0, AIX, HP UNIX, SCO UNIX, WINDOWS NT/2000/XP/7

Management software: MS Project 2010 / Version One

PROFESSIONAL EXPERIENCE:

Confidential, Rhode Island

Technical lead/Informatica

Responsibilities:

  • Played lead role in all Informatica development work. Managed off-shore development team. Conducted meetings related to all development/documenting/issue tracking.
  • Participated in Integration meetings with on-site team regarding Web-Services and all Informatica development work. Coordinated requirements with business team members and was responsible for all Informatica deliverables, assign work to team members and monitored progress.
  • Worked on Sales (COPA) data related to Actuals, SPU/NSPU breakdowns, and then to breakdown by GL accounts.
  • Developed maps/session/workflows for other subject areas like: NON-ERP, Consensus data, Atlas field forecast, Product attributes, Hierarchy automation Control, bckflp data.
  • Wrote windows script to archive data using 7zp. Create complex workflows to email, archive and generate exceptions, backup exception files and email them to various groups handling them.
  • Wrote SQL queries to verify data, wrote Informatica maps to read/write data from Longview system using provided reusable transformations.
  • Wrote ETL specification documents for all development work as well as review the specs developed by others. Documented all development work and migrated to TEST/PROD environment.
  • Completed “Advanced Negotiation Skills” training sessions.

Environment: Windows 7, Informatica 9.6, SQL Server 2012, SAP, Longview, Business Objects, Tidal

Confidential, Connecticut

Data Architect

Responsibilities:

  • Confidential wanted to document the CMDB environment (ServiceNow), including FocalPoint, to understand the existing process, and various input sources to those applications and the information flow between them.
  • Analyzed ServiceNow application modules CMDB, Incident, Change, Problem, Asset..etc to create a data model. Identify relationships between the various modules and their table structures. Conducted SME sessions and document the findings.
  • Analyzed FocalPoint data and incorporated into the data model.
  • Utilized Erwin Data modeler to build a high level data model, capturing major relationships between CIs, Assets, incidents, problems, change requests..etc. Added definitions and descriptions to the model and created subject areas.
  • Reviewed the model with senior managers/technical leads.
  • Created a source to target map to identify original sources loading the ServiceNow and FocalPoint applications.

Environment: Erwin 9.6, Windows 7, ServiceNow, FocalPoint

Confidential, Massachusetts

Technical Lead

Responsibilities:

  • Played lead role related all Operational and Integration layer development, extract and jobs scheduling/running. Conducted daily team updates meetings to track progress.
  • Interact with Architecture team to finalize details of source extracts, identify tables from AS400 and Oracle sources related to Retail, Order, Shipments, Inventory, Contract, Dotcom… etc.
  • Maintained Master Excel to track all source systems/tables/jobs related to the GDW project.
  • Lead and oversee all Informatica, Teradata and Hadoop development. Assigned tasks to team members and provide development specs and assistance to all development. Maintained details job details excel for all job runs. Provided daily updates to management regarding development and job runs. Also provide weekly team performance report to the management.
  • Wrote technical documents for: Requirements, Deployment, Test Plan, Test Case, ETL design, Deployment checklist, Informatica Mapping, Informatica Naming Conventions.
  • Exposure to JSON and MongoDB.

Environment: Erwin, Visio, Teradata 14, Informatica 9.5 Toad, UNIX, Oracle 11g, BTEQ, SQL, Pentaho, Tidal

Confidential, Massachusetts

Data Architect

Responsibilities:

  • Reviewed and analyzed eHealth Data Vault model. Developed Erwin data model for eHealth reporting mart. Conducted meetings to discuss the reporting requirements and knowledge transfer. This mart contains data related to the statistics for Elements (CPU/DISK/SERVER, LAN/WAN) of Confidential servers. This mart being utilized to report performance of Confidential IT infrastructure.
  • Wrote ETL logic document for ETL developers and conducted knowledge transfer sessions. Managed the ETL development work for eHealth Mart. Conducted meetings with India team on the ETL development. Managed development efforts and issue logs for the project.
  • Created user stories and Sprint planning for the all ETL and design efforts. Conducted daily project standup meetings with the scrum team.
  • Reverse engineered the Benefit Workstation model and created subject areas. Also added naming standards and definitions. Perform analysis on the model.
  • Reverse engineered the Evaluation Engine application. Added naming standards and definitions/comments to complete the documentation. Perform analysis on the model and created subject areas.
  • Reviewed Client Billing model for GCP project.
  • Participated in setting up Erwin modeling standards for the organization.
  • Created Conceptual model for HRI (HR Import process). Conducted meeting to analyze the process.
  • Reverse engineered GMD (Global Management Desktop) model.

Environment: Erwin 9.5, Visio, SharePoint, HP Quality center, Toad, Oracle 11g, PL/SQL, Version One, Data Vault Modelling

Confidential, Massachusetts

Data Architect /Manager

Responsibilities:

  • Worked with business analyst team to convert business requirements into data mart design requirements to be used in ESP tool. Identify all elements (columns) and Categories (tables) for building new marts or updating existing marts.
  • Define scope utilizing requirements, created WBS and WBS dictionary. Perform decomposition to create work packages.
  • Estimated Cost for activities and developed Schedule.
  • Managed developer teams in India and China, assign them tasks, coordinate activities between teams including on site developers.
  • Mentored and trained team members on ESP software, modeling and designing. Manage resource allocation and scheduling.
  • Develop and design Data Marts (DM), Elements, Categories (tables) and other units using state of the art ESP tool. Manage change sets, updates and various changes to new and existing marts.
  • Handled Lookthrough process and GMAA valuations Data mart related development and issues, including requirements changes and client requests for day to day operations.
  • Implement approved change requests according to the change management plan.
  • Conducted status meetings and development sessions with India/China teams. Solved design issues and technical problems related to issue logs and new mart designs.
  • Reviewed report specs for Valuations and Transactions reports.
  • Started the AB (Alliance Bernstein) project (new client) related designing to create data marts. Reviewed major reporting requirements for selected set of reports with business.
  • Load all source file data for AB client to the DEV/BUAT environment.
  • Update/Review status reports every week regarding RPMS and weekly tasks for team members.

Environment: ESP, MS Project 2010, Visio, SharePoint, HP Quality center, Toad, Oracle 11g, PL/SQL

Confidential, Massachusetts

Data Architect/Manager

Responsibilities:

  • Reviewed requirements document with the project manager, participated in writing the scope document. Identify high level risk and documented assumptions and constrained of the project.
  • Played team lead role and performed analysis on Confidential (PO) mainframe system data.
  • Created WBS, WBS dictionary, Estimate cost and develop schedule for development efforts.
  • Assist Project Manager on resource and quality planning.
  • Improved team performance by leading and mentoring junior members.
  • Lead SME sessions related to PO system and compiled detail documents identifying tables and business logic. PO contains 6 divisions of Confidential business units.
  • Identify relationships and entities that represent the core business process and documented them and presented to the whole team.,
  • Reverse Engineered the DB2 databases to create an ODS models for each division using Erwin. Models include column and entity definitions gathered during SME sessions.
  • Design Source to Target maps for all ODS models with ETL logic.
  • Identify the grain of the data and converted the identified relationships and entities to Facts and dimensions to build the data model. Designed dimensional model for PO subject area using Erwin Data Modeler Created Logical/Physical models. Utilized domains, logical and physical column definitions, entity definitions along with relationship definitions to complete the model.
  • Build Source to Target maps with detail ETL logic and lead reviewing sessions for team members deliverables. Ensure all designs conform to Quality standards established in the project management plan.
  • Conducted knowledge transfer sessions on PO subject area.
  • Assist and lead QA team with all the issues related to the PO subject area.
  • Maintained Issue logs
  • Participated in Scope verification.
  • Performed analysis on current Logistics and Distribution Reporting environment (Cognos Impormptu jobs, MS Access databases, Macros, Excel reports) and document the whole business process.
  • Designed process diagrams for all subject areas such as: Domestic Process(Inbound, Outbound, Crossdock, Store delivery), Imports, International logistics, Domestic logistics and HomeGoods.
  • Identified all drawbacks and issues currently existing in the process and proposed a solution.
  • Lead a team of offshore developers coding the business logic in SQL Server environment.
  • Oversaw the application migration to MS SQL Server 2008 environment after complete redesign and coding.
  • Managed development and unit testing process to complete the project successfully.
  • Reverse Engineered the DSDM (Distribution Services Data Mart) data mart.
  • Created Logical/Physical models which include subject areas SCD, ASN and Imports.
  • Created to Source to Target Mapping for all whole data mart.
  • Reverse Engineered the BW Client/Server repositories which were hosted in SQL Server 2008 environment.
  • Created Source to target mapping document for all repositories.
  • Worked on the inbound feed for Confidential . Worked on the CODS model and created the source to target maps.
  • Wrote the ETL logic for complex measure calculations.
  • Created ODS model based on COBOL file structures.
  • Build Source to Target Maps with ETL logic.

Environment: Erwin Data Modeler, MS Project, DB2 Mainframe, SQL Server2008, Visio, SharePoint, HP Quality center, IBM Neteeza, Toad, Oracle, AIX, Windows 7

Confidential, Oregon

Team Lead/Sr. Informatica Developer

Responsibilities:

  • Helped setting up Integration Competence Center (ICC) to implement standards and procedures for Informatica Development and design. Wrote documents to elaborate standards and procedures such as Informatica best practices, folder strategies and parameter handling.
  • Manage on-site/off-shore development team. Coordinate development activities between teams, motivate and mentor team members to ensure project efficiency and boost morale.
  • Conduct meetings related to design reviews, technical specs and ETL specs
  • Actively played the lead role by discussing design issues related to maps, sessions, workflows and troubleshooting, performance tuning and database issues related to tables/views and PL/SQL.
  • Wrote Planning team specific Informatica Label creation process document.
  • Design maps (about 45) to deliver data for testing teams. This includes loading relational table data and flat files (about 7 million rows) into Teradata database.
  • Design maps to read/write/delete SAP - SCM data using BAPI structures.
  • Design about 85 maps for Plan2PO project. Utilized parameter files, reusable/non-reusable sessions and concurrent sessions. Designs include table flip through view switching and workflows tuned with Pushdown optimization.
  • Use PUSHDOWN optimization to performance tune many workflows. Utilize FastLoad, MultiLoad, FastExport and Tpump to load data to Teradata Database.
  • Wrote technical specifications for various links (process) and review technical specifications written by team members for various developments links. Design maps (80) for those links. Coordinate testing with off-shore team in India.
  • Design detail Visio diagrams to describe the process logic.
  • Use HP Quality Center software to create/update issue tickets and monitor status of issues related to testing related problems.
  • Use AutoSys to run workflows. Ran workflows on command mode with a UNIX script as needed during testing. Configure workflow to run concurrently with multiple parameter files.
  • Wrote UNIX shell script to extract Informatica connection string details to verify multiple connections string information. This is to ease the administrator workload during different environment set ups such DEV/QA/TEST/PROD.

Environment: Informatica Power Center 8.5.1/8.6, MS Project, Teradata 12/13, Oracle 10g, HP UNIX, Windows XP, SQL Navigator, Visio, SharePoint, HP Quality center, SQL assistant 12.0, Toad 9.0.1

Confidential, CA

Sr. Informatica Developer

Responsibilities:

  • SAND project - This was built to analyze performance of the advertising broken out by demographic and geographic attributes. Source files were generated from the existing DW or acquired from our vendors(DoubleClick). They are Impressions, Clicks, Activities, Inquiries and Cost. Those files were staged, then processed and finally generate target flat files for SAND processing. Built a scheme to reprocess error files and error handling.
  • Created/Changed Informatica mappings - worked on transformations such as Lookup, Aggregator, Expression, Joiner, Filter, Router, Rank, Update Strategy and Sequence Generator.
  • Wrote shell scripts to backup files with timestamps, certain file names generated using the data itself, backup files into directories using date and delete data which are older than three weeks.
  • Production support - day to day monitoring of the nightly and daily loads of the workflows. This includes fixing the problems in the loads, rescheduling, rerunning or fixing database related issues. Some of the loads include over 100 million rows.
  • Worked on issues related to the Data Warehouse - Enhancements, data quality issues, fixes, missing data, missing inquiries, missing revenue, adding new columns to tables, de-duping tables and Informatica tuning. Developed new Informatica maps, Changed existing maps, Changed workflows, wrote SQL codes, test maps and results, Documented the changes and new things introduced.

Environment: Informatica Power Center 7.1, SQL Server, Oracle 9i, AIX, Windows XP, TOAD, Visio, Windows 2003

Confidential, California, CA

Sr. Informatica Developer

Responsibilities:

  • Analysis of various banking units such as DDA, FID, TCD, MID, Loan systems, Courier Services, Wire transactions etc. and find relationships to the CAST reports. Validating the CAST report specs against the actual systems.
  • Wrote design document for each report - field definitions against the actual table columns, data type definitions, source to target mapping for the development data warehouse.
  • Developed SQL coding: Used standard SQL data types and functions to generate reports that will be examined by the Federal Agents. Elements of SQL used: LIKE, NOT, IN, EXISTS, ABS, ROUND,CONCAT,RPAD,LPAD,LTRIM,RTRIM, SUBSTR,TRUNC, LENGTH, MONTHS BETWEEN,TO CHAR,DECODE,NVL,AVG,MAX, UNION, IF THEN ELSE statements, CASE statement, OUTER JOIN, sub queries ..etc
  • Generate exceptions reports whenever necessary.
  • Generate report outputs in various file formats such as excel, csv and text. Larger reports were delivered as an ACCESS database.
  • Designed Visio diagrams for each report.
  • Validate SQL code to verify results.
  • Wrote Excel macros to create reports automatically - extract data from Oracle database tables to excel worksheets, manipulate data and format them to suite report specs.
  • Maintaining development data warehouse: create tables and views, create indexes, Analyze tables.
  • Document SQL codes, input data loading procedures using SQL Loader and Macro codes
  • Finally used Informatica to load the data - Lookup, Aggregator, Expression, Joiner, Filter, Router, Union, Rank, Update Strategy and Sequence Generator.

Environment: Informatica Power Center 7.1, SQL *PLUS, PL/SQL, Oracle 9i, TOAD, Excel, Windows 2000

Confidential, Edison, CA

Informatica Developer/Data modeler

Responsibilities:

  • Documented User requirements, analyzed the system, translated the requirements into system solutions, and developed the implementation plan and schedule.
  • Member of the 2 person team that developed the ERWIN Data Model.
  • Developed ETL specification based on business requirements/Mapping Document. Formulated and documented physical process design.
  • Responsible for data control and cleansing through use of Active and Passive transformations.
  • As part of ETL process, extracted large number of Flat files using Informatica from Oracle database and configure those mapping to run 3 times a day to extract only new rows from database.
  • Designed and applied Mapplets, Transformations, and Workflows using Informatica Power Center.
  • Worked on transformations such as Lookup, Aggregator, Expression, Joiner, Filter, Router, Union, Rank, Update Strategy and Sequence Generator.
  • Designed and developed complex aggregate, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL (Power Center) tool.
  • Conducted migration of mappings from development to QA staging environment and from QA staging to production environment and back.
  • Created server sessions for the transformations and mappings. Scheduled server sessions for the mappings at predetermined time intervals .
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent batches and Sessions and scheduling them to run at specified time.
  • Conducted system and acceptance testing. ETL process system testing was done by simple record counting testing and isolating sources if more than one source were used and conducting test separately for each source.
  • Documented test cases and test plans data for the mappings developed.
  • Wrote a complete manual which include Informatica mappings, sessions and workflows as well as their settings.

Environment: Informatica Power Center 7.1, Oracle 9i, DB2 UDB 7.2 EEE, SQL *PLUS, PL/SQL, Sun Solaris 8/9, Erwin Data Modeler

Confidential, Phoenix, AZ

Informatica Developer

Responsibilities:

  • Data mapping, mapping design, consolidation and validation of source system, defining error and restart logic, and production environment.
  • Loaded data coming from the source (delimited Flat files) into target Oracle database and applied business logic on transformation mapping for inserting and updating records
  • Designed and developed aggregates, joins, and look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica tool.
  • Imported and exported multiple Objects in Repository Manager; with and without their dependencies.
  • Created shell scripts for better handling of incoming source file like moving file from one directory to another directory, extracted information from file names such as date for continuously incoming sources.
  • Documented technical specification, business requirements, functional specifications for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables and defined ETL standards. Documented test cases and test plans data for the mappings developed.
  • Conducted SQL testing for DB sources for Insert and Update timestamps, Counts, Data definitions, Control, and Error logic for Facts and Reference tables.
  • Used various active and passive transformations for Data Control and Cleansing.

Environment: Informatica Power Center 6.1, Oracle 8i, SQL *PLUS, PL/SQL, Sun Solaris 8/9

Confidential, St. Petersburg, FL

Informatica Developer

Responsibilities:

  • Designed and developed the Data Warehouse system with emphasis on ETL cycle in a team environment.
  • Identified data ownership/responsibility, performed data sizing, conducted ETL logic design, read the structure of multiple databases, and queried the data to ensure uniformity and accuracy.
  • Worked with Data Quality group to identify and research data quality issues.
  • Responsible for development of ETL processes to load data from different sources into the target Oracle data warehouse by applying applicable business logic on Transformation mappings.
  • Assisted in maintaining ETL standards, Naming Conventions and writing ETL flow documentation.
  • Conducted consolidation and revalidation of the source systems as a part of secondary validation check before loading into the Informatica server area.
  • Involved in collecting and reporting ETL process statistics and design and implementation of ETL alerts severity and the groups responsible for handling the errors.

Environment: Informatica 5.1, Oracle 7, SQL *PLUS, PL/SQL.

Confidential

Team Leader

Responsibilities:

  • Wrote socket server and tested using COSES software. Wrote a C routine that directly wrote up to 3 records increasing server response time and accelerating data processing speeds on database routines.
  • Designed IP scheme for entire network. Each branch was considered as a separate LAN.
  • Involved in customizing ATM’s, especially the TCP/IP configuration, network card installation and OS/2 WARP installations. Responsible for testing and troubleshooting ATM’s using COSES software. Lead a team of five consultants in a COSES technical and operator training program.
  • Worked with Tandem Engineer in configuring Tandem. Gained a working knowledge on Tandem concepts and facilities.
  • Responsible for UNIX installation and issues pertaining to configuration of the OS.
  • Wrote a shell script, which includes an FTP session to download files to required branch resulting in significant cost savings to the Bank.
  • Wrote a backup shell script used to handle backups on 2 hard disk servers. Same script used with hot swap servers 3 hard disks with hard disks configured identically making it is possible to boot server with one of two alternate disks.

We'd love your feedback!