Data Integration Architect Resume
Jersey City, NJ
TECHNOLOGY:
Data warehouse / Database concepts: OLTP, OLAP, Data Lake, Data marts (Facts and Dimensions), Data warehouse, Slowly Changing Dimensions, Normal Forms (1st, 2nd, 3rd), MPP databases.
Data Warehouse (Extract Transform and Load) and BI: Informatica Power Center 10x/9x/8x/7x (Admin console, Repository Manager, Designer, Workflow Manager, Workflow Monitor, PDO, Session level partitioning), Informatica Cloud (Mapping designer, ICRT, IICS, Data synchronization, Replication, PDO), Informatica Power Exchange 9.x, IBM Datastage/QualityStage, Informatica B2B Data transformation, Pentaho, Cognos on Cloud.
Databases: DB2, Netezza, Oracle, Microsoft SQL server, DB2 warehouse on cloud (DashDB).
Other Data sources: XML, Salesforce, Message Queue, SAP ECC, SAP BW, Mainframe, flat file, Webservices, API
Data Modeling: Microsoft VISIO, Erwin, Physical and Logical Modeling, Data modeling, designing Dimension & Fact Tables, Normalization, De - normalization, Entity Relationship Model, Star & Snowflaking.
Programming/Scripting: PL/SQL, SQL Scripting, Unix shell Scripting, Java Scripting, Python, Mainframe.
Job Scheduling Tools: Autosys, Tivoli Work Scheduler (TWS), ICRT, Tidal.
Environment: Windows XP/2000/98, Win NT, UNIX/Linux, MS-Dos
Cloud Platform: Azure, Cloud integration and automation.
Project Methodologies: Waterfall, Agile.
PROFESSIONAL EXPERIENCE:
Confidential, Jersey City, NJ
Data Integration Architect
Technology: DWBI (ETL), Informatica Powercenter10x, Informatica Cloud (ICRT, IICS, Mapping designer, Data synchronization, Replication), SQL Server, Oracle, SalesForce, Rest API, Scheduling in Tidal and Autosys.
Responsibilities:
- Address issues reported by Business users in the Global CRM system. Perform the analysis with the business users, propose the solution and perform the fix as a part of the ongoing or future Sprint.
- Being an integration architect in the COE team, provides consultation to different application within Confidential on integrating with Informatica in order to sync data with Salesforce CRM.
- Integrating IICS with Salesforce application and publishing Webservices / Rest API to be called by Sales Force. Create IICS data replication jobs for Salesforce objects and also extract deltas to store in a data hub (staging data layer more like a data lake).
- Integrate IICS with multiple financial applications like MSB/ESB, Filenet, nCino, Concur etc. and sync data between Salesforce CRM and the downstream applications.
- Perform source data analysis, develop and unit test the ETL codes built in Informatica Powercenter, IICS, SQL scripts where data is read from applications, stored in staging data hub, and then loaded into Salesforce.
- Also develop data replication jobs to send data out of Salesforce to the downstream applications.
- Design the High level and Low Level ETL architecture for the project.
- Schedule jobs, monitoring, debugging, setting up email alerts in informatica cloud using Tidal and Autosys.
- Generate all technical documents required to support the project.
- Provide support during implementation and post implementation warranty support.
- Overall, working independently as an individual contributor takes care of all activities within the COE team, also completely hands-on.
Confidential, Newark, NJ
DWBI Solutions Architect (Lead Developer)
Technology: DWBI (ETL), Informatica Cloud (ICRT, IICS, Mapping designer, Data synchronisation), DB2 warehouse on Cloud, Stored Procedure (SQL), SAP ECC. Erwin, Visio, Windows, Python, REST API/Webservices.
Responsibilities:
- Architect the End to End DWBI solution for the new SAP BI solution that maintains the client’s Sales & Distribution and Finance Application (AP, AR, GL & COPA).
- Develop and unit test the ETL codes built in Informatica and SQL scripts where data is read from SAP ECC, SalesForce, flat files, XML.
- Business Analysis (preparing BRD/FRD ), Data model (logical/Physical) using MS Visio and/or Erwin (Star/Snowflake) and Source-to-target Data mapping for the data flow between the source SAP ECC to the DB2 on cloud.
- Gather the business requirement (meeting the user) and understand the data on the SAP ECC system, which is very complex SAP ECC system and replicate the same on the new BI system.
- Analyze data on SAP system using SAP commends like SE16, SE11 & complex SQL queries to come up with solutions on design issues.
- Design the High level and Low Level ETL architecture for the project.
- Schedule jobs, monitoring, debugging, setting up email alerts in informatica cloud using ICRT, IICS.
- Write complex SQL queries to read data from ECC tables and write stored procedures within DB2 cloud for data processing.
- Perform Push down optimization (PDO) in order to efficiently handle huge volume of financial data.
- Generate all technical documents required to support the project.
- Delegate task to Off-shore team and get work done on time.
- Provide support during implementation and post implementation warranty support.
- Take care of change requests or enhancements of the existing system.
- Perform the effort estimation and prepare & execute the project plan.
- To Summarize, lead the project end-to-end, and also perform hands on coding.
Senior Data Analyst
Confidential, Rochester, NY
Technology: DBWI(ETL), Informatica PowerCenter 10x, MS SQL server, Stored Procedure, SAP BW, SAP HANA studio, Unix Shell scripting.
Responsibilities:
- Design the DBWI architecture for the enterprise data warehouse of the client’s Sales and Distribution application.
- Participate in the Data modelling exercise with the Data modeler and BA and prepare Source-to-target Data mapping document between SAP HANA BW to the data warehouse which is in SQL server data base.
- Gather the business requirement and document them in High and low level technical design documents.
- Build the ETL code in Informatica PowerCenter 10x and unit test the code to ensure it meets the business requirement.
- Write complex stored procedures in MS SQL Server to process some of the complex business rules.
- Review code done by other Team members and make sure it adheres to the requirement and coding standard.
- Prepare the Project plan and Effort Estimation and adhere to those timelines.
- Delegate task to Off-shore team and get work done on time.
- Take care of change requests or enhancements of the existing system.
- To Summarize, take care of the complete SDLC and make sure that the project deliverables are met and also perform hands on coding.
- Provide the Weekly Status Report to the Client Manager.
Senior ETL Developer / IT Analyst
Confidential, Cleveland, OH
Technology: DWBI(ETL), Informatica Powercenter, IBM Datastage/QualityStage, SQL scripting, Webservices(SOAP API), MS SQL Server
Responsibilities:
- Implement the Name and Address standardization on 10 different data sources for the Auto insurance client’s Customer base and perform the Business Rules in IBM Datastage/Quality Stage. Integrate the standardization module to IBM MDM via webservices /API.
- Work in an Agile mode and attend and update on the Scrum calls every morning to the project team about the progress of the project sprint.
- Work closely with the Data Modeler/Architect to understand the system architecture.
- Gather the requirement and design the architecture for the data standardization ETL process.
- Build the IBM Quality stage code using the standard data standardization modules using Quality stage, also write custom business logics in Datastage to meet the specific Business rules, then Unit Test the standardization program.
- Write complex stored procedures to load data into some master data tables.
- Review Code, Unit test cases & Test Evidences done by the Client Partner.
- Train the Client partner on IBM Quality stage.
- Prepare the Effort Estimate and Project Plan and adhere to that.
- To Summarize, take care of the complete SDLC and make sure that the project is on track.
- Provide the Weekly Status Report for the Project Manager.
Confidential, Raleigh,NC
Senior Software Engineer
Technology: ETL, Informatica Powercenter, Oracle, PL/SQL, Unix Shell scripting, Agile.
Responsibilities:
- Reverse Engineer the Business logics from the existing code and understand the Business Rules, document them.
- Work in an Agile environment having a Release every 6 weeks and report to the scrum master every day on the scrum call.
- Data model and Source-to-target Data mapping between Oracle Legacy datawarehouse to the new Oracle datawarehouse.
- Requirement gathering and data analysis of the data using standard SQL queries.
- Create High level & Low-Level Design. Code & Unit Test.
- Review code developed by Team members, UTC & peer review.
- Work with the Senior Business Analyst to prepare the Data mapping document for each Story.
- Prepare component development timelines (Project planning) and Effort Estimation.
- To Summarize, take care of the complete SDLC and make sure that the project is on track.
- Provide the Daily Status Report about the progress to the Scrum Master and Project Manager.
- Work closely with the Data Modeler/Architect to understand the Database design.
Confidential, Minneapolis, MN
ETL Lead (Senior Developer)
Technology: DWBI(ETL), Informatica Powercenter, SQl Server, Message Queue, XMLs, PL/SQL, mainframe, Unix scripting, SalesForce Source.
Responsibilities:
- Architect the Datawarehouse for the functioning of the Client’s financial tool SAMs. Understand the SAM’s data model to be able to design the datawarehouse that feeds the data into the SAMs tool.
- Create High level & Low-Level Design documents that will be used for the ETL build.
- Build and Unit Test the ETL component that reads data from multiple sources like Mainframe, Right Now Forum, Message queues, Salesforce, XMLs, Flat Files, SQL server database.
- Using XSD to read the source XML data, Xpath to read only specific nodes in the xml file and some XQueries.
- Requirement gathering and Data analysis of the data source using SQL queries and using the knowledge to complete the data mapping sheet.
- Perform performance tuning by implementing Push Down Optimization (PDO) and session level partitioning.
- Work with the Senior Business Analyst to prepare the Data mapping document for some of the source systems.
- Component development timelines (Project planning) and Effort Estimation.
- Generate HLD, LLD and other technical documents that will be used by ETL technical team.
- Delegate task to Off-shore team and get work done on time.
- To Summarize, take care of the complete SDLC and make sure that the project is on track.
- Provide the Weekly Status Report to the Client Manager.
- Additionally, work closely with the BSA in the project to understand the business requirement and taking active part in generating the Data mapping document.
Confidential, Schaumburg, IL
ETL Technical Lead (Senior Developer)
Technology: ETL, Informatica PowerCenter, XML, Netezza, DB2.
Responsibilities:
- Create High level and Low-level design for the short releases that performs enhancement on the legacy DB2 system.
- Requirement gathering and analysis from Customer’s BA during data modelling sessions for the new Netezza Database that is slowly replacing the DB2 database.
- Redesign existing data model in order to accommodate the restrictions and limitations in Netezza.
- Create required technical documents that helps the technical team for the actual build work.
- Code and Unit Test the ETL programs in Informatica Powercenter reading data from multiple sources like XML files, Netezza database, mainframe.
- Perform performance tuning on the existing DW system by implementing PDO (push down optimization) and source data partitioning.
- Component development timelines (Project planning) and Effort Estimation.
- Delegate task to Offshore team and get work done on time
- Overall, take care of the End-to-End Project Development life cycle.
Confidential
ETL Senior Developer & Lead
Technology: ETL, Informatica PowerCenter 8.6.1, SQL Server2008 and Oracle 10g.
Responsibilities:
- Requirement gathering and data analysis of the Client’s Agent quote system
- Build and Unit Test the ETL code in informatica Powercenter and SQL server.
- Allocate task to team member and myself
- Attend Client and Onsite coordinator calls
- Create mappings as a part of requirement and delivering on time
- Review the codes developed by other team members
- Prepare test cases and testing components