Etl And Lead Informatica Developer Resume
CaliforniA
PROFESSIONAL SUMMARY:
- 2 years of experience in Dell boomi integrating on salesforce, Oracle db Cloud, Workday, SFTP, HTTP etc.
- 8 years data warehousing and BI experience.
- Worked ETL tools: Informatica Power Center 9x,10.x, Exchange, Informatica B2B, EDI, Mdm, IDQ. Dell boomi.
- Involved in all phases of end - to-end implementations such as Requirements Analysis, Technical Specifications preparation, Coding, Testing, Quality Assurance, User, Performance Tuning and Production Support.
- Creating configuration scenarios for B2B Integration for various format. Involved in the va6rious phases of the Implementation in the Development level for various projects.
- Experience with FlatFile, XML, Jason and User defined mapping function.
- Worked on REST/ Soap API.
- Good knowledge on Standard iFlows in Dell Boomi.
- Experienced in Data Conversions, Data Uploads from legacy systems to target system.
- Writing Oracle and MySQL Data queries, Procedures, Functions, triggers. Snapshot, views.
- Worked on Java Script, Unix Shell Script, Python.
- Knowledge on Big data integration.
- Tableau, OBIEE. BOXI.
- Worked on MDM, Data Quality and profiling.
- Highly motivated person with Excellent Communication and interpersonal skills, played roles like team lead for development team and production support team.
- Installing of atoms, molecules and cloud environment and maintaining it.
- Creation of New users, giving appropriate access to them and creation of custom users role for restricting the access
- Worked on point to point (source to target) process. And subscriber based architecture help in reducing the development time and money for development and maintenance.
- Building new integration process from EBS payment data and profile data from MDM system validate data and send it to ftp folders and Gforce Portal.
- Run data sync integration job between two systems.
- Building up the processes on B2B data integration, Inbound, outbound, Event Based Integration, Deploying and Scheduling Tracking document, etc.
- Deploying, scheduling and maintaining the processes across different environment and supporting them on the daily basis.
PROFESSIONAL EXPERIENCE:
Confidential, California
ETL and Lead Informatica Developer
Responsibilities:
- Manufacturing Forecast accuracy and supply demand: This project is to report the accuracy of the forecast demand data vs the supply data over the next 5 years, Based on the demand quantities over the period and the Purchase order the drug maker determines the production material qty needed for manufacturing. The source systems are from Demantra and Rapid-Response which provides the input and are sources the OLAP mfg warehouse creating the Datawarehouse tables from file extract to Dimensions and fact extract to Fact table loads and creating the Reporting summary aggregate table for Reports. Tableau reports created for end user are schedule attainment, Forecast Attainment, API Demand reports. Created Standard, Cross-tab, Sub reports. Generated Tableau Dashboard with quick/context/global filters, parameters and calculated fields on Tableau.
- Reports: Schedule Attainment reports as part of ISCP process to see how each cycle of forecast comparing against actuals. Forecast Accuracy report built to see outliers items and respond with solutions in the timely. Chem Manufacturing reports built to identify the stock movements from warehouse to warehouse.
- Created shell script for automation File archiving and validation script for RR source file with the estimated file list count for the latest set of files which are to be consumed by the MFG warehouse. Send notification mails after any process fails, success emails after the files are validated.
- Provide KT to users .
Confidential, California
Boomi Consultant
Responsibilities:
- Redesign the KYC process and implement the ingestion of the KYC source data from flat file forms and force application and send it to db tables,
- Develop do not bank lookups process.
- Perform validations processes and sub process, event based trigger for CDD (Due Diligence) and enhanced DD process flows validations and approvals.
- Employee High Grounds Integration from Successfactor to SFTP, Employee High Grounds Integration from SFTP to Successfactor, Enhancement on successfactor employee termination interface
- Payment file generation and transmission from Successfactor to multiple banks.
- Implemented document tracking process.
- Coordinate with client, Functional Consultant to solve the issue of tickets. Coordinate with various systems Consultants to design the enterprise bank architecture and model of the Interfaces.
- Developed bank related boomi process for inbound and outbound flows. used. P
- Worked on User management (adding users, assign roles and privilege, created custom roles for the account)
- Worked on point to point and subscriber based architecture.
- Extensively used the Boomi process the execute shapes like map for mapping the source and Target bjects, define connection and configure objects . set properties, messages, notify, process calls to subprocess, Return documents, data process for slitting the documents and rel. Used logical boomi shapes such as branch, route, cleanse before the mapping, decision shapes for the upsert logic, try and catch shapes.
Confidential, California
Team Lead Boomi Consultant
Responsibilities:
- Capture and Create and Review the Business Requirement document for Product Quality project supporting HP acquire H3C, COM3 product lines.
- Develop high level Architecture flow diagram of the Data warehouse implementation for functional requirement with the client.
- Prepare the TSD and Implementation logic document as a part of deliverable for the PQ and Activate Project. design get and send Process send and get implement web API automation.
- Exception handling flows and custom notifications were implemented.
- Deploy in multiple environment.
- Performed Database performance tuning on high volume queries to the dba. Adding hints to the source queries and indexes for faster retrieval of data.
- Cloud environment management and planning.
Confidential, California
ETL Consultant
Responsibilities:
- Designed and Developed and implemented ETL process and MDM handling for AHM and Complete Spend Health care providers.
- Data integration for GCH- Veeva, (Veeva is a Salesforce Saas CRM a cloud based application).
- Worked on GCH to VEEVA interface Daily Full load with truncate source gch Stage, Daily incremental extracts and load to target Account tables, Stage and update the Address based on External Gilead IDs.
- Integration with Veeva sales force data with Informatica tool.
- Error Handling strategy and data reconciliation for Veeva GCH integration utilizes the Common Informatica Framework. Includes the error handing in the ODS tables Data integration for GCH- Veeva, Veeva a Salesforce Saas CRM a cloud based application Customer Expense extraction from EBS Oracle, concur to ODS via Informatica .
- Designed and Developed and implemented ETL process and MDM handling for AHM and Complete Spend Health care providers
- Project Team member assignments from Successfactor to Oracle.
- The PSR Reporting system is a custom in-house system developed system using Informatica and OBIEE. It comprises of two main components, the ETL and the meta-data repository. The ETL layer of PSR Reporting system loads data from PSR data sources into a database schema residing in the Argus Insight database
- Create Informatica Job, Workflow and Informatica worklets. Design the Workflow job sequence to load the Base Data, Staging Data and the final Fact data.
- Design the Base Table, Loading strategies and Dimensional load, determine the Business Key column for the PK key comparison.
- Design the Loading the strategy for PSR reporting Tables. The reporting tables:
- The ETL implementation of the Historical File Information for the Pre-Approval Files sources in FILE DETAIL Metadata Table, The Current / Latest File is maintained indicated by ACTIVE Flag Status, Old Files are marked as INACTIVE and the END Data is the Last date loaded, also this would maintain the File Seq. no and keep track of the count on the Source File records and Source File names.
- Created Shell scripts File Mover scripts which can handle the Source and Target Paths for the File copy dynamically as per the Environment (DEV/TST/PRD).
- Carried out validation as per the Test case document. Prepare for the UAT and PROD deployment creating Deployment Checklist, and labelling the ETL objects and creating the dynamic Queries for ETL objects.
- Create deployment groups and do code promotion to UAT and Prod environment.
- Help testing with the reporting member with the standard monthly reports using Tibco Spotfire, by doing the validation at the backend. Help end UAT.
Confidential, California
Boomi developer
Responsibilities:
- Designed and Developed and implemented ETL process for Prod Prof Mart for the TCCI Canada project to track the Total Expenses, Subvention / Dealer discount, Balance due Amort, Term for the Active Accounts Payment due.
- The Key tables Loan Master History, Retail Monthly Fact, Summary Fact, Contract Dim, and Application Dim.
- The Transaction source and the Loan master Files were in cbl mainframe format handled and loaded to the Stage tables by Informatica. Some of the Logic are handled with the Netezza Stored Proc for the business calculated fields. Automate the SPs for Several MonthEnd loads with Informatica Workflow and setting up the Calendar Dim for each MonthEnd Load run.
- BI- Analytics prepare the dataset for the Customer Value Model showing expenses amounts from Accounting Mart breakup by Vendor under each Channel by Date, By region. involved in analysis on Loyalty and Early termination and New Contract Groups from the Customers data from the SERVICE REQUEST table.
- Compute the Cost of Services like Telephone Call expense, Count of AutoPay enrollment, Pay by Phone Counts from the Services data.
Confidential, California
Lead Informatica Developer
Responsibilities:
- As Sr. ETL developer I was working on the SALES and INVENTORY data populating the Dimension and Fact tables using Informatica as tools.
- Requirements gathering from business.
- Prepare Source Target mapping document, Process Matrix document.
- Source data was received from different sources for Specialty Health, MCKESSON Pharma and ABC in the form of delimited Flat files daily and weekly.
- Responsable for the ETL to take care of the fréquence of the Load to the datawarehouse Oracle tables.
- Upgraded Informatica MDM from 9.1 to 9.6.1
- Configured Informatica Data Director (IDD) in to the Data Governance of users, IT Managers & Data Stewards.
- UtilizedInformaticaIDQData Analyst, Developer fordata profiling and matching/removing duplicate data, fixing the bad data, fixing NULL values.
- Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and IDQ tools and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business grains into the FACT tables.
- Developed business rules for cleansing/validating/standardization of data.
- Configured Address Doctor to Validate, Parse, Cleanse, and Standardize North America Address data. Created and deployed new applications in Informatica Data Director and binding applications to a specific ORS.
- Developed and reviewed configuration documentation and design documentation as well as analyzed functional specifications and user requirement documents.
- Prepare Master data to populate the Dimension tables attributes using lookup transformation.
- Prepare and validate Unix script to validate and FTP files from the Remote Server source directory and call them using informatica.
- Deployment activity from Development to UAT and PROD.
- Participated in feasibility study of Dell boomi as ETL tool.
- Code review and in corporate and further changes as needed by the business.
- Automating the workflow run and sending Alert email notification using Informatica.
- Provided technical decision support for the system and business analysts.
- Effectively coordinated between onsite and offshore teams for optimum utilization of the resources and time.
Confidential, California
MDM project
Responsibilities:
- Identify Business Requirements on discussing with various stakeholders Prioritize/Baseline/Track the requirements in Requirements Centre and create wireframe diagrams using MS Visio.
- Drove the requirements gathering process through approval of the documents that convey their needs to management, developers, and quality assurance team.
- Developed and updated functional use cases and conducted business process modeling to explain business requirements to QA teams.
- System appreciation and Test Strategy document and QC execution Plan.
- Collaboratively Develop ATDD’s with examples with engineering and Business users and conducted business process modeling to identify acceptance criteria for requirements
- Ensure Requirements are baselined and signoff by business.
- Review Test Scenarios, acceptance criteria documents and test cases through detailed steps developed by Offshore Testing teams.
- Facilitated Daily Stand Up Calls and during Implementation phases
- Reported the status of the project daily, weekly and on cycle basis as technical lead of the project during test execution phases.
- Coordinated the UAT and ensured sign off.
- Lead the production release and lead transition to support teams
Confidential, California
ETL Informatica Team Lead, Data Architect
Responsibilities:
- Responsible for support of business teams in addressing of operational system issues as well as requests for new capabilities
- Integration Manager for different modules in Confidential for Cassiopae datawarehouse for managing Hyundai Leasing and Financing/Daybreak application.
- Develop the design to in corporate new ODS datawarehouse and CRM datawarehouse to Report the Leasing and Financing Information to Credit Bureau companies.
- ETL Integration for the interfaces with near real time datawarehouse like CRM and EDW (Enterprise Datawarehouse) and source and Target Stage datawarehouse.
- Provide the ETL High and Solution design for different interfaces for inbound and Outbound processes. Designs and implements complex ETL processes.
- Implement the best practices, ETL Coding standards and guidelines for the development team and the Database standard naming conventions.
- Requirements gathering with Functional team on reporting Metro 2 format for consumer reports to different Credit Bureau companies.
- Responsible for the operational support of Informatica environment including but not limited to automation, job scheduling, dependencies, monitoring, maintenance, security, and administration
- Implements on-going optimization and tuning of Informatica mappings using database portioning and Partition points within Informatica to run parallel threads for loading source data.
- Design and Manage the file validation scripts which are specific to inbound interfaces (The credit Bureau files which are received from external vendors.)
- Tibco JMS integration with Informatica for Real time updates for datawarehouse.
- Provide presentation on Solution design and Code walkthrough with the Architecture review board to the client.
Confidential, California
Informatica Developer
Responsibilities:
- Coordinating and interacting with key business users, project stakeholders for gathering Requirements and Design of Interface Architecture, naming conventions, processes and analysed all business requirements and mapped them to informatica Interfaces according to business process requirements
- Coordinate Offshore team towards developing the Interfaces.
- Responsible to Develop Integration processes in informatica for Medicare, Medicaid, Medgate, CCure and Success Factors.
- Coordinate with client, Functional Consultant to solve the issue of tickets.
- Coordinate with various systems Consultants to design the architecture of the Interfaces.
- Developed Interfaces for various systems pass through informatica
- Applied the SQL queries on the data base
- Developed Outbound Interface to send Employee’s Emergency Contacts data form Success Factors to Medigate and medicare system.
Confidential
Module Lead for ETL Team (Lead Informatica Developer)
Responsibilities:
- This project extracts data from EDI 837 transactions to meet the HIPPA requirement for submission of healthcare claim.
- This data warehouse maintains historical data as well. Capture all the 837’s incoming data like the encounter between the Confidential t and provider like Confidential t description, cost of treatment etc. These information are fed into through Pipeline gateway into Canonical warehouse model.
- The first step was to capture 837s from Pipeline into a Data Acquisition (DA) Layer.
- The second step is to extract the 837 data elements from this DA layer and store it in the Claims Canonical.
- Send the information back to providers via the EDI 835 transaction set.
Confidential
ETL Developer
Responsibilities:
- Fine-tuned SQL Queries for maximum efficiency and performance, using explain plan, index and db partitioning.
Confidential
ETL lead developer
Responsibilities:
- Carry out requirement analysis and Feasibility study.
- Creation and validation of Informatica mapping design document.
- Developed mappings, as per transformation rules .
- Defined source and Target object definition.
- Implemented all the loading strategy: full load, Bulk Mode, Incremental logic. Insert else update logic, Type 2, type 2 dimension.
- Monitor jobs in Prod.
- Code fix and troubleshooting.
- Performance management.
- Build data model.