We provide IT Staff Augmentation Services!

Data Warehousing/mdm Resume

0/5 (Submit Your Rating)

SUMMARY

  • Eighteen years of extensive experience within business analysis, application design, development, customization, migration and implementation of several software applications involving EPIC,SAP, DataMart/Center, Hyperion, Informatica Power Connect, Erwin Platinum, Cognos Power Play, Cognos Impromptu, Impromptu Web Reports, Business Objects, Hyperion Essbase, Architect, Portfolio, Visualizer, Developer studio, Access Manager, Upfront, First Logic, Cobol, Java, JSP, HTML, Informix V.B, IMS, SQL, SQL*, PL/SQL, SQL*Loader, Crystal Reports, Power Play Enterprise Server, SQL Server, IWR Server, MS Access, Sybase, DB2 and Oracle under Unix, Solaris, AIX, MVS and Windows NT environment.
  • 5yrs of experience working on EPIC, SAP, Siebel &Telematics
  • 12yrs of experience working on MDM/Data Governance/Data Security
  • 18yrs of experience working as Database/Data warehouse Architect/BigData
  • 18yrs of experience working on Data Analysis, Data Profiling and Data Governance/Data Security
  • 18yrs of experience working on Informatica, Business Objects, Tableau, Cognos, Essbase, Hyperion
  • 18yrs of experience working on Oracle, Sybase, DB2, SQL Server, Teradata, Pl/Sql, Stored Procedures

TECHNICAL SKILLS

Applications: EPIC, Telematics, SAP R/3 - 3.1H

ETL Tools: Informatica, SQL*Loader, Export/Import, Oracle Warehouse Builder 2.1. Informatica Power Mart/Center, Informatica Power Connect, Informatica Power Bridge, Data Stage,Quality Stage, Ab INITIO, Hyperion Essbase.

Data Modeling: Erwin, Designer

OLAP: Business Objects Designer, Hyperion Essbase, Cognos Transformer, Brio, Hyperion, Microstrategy

Reporting/DSS: Cognos, Business Objects, Tableau, Hyperion, Microstrategy

Analysis: Business Miner, Power Prompts, Developer studio, First Logic

Languages: C, C++, VB, JAVA, Trans-SQL, Sql*Plus, PL/SQL, SQL*Loader

RDBMS: Universal DB2 (UDB), SQL Server, DB2, Sybase, Oracle

Scripts: Shell Scripts, Perl Scripts, VB Script

Scheduling: Autosys, Jils

Internet Technologies: HTML, DHTML, ASP, Cold Fusion, XML, Script, JavaScript, Applets, JAVA, JDBC, Servlets, JSP, Rational Rose

Operating Systems: UNIX, Solaris, AIX, MVS, Windows

PROFESSIONAL EXPERIENCE

Confidential

Data Warehousing/MDM

Responsibilities:

  • Worked with client for understanding existing system and requirements around MDMfor IT transformation.
  • Involved in JAD sessions and prepared road map for implementation of MDMsolution in phased approach using IBM Infosphere MDM.
  • Customized SDP rules for matching, collapse and survivorship.
  • Worked on implementation of Probabilistic Matching Engine (PME). Initially the SDP was implemented on classic matching engine.
  • Have designed several search combinations that involved coding to pull out right person/organization.
  • Done Entity/Data Extensions by extending the existing data model and added required new attributes.
  • Designed InfoSphereQuality Stagestandardization and probabilistic matching rules. The solutions are aimed at comparing Quality Stagematching results against the current cleansing and matching results.
  • Developed PLSQL Code to gather data for initial load to load the MDM Database, which involved complex business logic to integrate data from multiple sources.

Environment: PeopleSoft, Oracle,Cognos,SQLServer,PL/SQL,SQL*Loader,IBMInfoshereMDM,Windows,Korn Shell, Toad, Power Designer, Microsoft Access, AS400, TSQL, Business Objects,Siebel,Quality StageData Stage, Crystal Reports, Tivoli, J2EE, JAVA & Teradata.

Confidential

Data Warehousing

Responsibilities:

  • Redesigned the Data Architecture to support Big Data and new Change Data Capture Process.
  • Implemented several new techniques to improve overall ETL performance.
  • Designed customized processes to support various business requirements.
  • Created Data Governance Framework, that met the data Objectives of the Organization.
  • Performed Master Data additions, changes, and deletions in accordance with established procedures and business rules and responded to Master Data information requests.
  • In depth understanding of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, and Map Reduce concepts
  • Experience in working with MapReduce programs using Apache Hadoop for working with Big Data to analyze large data sets efficiently
  • Hands on experience in working with Ecosystems like Hive, Pig, Sqoop, Map Reduce, Flume, Oozie. Strong knowledge of Pig and Hive's analytical functions, extending Hive and Pig core functionality by writing custom UDFs
  • Experience in importing and exporting terra bytes of data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experience in job workflow scheduling and monitoring tools like oozie and Zookeeper, of NoSQL databases such as HBase, Cassandra, and of administrative tasks such as installing Hadoop, Commissioning and decommissioning, and its ecosystem components such as Flume, Oozie, Hive and Pig.
  • Worked on Apache Spark for data lake creation for building RWI(Real World Intelligence) application
  • Created spark scala applications for transformation of data using Spark RDD’s and also used SparkSQL for SQL queries and to connect to hive metastore.
  • Implemented Data Governance and Data security by developing data dictionary & implementing change control framework to establish common data usage. Framework ensures taxonomy standards, enterprise data Validation and clear accountability for meta data in data owner community.
  • Create, amend, suspend, or de-activate client accounts in order to supportGovernance as a result of the KYCS (Know Your Client Sustainability) review.

Environment: PeopleSoft, Oracle, Cognos, SQL Server, PL/SQL, SQL*Loader, Microstrategy, Windows, Korn Shell, Toad, Erwin, Microsoft Access, AS400, TSQL, Siebel, Informatica, Crystal Reports & Teradata, Map Reduce, HDFS, Spark, Scala, Kafka, Hive, Pig, Spark streaming, Mongo DB, Maven,Jenkins, UNIX, Python, MR Unit, Git.

Confidential

Data Warehousing

Responsibilities:

  • Deployed as leadBusiness Objectsdeveloper to the first disaster handled by the new environment, providing on-site expertise, training and custom report development for users in the field.
  • Worked with MDM team to achieve the MDM Vision by defining the tasks, specific Master data attributes and authoritative systems.
  • Integrated and implemented MDM, Business Intelligence and Data Warehousing Solutions.
  • Used Erwin for logical and physical design of the Star Schema.
  • Designed Custom Reporting Solution in Hyperion so that the Back Office Team could specifically analyze and validate particular dataset of interest.
  • Created Process Flow Diagrams, Data Flow Diagrams and Step by Step User Manual Documents for BackOffice Business Process Team.
  • Extracted data from various source systems like Azero, Verizon into DSS System for Data Analysis and Integration.
  • Done Data Governance and Monitoring in accordance to the application/environment standards and methodology.
  • Provided Strategic Planning and Process Re-engineering efforts forGovernance improvement, performance management, and the development of operational KPIs as well as operational and tactical dashboards.
  • Created Informatica ETL processes to lookup the master data like the most recent VIN Transaction and load the detail Tables such as Retail Vin.
  • Worked with various Subscription data like Entune, Enform, Safety Connect and Connected Services to find overlap and gaps between Subscriptions.
  • Designed Reports that validates the correct Wholesale record for a particular VIN and every retail VIN record has a valid Wholesale Record.
  • Worked with the major functionalities ofBusiness Objectslike alerts, drill filter, breaks, Sorts, Ranking and calculations using the slice and dice panel.
  • Created a process that would track the order of the Subscriptions of an account and load the transactions into a custom Table.
  • Done Statistical Data Modeling and worked with statistical programming languages for Data analysis and data mining.
  • Involved in troubleshooting Microstrategy Prompt, filter, template, consolidation and custom group objects in Data Warehouse.
  • Created several Stored Procedures, Custom Tables, and Parameter Tables for migration several business processes such as F&A process from Microsoft Access to Decision Support System in Hyperion.
  • Created a process to reconcile the Recycle and Reject VIN data reconciliation process identifies the reason for the VIN to be in Recycle or Reject group.

Environment: PeopleSoft, Oracle, Cognos, SQL Server, PL/SQL, SQL*Loader, Microstrategy, Windows, Korn Shell, Toad, Power Designer, Microsoft Access, AS400, TSQL, Business Objects, Siebel, Informatica, Crystal Reports, Tivoli, J2EE, JAVA & Teradata.

Confidential

Data Warehousing

Responsibilities:

  • Visited several hospitals and interacted with business and technical people to gather requirements.
  • Developed Process Flow Diagrams, which shows relationship between Business Process and Data.
  • Performed Universe design with design, training Best Practices, scalability, optimization of universes by modifications of contexts and creation of variables and formulas, and support existing Universe and fixing diversions.
  • Used ER-Diagrams and Documentation to gather requirements completely.
  • Worked with several Payer Companies to resolve and balance payments.
  • Interacted with various teams and Business Analysts to understand the other applications which are feeding the Data Warehouse.
  • Held meetings with end user and Business Analysts on regular basis.
  • Done Logical and Physical Design of the Data Warehouse using Erwin.
  • Designed, coded, tested and implemented Medicaid Fee Schedule Process for NYU and Beth Israel Cancer Centers that would capture the true cost of the drugs after applying discounts and rebates.
  • Created PL/SQL Stored Procedures using Cursor Data Structures for building the ETL process.
  • Design and Development of universes and reports in Business Objects(IDT). Created Hierarchies, complexObjectsusing various @Functions, and Aggregates.
  • Wrote several PL/SQL stored procedures to satisfy the business requirements.
  • Designed and built Informatica Mappings for Extract Transform and Load Process.
  • Worked with Hospitals and users of various Healthcare Application Products for installation, implementation, testing and resolved any issues they encounter while using the products.
  • Created Reports that would keep track of the Inventory of Drugs that is available in the Hospital after Purchase and Usage of Drugs. These reports are used by the hospitals to make sure that they are following the 340b program rules of that state.
  • Created an application that would track the reimburse rates for Confidential t charges for various Insurance such as BlueCross, Aetna, Cigna, Medicaid, Medicare at the unit and account level.
  • Created a Data feed for EPIC project that would provide Charges, Encounters at the Unit level
  • Created a Cube that would provide the outcome of various treatments for Confidential ts and there by capture the trend analysis for each type of treatment and drugs.
  • Analyzed the data and created complex reports that would provide, the Confidential ts referred by a particular Referring MD by location and the charges associated with the Confidential ts.

Environment: EPIC, SAP, Oracle 9i/10g, Cognos Series 7 - Cognos Impromptu, SQL Server2008, PL/SQL, SQL*Loader, Powerdesigner, Windows 2000/XP, Korn Shell, S, Siemens Invision, Microstrategy, AS400, Siemens DSS, Oracle, Healthcare Query, TSQL, Business Objects, Access, Informatica, Crystal Reports & Microsoft Access.

Confidential

Data Warehousing

Responsibilities:

  • Done Data Modeling by gathering specifications and requirements and interacting with end users and done extensive Business analysis.
  • Worked on multiple Data warehousing projects involving separation and integration of Boston Scientific and Confidential data.
  • Did analysis, design, coding, testing, implementation and production support for Company Code Duplication Project.
  • Wrote multiple Oracle stored procedures and PL/SQL Code to enhance and implement new ETL Processes.
  • Worked extensively with Oracle 9i and Informatica 7.1.
  • Designed, coded and implemented Informatica Mappings as a part of solution implementation for NAM Reporting.
  • Worked with Cognos Reportnet for OLAP Reporting.
  • Worked extensively with Controlm Scheduling tool to schedule jobs and understand the process flow.
  • Worked with UNIX Korn Shell Scripts for FTP and Copy files between Systems.

Environment: Informatica Power Center 7.1/8.1.1, Oracle 9i/10g, Cognos Series 7 - Cognos Impromptu, PL/SQL, SQL*Loader, Erwin 3.5.5, Siebel, Vertica, Windows 2000/XP, Korn Shell & Mango DB.

Confidential

Data Warehousing

Responsibilities:

  • Designed and Developed EDW for sales, service, marketing and finance using Star Schema.
  • Gathered requirements from end users and business and converted them to business rules.
  • Done Process Flow Diagrams and ER diagrams using ERWIN.
  • Designed the ETL process for various extracts and scheduled batch process to trigger the jobs.
  • Wrote stored procedures to implement the business logic and load the data to Dimension and Fact Tables.
  • Worked with Siebel Data Model and Extracted data from Siebel Source System and integrated to the Enterprise Data Warehouse.
  • Done OLAP Data Warehouse Reporting using Cognos Report net and Business Objects.
  • Worked extensively with Framework Manager, Query Studio and Report Studio.
  • Done Cognos Report Net Configuration.
  • Used Informatica and Decision stream to implement the ETL process.
  • Done unit, system and parallel testing before successfully implementing in production.

Environment: Informatica Power Center 7.1/ 6.2/5.1, Sun Solaris 2.7, Oracle10g/ 9i, SQL Server 2000, IMS, DB2, UDB, VSAM, Cognos Series 7 - Cognos Impromptu, Agent, Siebel, Architect, Intelligence Server, OWB, Teradata, Narrow castserver, PL/SQL, SQL*Loader, Erwin3.5.5, Windows 2000/XP, Decision Stream, Vertica.

Confidential

Data Integration

Responsibilities:

  • Interacted with various teams and Business Analysts to understand the other applications which are feeding the Data Warehouse.
  • Gathered requirements for ongoing building of Data Mart.
  • Trained and assisted end users on reports.
  • Done technical support for existing Data warehouse.
  • Interacted with business and customers to understand their needs and provided support.
  • Done Data Integration between DB2 and Oracle9i using ETL Process.
  • Developed various ETL process for complete end to end Data Integration.
  • Used Cognos for OLAP reporting.
  • Used Informatica and Data stage for ETL Process.
  • Developed Shell Scripts for automation of various processes.
  • Done System, Integration and Parallel Testing for various ETL Processes.

Environment: Informatica Power Center 7.1/ 6.2/5.1, Sun Solaris 2.7, oracle 9i, SQL Server 2000, IMS, DB2, UDB, VSAM, Cognos Series 7 - Cognos Impromptu, Agent, Architect, Intelligence Server, OWB, Teradata, Narrowcast server, PL/SQL, SQL*Loader, Erwin3.5.5, Windows 2000/XP, Data Stage & Vertica.

Confidential

Data Warehousing

Responsibilities:

  • Done Data Modeling by gathering user requirements and specifications.
  • Interacted with the end users and Business to take the new requirements for existing Data warehouse.
  • Gathered Business Requirements and converted those requirements in to a model.
  • Developed Process Flow Diagrams, which shows relationship between Business Process and Data.
  • Used ER-Diagrams and Documentation to gather requirements completely.
  • Interacted with various teams and Business Analysts to understand the other applications which are feeding the Data Warehouse.
  • Held meetings with end user and Business Analysts on regular basis.
  • Done Logical and Physical Design of the Data Warehouse using Erwin 3.5.2
  • Designed Star schema to implement Data warehouse.
  • Made the suggestions for the end client to select the software and tools needed to implement the Data Warehouse.
  • Interacted and worked with vendors like Informatica to get the specifications needed and installation of the tool.
  • Upgraded Informatica tool to the current version.
  • Used Cognos Reportnet for OLAP Reporting.
  • Interacted with end users and gathered requirements for Impromptu Reports.
  • Wrote functional and technical specifications for Impromptu Reports.
  • Helped end users solved their technical issues for generating reports.
  • Worked on Informatica Power Center 6.2/7.1 tools - Sources Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations
  • Identified business rules for data migration, parsing high-level design spec to simple ETL coding and mapping standards.
  • Used data base objects like Materialized view, Sequence generators, Parallel Partitioning and Stored Procedures for accomplishing the Complex logical situations and Memory Management.

Environment: Informatica Power Center 7.1/ 6.2/5.1, Sun Solaris 2.7, oracle 9i, SQL Server 2000, IMS, DB2, UDB, VSAM, Micro strategy 6.0 Administrator, Agent, Architect, Intelligence Server, OWB, Teradata, Narrowcast server, PL/SQL, SQL*Loader, Erwin3.5.5, Windows 2000/XP, Cognos Series 7 - Cognos Impromptu, Impromptu Web Reports, Power Play, Power Play transformer, Power Play Enterprise Server (PPES), Upfront, Reportnet Architect, Power Prompts, Cognos Query, scheduler, Access Manager & Oracle 9i.

Confidential, CHICAGO

Data Warehousing

Responsibilities:

  • Done Data Modeling by gathering user requirements and specifications.
  • Visited several hospitals and interacted with business and technical people to gather requirements.
  • Developed Process Flow Diagrams, which shows relationship between Business Process and Data.
  • Done meetings with the end users and business on daily basis and gathered requirements.
  • Made the suggestions for the end client to select the software and tools needed to implement the Data Warehouse.
  • Interacted with various teams and Business Analysts to understand the other applications which are feeding the Data Warehouse.
  • Interacted and worked with vendors like Informatica to get the specifications needed and installation of the tool.
  • Used Oracle warehouse builder for ETL process.
  • Done logical and physical design of the warehouse using Erwin 3.5.2.
  • Developed a hybrid model of Star and Snow Flake schema that satisfied all the user requirements.
  • Done ETL process with DTS and Informatica.
  • Helped end users solved their technical issues for generating reports.
  • Worked extensively with Informatica Power Center, Informatica Power Connect, Informatica Power Analyzer.
  • Done OLAP reporting using Informatica Power Analyzer.
  • Worked extensively with Business Objects for OLAP reporting.
  • Worked with Microsoft Analysis Services for OLAP cube generation.
  • Done database builds for Centricity 7.7 application in SQL Server 2000.
  • Done data migration from ellipse application to SQL Server 2000.
  • Managed AUTOSYS scheduling in Distributed Systems/Environments by creating Job definition, Advanced Scheduling including complex job dependencies and conditional Job Flow.
  • Done extensive VB Scripting
  • Worked extensively with Clear Case for version control.
  • Done labeling, branching and merging in Clear Case.
  • Responsible for the Designing the universe by creating the Business Objects data model selecting/joining tables, indicating cardinalities, creating aliases to resolve the loops, subdividing into contexts and creating the objects which are grouped into classes.
  • Designed Cubes with Slice & Dice and Drill Down operations.
  • Created the reports (Master/Detail, Cross tab, Chart templates and outline reports) in Business objects and Web Intelligence using the universes as the main data providers and writing the complex queries including sub Queries.
  • Generated the reports for the web users using the Web Intelligence 2.x Info View. Reports are created using the Web Intelligence explorer and Reporter modules.

Environment: Informatica Power mart 4.7/5.0/5.1/6.1 , Informatica Power Center 1.7/5.0/5.1/6.1 /6.2 , Informatica Power Connect, Informatica Power Analyzer, Erwin 3.5.2, DB2, Teradata, COBOL files, Clear Case, VB Script, IMS, Flat files, Oracle 8i/9i, SQL Server 7.0/2000, Sybase, ER/Studio 3.5/4.0, SQL, PL/SQL, Transact-SQL, Autosys, ASP 2.0, Business Objects 5.1/6.1, Sun Solaris 2.6, HP-UX, Windows NT 4.0.

Confidential, CHICAGO

Data Warehousing

Responsibilities:

  • Done Data Modeling by gathering specifications and requirements and interacting with end users and done extensive Business analysis.
  • Designed star schema.
  • Worked on Informatica - Source Analyzer, Data warehousing designer, Mapping Designer & Maples and Transformations. Involved in the development of Informatica mappings and also tuned them for better performance.
  • Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
  • Done OLAP reporting using Informatica Power Analyzer.
  • Validated Informatica mappings.
  • Done unit testing on Informatica mappings.
  • Managed metadata using repository.
  • Used most of the transformations such as the Source qualifier, Aggregators, lookups, Filters & Sequence.

Environment: Informatica Power mart 4.7/5.0/5.1/6.1 , Informatica Power Center 1.7/5.0/5.1/6.1 , Informatica Power Connect, Informatica Power Analyzer, Erwin3.5.2, DB2, Teradata, COBOL files, VB Script, IMS, Flat files, SQL Server 7.0/2000, SQL, PL/SQL, Transact-SQL, ASP 2.0, Sun Solaris 2.6, HP-UX & Windows NT 4.0.

We'd love your feedback!