Lead Data Architect Resume Profile
Professional Summary
- Over 12 years of IT experience in analysis, design, development, data conversion, Data Warehouse, Business Intelligence,
- Testing, documentation and implementation of client/server applications
- 11 years of strong experience in ETL process and Business Intelligence using Informatica PowerMart 4.7/5.1, InformaticaPowerCenter1.7/5.1/6.1/6.2.2/7.1.4/8.1.1/8.5.1/8.6.1/9.0.1, Data Stage 7.5.1, Business Objects 5.0/5.1, Cognos 6.0, OLAP, Data Transformation Services along with 4 years of experience as a Sr Data Modeler using data modeling tools Erwin3.5/4.1/7.x/9.x, Data Warehouse professional, specializing in the development of Data Warehouse, Business Intelligence architecture that involves data integration and the conversion of data from multiple sources and platforms.
- Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehousing, DataMarts and IODS using Data Conversions, Data Extraction, Data Transformation and Data Loading ETL .
- Good experience with Data Stage.
- Experience in BI reporting tools using Cognos-Impromptu, Transformer, Power Play, Impromptu Web Reports and Business Objects 5.0, Micro Strategy.
- Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system using Data Modeling tools like Erwin3.5/4.1/7.3, ER/Studio, HPDM
- Extensively used Informatica Repository manager, Power center Designer, Workflow manager, Workflow Monitor and Repository Admin Console.
- Expert in design and development of complex ETL mappings and scripts using Informatica PowerCenter.
- Hands on solid experience in Performance Tuning of source, target, mappings, transformations and sessions.
- Extensively worked using IBM Information Server/DataStage Client components like DataStage Designer, Director, Manager and Administrator.
- Working experience of methodologies like Star Schema and Snowflake Schema
- Experienced in Analysis, Design, Development, Implementation and Testing of Client/Server Applications using SQL, PL/SQL, SQL Plus, Procedures, Functions, Triggers and Packages in Oracle, and SQL Server
- Experience in interfacing the data from legacy systems to Oracle applications using SQL Loader. Experienced in writing SQL statements and PL/SQL code in Oracle
- Experience in Building Business Objects Universes and using various Data Providers like Universes, Stored Procedures, Free Hand SQL, Personal Data Files for retrieving data and creating Simple, Complex and Adhoc Reports.
- Strong logical, analytical and communication skills with Solid business acumen. Ability to effectively gather and refine report requirements from Users and Management
- Strong Team Management skills and excellent team player.
- Handled Development Teams of sizes 5 to 8. Good at effort estimation and task distribution among the Team members, and overall project coordination.
- Experience in a wide range of Business Domains Government, Telecom, HealthCare, Financial, Banking and Insurance.
TECHNICAL SKILLS
Data Warehousing | Informatica PowerMart PowerCenter 5.1.2/6.x/7.1.4/8.1.1/8.5.1/8.6.1/9.0.1 Source Analyzer, Designer, Mapping Designer, Mapplet, Transformations , B2B Data Transformation 9.0.1,Power Exchange 9.0.1, Data Quality, Data Profiling, Data Stage 7.5.1, Data Junction, Metadata Manager, Data Analyzer, Data Integrator 9.2.3 Pervasive |
Reporting, Bug Reporting Version Control Tools | Business Objects 5.0/5.1 Business Designer, Reports, Supervisor, Business Query, Web Intelligence and Broadcast Agent Server , Super glue, Cognos Impromptu, Transformer Power play, Micro Strategy 7.5, ER/Studio, Erwin, SQL LOADER, Data Transmission Services, SQR BRIO , Crystal Reports 8.5, Informatica Metadata Bridge, SQL Navigator 4.5.1.538, TOAD, STRIVA, Control-M, Clear Case, Clear Quest, Test Director, CVS, Mercury Quality Center 9.0, Aginity 4.1.1 |
O/S | Windows 95/98/NT/2000/XP, MS-DOS, Unix, Linux, Sun Solaris, IBM-AIX 5 |
Languages | PL/SQL, SQL, Transact-SQL, UNIX shell scripting, Java, ASP, C |
Databases | Oracle 7.0/8.0/8i/9i/10g,Teradata 12.0,DB2,SQL Server 6.5/7.0/2000, Sybase 11.5 and MS Access97/2000, Netezza 7.2 |
GUI | Visual Basic 5.0/6.0, Forms 4.5/5 and Reports 2.5/3 Web Visual Interdev 6.0,ASP 2/3, VBScript, JavaScript and IIS 4.0 |
Web Skills | IIS, ASP2/3, VB Script, JavaScript, Xml, HTML, DHTML, PhotoShop6.0, Flash5.0/MX, Dream Weaver4.0/MX |
PROJECTS
Lead Data Architect
Confidential
Magellan Health Services is a leading, diversified specialty health care management organization.
PDW Is the Pharmacy Data Warehouse, The goal of Magellan's Pharmacy Data Warehouse PDW is to provide a centralized, high performance repository that meets information needs across the organization. When paired with an Enterprise Business Intelligence suite, the PDW provides a consistent version of the truth to the business, so reports and analytics can be consistent and results trusted.
Responsibilities:
- Working closely with ETL team members to resolve the performance issues of ETL loads and business related questions.
- Extensively using HPDM Erwin tools to do the modeling for both automatic and dimensional warehouse models
- Defined Modelling Naming standards and Best Practices for the Modelling team to use in the Data models as well as in the DDLs DMLs while creating new data elements and adding attributes
- Identifying ETL specifications based on Business Requirements and creating ETL Mapping Documents.
- Developed Conceptual, Logical and Physical database models for PWD Phase-2 assurance applications and forward engineered the DDLs
- Deriving required Fact and Dimension tables to accommodate the data as per To bring additional data elements from First RX source system into PWD as per the user requirements
- Interacting with the Business users to identify the process metrics and various key dimensions and measures and involved in full life cycle of the project.
- Identifying ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design. Participated and lead research and development efforts proofs of concept, prototypes , as subject matter experts, with new technologies.
- Reviews new, existing systems design projects, procurement plans for compliance with standards and architectural plans. Ensures that solutions across platforms adhere to the enterprise architectural road map and support both short-term tactical plans as well as long-term strategic goals.
- Ensure technology solutions in Production are monitored and assessed for performance and sustainability through
- periodic health checks such as capacity planning, incident and problem reviews
- Responsible for the planning, designing and developing an Informatica ETL solution using Informatica objects such as Mappings, Mapplets.
- Worked with business owners, analysts, solution engineers, development teams and infrastructure services to communicate application and data architectures.
- Provided expert advice, counsel and technical expertise to the project team to help assure that Informatica solutions are designed and developed in the optimal manner and in accordance with industry and Informatica best practices
- Produce detail design document, low level design document and involved in code/document review. Participated in the maintenance and enhancement of the application and perform bug fixing
- Used XML SPY to edit, View, create XML schemas definitions.
- Designed and developed ETL mappings to load, transform data from source to Target using Informatica Power Center 9.5.1
- Worked extensively on different types of transformations like source qualifier Transformation, expression, Aggregator, Router, filter, update strategy, lookup, sorter, joiner and sequence generator.
- Tested all the business application rules with test live data and automated, monitored the sessions using Work Flow Manager and Workflow Monitor
- Created, launched scheduled Workflows/sessions. Involved in the Performance Tuning of Mappings and Sessions Involved in automating the Informatica process by using Unix Shell Script.
- Involved in Creating Control-M jobs to automate and schedule the Informatica workflows.
- Managing timely written and verbal communication on project progress, load status, issue tracking, and concerns for management attention and/or escalation. Written documentation to describe program development, logic, coding, testing, changes and corrections.
Environment: Netezza 7.2, Oracle 10g, SQL Server 2008, Erwin7.3.3, Informatica Power Center 9.5.1, Aginity WorkBench 4.1.1, Toad 11.6.0.43, SVN1.8
Lead ETL - Informatica Developer
Confidential
CareFirst, Inc. is the not-for-profit, non-stock, parent company of CareFirst of Maryland, Inc., and Group Hospitalization and Medical Services, Inc., affiliates that do business as CareFirst BlueCross BlueShield.
LDS: The purpose of the project LDS is to load active members as of system date 30th Day on daily basis from Facets tables. This will serve as a single source for all the letter generation process reducing the complexity of fetching data from multiple tables to create Letters to Subscribers, user reports and quarterly reports.
Silver Link Application: The purpose is to automatically insert or update Coordination of Benefits fields when subscribers indicate they have no medical insurance. Silver Link is the vendor contracted to make automated calls to select members to determine if other medical insurance applies in lieu of mailing out COB questionnaires. CareFirst sends an annual report that includes active membership to SL to make random COB calls. Silver link captures COB information via their Voice Response Unit. Silver link provides CareFirst two weekly files, one with the Yes responses and one with No responses. The file with No responses shall automatically insert or update specified Facets tables/columns.
Responsibilities:
- Participating in the entire requirements engineering process right from requirements elicitation phase to the phase involving documenting of the requirements.
- Involving in Design and develop the architecture for all data warehousing components e.g. tool integration strategy source system data ETL strategy, data staging, movement and aggregation, information and analytics delivery and data quality strategy
- Involving driving the solution approach to data schema development, business rules, data design and migration processes for the team. Also provided standards, guidelines, processes and expertise to consistently address recurring data issues including data convergence, data standards and data synchronization.
- Implementing ETL standards and Best Practices while naming Maps and processes.
- Handled team of size 5. Did effort estimation and task distribution for the team members.
- Coordinated and communicated the overall project execution status with Project Manager and Technical Delivery Manager
- Developing complex maps to extract data according to the Transformation Business Rules and User requirements.
- Modified several of the existing maps and created several new maps based on the user requirement.
- Creating Workflows and used various tasks like Email, Control, Decision, Session in the workflow manager.
- Worked with Informatica PowerCenter client tools like Source Analyzer, Warehouse Designer and Mapping designer.
- Developed new ETL mappings according to the requirements using Informatica PowerCenter 8.6.1.
- Involved in creating Workflows, Worklets and Configured Sessions, Used pmcmd command of Informatica in UNIX scripts.
- Debugging code, testing and validated data after processes are run in development according to business Rules.
- Assisted QA Team to fix and find solutions for the production issues.
- Coding, Debugging and sorting the time-to-time technical problems and have extensive experience in performance tuning identified and fixed bottlenecks and tuned the complex mappings for better performance.
- Working closely with all team members to resolve the technical issues and business related questions.
- Involved in automating the Informatica process by using Unix Shell Script.
- Managing timely written and verbal communication on project progress, load status, issue tracking, and concerns for management attention and/or escalation. Written documentation to describe program development, logic, coding, testing, changes and corrections.
Environment: Oracle 11g, Flat Files, SQL SERVER 2008, Informatica Power Center 8.6.1, Data Profiling, Toad 9.7.2,SQL,PL/SQL, Secure CRT, Main Frame -JCLs, Unix Shell Scripts, Quality Center 9.0,Sun Solaris 10.
Lead Data/ETL Architect
Confidential
CMS is the Centers for Medicare Medicaid Services and the federal agency responsible for administering the Medicare, Medicaid, CHIP Children's Health Insurance , HIPAA and several other health-related programs. The purpose of the NDW is responsible for providing user friendly access to CMS 1-800-MEDICARE information.NDW users will be able to leverage a flexible and user friendly web based environment to produce reports for conducting data analysis.NDW collect all data from CMS help line operational system into a single data warehouse and then further organize it into small data marts based on the different subject areas as per business requirement.
Responsibilities:
- Interacted with business analysts and developers to analyze the user requirements, functional specifications and system specifications.
- Proficiently worked on Conceptual, Physical and Logical data models 3NF using various Data Modeling tools like Erwin and MS Visio with strong understanding in the principles of Data ware housing using Fact Tables, Dimension Tables, star schema modeling and snowflake schema modeling, foreign key concepts, referential integrity.
- Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
- Created application-specific Data Marts so that users can access personalized dashboards of information that is specific to their department and business unit
- Worked on Query optimization Performance tuning using Execution Plan, Performance Monitor.
- Identified ETL specification based on Business Requirements/Mapping Document and formulate process design.
- Implemented ETL standards and Best Practices while naming Maps and processes.
- Worked closely with all team members to resolve the technical issues and business related questions.
Environment: Oracle 11g, Flat Files, Erwin r7.3.3, Informatica Power Center 8.6.1, Meta Data Manager,Data Profiling,Toad 9.7.2,SQL,PL/SQL, WIN SCP 4.3.4,Unix Shell Scripts, Quality Center 9.0,Sun Solaris 10.
Lead Data Architect
Confidential
The Department of Education DOE exemplifies energetic leadership and innovative products and services to improve public education, library services, and rehabilitation services. Objective of the Debt Management and collections system DMCS2 replacement project is to improve the quality, accuracy, and precision of data contained in and shared between Federal Student Aid's systems.
Responsibilities:
- Interacted with the Business users to identify the process metrics and various key dimensions and measures and involved in full life cycle of the project.
- Developed FRD Functional Required Document and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis
- Derived and Designed required Fact and Dimension tables to accommodate the data as per the user requirements
- Developed Logical and Physical database models to design OLTP system for insurance applications and forward engineered the DDLs
- Identified ETL specifications based on Business Requirements/Mapping Document.
- Defined ETL standards and Best Practices for the ETL team to use in ETL maps and processes
- Extensively used Data Integrator9.2.3 Pervasive tools, Map Designer, Process Designer, Schema Designer, Repository Explorer, XML Design Repository to extract data from different sources like Flat Files and SQL Server to Targets SQL Server and XML files.
- Conducted team meetings in order to gather the status from each team member, coming up with the meeting minutes and making sure that every one in the team is very effective in order to meet the project deadlines.
Environment: Oracle 10g, SQL Server 2008, Erwin7.3.3, Informatica Power Center 8.6.1, Data Integrator 9.2.3 Pervasive, Toad 9.6.1,SQL,PL/SQL,Unix Shell Scripts, Control-M, Clear Case, Sun Solaris 9.0.
Lead ETL - Informatica Developer
Confidential
CMS is the Centers for Medicare Medicaid Services and the federal agency responsible for administering the
Medicare, Medicaid, CHIP Children's Health Insurance , HIPAA and several other health-related programs. The main aim of the Beneficiary BENE Consolidation Process is to migrate all beneficiary-related data into the Integrated Data Repository IDR . The objective of the consolidated BENE project is to combine beneficiary Medicare and Medicaid information. This combined Medicare and Medicaid data will be primary beneficiary data source for various applications.
Responsibilities:
- Involved in understanding requirements, analyze new and current systems to quickly identify required sources and targets
- Responsible for requirement gathering, data analysis, design, development, testing and systems implementation
- Identifying ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design.
- Implemented ETL standards and Best Practices while naming Mappings, Transformations, links, Sessions, Workflow names, file names etc.
- Extensively using Informatica client tools to extract data from different sources like Flat Files,Teradata and Oracle to Targets Oracle and Teradata
- Removed the repository server on windows and installed infa 8 on UNIX.
- Imported all the mapping and workflows to the new repository server on UNIX.
- Have involved in administration tasks including upgrading Informatica, importing/exporting mappings.
- Created and managed the multiple environments, user groups.
- Have done the admin activities like, backup and restore of the repositories
- Created data manipulation and definition scripts using Teradata Bteq Utility.
- Used BTEQ scripts to load data from staging tables into Target tables.
- Developed complex Mappings using Transformations like Filter, Router, Expression, Joiner, Lookup, Stored Procedure, Union, Joiner and Updates Strategy on the extract data according to the Transformation Business Rules and User requirements.
- Developing new ETL mappings according to the requirements using Informatica Power Center 8.6.1
- Mentored peers and junior staff in both design and development of Data Warehouse and ETL technologies.
- Worked closely with the QA team and Integration group in moving the ETL code, shell scripts and database code to the higher environments.
- Involved in production support activities including monitoring loads, resolving exceptions and correcting data problems.
Environment: Oracle 10g, Informatica PowerCenter 8.6.1, Teradata 12.0, Toad 9.6.1,SQL,PL/SQL,Unix Shell Scripts, SQL Navigator 4.5, BTEQ, SQL Assistant, F-Secure SSH 5.3 and Sun Solaris 9.0
Sr. Data Warehouse Consultant
Confidential
Barclays is a financial institution of Dutch origin offering banking, insurance and asset management .Barclay is in the process of designing and implementing a replacement to the PWARE data warehouse environment as the existing warehouse is not serving business to the maximum extent. The main aim of the project is to create a new warehouse from the existing PWARE data warehouse and also other divisions like banking and insurance
Responsibilities:
- Identified ETL specification based on Business Requirements/Mapping Document. Formulating and documenting ETL process design.
- Involved in data mapping activities from source systems to target with Business Analysts and Data modelers.
- Developed various jobs to read from Complex Flat Files from TSYS and Flat files from Scorex and TransUniion and Databases, transform and load to Target databases.
- Performed impact analysis of recent UDAP regulations on Data mart areas.
- Extracted data from various source systems like Oracle, SQL Server, and Flat Files.
- Developing new ETL mappings according to the requirements using Informatica Power Center 8.5.1
- Responsible for Production support and monthly releases.
- Involved in creating Workflows, Worklets and Configured Sessions, Used pmcmd command of Informatica in UNIX scripts.
- Extensively used DataStage Designer to design, develop ETL jobs for extracting, transforming and loading the data into different Data Marts.
- Used various stages of Parallel Jobs such as aggregator, sort, transformer, sequential file and lookup file.
- Performed Import and Export of DataStage components and table definitions using DataStage Manager for versioning into CVS.
- Created sequencers to execute the designed jobs with better controls and UNIX scripts deployed in command line stages to wrap the files.
- Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
- Written various Unix Shell Scripts for scheduling and formatting the files.
- Worked with QC and Prod Support teams and bug fix and followed-up with QC-Tickets.
- Performed Unit Testing, Integration Testing and User Acceptance Testing UAT for every code change and enhancement.
- Created Process/Support documentation to providing the support for the Application
- Created Deployment documents and Release notes for the releases. Developed and created template mappings for future notes.
Environment: Oracle 10g, Power Center 8.5.1,IBM Information Server/ DataStage 8.1/7.5.3,Toad 8.0, SQL,PL/SQL, Unix Shell Scripts, Control-M, SQL Navigator 4.5,IBM-AIX 5,Windows XP,Micro Strategy 8.0.2,Control-M 6.2, WinCVS 2.0.2.4,Mercury Quality Center 9.0
ETL Architect
Confidential
The Ontario Cancer Registry OCR is being re-engineered into the Enterprise Data Warehouse EDW as part of the EDW Phase 4 project.The cancer surveillance system known as the Ontario Cancer Registry OCR has many of the desirable features of a modern Data Warehouse.The OCR also relies almost wholly on sophisticated decision support systems DSS to summarize or consolidate core information about the patient and the tumor.
Responsibilities:
- Involved in the meetings with users and business analysts in order to define the requirements
- Performed major role in understanding the business requirements and assisted Data Architect in designing star schema and Loading into data warehouse ETL .
- Developing new ETL mappings according to the requirements using Informatica Power Center 7.1.2/8.1
- Creating Mapplets with the help of Mapplet Designer and using these Mapplets in various mappings
- Have prepared the impact analysis document and high-level design for the requirements
- Developed functional and technical specification documents, mappings and exception handler for ETL processes.
- Performed Informatica administration duties including creating users, user groups, and folders, database connections, promoting informatica folders, mappings, sessions and workflows from development to production.
- Coordinating with the Quality Assurance team for component and integration test environments.
- Attended few Micro Strategy training sessions.
- Providing mentoring to junior team members
Environment: Oracle 10g, Informatica Power center 7.1.2/8.1, Toad, Solaris 2.7, Unix Shell Scripts, Windows XP, Micro Strategy 7.5