Information Architect Resume
SUMMARY
- Over 14+ years in Information Architecture, Data Modelling, Data Conversion, Data Warehousing, Data Mining and Data Analysis
- BPMN modelling in ARIS and Creating Business Support Map using the Applications and Business Objects in the Alfabet tool
- Data flow diagrams capturing current state of Business Domain to help solution the future state
- Agile project scrum team systems analyst
- Self - learning Hadoop fundamentals, MapReduce programming, and Data analysis using R from Bigdata University
- Used MicroStrategy desktop to build traceability matrix for dependent /component objects from reports
- Worked on DBAmp to read and write data into Salesforce
- Accessing data from the REST API exposed by different cloud solutions
- Worked on COGNOS 8 data manager to do the ETL
- Used CANVAS forms to get collect data form Mobile and web forms
- Used Cransoft reports to validate the Data Migration to SAP
- Worked on Talend Jasper ETL for performing data transformation and Extraction
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a enterprise-wide-ETL Solution.
- Experience in Informatica-Design, Development (Transformations, Mappings, Maplets, Workflows, Scheduling), Implementation and performance tuning of ETL process.
- Worked extensively on Push Down Optimization feature of Informatica 8.1 to work with huge data sets and enhance performance
- Experience in UNIX shell scripting
- Experience in Teradata utilities like BTEQ, FastLoad, MultiLoad, Tpump, Fast Export and Queryman and performance tuning of ETL process.
- Write Teradata SQL & Teradata macros to transform data into data warehouse.
- Tuned Oracle Teradata SQL for data movement jobs and queries to improve data load times and end-user data access response times.
- Experience in Data Analysis and Data integration using various data sources Oracle, Teradata, SQL Server, MS Access, SAP BW and Flat files
- Handled release tickets for defects, enhancements, maintenance and process optimization activities and documentation of issue found.
- Knowledge of Logical and Physical Data Model design process in Star and Snowflake Schema, 3NF. Erwin ER diagrams preparation, designing Dimension, Fact and Aggregate tables
- Domain Experience in Telecom, Banking and Manufacturing LDM
TECHNICAL SKILLS
Databases: Teradata V6R6 & R12, Oracle 9i/8i, MS-Access, SQL Server 7/2000/2008
Date warehousing/ ETL tools: Talend JasperETL,Informatica Power Center 8.5.1/7.1, Teradata Bteq, MLoad, Fexp, FLoad, Tpump
Reporting Tools: BOXI R2, COGNOS, MSTR
GUI/Client/Server Tools: Erwin, TOAD, MS-Office, MS-Project, ERWIN, Visio
ARIS BPMN tools: and Alfabet for Business and Information Architecture
PROFESSIONAL EXPERIENCE:
Confidential
Information Architect
Responsibilities:
- BPMN modelling in ARIS and Creating Business Support Map using the Applications and Business Objects in the Alfabet
- Understanding different value streams viz., Product Idea to Offering, Market Quote to Request, Quote to Cash, Prevention to Wellness, Request to Resolution and Financial Forecast to Report
- Working on the ELDM mapping for the request ad response parameters for the ISL services
- Maintaining the information assets in Alfabet and supporting business with the reports on the Application Architecture, Transition Architecture, Business Function and Process alignment
- Supporting Polaris Process Integration on the Data Requirement
- Creating Data flow diagrams for current Oxford Renewal process on current state business domains to support target solutioning
Confidential
Contractor: Sr Systems Analyst
Responsibilities:
- Understanding the business process, data model and design documents and converting the specifications to user stories for the development and QA team
- Gathering the reference data from the business SME
- Helping the QA team with the acceptance criteria
- Conducting scrum calls from time to time
- Helping the product owner with the set up of capabilities, features to track the user stories
- BPMN modelling in ARIS and Creating Business Support Map using the Applications and Business Objects in the Alfabet
- Understanding different value streams viz., Product Idea to Offering, Market Quote to Request, Quote to Cash, Prevention to Wellness, Request to Resolution and Financial Forecast to Report
Confidential, MN
Sr Principal Consultant
Responsibilities:
- Researching SOR by profiling data/ conducting KDD sessions with the SOR SME/ documenting notes and seeking clarification
- Researching and documenting issues related to data quality in SOR
- Analyzing SOR for data consistency
- Assisting in bringing Issues and providing data summaries and analysis to justify the same
- Suggesting LDM model changes based on profiling
- Setting topic calls to provide information and seek consensus on issues
- Data Analysis and Mapping-
- Profiling DDR elements and suggesting SOR
- Proposing mapping for DDR elements
- SOR tables required for semantic views
- Data laod ( delta/full) in required SOR tables
- Validating semantic view against SOR and legacy reporting systems
- Suggesting mapping changes based on validation- refrential integrity, mapping updates, join changes and data granularity changes
- Coordinating with Program Team, DQ team, MicroStrategy team and UAT team
- coordinating across different teams and providing responses to queries regarding MIDE semantic Model, Data Issues, DQ issues
- Assiging defects to development team and assisting retesting and closure
- Assisting MicroStatregy (MSTR) team by providing clarifications regarding semantic model
- Preparing traceability matrix for data elements required in reports to the semantic views, SOR and Legacy systems
Confidential, MN, USA
Principal Programmer Analyst
Responsibilities:
- Design, develop and maintain the ETL in COGNOS 8 Data Manager and SQL server2008
- Read and write into Salesforce using DBAmp App in SQL server. The SALESFORCE cloud is access using the REST API that the application provides
- Working on designing the database to pull and push data from CANVAS cloud to read and write in mobile and web forms. The CANVAS webservices are used to read and write data from MS SQL server. The REST API are used to pull data from the cloud
- Model the LDM and forward engineer the PDM using Erwin to support the DBA to create the tables.
- Drafted the Data management strategy and data conversion strategy
- Performed gap analysis of health exchange enrollment system and Federal SERFF templates
- Participated in design discussions to solution informatica to access the web services that are exposed by different state and Federal systems for MNHIX and write into the enterprise integration bus.
Confidential, MN
Data Business Process Consultant
Responsibilities:
- Working closely with the Target's Finance Integration Systems team. Interacting with the business and checking if all the migration rules are as per the business requirements and ensure validations to the SAP using the Cransoft tool.
- Collecting the Configuration values from functional and business teams
- Have worked on Procure to pay module, on Vendor, Article Master, Bill of Materials, Purchase Orders, Store Groupings etc.
- Leading the teams solving the Service requests on the existing Medica's data warehouse in Enterprise Data Management group
- Leading the team working on APT sunset project a UHG application
- Working closely data architects and System's Analyst, ETL team and production support team to plan and implement the fixes
Confidential
ETL Team Lead
Responsibilities:
- Worked with business users and other data source owners to identify requirements, source file structure, and frequency of loading, implications and resolve data quality issues.
- Prepared Power point presentations, ETL design documents, Data flow diagrams.
- Accomplished data movement process that load data from different sources such as SAP BW, SQL Server, Oracle, Ms Access, Lotus Notes etc... into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as Bteq, Fastload, Fastexport, Multiload and Queryman
- Improved Informatica performance by identifying bottlenecks at source, target, mappings and session level.
- Provided expertise and trained less experienced developers, creating the necessary training materials.
- Analyzed the business and reporting requirements of client and agreed on the purified requirements for the analysis and reporting
- Understood the Nokia logical data model (LDM) for Sales Planning and Simulation, Nokia Pilots phone testing analysis, Internet and Music Radio Phones, Supplier Data Management and designed and implemented the teradata physical data model
- Worked with Project manager, Business owners, UNIX administrators, Nokia Teradata DBA and Nokia Platform team (Build, Production, Maintenance teams).
- Led daily follow up calls with Nokia Project manager and weekly meeting with the other stake holders to ensure project timelines.
- Designed and implemented the dimensional model for performing OLAP and calculating different measures using median, percentages and averages based on scores derived from data
- Implemented a small prototype application for data quality for the customer data integration initiative at the Nokia Ovi.com
- Designed the BO universe for the webi reports for customer data integration initiative at the Nokia Ovi.com and Internet and music radio phones analysis
- Lead the ETL work from preparation of design documents and developing the ETL process which also involved working with the Teradata utilities and Unix scripting
- Lead teams for completing a number of modules in 2nd and third release of the DW project
- File archiving solution using Unix scripts
- Data quality solution for customer data analysis
Environment: Informatica Power Center 8.1 (Designer, Repository Manager, Workflow Manager), TOAD, SAP BW, Oracle 9i, SQL Server, Teradata V2R5, BOXI R2, PowerPoint, UNIX, Windows.
Confidential
Data Analyst
Responsibilities:
- Developed prototype using Push Down Optimization which was the new feature in Informatica 8.1 at Nokia
- Accomplished data movement process that load data from different sources such as SAP BW, SQL Server, Oracle, Ms Access, Lotus Notes etc... into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as Bteq, Fastload, Fastexport, Multiload and Queryman
- Led the ETL work from preparation of design documents and developing the ETL process which also involved working with the Teradata utilities and Unix scripting
- Informatica performance by identifying bottlenecks at source, target, mappings and session level.
- Designed and implemented the dimensional model for performing OLAP and calculating different measures using median, percentages and averages based on scores derived from data
- Worked on Cognos framework manager to design the dimensions and measures
- Worked with the Business Owners to update status of project daily / weekly.
Environment: Informatica Power Center 8.1 (Designer, Repository Manager, Work flow Manager), TOAD, SAP BW, Oracle 9i, SQL Server, Teradata V2R5, COGNOS 8, PowerPoint, UNIX, Windows.
Confidential
Research Officer
Responsibilities:
- Database on Indian Economy enterprise-wide Data Warehouse
- Rationalization of information sources
- Prototype data mart on Macro Economic Indicators
- Prototype meta data application
- Key member of the enterprise-wide Data Warehouse Project team.
- Worked on various research issues related to Data Warehousing and Data mining
- Developed prototype data marts and browser based Metadata application Development of Data Mart (DM) on Macro Economic Indicators: involved in ETL and designing the data model. A GUI was build using Oracle Express Object.
- Developed web enabled DM on National Accounts Statistics of India: involved in ETL and designing the data model. It was published in web using oracle express web publisher
- Prototype metadata application: developed a Metadata application based on the ASP using HTML, VB Script and JavaScript and Oracle 8i. The application provides a seamless navigation across departments of RBI, Areas of operations of RBI, Returns received at RBI, variables and Data.
- Marketing the use of Data warehouse among users in the Bank
Environment: Oracle Express objects, Oracle web publisher, Oracle 9i,Microsoft DTS, SQL Server, ASP, VB script and Javascript, PowerPoint, windows
Institute for Development
Confidential
Responsibilities:
- Conducted Programs for the banks in the Data warehousing and Data mining
- Worked as an SME (Subject Matter Expert) in development of e-learning course 'Financial Information System'
- Teaching course on Data warehousing and Data mining to M.Tech
- Guided M.Tech students with their research project
- Doing research work and Publishing papers. Development of Data mart on the Credit Data of a nationalized bank with main focus on Non-performing assets, Defaulters analysis and Asset Liability Management and Risk Management
- Suggested a CRM strategy for a Bank
- Gave the enterprise wise data model for a bank to build a data warehouse
- Developed Toolkit for data warehousing and data mining, which has templates, data model for credit data mart and also data mining algorithms such as classification rules, association rules, CART, decision trees and Neural network application
- Prepared course material for the 'Financial Information System' course for the Post Graduate Programmer on Banking Technology Management
Environment: Microsoft DTS, SQL Server, C, ERWIN,PowerPoint, windows
Confidential
Research Associate
Responsibilities:
- A leading Market Research Agency
- Suggested a marketing strategy to clients using the sample survey and qualitative and quantitative data analysis techniques
- A study to aid development and shortlist of concept for a Multi-Media course for leading Software Company
- Analyzed data for quantitative study concerning energy consumption and concerns
- A qualitative research to conduct preliminary assessment of viability of a new product