Sr. Software Engineer Resume
Charleston, WV
SUMMARY
- Expertise in Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions.
- Good IT experience in Business Intelligence systems such as Analysis, Design, Development, Deployment, Testing, Dimension Modeling, and Enterprise Planning.
- Experience in creating landing tables, staging tables, base objects, lookups, query groups, queries/custom queries, packages, hierarchies, and foreign - key relationships,
- Expertise in Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups.
- Designed & developed multiple cleanse function, Graph function and standardization scenarios using Address Doctor.
- Configured JMS Message Queue in the MDM hub Console.
- Configured Soap based SIF API calls by using Soap UI for performing multiple tasks in Informatica MDM Hub.
- Experience in using multiple Application Serverssuch as J-Boss, WebLogic, and Web sphere
- Good development experience in Informatica (9.1/9.5) Power Center- ETL tool and Informatica Data Quality (IDQ) tool.
- Worked on different types of transformations like Source Qualifier, Expression, Aggregator, Router, Filter, Update Strategy, Lookup, Sorter, Normalize, Union, Stored Procedure, and Sequence Generators.
- Created, launched & scheduled workflows/sessions and extensively involved in the performance tuning of mappings and sessions.
- Experience in databases such as Oracle 10g/9i/8g, SQL SERVER 2000/2005, DB2 and Teradata
- Experience in SQL Queries, Creating Views, PL/SQL Stored Procedures, Functions, Triggers, Cursors as per the business needs requirements
- Experience in Cognos TM1 (10.1.1/9.5.2/9.5/9.4 )
- Experience in Advanced Turbo Integrator scripting.
- Worked on Rules to apply business rules to a cube.
- Experience in scheduling Turbo Integrator process using Chores.
- Implemented dynamic security at all levels of TM1 objects
- Experience in COGNOS 10 (10.2/10.1) (Framework Manager, Report Studio, Analysis Studio, Query Studio, Business Insight, Business Insight Advanced, Cognos Connection and Transformer).
- Proficient in meta data modeling with Erwin and Toad Data modeler.
- Experience in DMR modeling with Framework Manager
- Outstanding experience in creating report layout as List, Cross Tab, Chart, and Repeater table
- Created complex reports with advanced features such Master detailed sections, Conditional formatting, render variables, Drill-through, Page Set, Java Script and Bursting using Report Studio
- Used Analysis studio for multi-dimensional analysis and created context filters for better performance
- Implemented security at Cubes using Apex, Cloak, Suppress, Exclude.
- Experience in Cognos BI server maintenance and Administration.
- Experience in Production/Customer Support, Deployment, Development, and Integration.
- Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules
- Good communication skills and strong problem-solving skills.
- Involved in setting configurations Dev and UAT servers.
TECHNICAL SKILLS
BI Tools: COGNOS 10 (10.2/10.2.1 ) (Framework Manager, Report Studio, Event Studio), TABLEAU 9.2, Jasper Reports.Master Data Management Informatica Multi domain MDM 9.7.0
Enterprise Planning Software: COGNOS TM 1 10.2.1/10.1.1/9.5.1/9.4/9.4
ETL Tools: Informatica MDM 9.7, Informatica Data Director (IDD) 9.7, Informatica Data Quality (IDQ) 9.6
Databases: Oracle 10g/9i/8.x, SQL Server 2000/2005, DB2, MS Access.
Modeling: Erwin, Oracle Data Modeler, TOAD, SQL*PLUS, DB2 Visualizer.
Languages: PL/SQL, VB, C#
OS: Windows 95/98/2000/XP/NT
PROFESSIONAL EXPERIENCE
Confidential, Charleston, WV
Sr. Software Engineer
Responsibilities:
- The current project is to maintain West Virginia state government data ware house sub-systems which involve various programs such as Child welfare support, Food supplements (SNAP), TANF, benefits eligibility (WV Works), schooling programs, Medicaid (MA) etc.
- Master Data Management Skills (MDM):
- Performed auto match/merge, manual match/merge and ran match rules to check the cleansing of MDM process on data
- Executed ETL workflows to integrate data from varied sources like Oracle, DB2, flat files and loaded into landing tables of Informatica MDM Hub.
- Developing mappings using various cleanse functions and Address doctor functions to move data into Stage tables
- Defining Trust Score Validation Rules Match rules and Merge settings.
- Closely working with state supervisor data stewards.
- Maintenance of the IDD application, building Hierarchies as per the business needs.
- Working with Power Center Designer Repository Manager Workflow Manager and Workflow Monitor.
- Worked on Real Time Integration between MDM Hub and External Applications using Power Center and SIF API for JMS.
- Created ETL jobs and mappings with the help of Informatica power center to integrate the data from different heterogeneous sources like flat files, .csv, SQL database, oracle database.
- Setup SIF for java application communication and interfacing the IDD as required.
- Worked w /Data Profiling Team in analyzing the source systems data for duplicative and quality issues.
- Performeddata profilingof reference data for large data sets using Informatica Data Analyst tool.
- Working in configuring the landing tables’ structure according to standardization of the Client.
- Configured Base Object Tables in schema and relationship tables in hierarchies according to data-model.
- Configured Landing Tables, Staging Tables, Look-ups, Cleanse Lists, Cleanse functions, Mappings, Audit trail/ Delta detection.
- Configured Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, query groups, packages, and custom cleanse functions.
- Created Queries, Query Groups, and packages in MDM Hub Console.
- Configured Informatica Data Director (IDD) as per requirement request from county Data Stewards.
- Performed testing on data validation, Batch Jobs and Batch groups.
- ETL (PL/SQL / Informatica / Oracle Warehouse Builder):
- Moving mappings, transformations, stored procedures from OWB to Informatica manually and via migrator tool.
- Wrote store procedures for small level of transformations.
- Loading data from VSAM files, .txt files using Informatica.
- Transformations used in Informatica: Update strategy, Filter, lookup, Stored procedure, Joiner, Router, XML parser, XML Generator transformations.
- Uploading data from mainframes files to oracle source tables using Oracle warehouse builder.
- Converting transactional DB2 tables to Oracle tables using OWB.
- Creating mappings and transformations for moving data from DB2 to Oracle target tables.
- Scheduling jobs which automatically refreshes the weekly data.
- Deployment releases from lower environments to production environments.
- Mainframes:
- Analysis of COBOL program and automation of the reports by extracting logic into SQLs for COGNOS reporting purpose.
- DB2 SQL creation using mainframe Platinum DB2 utility tool.
- Mainframe Data ware house job monitoring using Control-M tool for weekly/monthly/quarterly jobs.
- VSAM and .txt files created from Data ware purposes.
- Reporting and Visualizations (COGNOS / Tableau / Jasper):
- Gathering requirements from the end users for upcoming release, creating new reports and enhancements to existing reports and Create business and technical documents for migrated and newly created reports.
- Improving the performance of the existing reports which were taking more time to populate.
- Migrating Ad-Hoc script reports, MOBIUS reports which are coming for main frames to COGNOS reports.
- Upgraded complete COGNOS environment from version 10 to version 11. (Multi-dispatcher environment)
- Designed complete COGNOS framework model for 4 benefit programs,
- Created visualization reports in TABLEAU for federal officers and county supervisors for finance department.
- Tools mainly used in COGNOS: Report Studio, Event Studio, COGNOS Workspace and COGNOS Workspace advanced.
- Creating new advanced dashboards which give more flexibility to users to do data analysis.
- Creating new Jobs and scheduling the reports.
- Implementing high data security privileges so that supervisors at counties will have rights to modify the reports and workers will have access only to read the reports.
- Support and Testing:
- Write the SQL’s to verify whether the new report outputs are correctly populated.
- Providing Level - III technical and on-call support for critical data fix issues and incorrect data reports.
Environment: Informatica Power Center 9.6.1, Informatica MDM 9.7, Informatica Data Director (IDD) 9.7, Informatica Data Quality (IDQ) 9.6, J-Boss 5.1, Oracle PL/SQL, ERWIN, Oracle 11g, DB2, COGNOS 10v, COGNOS Framework Manager, Tableau 9.4, Oracle warehouse builder (OWB) Design Center, OWB control center, DB2, Mainframes, Unix.
Confidential, Chicago, IL
Software Engineer
Responsibilities:
- ETL (PL/SQL / Informatica):
- Developed dimensions of Type I and Type II.
- Developed Mapplets, Parameters and Variables to for reusability of code
- Data Profiling and Code migrations from lower environments DEV, UAT, INT to PROD repositories and monitoring all the environments
- Informatica Administration, managing user group, roles, and user accounts.
- Code migration from Development to Production
- Created mappings in Power Center Designer using Aggregate, Expression, Filter and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations
- Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings
- Creating indexes on tables to improve the performance by eliminating the full table scans
- Wrote stored procedures for small ETL functionalities.
- Cube Designing (Dynamic Cubes):
- Project is mainly about migration of Cognos transformer cubes to Dynamic cubes because of increase in fact table size.
- Installed cube designer and setup new Dynamic cube environment.
- Developed a dynamic cube like that of existing transformer cube for POC purpose.
- Created database aggregated tables for enhancing the performance of dynamic cubes.
- Implemented the dynamic cubes for multiple departments (Sales, Event management systems, Financials)
- Created various in memory aggregates by analyzing the cube model and work load logs using the aggregate advisor.
- Administered the various Dynamic cubes multi- dispatcher environment.
- Created various jobs to refresh the member cache and data cache at regular intervals for the cubes.
- Reporting (COGNOS):
- By using Cognos RAV created extensive graphical dashboards by using the data from dynamics cubes as source.
- Built a DQM Model with Relational Data using Framework Manager to support the multiple languages for Report Globalization and Deployed Packages to the Report Servers.
- Performance tuning by analyzing and comparing between SQL and Cognos.
- Set up the Security at the package level, query subject level and the query item level.
- Involved in Customer UAT in Dev, QA, and Prod environment.
- Managed scheduling of reports, public and my folders and distribution lists, users, groups, and roles using COGNOS Connection in COGNOS 10.2.1
- Developed complex List Reports, Cross tab Reports, Chart Reports, and Drill-Through Reports.
- Customized data by adding prompts, Filters, Calculations, Summaries, and Functions.
- Customized Reports by writing custom SQL.
- Deliver Advanced/ Complex reporting solutions such as Dashboards and Standardized Reporting (including the bursting of reports globally) using Cognos 10.2.1
- Created Transformer models and Power cubes for Multidimensional analysis.
- Addressing the Performance and other Production issues with the Cognos 10.2.1 reports.
Environment: Informatica Power-Center 9.6, Teradata, Oracle 11g, Cognos 10.2.1 Suite, Cognos Cube Designer, Cognos Dynamic Query Analyzer, SQL Server Studio, Cognos Studios. (Cognos Connection, Framework Manger, Report Studio, Query Studio, Analysis Studio, Metric Studio), Informix.
Confidential, Lincolnshire, IL
Software Engineer
Responsibilities:
- Interacted with business users to gather requirements.
- Created new mapping dimensions using Turbo Integrator (TI) process which are used as source for the dimensions which are needed to be converted.
- Current dimensions which are in scope of change are updated using new mapping dimensions by not altering the data for other elements in dimensions which are not changing.
- Merged two different data sources to one common source for TM1 cubes.
- Created separate dimension using TI processes for validating that data remained unchanged throughout the conversion process.
- Assigned new attributes for all associates with their preferred names and placed aliases which show the legacy codes after the conversion.
- Created TI processes which is used for separating the Associates who currently active from the associates who are terminated, retired, and missing from organization.
- Separate subsets were done based on user requirements.
- Implemented new methodology for adding new pay levels based on their Job Codes for all the Associates who are from AON group.
- Involved in setting configurations Dev, QA and UAT servers.
- Performed Regression, conversion, and conversion cleanup testing to prove data remained unchanged based on control totals.
- New pay levels were assigned to all employees based on their new job codes.
- TI scripts are created in modular approach so that the written TI scripts are based on the parameterized loading of historic data into the cubes.
- Logging of issues while modifying dimensions was made easier by implementing E-mail functionality.
- Loading of actuals results were done monthly bases.
- Created new views and increased the view size for validating huge data as per the business user reports used currently.
- TM1 rules were added for new measures introduced.
- TM1 perspectives for BI user’s convenience are created by using Citrix Gateway.
- Use of Advanced Query Tool (AQT) for validating mapping tables from data warehouse team.
- Added separate group for data warehouse users to show how data looks after the conversion process is done by limiting their security to modify the data.
- Separate chores where created for updating the mapping tables from time to time.
- Cell to cell data validation testing is done using MS Access.
- Implemented Cognos Insight on request.
Environment: Cognos TM 1 10.2.2/10.2.1 , Citrix Access Gateway, Advanced Query Tool (AQT), MS Access, Cognos Insight.
Confidential, Omaha, NE
Software Engineer
Responsibilities:
- Worked closely with the business to understand and model the business processes. Created the models of ‘AS-Is’ and ‘To-Be’ business processes.
- Involved in defining the scope and defining business rules of the project.
- Interacted with Business Users and created the process requirements. Also, discussed/explained the flow of future data modules with design and development teams.
- Experience in creating a dynamic excel templates by creating temporary view, export data to excel and destroy the view
- Created user input templates using Active forms with conditional formatting and security
- Implemented security at Application level, Cube level, Dimension level, Element level and Data level.
- Created Drill through Process from TM1 Cube to Sql Server for querying transaction level data for each Cost Center.
- Using Global Variables performed Error handing /TI Process monitoring in Chores and loaded the Error Status into logging cube.
- Proactively monitored the environments for capacity and performance related issues and take steps necessary to address them.
- Migrated TM1 objects from Prod to Dev and Installed TM1 test Servers for Users.
- Worked as BI Report Author on Cognos 10.2 toolset including Report Studio, Query Studio, Analysis Studio, Framework Manager, Active Reports, and Cognos Workspace.
- Interacted with Data Modeling team and ETL team to make available certain table and data modifications required to successfully execute report requirements
- Used Cognos Framework Manager for metadata modeling and followed layers approach such Database Layer, Business Layer, and Presentation layer
- Worked on report layouts such as Chart, List, Crosstab, and Repeater for presenting the data in efficient manner in Report Studio
- Created several complicated reports that involve multiple query joins, cascade prompts, conditional variables and report filters
- Created Master-Detail relationship-based reports that represented information pertaining to a certain identifier or master field based upon either User entered report-filters (e.g. Drop down lists, from/to date ranges, sorting options etc.)
- Scheduled reports requiring distribution or bursting to stakeholders
- Defined various Drill through definitions for navigating from Analysis studio to Report studio and Report Studio to Query Studio
- Worked on Active reports to create static reports for offline users
- Administered Cognos BI Environment which includes, adding users and groups, update fixpacks, installation etc
- Demonstrated excellent business writing skills, business process flow diagrams, comparison of current and projected processes and their impact, efficiency gains in implementing new technologies and processes
- Conducted user acceptance testing (UAT) along with the business team and made modifications to the current process as per recommendations.
- Create test plans, manage user acceptance testing, document, and track issues, and ensure resolution of issues.
Environment: Cognos TM1 10.1.1, TM1 Contributor, Cognos 10.2 (Framework Manager, Report Studio), Sql Server 2005, Microsoft Office 2003, Windows 7.
Confidential, San Diego, CA
Software Engineer
Responsibilities:
- Interacted with business users to gather requirements and designed TM1 architecture so that previous architecture is not disturbed.
- Created dimensions using Turbo Integrator(TI) process which replaced the old dimensions
- Assigned several aliases and attributes to different elements and applied various formats for measure in the dimension
- Worked with subsets and cube views
- Worked on Turbo Integrator to perform ETL process efficiently
- Written TI scripts for parameterized loading of historic data into the cubes
- TM1 rules with perfect feeders which were applied to the cube as per business needs.
- Declared skipcheck and feeders function in Rules editor for faster performance of complex calculations
- Used TM1 Perspectives for Excel to slice the cube which are similar to the earlier perspectives
- Worked with spread sheets to create TM1 applications and made available through TM1 web
- Used several TM1 functions to interact with the data from TM1 cubes
- VBA/Macros were used to create best user interface in excel
- Added new client groups, assigned users to groups and implemented security at Application level, Cube level, Cell level, Dimension level, Element level, Process level and Chores level
- Created Approval hierarchy as a subset in dimensions and created cube views for use in contributor applications
- Designed contributor applications, deploying the application, assigning security rights to at each level of approval hierarchy.
- Interacted with historic cube data using earlier TM1 cubes
Environment: Cognos TM1 9.5.1, TM1 Contributor, Oracle 11g.
Confidential, Moline, IL
Software Engineer
Responsibilities:
- Involved in designing TM1 application architecture and its workflow
- Experience in creating snapshot process in Forecasting cube which will populate all the forecasted values in the version dimension for given period, involving change of rules, changing flag in month dimension etc
- Involved in TM1 10.1.1 installation and configured Perspective and TM1 web as remote servers
- Experience in creating a dynamic excel templates by creating temporary view, export data to excel and destroy the view
- TM1 rules with feeders.
- Worked on Active forms to create user input templates and reports with conditional formatting and security
- Designed workflow in TM1 application with manager dimension allowing managers to approve or reject forecasted values using picklists.
- Experience in creating dynamic subsets using MDX in dimension editor and also in turbo integrator scripting.
- Implemented security at Application level, Cube level, Dimension level, Element level and Data level.
- Experience in scheduling the turbo integrator process using chores.
Environment: COGNOS TM1 (10.1.1), COGNOS 10 (10.1/10.1.1 )
Confidential, Warren, NJ
Software Engineer
Responsibilities:
- Installed and maintained TM1 application
- Gathering business requirements and implementing best possible technical solutions
- Worked integrating TM1 and Cognos BI with best practices
- Created several cubes, dimensions and TI processes implementing best practices
- Created a dynamic excel templates by creating temporary view, export data to excel and destroy the view writing TM1 rules
- Designed Active forms and Worksheet application for user interaction with TM1 and prompt selections
- Implemented email feature in TM1 to notify user about the status of TI processes
- Implemented dynamic security by adding user from LDAP, create a user group according to the country and assign appropriate privileges to the user
- Developed complex reports in report studio using prompts, parameters, filters, and conditional formatting
- Customized the reporting environment by adding new portal skin including corporate logos and approved colors and new links
- Creating and Managing Jobs and Schedules
- Scheduled and managed events using Event studio
- Involved in Report testing and UAT
Environment: Cognos TM1 (9.5.1), TM1 Contributor, COGNOS 10 (10.1/10.1.1 ) (Framework Manager, Report Studio, Analysis Studio, Query Studio, Business Insight, Business Insight Advanced, Cognos Connection and Transformer), Cognos 8.4, Cognos Transformer, Oracle10g, Windows 2000/XP
Confidential, Jersey City, NJ
Software Engineer
Responsibilities:
- Involved in Requirements collection with Business users and prepared Functional Requirement Specifications for report development and designed TM1 architecture
- Worked with subsets and cube views. used Turbo Integrator to perform ETL process efficiently
- Experience in using TI scripts for parameterized loading of data into the cubes
- Created dimensions manually and using Turbo Integrator(TI) process
- Used TM1 Perspectives for Excel to slice the cube and worked with spread sheets to create TM1 applications and made available through TM1 web
- Declared skipcheck and feeders function in Rules editor for faster performance of complex calculations
- Used VBA/Macros to create best user interface in excel
- Experience in creating Approval hierarchy as a subset in dimensions and created cube views for use in contributor applications
- Was involved in creating new client groups, assigned users to groups and implemented security at Application level, Cube level, Cell level, Dimension level, Element level, Process level and Chores level
- Involved in Requirements collection with Business users and prepared Functional Requirement Specifications for report development
- Developed the Framework Manager model by pulling the data from multiple data sources like Oracle, SQL Server 2005 and published packages to the Cognos Connection
- Involved in creating Active reports and Business insight advanced.
- Created Dimensions, Levels, Calculated Measures, Special Categories with relative time period and Cube groups
- Applied various kinds of prompts such as Value prompt, Date prompt, Textbox Prompt, Select & Search Prompt for report's better performance
- Created reports using advanced features like Master-Detail, Drill-through, Conditional Formatting and Page set
Environment: Cognos TM1 (9.5.1), TM1 Contributor, COGNOS 10 (10.1/10.1.1 ) (Framework Manager, Report Studio, Analysis Studio, Query Studio, Business Insight, Business Insight Advanced, Cognos Connection and Transformer), Cognos 8.4, Cognos Transformer, Oracle10g, Windows 2000/XP.