We provide IT Staff Augmentation Services!

Informaticaand Cognosdeveloper Resume

5.00/5 (Submit Your Rating)

Harris Burg, PA

  • Over 6+ years of experience in IT industry with expertise in Informaticaand Testing.
  • 5+ years of experience in the Development and Implementation of Datawarehousing with Informatica, OLTP and OLAP using Data Extraction, Data Transformation, Data Loading and Data Analysis.
  • 3 years of experience in Business Intelligence with Cognos 10.1.1,Cognos 8.x series.
  • year of experience with proficient in Manual and Automated Testingwith experience on automation testing tool Quick Test Pro, bug tracking tool Test Director and configuration management tool VSS.
  • Experienced in Data warehouse reporting with Cognos 10.1,8.0, 8.2 BI Series, Cognos.
  • Experience in all phases of the Data warehouse life cycle involving Analysis, Design, Development and Testing of Data warehouses using ETL.
  • Strong Data warehousing experience using Informatica Power Center/Power Exchange / Power Mart 8x/7x/6x/5x.
  • Extensively used Informatica tools (such as Informatica Server and Client tools like Designer, Server Manager/Workflow manager, Workflow Monitor, Repository Manager).
  • Experience in implementing the complex business rules by creating transformation, re-usable transformations (ExpressionAggregatorFilterConnected and Unconnected Lookup,RouterRank, JoinerUpdate Strategy), and developing complex Mapplets and Mappings, and SQL Stored Procedure, and Triggers.
  • Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Mart and Power Center.
  • Developed Slowly Changing Dimension Mappings of type I, II and type III( version, flag and time stamp)
  • Worked with Oracle Stored Procedures, Table Partitions and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL*Loader, Export/Import utilities.
  • Developed mapping using Parameters and Variables. Extensively used parameter file to pass mapping and session variables.
  • Worked with PMCMD to schedule the workflow.
  • Involved in writing stored procedures, views, cursors, functions to load data from different sources to staging area.
  • Having command on troubleshooting with the help of error logs generated by Informatica server.
  • Experience with UNIX shell and Python scripting for File validation and SQL programming.
  • Practical understanding of Star Schema and Snowflake Schema Methodology, using Data Modeling tool Erwin 4.0/4.2.
  • Hands on experience in writing, testing and implementing Triggers, Stored, Procedures, Functions, Packages at database level and form level using SQL.
  • Highly adaptive to a team environment and proven ability to work in a fast paced teaming environment with excellent communication skills.
  • Involved in project documentation.
  • Extensive Experience in SQL ServerOracle, Teradataand DB2 for building and implementing data warehouse.
  • Expertise in creating complex reports in Cognos Report Studio such as List reports, Cross-tab reports, Drill through reports, Master-Detail Reports and Cascading Reports and involved in reports performance tuning.
  • Experience in Metadata Modeling both Relational and Dimensional model.
  • Created Projects, Models using Cognos Framework Manager and published packages to Cognos Server for reporting authoring.
  • Experience in creating different layers in Framework Manager Models based on the business requirements.
  • Created Dimensionally Modeled Relational in Framework Manager and created analytical reports using Analysis Studio.
  • Experience in ReportsMigration such as migrating from Cognos 8.4 to Cognos 10.1.1.
  • Experience in developing dynamic Dashboards and Scorecards and various charts and map reports depends on the business requirements.
  • Involved in the DesignDevelopment and Implementation of theCognosMulti Dimensional Power Cubes using Transformer.
  • Developed several reports using Cubes as data source.

Technical Skills

Operating Systems :Windows95/98/2000/2003/NT/XP, UNIX.
ETL Tools :Informatica Power Mart 5x, Power Center V5x/V6x/V7x/V8x, Power Exchange,Business Objects Data services 4.0
Scripting Languages : Bash, Python, and Perl.
Data Modeling Tool : Erwin 4.0/4.2
Databases : Oracle 8i/9i/10g, Sql Server 2000/2005, Teradata, MySQL, Ms Access.
Languages : C, C++, SQL, PL/SQL, HTML, VB.
BI Tools :Cognos 8.1/8.2/8.4/10.1.0, Report net, Report studio, Impromptu. 
Testing Tools : Quick Test Pro, Test director, VSS
Database utilities : SQL Navigator, Toad 7.6/8.0, SQL Loader

Confidential, Harris burg, PA 
Role: Informaticaand Cognosdeveloper(Nov 2010 to Present)
The Health department of Pennsylvania is currently implementing a major initiative Pennsylvania’s National Electronic Disease Surveillance System (PA-NEDSS).It is an online, public health disease reporting and case management system for the Pennsylvania Department of Health (DOH). PA-NEDSS seeks to provide a single, integrated Web-based application. The Enterprise Data Warehouse is built for PA to generate reports for prior approval of the budget.
Responsibilities:

  • Involved in Dimensional modeling(Star Schema) of the Data warehouse and used Erwin to design the business process, grain, dimensions and measured facts
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto DataWarehouse.
  • Developed number of complex Informatica mappings, Mapplets, reusable transformations to implement the business logic and to load the data incrementally
  • Developed Informatica mappings by usage of Aggregator, SQL Overrides in Lookups, Source filter inSource Qualifier and data flow management into multiple targets using Routertransformations
  • Used Debugger to test the mappings and fixed the bugs.
  • Used various transformations like Filter, Expression, Sequence Generator, Update
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading data warehouse.
  • Used Power Center server manager/Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process using Control M auto scheduling tool.
  • Migrated mappings, sessions, and workflows from Development to testing and then to Production environments.
  • Created multiple Type 2 mappings in the Customer mart for both Dimension as well as Fact tables, implementing both date based and flag based versioning logic.
  • Monitor troubleshoots batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database.
  • Given Production support to resolve the issues.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Tested the data and data integrity among various sources and targets. Associated with Production support team in various performance related issues.
  • Developed UNIX shell scripts to move source files to archive directory.
  • Used Informaticapower connect to connect external Data bases.
  • Involved in Unit, Integration, system, and performance testing levels
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Created various data marts from data warehouse and generated reports using Cognos
  • Developed Standard Reports, List Reports, Cross-tab Reports, Charts, Drill through Reports and Master Detail Reports Using Report Studio.
  • Created Query prompts, Calculations, Conditions, Filters, Multilingual Reports Using Report Studio.
  • Good knowledge in Framework Manager, Report Studio, Query Studio, Cognos Connection, Analysis Studio.

Environment:Informatica Power Center 8.1, Informatica power connect,Oracle10g, DB2,SQL, PL/SQL, RUP,SQL* Loader, Erwin, TOAD 9.5, Star Schema, UNIX Shell Scripts, Cognos8.x , Flat files, Windows XP, and MS-Office tools.

Confidential (OAK,CA)Informaticaand CognosDeveloper August 2009 – Nov 2010

The Office of the President is the system wide headquarters of the University of California, managing its fiscal and business operations and supporting the academic and research missions across its campuses, labs and medical centers. The 10 campuses of the University of California open their doors to all who work hard and dream big. Through its teaching, research and public service, UC drives California\'s economy and leads the world in new directions. The UC family includes more than 220,000 students, more than 170,000 faculty and staff, 37,000 retirees and more than 1.5 million living alumni.

Responsibilities:

  • Involved in the DesignAnalysis and Development of data warehouse using Informatica/Power
  • analyzer tools. The project involved in creating data warehouse/data mart which is further used
  • for reporting purpose.
  • Extensively making use of Informatica Power Center 7.1.1 as ETL tool to load tab Delimited and XML files.
  • Responsible for sending Invoice, Support Customer Master, Support and Miscellaneous subject area data into CRM Data Warehouse.
  • Analyzed Mapping, Session, Source, Target and System Bottlenecksto improveand tuned Performance of various ETL jobs.
  • Worked extensively on different types of transformations like Source Qualifier, Expression, Lookup (Connected & Unconnected), Stored Procedure, Joiner and Union Transformation.
  • Widely used of Shared folders, project folders (Non-Shared folders) and short cuts to maintain integrity between Development, Test, Data validation, Performance and Production repository.
  • Worked with Slowly Changing Dimension of type I, II.
  • Responsible for Informatica Code Migration from development to TestQAPerformance and Data Validation environment.
  • Frequently using import & export utility to migrate session from developer’s folder to subject folder.
  • Extensively used parameter file to pass Mapping and Session Variables.
  • Wrote UNIX Shell Scripts for file validation and scheduling Informatica jobs
  • Involved in many technical decisions and prepared technical design document along with vision diagrams for ETL Tech leader.
  • Created list Reports, Cross Tab Reports and Drill Through Reports.
  • Created Prompt pages according to the requirement of business user and added Conditions and Filters
  • Working as an individual team member to develop most of the process as well as working in a team to co-ordinate with other team members.
  • Performing Unit TestFunctional TestIntegration Test and System Test depending upon the environments along with testing team.
  • Involved in LLD Document and Design
  • Updated mappings and transformation with necessary comments to Track of Revision History and logic being used.
  • Interacted with business analyst to gather business requirements and designed functional and technical requirements documentation.
  • Developed models in Framework Manager and deployed packages to the Cognos Connection.
  • Customized data by adding filters at both the Framework Level and Report Level.
  • Published different packages from Framework Manager to Cognos Connection.
  • Implemented all business logic in Framework Manager by creating prompts, folders, calculated columns, joins and naming convention.
  • Worked on creating and analyzing the complex reports like Cross tab, drill-thru reports.
  • Developed active reports using Cognos 10.1.1
  • Page explorer, query explorer and variable explorer were used to manage the content of the reports.
  • Bursting the reports in various formats, such as html, PDF, excel on the server based on the sales area and different levels of administration.
  • Created Dashboards comprising of list reports, cross-tab reports and chart reports using underlying multiple query structure by using filters, calculations, complex expressions, and conditional formatting.
  • Assisted in creating executive dashboard for management level reports using Report Studio.
  • Worked on complex filter condition, query join logic, conditional formatting, various type of prompts like value prompt, search & select prompt, Text Box prompt and Date & Time prompt.
  • Used layout component reference in page header, page footer and prompt page for uniformity across the reports.
  • Changed the appearance of the reports in Query Studio by reordering the report items, swapping rows and columns and by limiting the number of rows that should appear on a page.
  • Created Ad-hoc Reports using Query Studio.
  • Used Cognos Connection to administer and scheduling reports to run at various intervals.
  • Scheduled and Distributed reports using Schedule Management in Cognos Connection.
  • Created Power Play cubes with data level security based on the user profile
  • Created various IQD in Framework Manager and from this IQD’s as source, integrated to Power Play Transformer to create various models and build cubes.
  • Creating the model and publishing the Cubes to Cognos Connection to be used for multidimensional analysis.
  • Involved in performance tuning of the application using governor settings in framework manager and transformer models.
  • Used Transformer with a variety of sources and combined them into Multi-DimensionalCube.
  • Generated all types of complex Work Order, Purchase Order, Account Balance Reports and Supervisor Reports using Report Studio and Query Studio. Analyzed cubes from cube packages using Analysis Studio
  • Designed, developed, and tested Power cubes and deployed them in to production
  • Created Virtual cubes using multiple Analysis Services Cubes.
  • Creation of Model definition, calculated measures, multiple cubes and cube groups in Power Play Transformer
  • Worked with user profiles to ensure security by restricting user access to data based on the individual user.
  • Set security for individual reports in Cognos Connection.

Environment:Informatica Power Center 7.1.1, Cognos 8.3, Oracle 9i,Teradata, SQL, SQL Navigator, UNIX, Windows 2003/XP.

Confidential, FLEtl Developer July 2007 – June 2009

The Web Software Developer will collaborate with the technical team to develop software as service products.This position will collaborate with internal technical teams to design, develop, and implement databases for our software applications. This position is responsible for understanding end-to-end requirements and providing technical expertise in designing, developing, and maintaining our database architecture.Neighborhood America is seeking a talented, hands-on software engineers to join our team building enterprise social networking solutions.

Responsibilities:

  • Analyzed the business requirements and framing the Business Logic for the ETL Process.
  • Extensively used ETL to Load Data from fixed width as well as Delimited Flat files.
  • Worked extensively on different types of transformations like Normalizer, Expression, Union, Filter, Aggregator, Update Strategy, Lookup, Stored Procedure, Sequence Generator andJoiner.
  • Designed and Developed complex mappings, ReusableTransformations for ETL usingInformatica Power Center 6.2.2
  • Designed Mappings between sources to operational staging targets, using Star Schema, Implemented logic for Slowly Changing Dimensions.
  • Developing and testing all the Informatica mappings involving complex Router, lookups and update strategies.
  • Extensively wrote user SQL coding for overriding for generated SQL query in Informatica
  • Created workflows and worklets for Designed Mappings.
  • Implemented variables and parameters in the mappings.
  • Setting up batches and sessions to Schedule the loads at required frequency using Power center workflow manager, PMCMD and also using scheduling tools.
  • Generated completion messages and status reports using workflow manager.
  • Worked with Workflow Manager to import/export metadata, jobs, and routines from repository, and also created data elements
  • Working on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database.
  • Extensively worked in Performance Tuning of programs, ETL procedures and processes. Also used debugger to Troubleshoot Logical Errors.
  • Involved in writing ETL specifications and unit test plans for the mappings.
  • Performed Developer testing, Functional testing, Unit testing for the Informatica mappings.

Environment:Informatica Power Center 6.2.2, SQL, SQL Loader, Toad 8.0, Windows NT/2000.

Confidential, Louisville KYEtl Developer Dec 2006 – June 2007

Confidential is a leading provider of Health Insurance

Responsibilities:
  • Responsible for development, support and maintenance of the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Used Informatica Power Center 8.1.3 for upgrading the existing system.
  • Developed an ETL Informatica mapping in order to load data into staging area. The source file was flat file and target table Oracle 10g/9i.
  • Used Informatica Power Exchange connects to Mainframe systems to access data in various formats like VSAM, IMS, IDMS, COBOL and Microsoft Visio files through data maps that acts like SQL views.
  • Designed Logical and Physical Database (Star Schema) using Reverse Engineering principle with Erwin 4.0 Tool.
  • Involved in writing SQL, Stored procedures and debugging them.
  • Designed Audit Strategy to audit the data between Source System, Target System and Parallel Reporting Systems and used power exchange for source data validation.
  • Created mappings using the transformations like Source Qualifier, Aggregator, Expression, Look Up, Router, Filter, Update Strategy, Joiner, Sequence Generators and Stored Procedure
  • Worked with various Informatica Power Center tools –Source Analyzer, Data warehousing designer, Mapping Designer &Mapplet, Transformations.
  • Used Slowly Changing Dimension Mappings of type II.
  • Created reusable transformations and Mapplets and used them in complex mappings.
  • Writing Stored Programs (Procedures & Functions) to do Data Transformations and integrate them with Informatica programs and the existing application.
  • Used Workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, scheduled them to run at a specified time.
  • Extensively usedPL/SQLfor creation ofstored procedures.
  • Worked on Parameterize of all variables, connections at all levels in Window NT.
  • Creating SessionsWorklets and Workflows for carrying out test loads.
  • Worked with Informatica Debugger to debug the mappings in Designer.
  • Developed Informatica Mappings and also tunedforbetter Performance.
  • Involved in creating test plans, test cases to unit test the mappings.
  • Involved in migrating the ETL application from development environment to testing environment.
  • Documented and presented the production/support documents for the components developed, when handing-over the application to the production support team.
  • Developed advanced reports using Report Studio – Using Tabular Objects, Sub reports, Cascading/customizing prompts and cross tabs, list, charts, Drill- through reports.
  • Customized data by adding CalculationsSummaries and Functions.
  • Creating User Classes and Catalog Prompt Condition.
  • Performed Unit Testing for the mappings using SQL Scripts.
  • Designed Excel Sheets for each mapping of their Test Scenarios.

Environment:Informatica Power Center 8.1.3/7.1.3, Workflow Manager, Workflow Monitor, Erwin 4.2, SQL, Power Exchange, Flat Files, XML, Oracle 10g/9i,Teradata,MySQL 5.1, Cognos 8.2 BI Series, MS- Excel, Windows NT/XP.

We'd love your feedback!