We provide IT Staff Augmentation Services!

Informatica And Cognos Developer Resume

5.00/5 (Submit Your Rating)

Los Angeles, CA

SUMMARY:

  • Over 8+years of progressively responsible IT experienced Professional with in - depth knowledge of system management, analysis, design, development, testing, production implementation and maintenance of the project. Extensive experience in Report Development/Administration, along with advanced SQL coding.
  • Progressive hands-on experience in Data warehousingandETLprocesses using Informatica.
  • Strong knowledge and experience in full Cognos life cycle implementations and business workflows
  • Well versed with ETL procedures to load data from different sources like Oracle, flat files, XML files into DWH using Informatica Power Center.
  • Extensively used various Performance Tuning techniques to improve ETL performance.
  • Expertise in DWH technical architecture, design, business requirement definition and Data Modeling. Responsible for designing, coding, testing, integrating the ETL processes and implementing the reporting requirements.
  • Experienced in leading a team for the entire lifecycle of the project.
  • Solid understanding and sound knowledge of Data warehouse & DW Modeling concepts using Dimensional Data modeling, Star Schema modeling, FACT and Dimensions tables.
  • Experienced in working with Cognos 10.X, 8.X (Report Studio, Query Studio, Framework Manager), Cognos series 7 (Impromptu, PowerPlay web, Transformer, Upfront, Visualizer, NoticeCast, Decision Stream, Metrics Manager) and ReportNet(Query Studio, Report Studio, Framework Manager, Cognos Configuration
  • Extensive experience in requirements gathering and end user training
  • Experienced in handling multiple projects simultaneously
  • Experienced in Teradata, Oracle, MS SQL Server, SQL Navigator, PL/SQL, MS Access, TOAD
  • Proficient in design, analysis, development, coding, testing, modification, and optimization of reports
  • Around three years of experience in design, coding and development in C, C++, OPEN GL
  • Excellent verbal and written communication skills

TECHNICAL SKILLS

Operating Systems: UNIX, Windows NT/2000/XP/7

ETL Tools: Informatica Power Center 9.x/8.x/7.x, IBM Datastage 7.x

RDBMS: DB2, Teradata, Oracle 10.x/11g, SQLServer 2005/2000, Netezza, MS Access.

Languages: Java, Java Script, HTML, XML, SQL, PL/SQL, C, and C++, OPENGL, CLIPS

Modeling Tools: Erwin

OLAP /Reporting: Cognos 10.X, 8.X(Framework Manager, Report Studio, Analysis Studio, Event Studio, Metric Studio), Cognos 8.4 Go Office, Cognos ReportNet 1.1, Cognos 7.x(Impromptu, PowerPlay, Upfront Access Manager, IWR, PowerPlay Enterprise Server, Visualizer, NoticeCast, Metrics Manager, Decision Stream).

Tools: SQL Navigator, TOAD, MS Office 2000, MS Project

Web Servers: Apache/Tomcat, IIS, iPlanet Application Server, and Netscape Directory Server.

Scripting: JavaScript, PHP

Other Technologies/Utilities: XML, HTML, XML and ASP, Java Swing

PROFESSIONAL EXPERIENCE:

Confidential, Los Angeles, CA

Informatica and Cognos Developer

Responsibilities:

  • Leading a team of 4 members extensively worked on Production Load Management, support and Maintenance.
  • Analyzed the source data coming from different sources like, Oracle, DB2, Flat files
  • Worked on development of different cognos reports using report studio, query studio and modeled the data for the reports using framework manager.
  • Worked with business users and developers to develop the model and documentation for the different projects.
  • Data Modeling Tasks:
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Played active role in business requirements gathering and ambiguity reviews.
  • Re-designed and re-developed conceptual, logical and physicaldatamodels using CA Erwin
  • ETL Development Tasks:
  • Involved in developing complex Informatica mappings using different types of transformations like UNION transformation, Connected and Unconnected Lookup, Router, Filter, Aggregator, Expression, SQL, Stored procedure, Normalizer and Update strategy transformations.
  • Extensively worked on Informatica Designer and Workflow Manager to design and develop various mapping and workflows.
  • Responsible for creating Workflows and sessions using Informatica workflow manager. Monitor the workflow run and statistic properties on Informatica Workflow Monitor.
  • Involved in the development of workflows and mappings for the movement of the data from Oracle to Netezza.
  • Worked on informatica web services for movement of data from oracle to sales force using cloud-based interaction with Netezza DB as source.
  • Implemented complex ETL logic using SQL overrides in the source Qualifier.
  • UsedSQL Assistantto queryingTeradata tables
  • Have very good experience on masking the data methodologies.
  • Experience with dynamic data masking to decrease the risk of data breach.
  • Performed ETL testing development work, tested the data and data integrity among various sources and targets by writing the SQL Scripts and validated the results with Business Analyst.
  • Developed Unix Scripts for updating the control table based on the environments.
  • Developed Informatica Mappings, Re-usable Transformations, re-usable mappings, Mapplets and re-usable sessions.
  • Cognos Tasks:
  • Designed, and developed different Health Care reports using Report Studio.
  • Developed reports in Report Studio with drill through capabilities.
  • Involved in complete project life cycle of this project, which includes gathering requirements for report development and draft Functional Requirement Specifications.
  • Creating Dynamic dashboards with multipage port lets for business users for comparison analysis.
  • Worked on Analysis Studio to develop multi dimensional reporting.
  • Modeled metadata from MS SQL server data source for use in Report studio and Query Studio using Framework Manager.
  • Worked with QA team to close the bugs identified on the Cognos reports-Used Quality Center for bug tracking.

Environment: Informatica Power Center 9.1.1/9.5, DB2, Oracle 11g, Teradata, Toad, Shell Scripts, CA Erwin Windows XP, Cognos 10.1 (Framework Manager, Report Studio, Business Insights Advanced, Cognos Connection)

Confidential, Houston, TX

Informatica and Cognos Consultant

Responsibilities:

  • Responsible for creating deployment groups to migrate the folders, workflows and their dependencies between different environments.
  • Performed migrations of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.
  • Worked on the admin related task in creating users, handling the users and granting proper privileges in Informatica.
  • Extensively worked in the performance tuning of the Informatica mappings, Sessions and PL/SQL packages and other processes and efficiency using cache filing and best practices of Informatica.
  • Develop, test and maintain ETL procedures employing both ETL tools and custom PL/SQL.
  • Participated in the development and maintenance of a Date Warehouse Routine Load Schedule.
  • Extensively used Netezza bulk loaders using informatica power exchange connections to move data from oracle to Netezza.
  • Have good knowledge on Netezza DB architecture and involved in creating different DB’s, tables, indexes across the DEV, QA and PROD environments.
  • Had good experience in writing different Sql statements, creating views and coding on Netezza.
  • Implemented logic to control job dependencies between the workflows solely through the use of event-raise and event-wait tasks and entries made by ETLs in pilot database tables.
  • Used most of the transformations such as the Connected & Unconnected Lookups, Filters, Routers, Joiners, Stored Procedure transformations, Sequence Generators & Mapplets.
  • Worked on making session runs more flexible through the use of mapping parameters and variables by using parameter files and variable functions to manipulate them.
  • Worked onTeradata Store proceduresand functions to conform the data and load it on the table.
  • Used Teradata Data Moverto copy data and objects such as tables and statistics from one system to another.
  • UsedTeradata Data Moverfor overall data management capabilities for copying indexes, global temporary tables
  • Implemented Slowly Changing Dimensions (Type-2).
  • Identified and resolved the bottlenecks in source, target, transformations, mappings and sessions to improve performance.
  • Designing datawarehouse, developing complexFM models, reports, dashboards and OLAP solutions
  • Followed the agile methodology and conducted scrum meetings every day with team members
  • Conducted technical grooming, backlog grooming and sprint planning sessions with team members.
  • Customized the BIA tool to allow the user to create adhoc reports
  • Customized the Cognos connection for security reasons
  • Designed the custom filter creation page for the users
  • Worked with QA team to help build the test plan and to test the various aspects of the application.
  • Designed and developed the pipeline functions to implement the user created filter scenarios.
  • Implemented the custom authentication provider security in Cognos.

Environment: Informatica Power Center 9.1.1/9.5, DB2, Teradata,Cognos 10.1 (Framework Manager, Report Studio, Business Insights Advanced, Cognos Connection, Cognos SDK), Oracle 11g, Windows XP/7, ESRI

Confidential, San Diego, CA

ETL Consultant

Responsibilities:

  • Coordinated source data feeds to Data Warehouse; planning and implementing data integration processes for multiple source systems
  • Study the existing source databases and interact with decision makers for analyzing their business plans.
  • Conducted user interviews to understand and document the current "as-is" systems flows and support processes, in an effort to design the future "to-be" processes.
  • Conducted review sessions and code walkthroughs.
  • Worked on creating staging Tables, Constraints, Indexes, and Views.
  • Worked on creation of stored procedures, functions and packages.
  • Created SQL scripts in for data validation.
  • Worked with heterogeneous sources including relational sources and flat files.
  • Developed ETL sessions for initial full loading and incremental loading.
  • Used various Transformations like Joiner, Aggregator, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router while developing the ETL mappings.
  • Optimized/Tuned mappings for better performance and efficiency, identifying the bottlenecks.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, and scheduling of the workflow.
  • Used Informatica debugger to identify the data issues and fix the mappings.
  • Extensively used ILM tool fordata masking.
  • Responsible for retrofitting the code to QA environment, and extending the support for the QA and UAT for fixing the bugs.
  • Used PMCMD command to start and run the workflow from the UNIX environment.
  • Proactively identifying the changes required within the production environment and working on the enhancements/Change requests.

Environment: Informatica Power Center 8.1, Oracle 9i, SQL Plus, Erwin, Windows XP, Visio.

Confidential, San Francisco, CA

ETL Consultant

Responsibilities:

  • Liaising with Users and Technical team to convert business requirements into Technical Specifications.
  • Analyze the requirements and create the High-level, detailed design document.
  • Participate in creating application architecture document.
  • Created design documents, mappings specification and unit test cases as per the requirements.
  • Working with business to get the requirements and prepare requirements documents as per Confidential standards.
  • Developed and implemented data migration and upgrade procedures.
  • Optimization of physicaldatamodel performance.
  • Closely work with Insight Analytics team to provide strategic data through Informatica for business reports.
  • Worked with delimited and fixed with flat files and load data into Salesforce.com using direct and indirect methods.
  • Extensively used Salesforce Bulk API to load 27 million of customers in less than 6 hours.
  • Used pass through partitioning to improve the job runtime during data normalization.
  • Developing Unix scripts for splitting/sorting files, masking data and running workflows from command line.
  • Extensively worked with different transformations such as Aggregator, Expression, Router, Filter, Lookup, and Sorter.
  • Extensively used Normalizer transformation with 140 occurrences.
  • Created reusable mapplets and transformation.
  • Created user defined function and called them in multiple mappings.
  • Used mapping variables to remove the duplicates based on certain fields.
  • Created command tasks to invoke Unix scripts from Informatica.
  • Worked with Workflow variables to pass the values to the command task rather passing hard coded value.
  • Used Upsert operation in Informatica to load data intosalesforce.comusing unique external id.
  • Created Unix scripts to invoke the jobs using pmcmd and automated them with scheduling tool AUTOSYS.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Maintaining all Informatica servers and performing all Informatica Administration tasks such as taking back up of Repository, Domain and user privileges activities.
  • Deploys the code from DEV to UAT and then to PROD using Informatica deployment groups.

Environment: Informatica Power Center, SUSE Linux, Power Exchange for Salesforce.com with Bulk API, MS SQL Server

Confidential

ETL Consultant

Responsibilities:

  • Extensively involved in Gathering requirements by holding meetings with users.
  • Constructed context diagrams and data-flow diagrams based on a description of a business process. Analyzing the data model and identification of heterogeneous data sources.
  • Constructed an extended entity relationship diagram based on a narrative description of a business scenario.
  • Used Informatica Power center to load the data into data warehouse.
  • Defined Source and Targets in Informatica for ETL and created source to target mappings.
  • Development of Informatica mappings and Mapplets and also tuned them for Optimum performance and dependencies.
  • Created reusable Transformations for modifying data before loading into target tables.
  • Created mapplets in the Informatica Designer which are generalized and useful to any number of mappings for ETL jobs.
  • Created Transformations using the SQL script to modify the data before loading into tables.
  • Created and used mapping parameters, mapping variables using Informatica mapping designer to simplify the mappings.
  • Used the Business objects features Slice and Dice and Drill Down for multi dimensional analysis.
  • Inserted Objects, Conditions, classes, subclasses and user objects according to client’s requirement.
  • Prepared SQL Queries to validate the data in both source and target databases.
  • Managed the database objects Indexes, Triggers, procedures, functions, packages, cursors.
  • Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
  • Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
  • Worked in Monitoring & Production Support Activities.

Environment: Informatica Power Center 8.1/8.6, Oracle 10g, Sql Server, TOAD, Windows NT and UNIX.

We'd love your feedback!