Informatica Developer Resume
SUMMARY
- 7+ years of experience in Business Analysis, Business Intelligence and CRM Applications using Informatica Power center 6.x,7.x and 8.x ,BODS 3.2,4.0, Service Orientated Architecture (SOA), Database Development, Quality Assurance - Testing, software design & development.
- Experience with working in BODS 4.0, BODI 3.X, BO XI 3.1, including Architecture, CMC, Clustering BO Servers, Universe Designer, Info view, Web Intelligence Rich Client, QWAAS, Live Office andImport Wizard.
- Expertise in preparing S2T Mapping documents, report specifications, database designs to support ETL and reporting requirements.
- Experienced in developing DWH process based on Informatica Power center 7.1.2, 7.1.4, 8.1.1, 8.5.1 & 8.6, Oracle DB, SQL Server 2005 and SQL & PL/SQL.
- Expertise in Star and Snowflake designs of Financial, Sales, and Supply chain Analytics.
- Expertise in Performance Tuning Informatica ETL Jobs, Cognos Dashboards/reports, Oracle SQL.
- Worked Extensively on Data Migration.
- Expertise in working with Informatica Client tools and fine tuning the Informatica Mapping Logic.
- Created numerous executive level reports. Dashboards were generated to provide insight into the sales / marketing/financial data
- Worked Extensively on ETL customizations using Informatica Power Center.
- Extensively Developed Informatica Complex Mappings.
- Expertise in Informatica Repository Upgrades(7.x to 8.1,8.5.1& 8.6)
- Implemented SCD type 2 Logic.
- Worked extensively on Lookup, Aggregator, Joiner, Update strategy, Expression etc.
- Expertise in Informatica data migration and Informatica Power exchange and Power center Administration(Installation, Upgrade, Security, Grid, Migration)
- Worked Extensively on UNIX scripting.
- Expertise in Oracle and SQL Server Databases.
- Experienced in transforming business requirements into technical and functional specifications.
- Managing and motivating others in a cross cultural/functional environment to achieve business and project objectives.
- Achieving consistent high performance in an area of rapid change and delivering high quality business solutions.
- Delivering customer presentations to various business groups/areas.
EDUCATION
- Bachelor of Mechanical Engineering
SKILLS
- ETL Tools : BODS, Informatica (Power Center 8.x/7.X,6.x)
- BI Tools : Business Objects 6.1/5.1/4.x, OBIEE
- Data Modeling : ERWIN 4.0/3.X, Visio 2000
- RDBMS : Oracle 10g/9i/8i ,SQL Server, MS Access
- Operating System: UNIX (Sun Solaris, HP Unix, LINUX), WIN 2000/XP
- GUI Tools : Visual Basic, SQL Developer
- Tools/Utilities : Teradata Load Utilities (BTEQ, Fastload, Multi-Load, TPump,Fast Export), Query man, TOAD
- Languages : PL/SQL, SQL, C, C++
- Data Cleansing : Informatica Data Quality 8.6.1/8.5, Informatica Data Explorer 8.6.1/8.5
EXPERIENCE:
Sep’ 11 – Till Date BODS Developer, Confidential,Jersey City, NJ
UBS presented one unified group to handle all Fixed Income cash and derivative products, assisting clients in managing their assets and liabilities. Fixed Income division offers many investment options including Global Bonds, US Bonds, Emerging Markets Debt, US High Yield, Municipal Fixed Income, etc. Intranet web application was used to handle Customer profile data, Projected Positions, REPO deal Capture and Investment performance tracking.
- Analyzed the Business Requirements with the Business users.
- Participated in Requirement gathering sessions and developed Business Objects Data Integrator XI jobs specifications on Excel sheets based on information acquired from analysis of source data, user requirements and business Rules
- Designed Jobs and Complex Workflows and Dataflow
- Created Scripts including starting script and ending script for each job and declaring the variables local and Global Variables.
- Defined separate data store for each database to allow Data Integrator XI to connect to the source or target database.
- Tune the simple and complex transformations in Data Flows.
- Design and developed Data Integrator scripts for data transformation.
- Created, validated and executed jobs to transfer data from source to target.
- Experience in debugging execution errors using Data Integrator logs (trace, statistics and error) and by examining the target data.
- Used Scripts for declaring the variables and creating the tables and inserting data into them.
- Perform Unit testing and Document the results of Unit testing
- Document the ETL process
- Integrated Data Quality jobs with DI job for data leasing and standardizing
- Worked on Data Profiling using DI/DQ and Data Insight tools to know the quality of data
- Created DQ workflow to clean and standardize investigators names and address using USA and multi plug-in address transforms
- Integrated DQ workflows with DI ETL jobs with cleansing and matching logic.
- Used data insight to profile the source data to generate the quality of source data
- Planning, Design and Development of various reports based on integrated assets Reporting System.
- Documenting Technical and Functional specifications for reporting requirements
- Creating Universes, Classes, Objects, Measure Objects, Prompts, Conditions and Joins.
- Extensively worked on Creating, Updating and maintaining Universes using Designer.
- Resolving the loops, fan traps, chasm traps using contexts and aliases, checking the Cardinalities and Integrity of the Universes.
- Highly experienced in Administrative tasks including Repository Configuration, Job Server configuration, Central Repository configuration, and Job Scheduling.
- User, Group and Folder Administration.
- Developed complex reports using @Functions including @Prompt (for user defined queries), @Where (for creating conditional filters), @Select and Cross Tab tables
- Analyzing system functionality requirements, documents and scope of the work.
- Created various scorecards and dashboards using Application Foundation which were used by decision makers
- Installed and configured of BO Software and Oracle software on client machines.
- Responsible for creating Derived Tables, Aggregated Tables using Aggregate Awareness concept, creating Indexes, views to increase the performance of the universe (Install Base.unv & renewals.unv)
- Migrated reports from 6.5 to XIR2 using Import Wizard.
- Configured Crystal reports 2008 to develop reports using Custom Coded functions.
- Developed Dashboards and scorecards using Performance Management.
Environment: SAP BODS 4.0, Business Objects XI 3.1/R2, Data Integrator XIR2,Data Insight XIR2,Data Quality XIR2, Oracle 10g/9i, Windows 2003 Server.
Apr’ 10 – Aug’ 11 BODS Developer, Confidential,OH
Cintas Corporation based in Ohio, is a publicly traded company which operates more than 400 facilities throughout North America. The company provides highly specialized services to businesses, including the design and manufacturing of corporate identity uniform programs, entrance mats, restroom supplies, promotional products, first aid and safety products, fire protection services and document management services to approximately 800,000 businesses.
- Used Business Objects Data Services for ETL extraction, transformation and loading data from heterogeneous source systems.
- Requirement gathering, developing Business Objects Data Services jobs specifications on excel sheets based on information acquired from analysis of source data, user requirements, Business rules and Enterprise standards.
- Designing Technical Specifications documents from Functional Specs, Analyzing the S2T Mapping documents.
- Creating Batch Jobs, Annotations, Conditional Objects, Workflows, Data Flows, template tables, mappings and using Various Transformations in the Staging area to Transform, Validate, split source data conditionally into Target tables.
- Expertise in working with Query, Case, Map Operation, Table Comparison, History Preserving, Key Generation, Merge, Row Generation, Validation, SQL , Data Cleanse, Address Cleanse and usability for different client requirements.
- Experience in using interactive debugger to test the jobs and fixed the bugs, Scheduling, monitoring the batch jobs.
- Efficient usage of Performance Techniques for Real time jobs and Batch jobs, complex transformations in Data Flows.
- Administering and maintaining Data Services management console , Scheduling Batch & real time Jobs, Registering Repositories, Assigning Users.
- Developed Workflows and Data flows where the data is extracted from the sources like Oracle, SQL Server and SAP ECC then loaded into staging tables from staging tables data loaded into relational tables and from Relational tables data loaded into Hubs.
- Perform Unit testing and Document the results of Unit testing and ETL process.
- Modified existing Business Objects Data Services jobs to fix defects at the root cause as enhancement part of project.
- Designed Universe by creating the BOBJ Data Model selecting/joining tables, indicating cardinalities, creating aliases to resolve the loops, subdividing into contexts and creating the objects like Dimensions, Details and Measures which are grouped into Classes and checked the integrity of the universe.
Environment: SAP BODS XI 3.0, SAP BI 7.0, SAP BOBJ 3.X, Data Integration, Data Services 3.1,SAP BW, SQL Server2005/2008, Oracle 11g, Windows XP.
Mar ’09 – Feb ’10 ETL Developer, Confidential,Marietta, GA
Reporting and GL Integration
ThyssenKrupp is amultinationalconglomeratecorporation that consists of 670 companies worldwide. While ThyssenKrupp is one of the world\'s largeststeel producers, the company also provides components and systems for the automotive industry, elevators, escalators, material trading and industrial services. ThyssenKrupp is structured into eight business areas that fall under two major divisions, Materials and Technologies .The need for a system to track all the orders, manufacturing costs, contract margin and with an interface with General Ledger for Financial accounting will ensure the reconcile of Data from GOS (Global Order System).The reporting layer to this system with help the product line, contract managers for optimal profitability.
- Interacting with the Functional Leads and Business Analysts to understand the requirement.
- Closely working with Modelling team as Technical SME for ensuring reliable system design.
- Create Mappings, sessions, work lets & workflows to load the data in ODS.
- Writing & Tuning SQL queries
- Improved the performance of the mappings by moving the filter transformations early into the transformation pipeline, performing the filtering at Source Qualifier for relational databases and selecting the table with fewer rows as master table in joiner transformations.
- Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
- Implemented various integrity constraints for data integrity like Referential integrity using primary key and foreign keys relationships.
- Used task developer in the Workflow manager to define sessions
- Created reusable work lets and mapplets and transformations.
- Responsible for monitoring all the sessions that are scheduled, running completed and failed.
- Involved in debugging the Mappings that failed.
- Developed an ETL interface between warehouse and GL Interface to book the sales transactions on weekly basis to General Ledger for Financial Accounting.
- Created UNIX scripts to handle flat file loads and send automated reconciliation reports to the functional owners.
- Developed an automated interface in ETL for Contract margin review.
Environment: Informatica Power Center 8.6.1, Teradata V12, Oracle 10g, DB2, WLM, UNIX, Windows XP.
Jan ‘08 – Jan ’09 Informatica Developer, Confidential,Minneapolis, MN
Communication Satellite Mobile Messaging,
The Data Architect focuses on both tactical (short-term) and strategic (long-term) uses of data within the application architecture and in the context of the project or program delivery objectives. Data Architects have responsibilities in areas such as data modeling, data structure design, data distribution design, data classification, and metadata management. Data architects are responsible for project level compliance in areas around data management, data standards, data governance, data ownership, and security according to the standards set by Enterprise Data Architecture. Data Architects support Integration Solution Architects and Designers on projects where data needs to be transported, consolidated, integrated, and / or cleansed. They are also responsible for defining logical and physical data structures that support effective reporting.
- Understand requirements associated with the application & Data Model
- Understand changes to the Enterprise Information Model & the impact to the existing Data Model
- Determine Data Model changes based on requirements & use cases for the Application
- Ensure alignment between the Data Model & the Enterprise Information Model as well as the Enterprise Information Architecture
- Identify and address application data issues that affect application integrity
- Work with the customer, users, technical architect, and application designers to define the data requirements & structure for the application
- Work with the integration solution architects & designers to design the integration solution
- Ensure that the database designs fulfill the requirements, including data volume, frequency needs, and long-term data growth
- Review the database deliverables throughout development to ensure quality & tractability to requirements and adherence to all quality management plans & standards
Environment: Informatica Power Center 8.1/7.1,Teradata SQL Assistant, and RMS, Windows XP, UNIX (Solaris)
Sep’ 06 - Dec’ 07 Informatica Developer, Confidential,Cary, NC
Dex One Corporation is an advertising company providing businesses with affordable and effective advertising. Project is to design, develop, implement and maintain ETL components for the sales data mart to provide analytical information. A data Mart was built with sources including Oracle, Flat files and target database as Oracle.
- Interacted with Business Analyst to understand the business requirements.
- Upgraded Informatica PowerCenter 7.1.2 repository to 8.1.1
- Created users/groups and folders using Repository Manager.
- Extensively worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
- Extracted Source Data using Informatica tools and Stored Procedures from Source Systems.
- Developed transformation logic and designed various Complex Mappings and Mapplets Using the Designer.
- Designed various mappings using transformations like Lookup, Router, Update Strategy, Filter Sequence Generator, Joiner, Aggregator, and Expression Transformation.
- Used Mapplets, Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.
- Configured and ran the Debugger from within the Mapping Designer to troubleshoot predefined mapping.
- Extensively involved in Fine-tuning the Informatica Code (mapping and sessions), Stored Procedures, SQL to obtain optimal performance and throughput.
Environment: Informatica Power Center 7.1.2, Erwin, Oracle 9i, Shell Scripting, Windows 2000, HP-UX.