Etl Informatica Developer Resume
PROFESSIONAL SUMMARY:
- 6+ years of IT experiencein Analysis, Design, Development, Implementation, Testing and Supportof Data Warehousing and Data Integration Solutions using Informatica Powercenter.
- 4+ years of experience in using Informatica PowerCenter (7.1.3/8.6.1)
- 1+ years of experience in Reporting tool COGNOS ( 8.4)
- Proficiency in developing SQL with various relational databases like Oracle, SQL Server.
- Knowledge in Full Life Cycle development of Data Warehousing.
- Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.
- Experience with dimensional modeling using star schema and snowflake models.
- Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
- Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
- Developed OLAP applications using Cognos 8BI - (Frame Work Manager, Cognos Connection, Report Studio, Query Studio, and Analysis Studio) and extracted data from the enterprise data warehouse to support the analytical and reporting for Corporate Business Units.
- Strong with relational database design concepts.
- Extensively worked with Informatica performance tuning involving source level, target level and map level bottlenecks.
- Strong business understanding of verticals like Banking, Brokerage, Insurance, Mutual funds and Pharmaceuticals.
- Independently perform complex troubleshooting, root-cause analysis and solution development.
- Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.
- Team player, Motivated, able to grasp things quickly with analytical and problem solving skills.
- Comprehensive technical, oral, written and communicational skills
SOFTWARE KNOWLEDGE:
Operating Systems: Windows, Linux, HP-UX
Software / Applications: MS XP, MS 2000, MS Word,MS Excel, MS Access,Outlook,PowerPoint
Database: SQL Server 2008/2005/2000, Oracle 11g/10g/9i
ETL: Informatica PowerCenter 7.1.3/8.6.1, Informatica PowerExchange 8.6.1
Modeling: Framework Manager, PowerPlay Transformer
OLAP/BI Tools: Cognos 8 Series
Languages: Java, HTML, XML,SQL, PL/SQL.
Web/Apps Servers: IBM Web Sphere 4x, Sun iPlanet Server 6.0. , IIS, Tomcat
Tools: TOAD, Visio, Eclipse
Client: Confidential, Louisville,KY
Duration: October 2011 to till date.
Role: ETL Informatica developer
Description:
Humana Inc., headquartered in Louisville, Kentucky, is a leading health care company that offers a wide range of insurance products and health wellness services, Humana provides Medicare Advantage plans and prescription drug coverage to more than3.5 million members throughout the US.
The main objective of this project shared data Repository is to capture new vitality program customers data , policies, group policies,HumanaOne and non HumanaOne medicare plans.
Data is coming from various sources like SQL Server, Mainframe etc which will be loaded in to EDW based on different frequencies as per the requirement. The entire ETL process consists of source systems, staging area, Datawarehouse and Datamart.
Responsibilities:
- Developed ETL programs using Informatica to implement the business requirements.
- Communicated with business customers to discuss the issues and requirements.
- Created shell scripts to fine tune the ETL flow of the Informatica workflows.
- Used Informatica file watch events to pole the FTP sites for the external mainframe files.
- Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
- Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
- Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
- Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
- Effectively worked on Onsite and Offshore work model.
- Pre and post session assignment variables were used to pass the variable values from one session to other.
- Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
- Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
- Performed unit testing at various levels of the ETL and actively involved in team code reviews.
- Identified problems in existing production data and developed one time scripts to correct them.
- Fixed the invalid mappings and troubleshoot the technical problems of the database.
Environment: Informatica 8.6.1 ,SQL Server 2008 R2, HP-UX ,
Client :Confidential, Chicago, IL
Duration : August 2010 to Sep 2011
Role : ETL Informatica developer
Description:
Allstate is one of the fastest growing Auto/Property/ Life Insurance Company. It serves its customers by offering a range of innovative products to individuals and group customers at more than 600 locations through its company-owned offices.
The primary objective of this project is to capture different Customers, Policies, Claims Agents, Products and financial related data from multiple OLTP Systems and Flat files. Extracted Transformed Loaded data in to data warehouse using Informatica Powercentre and generated various reports on a daily, weekly monthly and yearly basis. These reports give details of the various products of Allstate Insurance products that are sold. The reports are used for identifying agents for various rewards and awards and performance, risk analysis reports for Business development Managers.
Responsibilities:
- Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
- Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
- Develop the mappings using needed Transformations in Informatica tool according to technical specifications
- Created complex mappings that involved implementation of Business Logic to load data in to staging area.
- Used Informatica reusability at various levels of development.
- Developed mappings/sessions using Informatica Power Center 8.6 for data loading.
- Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
- Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
- Building Reports according to user Requirement.
- Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
- Implementedslowly changing dimensionmethodology for accessing the full history of accounts.
- Write Shell script running workflows in unix environment.
- Optimizing performance tuning at source, target,mapping and session level
- Participated inweeklystatus meetings, and conducting internal andexternal reviews as well as formal walk through among various teams and documenting the proceedings.
Environment: Informatica 8.6 .1,Oracle 11g, SQL Server 2005, HP-UX.
Client :Confidential, San Jose, CA
Duration :April 2009 to July 2010
Role :ETL Developer
Description:
This position requires implementing data warehouse for Forecasting, Marketing, Sales performance reports. The data is obtained from Relational tables and Flat files. I was involved in cleansing and transforming the data in the staging area and then loading into Oracle data marts. This data marts/Data warehouse is an integrated Data Mine that provides feed for extensive reporting.
Responsibilities:
- Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
- Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
- Involved in extracting the data from the Flat Files and Relational databases into staging area.
- Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
- Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
- Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
- Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
- Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
- Developed several reusable transformations and mapplets that were used in other mappings.
- Prepared Technical Design documents and Test cases.
- Involved in Unit Testing and Resolution of various Bottlenecks came across.
- Implemented various Performance Tuning techniques.
- Used Teradata as a source system
Environment:
Informatica 8.1.1 Power Center, Teradata, Oracle 11g, Windows NT.
Client :Confidential, Newark, NJ
Duration : July 2008 to March 2009
Role :ETL Informatica developer
Description:
Prudential Financial companies serve individual and institutional customers worldwide and include The Prudential Insurance Company of America, one of the largest life insurance companies in the U.S. These companies offer a variety of products and services, including mutual funds, annuities, real estate brokerage franchises, relocation services, and more. Involved in the development and implementation of goals, policies, priorities, and procedures relating to financial management, budget, and accounting. Analyzes monthly actual results versus plan and forecast
Responsibilities:
- Involved in design, development and maintenance of database for Data warehouse project.
- Involved in Business Users Meetings to understand their requirements.
- Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica 7.X.
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
- Worked extensively with the connected lookup Transformations using dynamic cache.
- Worked with complex mappings having an average of 15 transformations.
- Created and scheduled Sessions, Jobs based on demand, run on time and run only once
- Monitored Workflows and Sessions using Workflow Monitor.
- Performed Unit testing, Integration testing and System testing of Informatica mappings
- Coded PL/SQL scripts.
- Wrote Unix scripts, perl scripts for the business needs.
- Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process
- Created Universes and generated reports on using Star Schema.
Environment: Informatica PowerCenter 7.1.3, Oracle 11g, UNIX
Client :Confidential, Dearborn, MI
Duration :April 2007 to July 2008
Role :Cognos Developer
Description:
The Oakwood Healthcare System serves 35 different communities in southeastern Michigan with over 40 primary and secondary care locations. Responsibilities include working with the clinical analytics team on the measurement of provider performance, quality improvement initiatives, and various ad-hoc requests. The reports are created, distributed and published using various Cognos BI tools like ReportNet, Impromptu, Power Play, IWR, and UpFront to the end-users. The application had OLAP features like Drill Down analysis, Multidimensional analysis, Prompts, Exception Highlighting and User Privileges.
Responsibilities:
- Developed models in Framework Manager.
- Published packages and managed the distribution / setup of the environment.
- Used Query Studio for creating Ad-hoc Reports.
- Created complex and multi-page reports using Report Studio
- Performed migration from Impromptu to Reportnet.
- Used Schedule Management in Cognos Connection.
- Performed Bursting Reports and Multilingual Reports using Report Studio
- Developed Layout, Pages, Object Containers and Packages using Report Studio.
- Created reports using ReportNet with multiple Charts and Reports.
- Responsible for assigning user Sign-Ons for the new users.
- Provided guidance to report creators for enhancement opportunities.
- Created Multidimensional Cubes using PowerPlay and published on the UpFront Portal using PowerPlay Enterprise Server.
- Developed PowerPlay Cubes, used multiple queries, calculated measures, customized cube content and optimized cube creation time.
- Fine-tuned the Cubes and checked the database space issue and cube growth periodically.
- Responsible in the creation of new User Groups and User Classes using Access Manager.
Environment: CogonsBI(Frame work manager, Cognos Connection, Report Studio, Query Studio), Oracle 11g, SQL server 2005.
Client :Confidential, Bridgewater, NJ
Duration : Nov 2006 to April 2007
Role :ETL Analyst
Description:
Aventis is a Pharmaceutical company, which provides new and improved biotech drugs for various diseases and their symptoms. The objective of the project is to extract data stored in different databases and load into oracle system which is the staging area and the business logic is applied to transform the tables in the required way. The data warehouse is fed by marketing data, sample data, market (competitor) data, prescription data and others.
Responsibilities:
- Extensively used ETL to load data from Flat Files, XML, Oracle to oracle 8i
- Involved in Designing of Data Modeling for the Data warehouse
- Involved in Requirement Gathering and Business Analysis
- Developed data Mappings between source systems and warehouse components using Mapping Designer
- Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner, XML.
- Setup folders, groups, users, and permissions and performed Repository administration using Repository Manager.
- Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel queries inside the source qualifier.
- Created, launched & scheduled sessions.
- Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
- Used Server Manager to schedule sessions and batches.
- Involved in creating Business Objects Universe and appropriate reports
- Wrote PL/SQL Packages and Stored procedures to implement business rules and validations.
Environment: Informatica 7.1.3, ORACLE 10g, UNIX, Windows NT 4.0, UNIX Shell Programming, PL/SQL, TOAD Quest Software
Client :Confidential, Bangalore, India
Duration :May 2005 to Oct 2006
Role :Oracle Developer
Description:
Core project focused on designing and implementing scalable solutions that would support the company's continued dramatic growth, under girded by the corporate data warehouse.
Responsibilities:
- Developed Oracle PL/SQL packages, procedures and functions
- Coded Oracle SQL to create ad-hoc reports on an as-needed basis
- Used Oracle Warehouse Builder to implement changes to the operational data store, as well as create data marts
- Involved in the data analysis for source and target systems. Good understanding of Data warehousing concepts, Star schema and Snow-flake schema.
- Involved in supporting and maintaining Oracle Import, Export and SQL*Loader jobs
- Involved in supporting and maintaining Unix shell script jobs
Technical Details: Oracle 9, PL/SQL, Windows 98