Etl Developer Resume
OH
SUMMARY
- Around 6 years of experience in data warehousing with Informatica 9.5.1/9.1/8. X/7.X environments specialized in analysis, debugging with real data, application support, implementations and cleanup maintenance by using all the Informatica tool set such as Data Quality, Data Analyst, Power Center (Source Analyzer, Repository Manager, Mapping Designer, Mapplet Designer, Transformation Designer) as ETL tool on Oracle, SQL Server, DB2, MySQL databases and XML files, flat files.
- Experience in Business Intelligence solutions by creating simple and complex reports using Report Studio and Query Studio in Cognos Reporting tool. Applied conditional formatting, conditional blocks, and different kinds of prompts for charts, graphs, list and crosstab reports using Cognos BI Tool.
- Expertise in gathering business requirements for Data Profiling, Data Quality Analysis/ Solutions by using Informatica Tools.
- Strong knowledge in Informatica Data Quality 9.5 version.
- Expertise in Project life cycles of Software engineering, SDLC, RUP and AGILE methodologies along with various levels of documentation.
- Excellent skills in Application performance tuning, Application testing, code reviews & walkthroughs. Strong fundamental understanding of a variety of development languages and their capabilities.
- Involved in different phases of Data Warehouse Life Cycle including business reporting requirements gathering, source system analysis, logical/physical data modeling, ETL design/development, project deployment, production support.
- Experienced in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
- Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet and Mapping Designer.
- Strong Experience on Workflow Manager Tools - Task Developer, Workflow and Worklet Designer.
- Involved in the debugging of the mappings by creating breakpoints to gain troubleshooting information about data and error conditions.
- Experienced in developing and maintaining Batch programs in Control M Jobs and exception handling in Batch Job Scheduling.
- Good knowledge in handling the issues in file transferring with Secured FTP, FTP processes.
- Strong Knowledge of Data Warehouse Architecture and Designing Star Schema, Snow flake Schema, FACT and Dimensional Tables, Physical and Logical Data Modeling using Erwin.
- Good Understanding of relational database management systems, experience in integrating data from various data source like Oracle8i/9i/10/11, MySQL, SQL Server, MS Access, DB2, XML Files and Flat files into staging area.
- Created Dashboards, Scorecards using Report Studio in Cognos BI tool.
- Experience in dimensional modeling (Star Schema & Snow Flake Schema) and knowledge on SOA.
- Experience on working with UNIX and handling shell scripts.
- Good understanding of version control tools like VSS, CVS and GIT.
- Knowledge of Installation and configuration of Informatica server with SQL Server, Oracle and able to handle the Informatica Administrator tasks like configuring DSN, creating Connection Strings, creating folder, Defining users & groups, assigning permissions & etc.
- Working domain knowledge of Financial, Banking, Insurance, Telecom, Health Care and Retail.
- Team player, self starter with excellent communication and interpersonal skills.
TECHNICAL SKILLS
ETL Tools: Informatica Data Quality / Data Analyst, Power Center 9.x/8.x/7.x (Data Quality, Repository Manager, Designer, Workflow Manager, Workflow Monitor, Source Analyzer, Mapplet Designer, Mapping Designer), Power Mart 5.1/6.2
BI Tools: Cognos 10/8.X, Query Studio, Report Studio.
Databases: Oracle 8i/9i/10g/11g, SQL Server 2008 R2/2005, MySQL and DB2
Tools: Oracle SQL Developer, TOAD, Putty and SQL Plus
Languages: C, C++, Java, VB, HTML, XML and SQL
Software Packages: MS-Office 2003, Microsoft Word, Excel and Power Point, MS Visio 2010
Operating Systems: Windows 2000/ XP/ Vista/98/ 95, UNIX
Web servers: IIS Web Server and Tomcat
Version Control Tool: VSS, CVS and GIT
PROFESSIONAL SUMMARY:
Confidential, OH
ETL Developer
Responsibilities:
- Involved in analysis of stats requirement by participating in user meetings and translated user inputs into design documents.
- Involved in creating Cognos reports from Data mart which was populated by ETL process.
- Involved in infrastructure design discussion and participated in Informatica installation on Linux server in development environment.
- Identified facts and dimension tables by analyzing existing sqls and flash reports.
- Created target structures as per the definition for each identified clarity source table.
- Followed Incremental Load approach for facts and full load approach for dimension in
- Designed and implemented incremental approach for loading ODS layer.
- Extensively used MD5 functionality to identify true changes from source to be applied to the ODS layer. This has reduced run time approximately by 80 %. This also enabled us to efficiently process true incremental changes.
- Performed Source (clarity, MKPharmacy, PSFin) to ODS mapping by developing complex mappings in Informatica to load data extracted from various sources using different transformations like Source Qualifier, Look-up (connected and Unconnected), Expression, Aggregate, Update Strategy, Joiner, Filter and Router Transformations.
- Developed mapplets to reuse transformation logic in multiple mappings throughout the project.
- Used Informatica Debugger to debug a valid mapping to gain troubleshooting information about data and error conditions.
- Developed ETL mappings to load data from ODS to Data Mart.
- Created and configured Workflows and Sessions to load data from source (SQL server) to ODS (oracle) database and from ODS to MART database tables using Informatica Workflow Manager.
- Worked on rationalization and reconcile of any data issues for each statistics.
- Performed unit testing (creating test scripts) by verifying and validating each stat data by month, day and ChartField grain.
- Worked on system testing against TST and PROD environments and validation of data.
- Involvement in post live support for data mismatches and other issues.
Environment: Informatica Power Center 9.5.1, Oracle 11g, SQL Server, People Soft, Sky Bot, SQL Developer, Shell Script, Putty and Linux.
Confidential, Chicago, IL
Informatica Developer
Responsibilities:
- Involved in setting up SDLC and best practices for Data Quality.
- As part of the Data Quality Management, responsible for gathering the requirements, designing rules, testing, deploying, and documenting the metrics.
- Business Data Quality rules were maintained by me by taking soft backups on a periodic basis.
- Played a key role in creating / managing the Data Quality Process and also in the development of DQ Rules, Profiles, Profile Models and Scorecard for various business requirements.
- Created many Column Profiles and Profiles Models with heterogeneous source systems.
- Developed a UNIX Shell Script which automates the profiling, scorecards executions. The Shell script also takes care of the Status Notification of the Loads, Exception Report with the List of Failed DQ Rules.
- Provided assistance in creating Control M Scheduler jobs to automate the Data Quality Process and reduced the manual intervention.
- Performed Source to Target mapping by developing complex mappings in Informatica to load data extracted from various sources using different transformations like Union, Source Qualifier, Look-up (connected and Unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router Transformations. Used Debugger to test the mappings and fixed the bugs.
- Created and configured Workflows and Sessions to load data from Staging to Target database tables using Informatica Workflow Manager.
- Acted as a liaison between Cognos and ETL Team reporting System, these are reports developed using Cognos.
- Supported Onshore and offshore model to establish a data quality methodology, documenting a reusable set of processes, determining, investigating, and resolving data quality issues, establishing an on-going process for maintaining quality data, and defining data quality audit procedures.
- Collaborated directly with the different LOB data owners to establish the quality business rules that will provide the foundation of the organization's data quality improvement plan.
- Assisted the business analysts/project manager/data stewards in defining or modifying the project Data Quality Rules based on business requirements.
- Actively Involved in Design, Develop and Test the Data Quality Rules, Profile Models, Column and Primary Key Profiling, for various data sources to determine root causes, ensure correction of data quality issues due to technical or business processes.
- Imported profiling results into data model using Cognos framework manager.
- Involved in creating Cognos reports from Data mart which was populated by ETL process.
- Worked on different measures and reports (in summary and in detail) to management on the progress of data quality improvement.
- Implemented and governed the best practices relating to enterprise metadata management standards.
- Conducted walk reviews, profiling reviews and guided the users in understanding and using the InformaticaAnalyst.
- Continuously worked with Informatica vendor team in trouble shooting and solvingthe issues with the tool.
Environment: Informatica Data Analyst 9.1, Informatica Data Quality 9.1, Informatica 9.1, Cognos 10, Cognos Framework Manager, Oracle 10g/11g, Toad, Unix, k-shell scripts, CTRL-M.
Confidential . Columbus, OH
ETL / BI Developer
Responsibilities:
- Participated in user meetings and translated user inputs into ETL design documents.
- Performed Source to Target mapping by developing complex mappings in Informatica to load data extracted from various sources using different transformations like Union, Source Qualifier, Look-up (connected and Unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router Transformations. Used Debugger to test the mappings and fixed the bugs.
- Created and configured Workflows and Sessions to load data from Staging to Target database tables using Informatica Workflow Manager.
- Used Control-M for scheduling of data extraction, transformation and loading jobs related to different SOR systems.
- Performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
- Worked with Informatica Version Controlling for Team Based Development to check-in and check-out mappings, objects, sources, targets, workflows etc.
- Implemented labels and exported/imported workflows, mappings, sessions between different repositories using Informatica Repository Manager.
- Acted as a liaison between Cognos and ETL Team reporting System, these are reports developed using Cognos 8.
- Developed reports using Conditional blocks, Conditional layouts, Conditional formats and Master Detail (Drill through definition) reports in Report studio, simple and ad-hoc reports in Query studio using Cognos 8.
- Performed unit testing Confidential different level of defects to ensure that analysis for given data is correct.
- Involved in Data analysis and used SQL Developer to write queries to help test the reports, document the defects Identified in Quality Center.
- Performed Unit Testing, Integration Testing and User Acceptance Testing to pro-actively identify data discrepancies and inaccuracies.
- Attended day to day meetings to achieve specific outcomes.
Software Environment: Informatica 8.6, Cognos 10/8, Oracle 10G, SQL, PL/SQL, DB2, SQL server, Control M, Shell/Perl Script, SQL Developer, Quality Center and UNIX.
Confidential, Charlotte, NC
ETL Developer
Responsibilities:
- Involved in analysis of source systems, data profiling and identification of business rules in building a Data Warehouse.
- Expertise in Data warehousing solutions, data analysis, data mapping and developing transformation logic using Informatica Power Center.
- Extensively used ETL methodologies for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using Informatica Power Center 8.6.
- Performed Source to Target mapping by developing complex mappings in Informatica to load data extracted from various sources using different transformations like Union, Source Qualifier, Look-up (connected and Unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router Transformations. Used Debugger to test the mappings and fixed the bugs.
- Developed new and modified existing Informatica mappings and workflows based on specification.
- Developed slowly changing dimensions for TYPE 1 SCD and TYPE 2 SCD.
- Involved in identifying bugs in existing mappings by analyzing dataflow and evaluating transformations using debugger.
- Used mapping parameters in mappings for incremental load.
- Automated the jobs through scheduling using Built in Informatica scheduler.
- Demonstrated comprehensive understanding of Dimensional Data Modeling, E-R modeling, Relational modeling, Multi-Dimensional Database Schemas like Star schema and Snowflake schema.
- Debugged program logic in a structured and organized manner.
- Preparing test cases, conducting peer review to maintain proper logic.
- Performed unit testing Confidential different level of defects to ensure that analysis for given data is correct.
Software Environment: Informatica 8.6, Oracle 10G, SQL, PL/SQL, SQL server, Shell/Perl Script and UNIX
Confidential, Roseland, NJ
ETL Developer / Application Support Delivery (ASD)
Responsibilities:
- Understanding existing system and requirements.
- To monitor Informatica daily refresh and weekly full load workflows, re-schedule them as per client requirements and ensure the presence of flat files as per workflow requirements.
- To analyze the Informatica log files upon workflow failures, document preliminary findings before escalating to on site team.
- To attend all the business meetings (stabilization, escalation etc) and provide an in-depth technical explanation of any issues raised by the business.
- Fully involved in Daily/Weekly/Monthly ETL processes, involved in manual data loads, trouble shooting, analyzing the data which is loaded by the process running and helping the business in the ways I can since I joined the team in Investment management (IM).
- Responsible for submitting DBA requests, follow up with requests, identifying the job failures and creating remedy tickets.
- Extensively used ETL methodologies for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using tools such as Informatica Power Center.
- Designed and developed complex update strategy, aggregator, joiner, lookup transformation that are used in mappings.
- Used Debugger to developing sessions and workflows to loading data from source to target tables.
- Integrated all jobs using complex mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
- Performance tuning has been done to increase the through put for both mapping and session level and SQL Queries Optimization as well.
- Automated the jobs through scheduling using Built in Informatica scheduler, which runs every day by maintaining the data validations
- Worked with different sources such as Oracle, My Sql and flat files.
- Performed Informatica code migrations, test, debug, document to maintain programs and deployed.
- Preparing test cases, conducting peer review to maintain proper logic.
- Implementation / quality standards in the mappings.
Software Environment: Informatica 7.1/8.1, Oracle, My Sql, SQL, PL/SQL, Sybase, Shell/Perl Scripting, Quality Center and UNIX.