We provide IT Staff Augmentation Services!

Sr. Informatica Data Quality Developer/analyst Resume

0/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Around 9 years of strong data warehousing experience using Informatica PowerCenter with extensive experience in designing the Tasks, Workflows, Mappings, Mapplets and scheduling the Workflows/sessions using Informatica PowerCenter 9.5/9.1/8.6/8.1/7.1/6.2/5.1, Power Mart 6.2/5.1/5.0. And Power exchange/Power connect.
  • Around 4 Years of Strong experience on Designing, testing, deploying, and documenting the data quality procedures and their outputs in Data Quality and Data Analysis using with Informatica IDQ/IDE 8.x/9.1/9.5
  • Strong business understanding knowledge of Financial, Telecommunications, Pharmaceutical and inventory management projects.
  • 5 years of experience in Cognos, Business Objects 6.1a/5.1.2/4.1, Web Intelligence v6/2.6/2.5, Broadcast Agent, Power Builder and Crystal Reports.
  • Experience in creating Reusable Tasks (Sessions, Command, Email) and Non - Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • 5 Years’ experience in Unix Shell Scripting, PERL Scripting.
  • 5 years experience on various SQL Editors such as TOAD, PL/SQL, BTEQ and Query Analyzer and other utilities SQL*Plus, SQL*Loader.
  • 6 years experience in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema (Fact Tables, Dimension Tables) used in relational, dimensional and multidimensional modeling.
  • Experience in loading data into Data marts using Informatica PowerCenter ETL.
  • Extensive experience in creation of ETL Mappings and Transformations usingInformatica PowerCenterto move data from multiple sources into target area using complex transformations like Expressions, Routers, Lookups, Source Qualifiers, Aggregators, Filters, Joiners.
  • Experience in integration of data sources like Oracle, DB2,TeraData, COBOL, MQ, SQL Server, Flat Files.
  • Expertise in Data cleansing, Stored procedures, Triggers and necessary test plans to ensure the successful execution of the data loading processes
  • Hands-on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Experience in MOLAP/ROLAP implementations.
  • Experience with AutoSys and Control-M.
  • Experience in working with Business Objects Universes & Query development.
  • Knowledge in E-commerce and Web related tools such as HTML, JavaScript, Applets, Servlets, Java Server Pages, EJB and XML.
  • Profound experience in performing all functions of System Development Life Cycle including writing Design documents and Test plans and conduct Unit, System testing, Quality Assurance Testing, on Data warehousing QA Testing and Analysis.
  • Good team player with leadership abilities and excellent communication skills.

TECHNICAL SKILLS

Languages: C, C++, Java, COBOL, PERL, PL/SQL.

Databases: Oracle 11g/10g/9i/8i/8/7.x,Teradata,SQLServer,DB2, Netezza, MS Access, Sybase.

ETL: Informatica Power center 9.5/9.1/8.6/7.1/6.2/5.1, PowerExchange 8.0, Informatica Data Quality (IDQ)/IDE 8.x/9.1/9.5, Informatica MDM hub 9.1, Informatica metadata manager, Metacenter, Oracle Warehouse Builder & SQL*Loader, Toad 7.6.

Reporting Tools: BusinessObjects 6.5/5.1, OBIEE 10.1.3.4, Cognos 8/7/6.0/5.0,Webiv6/2.5/2.6, Crystal Reports, OracleReports2.5, PowerBuilder, Dataflux

Design Tools: Erwin 4.0/3.5, Oracle Designer / 2000, ER Studio

GUI: Developer/2000, Visual Basic

Internet Tools: HTML, JavaScript, VBScript, JSP, XML, ASP

Scheduling Tools: Autosys, Control-M

Other Tools: Quality Center, VSS and Test Director, Jeera

0S: Windows(NT,2000,2003,XP,Vista,7),Linux(RedHat,Fedora),UNIX(SunSolaris,Linux)

PROFESSIONAL EXPERIENCE

Confidential, WI

Sr. Informatica Data Quality Developer/Analyst

Responsibilities:

  • Created Profile report used IDE to know weakness and strengths of source data and also created Scorecards.
  • AppliedInformaticaData Quality standards, guidelines and best practices on the Development (mappings, workflows, Logical data object, Profiling, Scorecards), Migration and automated entire process.
  • Understand existing DQ landscape to provide key insights on the scope and magnitude of DQ issues and supporting Processes and procedures.
  • Involved inconceptual, logical and physical datamodeling and used star schema in designing the data warehouse
  • Identified key data sources that are in-scope, validating/defining critical data elements and business rules.
  • Optimized theperformance of the Informatica power center mappingsby various tests on sources, targets and transformations. Identified theBottlenecks, Removed the Bottlenecksandimplemented performance tuning logicon targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Exported the valid and invalid documents to data stewards and business analyst to understand the data.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created conceptual and logical model for data warehouse model using with ER STUDIO tool.
  • Applied the Completeness, Conformity, Integrity, Timeliness, Synchronization/Consistency to Create the Scorecards, trend charts, Dash-Boards and Ad-Hoc reporting
  • Identified data quality issues, anomalies, patterns, based on defined business rules and created metrics and score Cards by IDQ Profiling.
  • Applied data analysis, data cleansing, data matching, exception handling, and reporting and monitoring capabilities in IDQ(Informatica data quality)
  • Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica Data quality.
  • Created logical data objects in IDQ for profiling using with multiple tables joined based on the business requirement.
  • Exported the IDQ mapping to Powecenter to maximize the performance on high volume loading.
  • Responsible for identifying reusable logic to build several Rules/Maplets in Informatica Data quality to increase profile retrieval process.
  • Created various rules in IDQ to satisfy the Completeness, Conformity, Integrity, Timeliness and Synchronization/Consistency.
  • Extensively used IDE profiling for the database and flat file data and applied different strategies to get detail understanding about data.
  • Applied all the created rules in IDE profile to get scorecards to analyze the business users and data stewards.
  • Cleansed, standardized, labeled and fix the data gaps in IDQ where it checks with reference tables to fix major business issues.
  • Data services imported in MDM for quality task before processing matching task in Informatica MDM.
  • Created various financial checkouts in IDQ for various amounts to match and validate as per the business and created scorecards to analyze the current process.
  • Tuned IDQ Development process to push the most of the process to database and Removed unwanted ports and transformations.
  • Extensively worked for address validation and name matching (Standardization) task at IDQ
  • Deployed workflows, data services in application to automate the process and monitor the process with monitor tool to apply required human tasks.
  • Monitor the landing, staging, base, matching, Xref table changes to update or insert the new data with Informatica MDM.
  • Used IDE profile on given business data and applied column profile, Primary key profile, function dependency profile, foreign key profile and join profile.
  • Created exception handling tool IDD for highlight the accepted, rejected records for data stewards to understand the exception data process.
  • Monitor the landing, staging, base, matching, Xref table changes to update or insert the new data with Informatica MDM.
  • Created XML data services and run through web services after deployed in applications through IDS(Informatica Data Services)
  • Imported IDQ mapplet to Informatica power center and integrated in the regular mapping for cleansing, standardization and address validation issues.
  • Analyzed the high level data profile results set of data to know the properties like Name, Unique, Null and data types in IDE profile.
  • Tuned performance of Informatica power center session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Extensive Oracle SQL, PL/SQL experience and good knowledge in RDBMS concepts
  • Worked with various stored procedures to populate data to summary tables.
  • Used metacenter to know data lineage and business glossary for the enterprise insurance data.
  • Extensively used Autosys scheduler to schedule Informatica workflows.
  • Developed Test Plans and written Test Cases to cover overall quality assurance testing.

Environment: Informatica PowerCenter 9.1/9.5, Informatica Dataquality(IDQ/IDE) 9.5, Informatica MDM hub 9.1, IDD, IDS, SQL Developer for Oracle, Oracle11g, SQL Server2005/2008, Business object, ER Studio, DB2, PL/SQL, Core FTP, Sun Solaris 8.0, Shell Scripting, Autosys, Metacenter

Confidential, Chicago, IL

Sr. Informatica Developer

Responsibilities:

  • Replicate operational tables into staging tables, Transform and load data into enterprise Data warehouse tables using Informatica from their legacy systems and load the data Into targets by ETL process through scheduling the workflows.
  • IDQ 9.1 have many more new transformations compare with IDQ 8.6 like Address validation, Matching, Merging, Associate, Decision, labeler, Parser, Standardizer, Key generator, Exception, Consolidation, Comparison, Case Convertor and other powercenter transformations.
  • Created Profile report used IDE to know weakness and strengths of source data and also created Scorecards.
  • Involved inconceptual, logical and physical datamodeling and used star schema in designing the data warehouse
  • Optimized theperformance of the Informatica power center mappingsby various tests on sources, targets and transformations. Identified theBottlenecks, Removed the Bottlenecksandimplemented performance tuning logicon targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Created conceptual and logical model for data warehouse model using with Erwin tool.
  • Validated in the debugging of the Informatica power center mappings by creating breakpoints to gain troubleshooting information about data and error conditions.
  • Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica PowerCenter.
  • Responsible for identifying reusable logic to build several Mapplets in Informatica PowerCenter which would be used in several mappings.
  • Created mappings to extract and de-normalize (flatten) data from XML files using multiple joiners with Informatica PowerCenter.
  • Extensively used IDE profiling for the database and flat file data and applied different strategies to get detail understanding about data.
  • Escalated Informatica MDM untrusted or probable records to Data steward to get final decision.
  • Cleansed, standardized, labeled and fix the data gaps in IDQ where it checks with reference tables to fix major business issues.
  • Upgraded from IDQ 8.6 TO 9.1 where we used reference table instead of dictionaries and also applied other enriched, updated transformation to match the 8.6 output data.
  • Created matching columns and rules to apply on Base tables in Informatica MDM.
  • Extensively worked for address validation and name matching (Standardization) task at IDQ and IDE(9.1).
  • Used IDE profile on given business data and applied column profile, Primary key profile, function dependency profile, foreign key profile and join profile.
  • Created XML data services and run through web services after deployed in applications.
  • Imported IDQ mapplet to Informatica power center and integrated in the regular mapping for cleansing, standardization and address validation issues.
  • Cleansed landing data with Informatica IDQ to load staging area in the Informatica MDM.
  • Used properties like Name, Unique, Null and data types for IDE profile
  • Worked with various stored procedures to populate data to summary tables.
  • Genarated WSDL URL for deployed xml informatica process.
  • Imported Profiles and Rules from IDE to IDQ 9.1 to fix the issues at mapping level.
  • Extensively worked at data upgrade for IDQ/IDE 8.6 to 9.1 and also worked at informatica powercenter upgrade from 8.6 to 9.1
  • Creating separate table data for bad records in Informatica power center and IDQ to sending the data SME to analyze more on this kind of data.
  • Extensively used TOAD to create target tables and accessed data.
  • Created and applied rulesto profile data for flat files and relational data by creating rules in IDE and case cleanse, parse, standardize data through mappings in IDQ and generated as Mapplets in PC.
  • Worked in Informatica MDM(siperion) to get the trusted master record where the conflict duplicated data is there.
  • Validated the web services using with SoapUI tool from source to target xml files.
  • Wrote Oracle PL/SQL procedures for processing business logic in the database. Tuning of Database queries for better performance.
  • Created Reference tables with help of IDE and used these reference tables at IDQ to standardize the ID’s, Names, Addresses and other source data.
  • Monitor the landing, staging, base, matching, Xref table changes to update or insert the new data with Informatica MDM.
  • Tuned performance of Informatica power center session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica PowerCenter.
  • Worked with business SMEs on developing the business rules for cleansing and Applied business rules using informatica Data Quality(IDQ) tool to cleanse data.
  • Designed and documented validation rules, error handling routines and testing strategies for the Informatica PowerCenter mappings.
  • Modified bunch of Oracle stored procedures that perform income calculations.
  • Writing queries and stored procedures in PL/SQL to fetch data from the OLTP system and executed at regular intervals of time.
  • Built various queries and packages in Informatica MDM hub to reuse the existing result set.
  • Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ 9.1) are the tools are used here. IDE is used for data profiling over metadata and IDQ 9.1 for data quality measurement.
  • Extensive Oracle SQL, PL/SQL experience and good knowledge in RDBMS concepts.
  • Automated the box jobs in Informatica MDM hub using with Batch group and created multiple parallel staging loads along with single load of base object load.
  • Created matching columns and rules to apply on Base tables in Informatica MDM.
  • Extensively used Autosys scheduler to schedule Informatica workflows.
  • Responsible for preparing developed Test Plan, Test Procedures and Test Cases.
  • Written UNIX Korn shell scripts along with Control-M for scheduling the sessions and workflows.
  • Developed Test Plans and written Test Cases to cover overall quality assurance testing.

Environment: Informatica PowerCenter 8.6/9.1, Informatica Dataquality(IDQ/IDE)8.6/9.1, Informatica MDM hub 9.1, SQL Developer for Oracle, Oracle10g/11g, SQL Server2005/2008,Obiee, Erwin 4.0, DB2, PL/SQL, Core FTP, Power Builder, Sun Solaris 8.0, Shell Scripting, Control-m

Confidential, Los Angeles, CA

Sr. Informatica Power Center Developer /IDQ/IDE Developer

Responsibilities:

  • Teamed together with client resources to design the to-be processes, applying standards and facilitating the industry best practices.
  • Implementation of best practice integration methods based on Informatica PowerCenter framework.
  • Developed Extract, Transform and Load (ETL) routines using Informatica Power Center and Oracle as the backend databases.
  • IDQ 9.1 have many more new transformations compare with IDQ8.6 like Address validation, Matching, Merging, Associate, Decision, labeler, Parser, Standardizer, Key generator, Exception, Consolidation, Comparison, Case Convertor and other powercenter transformations.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups (Connected, Unconnected), Expression, Normalizer, Update strategy & stored procedure transformation.
  • Used IDE profile on given business data and applied column profile, Primary key profile, function dependency profile, foreign key profile and join profile.
  • Created Profile report used IDE to know weakness and strengths of source data and also created Scorecards.
  • Genarated WSDL URL for deployed xml informatica process.
  • Worked with business SMEs on developing the business rules for cleansing and Applied business rules using informatica Data Quality(IDQ) tool to cleanse data.
  • Involved in Creation of validation logic and reject reports at various stages during processing.
  • Worked in Informatica MDM (siperion) to get the trusted master record where the conflict duplicated data is there.
  • Involved in creating logical and physical data modeling withSTAR and SNOWFLAKE schematechniques using Erwin in Data warehouse as well as in Data Mart.
  • Understand the existing subject areas, source systems, target system, operational data, jobs, deployment processes and Production Support activities.
  • Responsible for identifying reusable logic to build several Mapplets, utilizing shared folders to in corporate them as shortcuts, which would be used in several mappings.
  • Involved ingenerating and applying rulesto profile data for flat files and relational data by creating rules in IDE and case cleanse, parse, standardize data through mappings in IDQ and generated as Mapplets in Informatica power center.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Involved in updating code at OBIEE according to requirements.
  • Responsible for developing several complex mappings for processing the feeds, re-defined existing process to accommodate batch schedule-using Worklets and developed E-Mail Tasks in the Workflow Manager for sending successful or failure messages.
  • Created matching columns and rules to apply on Base tables in Informatica MDM.
  • Identified and documenting strength and weakness of source data using informatica IDE.
  • Extensively used TOAD and SQL DEVELOPER to create target tables and accessed data.
  • Translated business requirements to Informatica PowerCenter Mappings.
  • Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica PowerCenter.
  • Updating and modify the existing code according current business rules.
  • Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ 9.1) are the tools are used here. IDE is used for data profiling over metadata and IDQ 9.1 for data quality measurement.
  • Extensive Oracle SQL, PL/SQL experience and good knowledge in RDBMS concepts.
  • Involved in developing several UNIX Korn shell scripts for performing validations.
  • Imported Profiles and Rules from IDE to IDQ 9.1 to fix the issues at mapping level.
  • Designed relational data model for facilitating data loads to custom table in RDR involving target load order and constraint based loading, created several views for discoverer reports.
  • Provided automation of the mentioned process for loading data into several different instances.
  • Created Test cases by taking input from Business Analysts and Technical Manager.
  • Automated the box jobs in Informatica MDM hub using with Batch group and created multiple parallel staging loads along with single load of base object load.
  • Validated the web services using with SoapUI tool from source to target xml files.
  • Tested the feeds regarding the functionality and Performance manually.
  • Created parameter files in Informatica PowerCenter and passed them to Informatica PowerCenter Tasks.
  • Executed Informatica PowerCenter Workflows individually and validated the data.

Environment: Informatica PowerCenter 9.1/8.6/8.1, IDQ/IDE 9.1, Informatica MDM hub 9.1, TOAD for Oracle, Oracle10g/11g, UNIX, SQL Server2005/2008, PL/SQL, Shell Scripting, Core FTP, OBIEE, Autosys, Quality Center.

Confidential, NH

Sr. Informatica PowerCenter Developer/IDQ/IDE Developer

Responsibilities:

  • Developed mappings in Informatica PowerCenter to load the data from various sources using transformations like Expression, Lookup (connected and unconnected), Aggregator, Update Strategy, Filter, Router etc.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Translated business requirements to Informatica PowerCenter Mappings.
  • Involved in data base validation using complex SQL and PL/SQL.
  • Identified and documenting strength and weakness of source data using informatica IDE.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and lookup transformation to identify slowly changing dimensions.
  • Imported IDQ mapplet to powercenter and integrated in the regular mapping for cleansing, standardization and address validation issues.
  • Designed, developed, implemented and maintainedInformaticaPowerCenter and IDQ/IDE 8.6.1 application for matching and merging process.
  • Worked with Informatica and other consultants to develop IDE/IDQ plans to identify possible data issues.
  • Develop the informaticsa transformations to source various XML files.
  • Involved in Data Quality for source, target and metadata in oracle.
  • Accessing and documenting existing business processes and identifying where improvements can be made or where new processes can be installed.
  • Involved in Informatica MDM (siperion) to get the trusted master record where the conflict duplicated data is there.
  • Extensively used TOAD and SQL DEVELOPER to create target tables and accessed data.
  • Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
  • Automated the XML archival process using XML transformations in informatica power center.
  • Informatica Data Quality (IDQ/IDE 8.6.1) is the tool used here for data quality/analysis measurement.
  • Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica PowerCenter.
  • Extensive Oracle SQL, PL/SQL experience and good knowledge in RDBMS concepts.
  • De-normalized the XML files and load on database using with normalizer transformation in informatica.
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Validated the feed processing in different stages like File Watcher conditions, PRE ETL, ETL, POST ETL and Oracle Apps Import/Post processes.
  • Created parameter files in Informatica PowerCenter and passed them to Informatica PowerCenter Tasks.
  • Scheduled several jobs using Autosys tool, creation of Autosys jils, changing job definition & handling scheduling issues.
  • Execution of Test plan, Implementation Plans & identified areas for process improvement of workflows, mappings in Informatica PowerCenter.

Environment: Informatica PowerCenter 9.1/8.6/8.1, IDQ/IDE 8.6, TOAD for Oracle, Oracle10g/11g, UNIX, SQL Server2005/2008, DB2, PL/SQL, Shell Scripting, Core FTP, Business Objects XI, Autosys, Quality Center.

Confidential, MA

Informatica PowerCenter Developer

Responsibilities:

  • Teamed together with client resources to design the to-be processes, applying standards and felicitating the industry best practices.
  • Implementation of best practice integration methods based on Informatica PowerCenter framework.
  • Responsible for Automation and Administration of OMB XML Report Generation, Submission, and Results and Analysis Processing (i.e admin for federal reporting XML processing).
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups (Connected, Unconnected), Expression, Normalizer, Update strategy & stored procedure transformation
  • Identified and documenting strength and weakness of source data using informatica IDE.
  • Automated of ARRA ODS to Data Warehouse Migration along with parameters.
  • Automated reusable logic to build several Mapplets, utilizing shared folders to in corporate them as shortcuts, which would be used in several mappings.
  • Responsible for developing several complex mappings for processing the feeds, re-defined existing process to accommodate batch schedule-using Worklets and developed E-Mail Tasks in the Workflow Manager for sending successful or failure messages.
  • Scheduled several jobs using Autosys tool, creation of Autosys jils, changing job definition & handling scheduling issues.
  • Involved in updating code at OBIEE according to requirements.
  • Worked withInformaticaData Quality 8.6.1 (IDQ) toolkit, Analysis(IDE), data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Re-writing Pl/Sql routines using Netezza nzsql and nzload utilities.
  • Involved in writing several stored procedures & PL/SQL programs GL interfaces built.
  • Provided automation of the mentioned process for loading data into several different instances.
  • Created Test cases by taking input from Business Analysts and Technical Manager.
  • Tested the feeds regarding the functionality and Performance manually.
  • Validated the feed processing in different stages like File Watcher conditions, PRE ETL, ETL, POST ETL and Oracle Apps Import/Post processes.
  • Created parameter files in Informatica PowerCenter and passed them to Informatica PowerCenter Tasks.
  • Executed Informatica PowerCenter Workflows individually and validated the data.
  • Execution of Test plan, Implementation Plans & identified areas for process improvement of workflows, mappings in Informatica PowerCenter.

Environment: UNIX, Informatica PowerCenter 8.6/8.1, IDQ/IDE, TOAD for Oracle, Oracle10g/11g, Netezza, SQL Server2005/2008, Erwin 4.0, DataFlux, PL/SQL, OBIEE, Power Builder, Core FTP, Sun Solaris 8.0, Shell Scripting.

Confidential, CA

Informatica PowerCenter Developer

Responsibilities:

  • Teamed together with client resources to design the to-be processes, applying standards and facilitating the industry best practices.
  • Implementation of best practice integration methods based on Informatica PowerCenter framework.
  • Responsible for Extract, Transform and Load Customer and Contacts for Data migration from ERP source systems into E1/Z tables.
  • Developed Extract, Transform and Load (ETL) routines using Informatica Power Center and Oracle/Netezza as the backend databases.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups (Connected, Unconnected), Expression, Normalizer, Update strategy & stored procedure transformation.
  • Involved in Creation of validation logic and reject reports at various stages during processing.
  • Responsible for identifying reusable logic to build several Mapplets, utilizing shared folders to in corporate them as shortcuts, which would be used in several mappings.
  • Re-writing Pl/Sql routines using Netezza nzsql and nzload utilities.
  • Scheduled several jobs using Autosys tool, creation of Autosys jils, changing job definition & handling scheduling issues.
  • Developed processes for automation of loading data to multiple instances using worklets, parameter driven sessions for batch schedule of processes, verification and re-conciliation of data stored in several different source systems.
  • Executed Informatica PowerCenter Workflows individually and validated the data.
  • Execution of Test plan, Implementation Plans & identified areas for process improvement of workflows, mappings in Informatica PowerCenter.

Environment: Informatica PowerCenter 8.6/8.1,Informatica IDQ 8.x,Informatica Power Exchange 8.0, SQL Developer for Oracle, Oracle10g/11g, Netezza, SQL Server2005/2008, Erwin 4.0, DB2, JD Edwards, Business Objects XI, Power Builder, Sun Solaris 8.0, Shell Scripting.

Confidential

Database Programmer

Responsibilities:

  • Extracted Data from Different Sources by using Informatica.
  • Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping designer and Mapplet Designer.
  • Created set of reusable transformations and Mapplets.
  • Extracted data from different sources of databases. Created staging area to cleanse the data and validated the data.
  • Designed and developed complex Aggregate, expression, filter, join, Router, Lookup and Update transformation rules.
  • Developed schedules to automate the update processes and Informatica sessions and batches.
  • Analyze, design, construct and implement the ETL jobs using Informatica.
  • Developed mappings/Transformations/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 6.1.
  • Developed Shell scripts to setup runtime environment, and to run stored procedures, packages to populate the data in staging tables.
  • Used PMCMD commands to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the activities.
  • Created Users, user groups, database connections and managed user privileges using supervisor.

Environment: Informatica PowerCenter 5.1.1, Oracle 8i, MS SQL SERVER 2000, SQL, PL/SQL, SQL*Loader, UNIX Shell Script, TOAD 7.6 for Oracle.

We'd love your feedback!