We provide IT Staff Augmentation Services!

Sr.informatica / Odi Developer Resume

0/5 (Submit Your Rating)

Bothell, WA

SUMMARY

  • 8 years of Experience in analysis, design, development, implementation and troubleshooting of Data warehouse applications.
  • Data integration experience in developing ETL mappings and scripts using Informatica Power Center 9.x/8.x/7.x and Power Exchange 8.x/7.x.
  • Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
  • Manage Etl and Data Quality and integrate data instantly in Talend ETL tool.
  • Experience in sas to informatica migration.
  • Experience in Informatica to ODI migration.
  • Experience in Sybase to Oracle migration.
  • Experience in Java and JavaScript programming.
  • Experience in Insurance, Retail, Banking, Financial and Healthcare Domains.
  • Proficient in gathering business requirements, establishing functional specifications and translating them to Design Specifications.
  • Strong technical leadership and communication skills. Ability to lead and direct work of other developers.
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels of data flow from source to target.
  • Worked onInformaticaCloudand developed jobs like Data Synchronization Tasks, Data Replication Tasks, Mappings and Task Flows.
  • Worked with Various sources and target such as Oracle, SQL server, Teradata, Salesforce (SFDC), Sybase, Netezza, DB2, XML, Flat file, COBOL files, etc.
  • Expertise in replication technologies including GoldenGate, Streams, AQ and Materialized Views
  • Experience in creating PL/SQL Blocks, Stored Procedures, Functions, Triggers, Views, Materialized Views and Packages.
  • Used Unix Shell Scripts to automate day - to-day operations.
  • Strong Working Experience of PowerCenter Administration, Designer, Informatica Repository Administrator console, Repository Manager and Workflow Manager.
  • Experience in sql server with Intergration with SAP HANA.
  • Design and develop mapplets in Informatica Data Quality (IDQ) Tool.
  • Experienced in UNIX Shell scripting as part of file manipulation, Scheduling and text processing
  • Expertise in Performance Tuning of ETL Informatics maps and SQL.
  • Involved in implementing in Slowly Changing Dimensions (SCD) and Change Data Capture (CDC)
  • Experience in Informatica Administration on Windows - Creating Domains, Repositories, Users, Folders
  • Conversant with all phases of the project Software development life cycle. (Estimation, Analysis, Technical Design, Coding, Unit testing, Implementation and Production Support).
  • Worked in Both Agile and Water fall methodologies.
  • Ability to multitask with multiple deadlines or milestone requirements.
  • Ability to independently develop and lead complex projects.
  • Extensively worked with Netezza database to implement data cleanup, performance tuning techniques.
  • Good understanding about Netezza architecture.
  • Co-ordination with client & offshore team on various project activities including requirement, design, development and implementation.
  • Good Knowledge onapplying rules and policies using ILM (Information Life Cycle Management) workbench for Data Masking Transformationand loading into targets.
  • Ability to work in a fast paced environment focusing on multiple projects at the same time
  • Good knowledge in developing and designing reports throughSSRSandExcel
  • Experience In Oracle Data Integrator (ODI)
  • Experience in creating High Level Design and Detailed Design in the Design phase.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements.
  • Experience in 2NF,3NF and Object Models
  • Well versed in OLTP Data Modeling, Data warehousing concepts.
  • Worked on Red shift database which hosted as a data warehouse product.
  • Worked on Java classes conforming to J2EE design patterns such as Business Delegate, Service Locator, Session faade, Value Object and packaged with J2EE specification and deployed in BEA WebLogic 6.X application server.
  • Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
  • Experience in Designing Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Exposure to Informatica B2B Data Exchange that allows to support theexpanding diversity of customers and partners and their data with capabilities that surpass the usual B2B solutions.
  • Experience in B2B DT performance optimization (Memory, Grid, Partitioning).
  • Experience in loading data from Oracle database and MS SQL Server database to Teradata database.
  • Experience in Compliance Management system.
  • Experience in data mining techniques.
  • Designed Mappings using B2B Data Transformation Studio.
  • Experience in Hyperion Oracle Data Relationship Management (DRM)
  • Experience in active data ware housing on the Teradata platform, extended use of Teradata Parallel Transport, TPump, FastLoad, BTEQ and migrating legacy systems into Teradata.
  • Experience in the concepts of building Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions and Surrogate Keys.
  • Experience in writing, testing and implementation of the PL/SQL triggers, stored procedures, functions and packages.
  • Experience in AWS(Amazon Web services), Red-shift.
  • Experience in Agile project management and sprint planning, execution.
  • Worked on Transact-SQL and .net sevices
  • Extensively worked on developing and debugging Informatica mappings, mapplets, sessions and workflows.
  • Experience working with Relational Databases like Teradata, Vertica, Oracle, SQL Server,
  • Experience in Data validation Option(DVO)
  • Worked on Unstructured Data Option (UDO),Partitiong, etc
  • Worked on OBIEE datawarehouse Application Console (DAC).
  • Worked on creating Unit Test Cases for mapping and code reviews.
  • Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • Created data summaries and dashboards using statistical and visualization tools to present results to management.
  • Work with DBA’s and systems support personnel in elevating and automating successful code to production. UsedUC4for job scheduling, workload automation and for generating reports.
  • Strong experience in writingUNIX Shell scripts,SQL Scriptsfordevelopment,automation of ETL process,error handling, andauditing purposes. Experience in usingUC4, Autosys,andControl-Mscheduling tools to organize and schedule jobs.
  • Experience in logical/physical data models and Forward/Reverse Engineering using ERwin.
  • Experience in Business Intelligence tools like Cognos 10.x/8.x/7.x, Business Objects XI R2.
  • Experienced in Data warehouse reporting withCognos 10.1,8.0, 8.2BI Series,Cognos.
  • Created Projects, Models using Cognos Framework Manager andpublished packagesto Cognos Server for reporting authoring.
  • Experience inReportsMigrationsuch as migrating from Cognos 8.0 to Cognos 10.2.2.
  • Have knowledge of coordinating all of the entities involved in a supply chain.

TECHNICAL SKILLS

ETL: Informatica PowerCenter 9.x/8.x/7.x/6.x, B2B Data Transformation, Power Mart 6.1/5.1, Power Connect 8.5.1/8.1.1 , Informatica Data Quality (IDQ).

OLAP/DSS Tools: Hyperion Essabase 7.2 Business Objects6.5.1, Supervisor, Designer, COGNOS (Impromptu6.1, Transformer and Power Play 6.1), crystal reports 8.1, crystal enterprise, OBIEE 10.1.3.4.1

Database Modeling: ERwin 4.5/4.0/3.5

Databases: Oracle 12c,11g,10g,9i,8i, Teradata, SQL Server 2008,BIDS(SSIS,SSAS,SSRS)

GUI: VisualBasic6.0, Developer2000 (Forms 6i/6.0/5.0/4.5 and Reports 6i/6.0/3.0/2.5)

Languages: SQL, PL/SQL, C, COBOL, JAVA, C++

DB Tools: Toad, Sql Developer, SQL* plus, SQL* Loader, Import, Export

Scripting: HTML, XML, JavaScript, VBScript, ASP

OS: Windows 7/Vista/XP/2003/2000/NT/98/95, Sun Solaris 2.x, HP-Unix, Red.

PROFESSIONAL EXPERIENCE

Confidential, Bothell,WA

Sr.Informatica / ODI developer

Responsibilities:

  • Developed complex Informatica mappings to load the data from various sources using different transformations like Source qualifier, Connected and Unconnected Lookup, Expression, Aggregator, Joiner, Filter, Normalizer, Rank and Router Transformations.
  • Created the Test Plans, Test Cases for the project.
  • Worked Informatica power center tools like Source Analyzer, Mapping designer, Mapplet and Transformations.
  • Expertise in creatingdatabases, users, tables, triggers,macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Experienced in Data and statistical analysis.
  • Developed, Supported and Maintained ETL (Extract, Transform and Load) processes using Oracle Data Integrator (ODI).
  • Developed numerous ODI Interfaces, packages, scenarios to extract data from different source systems such as Oracle EBS (AR, AP, GL, OM Modules), Legacy Systems such as Mainframe, Flat files and Load into ODS/EDW.
  • Responsible for setting up ODI in all the environments (DEV, UAT and PROD) which includes all the tasks from Installing and configuring ODI to releasing the environment to users (assigning security, client installations and other related tasks).
  • Develop the migration process required to migrate the ODI configuration through various test environments including the production and contingency environments.
  • Export/Import data from DRM using the DRM migration modules.
  • Migration of allInformatica Powercenter mappingsto Oracle Warehouse Builder 12C and deployed successfully.
  • Migrated the packages, scenarios from one repository to another repository.
  • Experienced in Installing and Configuring Oracle Data Integrator (ODI) software tool in a three-tier environment and performing periodic upgrades, performing source-to-target mappings, storage capacityplanning and developing ETL.
  • Created numerous Views, Materialized Views, Stored Procedures and Functions which were used in ODI and OBIEE.
  • Designed and developed theUNIX shell scriptsfor theautomation of ETL jobs. sDesigned Mappings using B2B Data Transformation Studio.
  • Developed Informatica mappings and also tuned for better performance.
  • Responsible for optimizing and reengineering some of the ETL jobs by improving SQL query performance and improving table projections in multi TB Vertica database
  • Responsible for developing ETL programs to apply transformations to the staged data and load into targetschema in Vertica. Utilized python, SQL, analytical functions, unix shell to load data.
  • Proficient in Data warehouse ETL activities using SQL, PL/SQL, PRO*C, SQL*LOADER, C, Data structures using C, Unix scripting, Python scripting and Perl scripting.
  • Experience withUNIX shellandPythonscripting for File validation andSQLprogramming.
  • Responsible for loading data files from various external sources like ORACLE, MySQL into staging area in mySQL and vertica databases. Utilized python, SQL, unix shell to load data.
  • Extracted data from Flat files and Oracle and loaded them into Teradata.
  • Developed complex mappings, mapplets using Informatica workflow designer to integrate data from varied sources like Teradata, Oracle, Flat files and loaded into target.
  • Load operational data from Oracle, SQL Server, flat files, Excel Worksheets into various data marts like PMS and DEA.
  • Developed, tested and implemented Unix Shell scripts, FTP, SFTP’s, file transfers, Teradata BTEQ, Fast Load, Mload, TPump, FastExport Oracle/DB2 stored procs & Bulk loads.
  • Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS)
  • Developed various operationalDrill-throughandDrill-downreports usingSSRS
  • Worked with the B2B Operation console to create the partners, configure the Partner management, Event Monitors and the Events
  • Experience in Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards which govern the data formats.
  • Designed and developed data warehouse solutions, integrate new files and data’s from other applications into Enterprise data warehouse, Utilize Teradata for reporting, OLAP & history.
  • Experienced in Data warehouse reporting withCognos 10.2.2,8.0, 8.2BI Series.
  • Expertise in creatingcomplex reportsin Cognos Report Studio such as List reports, Cross-tab reports, Drill through reports, Master-Detail Reports and Cascading Reports and involved in reports performance tuning.
  • Experience in Busines process Outsourcing (BPO) Informatica.
  • Worked closely with the OBIEE team in building the RPD and dashboard.
  • Customized the OBIEE dashboard.
  • Experience in Data Validation Option(DVO)s reads table definitions from Power Center metadata repositories.
  • Developed Java classes conforming to J2EE design patterns such as Business Delegate, Service Locator, Session faade, Value Object and packaged with J2EE specification and deployed in BEA WebLogic 6.X application server.
  • Cleansed/standardized, address validation, de-duplication, cleaning of the legacy source data using Trillium data quality.
  • Having exposure on Informatica Cloud Services.
  • Developed Procedures and Functions in PL/SQL.
  • Used Procedure Transformations to invoke Oracle PL/SQL Procedures.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Used Data Transformation Studio to transform unstructured to structured forms
  • Experience in validating and creating complex XML and JSON documents, including XML extensions.
  • Extensively worked with XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files, used Oracle XMLTYPE data type to store XML files.
  • Used major components like Serializers, Parsers, Mappers and Streamers in Data Transformation Studio for conversion of XML files to other formats.
  • Designed and Developed ETL logic for implementing CDC and Replication Technologies by tracking the changes in critical fields required by the user.
  • Extensively used Informatica to load data from Flat files, Oracle database.
  • Extensively performed Data Masking for preserving the referential integrity of the user data.
  • Working closely with Architects, Lead and Project Managerfor the applications assessment to all the Data Masking Team on Proxy server and proving support on the databases and applications
  • Performed Data Encryption on user data and client data for maintaining consistency and security.
  • Responsible for Performance Tuning at the Mapping Level and Session level.
  • Worked with SQL Override in the Source Qualifier and Lookup transformation.
  • Extensively worked with both Connected and Unconnected Lookup Transformations.
  • Load balancing of ETL processes, database performance tuning and capacity monitoring.
  • Used informatica powercenter 9.x to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
  • Involved in Unit testing and System testing of the individual.
  • Analyzed existing system and developed business documentation on changes required.
  • Performed detail data analysis of both data elements requested by the business and data source and documenting data source definition, source-to-target mapping, and logical structures for the data warehouse/data mart
  • Was involved in design, development, and the administration of Oracle data mart, the distribution of global data from the Data Warehouse.
  • Used UNIX to create Parameter files and for real time applications.
  • Developed shell scripts.
  • Extensively involved in testing the system from beginning to end to ensure the quality if the adjustments made to oblige the source system up-gradation.
  • Worked with many existing Informatica mappings to produce correct output.
  • Prepared Detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.
  • Prepared Unit Test plan and efficient unit test documentation was created along with Unit test cases for the developed code.
  • Prepared SQL scripts for validation of the ETL business logics involved.
  • Prepared the SQL scripts and validated the data shown in the reports.
  • Responsible for scheduling the Test status calls with users and the project management team.
  • Prepared the test sign off documents.
  • Detail system defects are created to inform the project team about the status throughout the process.
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache and Persistent Cache.
  • Experience in replication using golden gate, materialized views.
  • Replication and Extracting data and applying on production using GoldenGate.
  • Used Update Strategy DD INSERT, DD UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
  • Developed Re-Usable Transformations and Re-Usable Mapplets.
  • Developed Slowly Changing Dimensions Mapping for Type 1 SCD and Type 2 SCD.

Environment: Informatica Power Center 9.1/8.6, Power Exchange, B2B, InformaticaAnalyst 9.6.1, InformaticaDeveloper 9.1, Talend RTX 4.X, Windows NT, Oracle 12c,9i/10g, ODI v.12.1.1.1/v.11.1.1.7,Teradata, Teradata R12/R13, Teradata SQL Assistant, SQL/PL-SQL, T SQL, Netezza 4.x, TOAD, XML, Cognos 10.x, Reporting Services (SSRS), Informatica cloud services

Confidential, Columbia, MD

Sr.Informatica Developer

Responsibilities:

  • Responsible for converting Functional Requirements into Technical Specifications.
  • Worked with the team to Integrate data from multiple source systems like Health Answers, Power system, Health Plex into Single Data Warehouse system.
  • Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, EDI files processing and Mapping Designer.
  • Created DSN connection to a Redshift database.
  • Strong Working Experience of PowerCenter Administration, Designer, Informatica Repository Administrator console, Repository Manager and Workflow Manager.
  • Worked on Administration of Informatica PowerCenter and Ascential DataStage.
  • Creating mappings with different transformations like Parser transformation, Standardizer transformation, Address Validator in Informatica developer (IDQ) tool.
  • Used Address validator transformation in IDQ and passed the partial address and populated the full address in the target table.
  • Used major components like Serializers, Parsers, Mappers and Streamers in Data Transformation Studio for conversion of XML files to other formats.
  • Created various data marts from data warehouse and generated reports using Cognos
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator.
  • Experience in health care domains such as HL7,FHIR AND FHIM Healthcare
  • Experience in Health care messaging standards including HL7, CCD, and X12.
  • Extracted data from Flat files and Oracle and loaded them into Teradata.
  • Designed the ETL processes using Informatica to load data from Oracle to target Teradata Database.
  • Loaded the data into the Teradata database using Load utilities like FastExport, FastLoad, MultiLoad, and Tpump.
  • Developed complex mappings, mapplets using Informatica workflow designer to integrate data from varied sources like Teradata, Oracle, Flat files and loaded into target.
  • Worked with Shortcuts across Shared and Non Shared Folders.
  • Created Pre-Sql and Post-Sql scripts which need to be run at Informatica level.
  • Developed Oracle views to identify incremental changes for full extract data sources.
  • Developed Stored Procedures which truncates the Tables and reinserts all/partial records depending on the “Load Flag” Parameter passed to the procedure. All messages/errors/load times are logged into the log table.
  • Developed the automated and scheduled load processes using Fact scheduler.
  • Involved in migration of mappings and sessions from development repository to production repository.
  • Responsible for Unit testing and Integration testing of mappings and workflows.
  • Worked with NZ Load to load flat file data into Netezza tables
  • Assist DBA and Architect to identify proper distribution keys for Netezza tables.
  • Created mappings using pushdown optimization to achieve good performance in loading data into Netezza.
  • Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouseNetezzatables usingInformaticaWorkflow Manager.
  • Worked on production tickets to resolve the issues in a timely manner.
  • Populated and updated data dictionary, standardized naming convention.
  • Familiar with the OBIEE datawarehouse Application Console (DAC).
  • Created OBIEE source system containers.
  • Acquired knowledge of integrating the Informatica workflows into OBIEE.
  • Able to manage Queries in Teradata and connect to it through ODBC for pulling out data.
  • Restored the relational connections into OBIEE for the Informatica workflows in OBIEE DAC
  • Experienced in configuring incremental loading in DAC

Environment: Informatica PowerCenter 9.1, Informatica Data Quality, Teradata, Oracle 11i, Netezza 4.x,Xml, DB2- Surveyor 400, Solaris UNIX, Winscp, Toad, Fact, cognos 8.x,10.x, ODI

Confidential, Grand Rapids, MI

Informatica Developer /Technical specialist

Responsibilities:

  • Review and test Informatica ETL code and execute the workflows for business validations.
  • Teradata and ETL objects code migration from Dev environment to QA using version software Accurev 5
  • Created complex mappings using Unconnected Lookup, Sorter, and Aggregator, Dynamic Lookup and Router transformations for populating target table in efficient manner.
  • Building the ETL architecture for Source to Target mapping to load data into Data warehouse.
  • Loaded filtered data into Teradata as target using fastload connection and multiload connection using Informatica.
  • Experience in Financial (Investment Management) Domain.
  • Worked On PowerExchange for Web Services Connections.
  • Worked on ODI migration.
  • Created numerous executive level reports. ‘Dashboards’ were generated to provide insight into the sales/marketing/financial data.
  • Wrote scripts to massage data and feed to Sybase IQ/Oracle databases for alert generation and automated in batch cycle using autosys.
  • Worked on Talend RTX ETL tool, develsop jobs and scheduled jobs in Talend integration suite. Extensively used the concepts of ETL to load data from AS400, flat files to Saleforce .
  • Modified reports and Talend ETL jobs based on the feedback from QA testers and Users in development and staging environments.
  • Involved in the development of Informatica mappings and also performed tuning for better performance. Schedules Informatica B2B Data Transformation.
  • Designed and developed sophisticated workflows and data transformations involving Informatica B2B components DX Data Exchange and DT Data Transformation .
  • Created and execute jobs on Tidal Enterprise scheduler
  • Experience in creating Web Services Consumer application connection.
  • Unix scripting as required for configuration of environments for different jobs and dependencies of workflows.
  • Created and executed Teradata Parallel Transport (TPT) scripts.
  • Writing test cases and validating data in various scenarios with adding some dummy data
  • Coordination with various teams DBA, Architects and business analysts to resolve the configuration and test data issues.
  • Worked on Teradata utilities like BTEQ, FastLoad, FastExport, MultiLoad and Teradata Parallel Transpoter.
  • Manage Oracle RAC database configurations utilizing Veritas Cluster Server (VCS), GoldenGate and Data Guard to ensure maximum availability and data protection.
  • Install, configure and support uni-directional and bi-directional Oracle-to-Oracle replicated environments utilizing GoldenGate v7
  • Procuring the test data from production
  • 24x7 Call support for production deployment.
  • Teradata Test numbered database and Informatica repository, maintenance and management for difference projects with Admin privileges.
  • Manage Informatica services and user accounts
  • Involve in generating test scripts for unit testing, system testing and integration testing.
  • Improving test models, test data, and test processes continuously
  • Data validations as per the documents provided by Business Analyst and Architects of the project(s).

Environment: Informatica 8.6.1, Power Exchange, B2B, Teradata 13, UNIX (HP UX), AccuRev 4.7.2/5, TSET, Quality Center 9.2,Oracle 10g, Golden gate 10.x, TOAD 10.1, Teradata SQL Assistant 13.0, Oracle SQL developer 2.1, TIDAL 5.1.x, Talend 4.1. UNIX, Sybase 15.x

Confidential, Princeton, NJ

Informatica Developer/Data Analyst/ Sybase devloper

Responsibilities:

  • Chose critical tables and files for profiling activity with the help of business users.
  • Created equivalent oracle tables in Staging area.
  • Created ETL mappings to pull data from various databases to Oracle staging area.
  • Designed Dashboards and Reports forTracking Service RequestWeekly Status and Overall Status Percentage.
  • Created users, groups and assigned different dashboards as per the data visibility defined by the client.
  • Created Data Maps to pull legacy data usingInformaticaPower Exchange.
  • Interfaced with the Portfolio Management and Global Asset Management Groups to define reporting requirements .
  • Experienced in Global Asset Management Groups project plan for intranet applications for Fixed Income and Equities.
  • Specialized in all aspects of database design and programming with Sybase Adaptive Server Enterprise, Replication server and Sybase IQ Server and Oracle
  • Captured various profiling results i.e. column profiling, table profiling, cross table profiling and cross application profiling results into the excel sheets so that business user can review the result.
  • Involved in capturing data which requires data cleansing activity, which is also a step in migration activity.
  • Participated in documentation part and tracking progress of Data Profiling project.

Environment: InformaticaPower Center 8.6, Oracle 10g, SQL Server, Toad,InformaticaPower Exchange, UNIX, SQL Plus and Windows 2000 / XP, Sybase 12.x.

Confidential

ETL Developer/Sybase Developer

Responsibilities:

  • Analyzed source systems, business requirements and identified business rules for building the data warehouse.
  • Have written Technical Specification documents for the ETL process.
  • Designed and developedInformaticaMappings, Workflows to load the data into Oracle DB, SQL server.
  • UsedInformaticaworkflow manager, monitor, and repository manager to execute and monitor workflows and assign User Privileges.
  • Performed data base administration activities on Sybase databases.
  • Extensively worked with Aggregator, Sorter, Router, Filter, Join, Expression, Lookup and Update Strategy, Sequence generator transformations.
  • Used Debugger to test the mapping and fixed the bugs.
  • Involved in Performance Tuning of Sessions and Mappings.
  • Specialized in all aspects of database design and programming with Sybase Adaptive Server Enterprise, Replication server and Sybase IQ Server and Oracle
  • Experice in Sybase IQ Server(data warehouse)
  • Experience in Sybase SQL Anywhere and Sybase mobile technology.
  • Used the Workflow manager to create workflows and tasks, and also created Worklets.
  • Involved in Production Support in resolving issues and bugs.
  • Worked on SQL stored procedures, functions and packages in Oracle.
  • Created and maintained UNIX Shell Scripts for pre/post session operations and various day-to-day operations.
  • Developed Unit and System testing test cases, using System Procedures to check data consistency with adherence to the data model defined.

Environment: InformaticaPower Center 7.1, Sybase 12.x, Oracle 9i, SQL server 2005, UNIX, SQL Plus, TOAD, MS Access, Windows XP.

We'd love your feedback!