Sr. Dw Informatica Developer Resume
NY
Sr. Certified DW Informatica Developer
SUMMARY
- Around Eight Plus (8+) yearsofITexperience inData warehousingand Business intelligence with emphasis on Business Requirements Analysis, Application Design, Development, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems.
- Six Plus (6+) yearsof development and design ofETLmethodology for supporting data transformations and processing, in a corporate wide ETL Solution usingInformatica Power Center 9.0/8.6/8.5/8.1.1/7.1.3/7.1.1/7.0/6.2/6.1(Workflow Manager, workflow Monitor, Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet Designer, Transformation developer), Informatica PowerMart 6.2/6.1/5.1.2/5.1.1/4.7, Power Connect, Power Plug, Power Analyzer, Power Exchange, Datamart, ETL, OLAP, ROLAP, MOLAP, OLTP, Autosys, Control M, Maestro
- Five Plus (5+) yearsofData modeling experience usingtoolslike ERWIN 7.2/4.5/4.0/3.5/3.x, Visio, Power Designer and Oracle Designer.Knowledge in designing and developing Data marts, Data warehouse using multi-dimensional Models such as Snow Flake Schema and Star Schema while implementing Decision Support Systems. Experience in FACT & Dimensions tables, Physical & logical data modeling.
- Five Plus (5+) yearsof strongData Cleansingexperience usingFirst logic (Ace, DataRight, Merge/Purge) and Trillium 7.6/6.0(Converter, Parser, Geocoder, Matcher), SQL coding and UNIX shell scripting for address cleansing, profiling and segmentation of customer data.
- Five Plus (5+) yearsofBusiness Intelligenceexperience usingBusiness Objects XI R3-R2-R1/6.5/6.0/5.1/5.0,Business Objects SDK, Web Intelligence 2.5/2.6,Cognos Impromptu 8.0/7.0/6.6/6.x,Power play, Transformer, Impromptu Web Reports (IWR), Power Play Enterprise Server, Micro Strategy. Created Universes designed and maintained various reports using Business Objects.
- Around Plus (8) yearsof databaseexperience usingOracle 11g/10g/9i/8i/, MS SQL Server 2008/2000/7.0/6.5, Teradata V2R6/V2R5, DB2 8.0/7.0/6.0,MS Access7.0/2000, SQL, PL/SQL, SQL*Plus, SQL*Loader and Developer 2000. Hands on Working knowledge inOracle and PL/SQL Writing Stored Procedures, Packages, Functions and Triggers. Adept at working with Distributed Data Bases, SQL * Loader.
- Extensively worked with ETL tools to extract data from various source includingOracle, Flat files, Teradata.Experience inPerformancetuningof sources, targets, mappings and sessions. Worked with OracleStored Proceduresand experienced in loading data into Data Warehouse/Data Marts using Informatica,SQL*Loader. Extensive Expertise with error handling.
- Strong knowledge ofSoftware Development Life Cycle(SDLC)includingRequirement analysis, Design, Development, Testing, Support and Implementation.ProvidedEnd User Training and Support.
- Experience withTOADto test, modify and analyze data, create indexes, and compare data from different schemas. Experienced inShell scriptingandPL/SQLprocedures.
EDUCATION & CERTIFICATIONS
- Informatica Certified
- Oracle Certified
- Bachelor of Engineering (Information Technology), India
- Master of Science (Major Computer Science),
TECHNICAL SUMMARY
- RDBMS: Oracle 11g/10g/9i/8i/, MS SQL Server 2008/2000/7.0/6.5, Teradata V2R6/V2R5, DB2 8.0/7.0/6.0,MS Access7.0/2000, SQL, PL/SQL, SQL*Plus, SQL*Loader and Developer 2000
- DW/ETL Tools: Informatica Power Center 9.0/8.6/8.5/8.1.1/7.1.3/7.1.1/7.0/6.2/6.1(Workflow Manager, workflow Monitor, Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet Designer, Transformation developer), Informatica PowerMart 6.2/6.1/5.1.2/5.1.1/4.7, Power Connect, Power Plug, Power Analyzer, Power Exchange, Datamart, ETL, OLAP, ROLAP, MOLAP, OLTP, Autosys, Control M, Maestro.
- Reporting Tools: Business Objects XI R3-R2-R1/6.5/6.0/5.1/5.0,Business Objects SDK, Web Intelligence 2.5/2.6,Cognos Impromptu 8.0/7.0/6.6/6.x,
- Languages: SQL, PL/SQL, SQL *Plus, VB 6.0, PERL, Shell Scripting
- Database Tools: SQL Navigator, DB Artisan, TOAD
- Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling,ERWIN 7.2/4.5/4.0/3.5/3.x, Visio, Power Designer and Oracle Designer.
PROFESSIONAL SUMMARY
Confidential NY MAR 2011 - CURRENTSR. DW INFORMATICA DEVELOPER
This DW Application was built for the loans Division and was aimed at building a data mart for loans processing. The project involves providing online credit and liability information to process the loans. The project aimed at facilitating faster laon processing for the loan officers and also helps in risk management for the bank. The project has interface with a number of subsystems, which provide it real time information to provide the complete picture of a customer.
Responsibilities:
- Gathered requirementsfrom the end users and attended JAD Sessions. Analyzed the functional specs provided and created technical specs documents for all the ETL.
- Worked closely with theBA and Data Warehouse Architectto understand thesource dataand need of the Warehouse.
- AnalyzedChange request (CR)related to ETL and BI applications, as per requests from various team and approve/implement the request.
- Responsible forData Analysis of Source Systems (legacy systems).
- PerformedSource to Target Mapping.
- DefinedData Mapping rules and Data Transformation Rules. Performed all Data Validation and Data Reconciliation tasks.
- Analyzing the source data coming from Oracle, Legacy Systems, XML, Flatfiles, Excel files, DB2, SQLSERVER AND FLAT FILESand worked with business users and developers to develop the Model.
- PerformedDataprofiling to enhance consistency of Member Data
- Data WarehouseData modelingbased on the client requirement usingErwin(Conceptual, Logical and Physical Data Modeling)
- Performed DataModeling using Ralph Kimball approach and populating the business rules into the target tables.Designed the database tables and constraints using Data modeling techniques using Erwin
- UsedInformatica as ETL Tool. Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database. Used IDQ, IDE for Data Analysis
- Worked onInformatica Power Centertool-Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager, Informatica Workflow Manager and Workflow Monitor
- Extensively usedPower Center/Martto design multiple mappings with embedded business logic.
- Creation of Transformationslike Lookup, Joiner, Rank and Source Qualifier Transformations in theInformatica Designer.
- Createdcomplex mappingsusing Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
- Fixed specs defects, technical coding defectssuch that the data loads properly as per business needs
- ImplementedSlowly Changing Dimensions - Type I & IIin different mappings as per the requirements.
- Strong experience in usingInformatica Workflow Manager and Workflow Monitorto schedule and monitor session status.
- Extensively used variousActive and Passive transformationslike Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, Lookup Transformation, Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation and Aggregator Transformation.
- Extensivelyworked with various Lookup cacheslike Static cache, Dynamic cache and Persistent cache.
- Extensively worked withJoiner functionslike normal join, full outer join, master outer join and detail outer join in the Joiner transformation.
- UsedUpdate StrategyDD_INSERT, DD_UPDATE, DD_DELETE and DD_REJECT to insert, update, delete and reject items based on the requirement.
- Worked withSession logs and Work flow logsfor Error handling and troubleshooting in Dev environment.
- UsedDebugger wizard to troubleshoot data and error conditions.
- Responsible forBest Practices Implementation, Performance tuning at all levels and Error Handling methods.
- DevelopedReusable Transformations and Reusable Mapplets.
- Worked extensively withMapping Parameters, Mapping Variables and Parameter filesfor Incremental Loading
- Worked withworkflow System variableslike SYSDATE and WORKFLOWSTARTTIME.
- Extensively usedVarious Data Cleansing and Data Conversion Functionslike RTRIM, LTRIM, ISNULL, ISDATE, TO_DATE, DECODE, SUBSTR, INSTR, and IIF functions in Expression Transformation.
- Worked withShortcuts across shared and non shared folders.
- Responsiblefor migrate the code using export and Import utilitiesacross various Instances
- Created and optimizedSQL queriesfor better performance.
- Createdpre sql and post sql scriptswhich need to be run at Informatica level.
- Responsible forUnit Testing of Mappings and Workflows.
- Responsible forimplementing Incremental Loading mappingsusing Mapping Variables and Parameter Files.
- Responsible fordetermining the bottlenecks and fixing the bottleneckswith performance tuning.
- Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy
- Used Autosys for Scheduling,Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts
- Extensively codedPL/SQL and SQLand tuned them for better performance.
- ExtensiveSQL querying for Data Analysis.
- PerformedDataAnalysis & Source-To-Target Mapping.
- Worked onDataExtraction, Data Transformations, Data Profiling, Data Loading, Data Conversions and Data Analysis.
- Wrote, executed, performance tunedSQL Queries for Data Analysis & Profiling.
- Design and developmentof standard and ad-hoc reporting using SQL.
- Identify and document, business functions, activities and processes, data attribute and table meta-data, and general and detail design specifications.
- Analyzed data and assisted sources todetermine and create standardized formats
- Created Data Dictionaryto gives information of all the data elements and their business descriptions and source to target transformation information.
- Wrote and executedDML and DDL scriptsto incorporate database changes on Oracle 10 g using Toad tool.
- Created specifications forETL Reject Reports.
- Extracted sources from flat-files, Oracle, SQL Server and load them intoOracle.
- Responsible for creatingsystem design and detail design documentsbased on the requirement document provided by the business users.
- Provide strategic thinking, leadership pertaining to new ways of leveraging information to improve business processes.
- Experiencedin database design, data analysis, development, SQL performance tuning, data warehousing ETL process and data conversions.
- Maintained effective communication with non-technical client personal and handled the change requests.
- Developed the Test plans for quality assurance based on functional requirements.
- Responsible for Validation of Data as per the Business rules.
- Created Reports using BO XI and QlikView.
- Provided Production Support.
Environment: Informatica PowerCenter 9.0.1/8.6,IDQ, IDE,Repository Manager, Designer, Work Flow Manager, Work Flow Monitor,Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer, Worklet Designer,Mapplets, Mappings, Workflows, Sessions, Re-usable Transformations, Shortcuts,Informatica Power Connect,Oracle 11g/10g , SQL Server 2008/2005, DB2, XML Files, Flat Files, CSV files,PL/SQL(Stored Procedure, Trigger, Packages),BO XI, QlikView 9.0/8.5, Autosys,Erwin 7.2, MS Visio, SQL Developer, iSQL*Plus,TOAD, Windows 2003/2007, UNIX AIX 5.3, UNIX, Toad
Confidential, NYC, NY JAN 2010 - MAR 2011SR. DW INFORMATICA DEVELOPER
The scope of the project was to create, Enterprise Master Person Index (EMPI), and Policy & Claims Datamart. EMPI was created integrating various external data, internal data and third party software provided by a vendor. This EMPI tracks data for Patients current information and then an EMPI application creates a master index for each person or patient and gives most current data. This EMPI data is also integrated with Policy and Claims Datamart to effectively determine policies, premiums and process claims.
Responsibilities:
- Requirement Gathering and Business Analysis. Performed Business Discovery to define functional specifications.
- Data Modeling using Ralph Kimball approach and populating the business rules using mappings into the target tables.
- Worked on Data Extraction, Data Transformations, Data Loading, Data Conversions and Data Analysis.
- Extracted the data from various heterogeneous sources and performed ETL using Informatica
- Created Complex ETL Mappings using Informatica transformations like Filter, Aggregator, Expression, Router, Lookup, Update Strategy, Sequence generator, Rank, Union, Joiner, Source Qualifier, etc.,
- Implemented Complex Incremental Mappings, Type1 & Type2 mappings to update slowly changing dimensions to maintain full historical claims/Policy data
- Used Informatica Velocity for Business Requirements Specification, change Request, critical test parameters and mapping specifications and documentation.
- Updated the Meta data repository with the sources, target, mappings and sessions details after every project in test and production environment.
- Worked on adding extraction groups and maps to get the change data.
- Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
- Used session partitions, dynamic cache memory, and index cache to improve the performance of Informatica server.
- Configured the sessions using workflow manager to have multiple partitions on source data and to improve performance.
- Translated the PL/SQL logic into Informatica mappings.
- Extensively worked on the Database Triggers, Functions and Database Constraints.
- Involved in repository management using admin console and repository manager for creating repositories for development and testing environments. Was Responsible for taking repository backup before migrations for major projects.
- Involved in addition and configuration of repositories, adding user groups, users to the repository, assigning privileges to the users, creating project specific and shared folders.
- Configure Database and ODBC connectivity to various source/target databases.
- Independently perform complex troubleshooting, root-cause analysis, solution development
- Manage and administer system performance including on-going optimization and tuning of Informatica environment.
- Tuning Informatica Mappings and Sessions for optimum performance using Multi threading, session partitions and parallel processing. Used various partitioning schemes in order to improve the overall session performances. Implemented Informatica Best Practices.
- Worked on the performance tuning of databases (dropping & re-building indexes, partitioning on tables)
- Wrote UNIX scripts for creating, dropping, analyzing tables, pre/post-session shell scripts to download/upload files from shared server, to generate control files, check status (failure/success) after completion of all batches, the ESP job scheduling.
- Experience with utilities Informatica Data Quality (IDQ) and Informatica Data Explorer (IDE).
- Involved in end to end system testing, performance and regression testing and data validations.
- Responsible for updating the Meta data repository with the changes in the test environment of power center.
- Used Maestro and Informatica scheduler for scheduling the jobs.
- Used style sheets (XSLT) to read the XML data, query the database and convert the result back into XML form.
- Developed and enhanced web services, XSLT data transformation for B2B data exchange applications
- Coordinated with the offshore team for development, testing and migration.
- Defects are logged and change requests are submitted using defects module of Test Director.
- Proven abilities in creating complex reports by linking data from multiple data providers, using free hand SQL, stored procedures and functionalities likeCombined Queries.
- Implemented Slowly Changing Dimensions Type 1, Type 2 and Type 3.
- Extensively used Business Objects reporting functionalities such as Master Detailed, slice and Dice, Drilling methodology, filters, ranking, Sections, Graphs and Breaks.
- Provided data to BO and Dashboards users for generating complex reports by linking data from multiple sources, using functionalities like Slice and Dice, Drill Down, Master Detail etc.
- QlikView was used to generate internal reports for claims officers to quickly review the data. Used OBIEE & BO XI for complex business report generation.
- Worked extensively with HL7 Data, EDI X12 Messages, HIPAA Transactions, ICD Codes.
- Worked with HIPAA Transactions and EDI transaction sets (834, 835, 837, 824, 820, 270, 276, 271, 278)
- HL7 data was used in creating EMPI and also Patients Datamart.
- Parsed HL7 messages and worked with HL7 Delimiter definitions (Segment Terminator, Field Separator, Component Seperator, Subcomponent separator, Repetition Separator, Escape Separator) for identifying and Separating HL7 data.
Environment: Informatica Power Center 8.6/8.1 (Designer, Repository Manager, Workflow Manager, Workflow Monitor), HL7 3.x/2.4, HIPAA, Epic Systems,Oracle 11g/10g, AS400, DB2 8.0, SQL Server 2008/2005 (Enterprise Manager, Query Analyzer), Maestro, Reflection FTP, OBIEE, BOXI, T-SQL, UNIX -AIX VERSION 5, Visio 2003, Erwin, UNIX, Maestro, Siebel
Confidential, RIDGEBURY, CT JUN 2008 - DEC 2009
SR. DW INFORMATICA DEVELOPER
The main purpose of the project was to build a Datamart on Sales and Marketing to enhance the business strategy and operations for the Sales and Marketing Department in Consumer Care Division. The Datamart complied all the data that would be fed in from various source departments, external data and various platforms of the company and was used to generate reports for Sales Growth, Sales Forecasts, Market Share, Compensation Packages, Sales Force Alignment & Deployment, Sales Force Size Analysis, Exploratory Analysis, Compensation Analysis, Product/Competitor Analysis, Cluster & Segmentation Analysis, Physician & Account Targeting, Market Share.
Sales managers could get reports which will help them in making decisions and setting strategic targets for the sales department. They also wanted to do performance index measures and create regional forecast for product sales. To accomplish this, data was collected from all these departments, and was compared with sales data. IMS data was used in conjunction with internal sales measures to capture the market share.
Responsibilities:
- Requirements Gathering and Business Analysis for creating Sales & Marketing Datamart
- Participated in JAD Sessions along with Project Manager and End users.
- Conducted End User interviews to gather reporting and analysis requirements.
- Assisted in Logical Modeling. Forward engineered the Physical Data Model to development and production database using Erwin followed Star Schemato build the Sales & Marketing Datamart.
- Responsible for Design, Development, Administer and automation of ETL processes.
- Parsing high-level design spec to simple ETL coding and mapping standards.
- Coordinating with source system owners, day-to-day ETL progress monitoring.
- Project coordination, End User meetings.
- Resolved the performance issues for IMS huge volume of data and increased the performance significantly.
- Analyzed IMS Rx Data using IMS tools such as Xponent and Xponent Plantrak, regarding segmentation & profiling
- Used DataFlux to remove data reduplication.
- Responsible for converting the IMS Data to required structure using Informatica
- Developed and tuned all the Affiliations received from IMS and other data sources using Informatica and tested with high volume of data.
- Extracted huge volumes of IMS data and uploaded into Oracle using Unix Scripts and SQL*Loader.
- Involved extensively in Data Analysis, Data Mapping, Data modeling, Data Transformation, Data Loading and Testing. Created Project and corresponding Source and Target modules in Informatica
- Performed Source to Target Mapping.
- Installed, Maintained and Documented the Informatica setup on multiple environments.
- Created, executed and managed ETL processes using Informatica.
- Coordinating with source system owners, performed data migration and monitored day-to-day ETL progress, Data warehouse target schema Design (Star Schema) and maintenance.
- Extraction, Transformation and Loading of the data using Informatica
- Data Cleansing using FirstLogic.
- Created ETL mappings with PL/SQL Procedures/Functions to build business rules to load data.
- Developed complex mappings in Informatica to load the data from various sources using different transformations.
- Developed Complex mappings in Informatica to load the data from various sources using different transformations like Custom, Union, Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations. Used debugger to test the mapping and fixed the bugs.
- Developed Mapplets using corresponding Source, Targets and Transformations.
- Tuned Informatica Mappings and Sessions for optimum performance.
- Used session partitions, dynamic cache memory, and index cache to improve the performance of Informatica server.
- Migrated development mappings to production environment.
- Wrote Unix Scripts for managing Oracle Tables.
- Made changes to tables to meet business requirements.
- Extensively used TOAD for performing different operations on the Oracle Tables.
- Designed and Developed Coding for Procedures functions and Packages.
- Used Decode, NVL, Rank, First, Last and other functions.
- Written test cases for unit testing.
- Performance tuning of databases and mappings.
- Responsible for Error Handling and bug fixing.
- Implemented Slowly Changing Dimensions Type 1, Type 2.
- For Performance tuning used Explain plan Tkprof tools.
- Developed Views and Materialized Views.
- Used Oracle partitioning Concepts.
- Created different types of reports like Slice and Dice, Drill Down, Master Detail using Business Objects.
- Performed Unit & Integration Testing of the modules
Environment: Informatica PowerCenter 7.1/8.1, Oracle 10g/9i, PL/SQL, SQL Loader, Toad, Oracle Developer 2000 (Forms 6i & Reports 6i), IMS Data, DataFlux 8.0, ETL, Erwin 4.2, Flat files, FirstLogic MS SQL Server 2005/2000, PL/SQL, Business Objects XI/6.5, Teradata V2R6, Autosys,Shell Programming, SQL * Loader, IBM DB2 8.0, Toad, Excel and Unix scripting, Sun Solaris, Windows NT
Confidential, USA AUG 2005 - JUN 2008
SR. INFORMATICA ETL DEVELOPER
Worked on Mutual funds projects, (onsite-offshore Model)
This Datamart was built for processing Mutual funds.
- Analyzed the systems, met with end users and business units in order to define the requirements.
- For Customer system, developed logical & physical data model by defining strategy for Star schema with Fact & Dimension tables with Detail, Summary, Look-up tables, Indexes and views.
- Worked with DBA in the Upgradation process of Informatica from PowerCenter 6.2 to PowerCenter 7.1
- Used Informatica as the ETL for extracting data from the flat files on UNIX platform and transforming the data according to the target database and performed loading.
- Performed data analysis on the source data coming from legacy systems.
- Worked extensively with complex mappings using expressions, aggregators, filters, lookups and stored procedures.
- Created source, target, lookups, transformation, session, batches and defined schedules for those batches and sessions.
- Worked on Dimensional modeling to design and develop Star schemas by identifying the facts and dimension.
- Designed logical models (relationship, cardinality, attributes, candidate keys) as per business requirements using ERwin Data Modeler.
- Designed and customized data models in ERwin for Data warehouse supporting data from multiple sources on real time.
- Extensively used Incremental aggregation for CDC (Change Data Capture).
- Worked on PowerCenter Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
- Worked with Informatica Power Center - Designer tool in developing Complex mappings using Unconnected, Lookup, and Aggregate and Router transformations for populating target table in efficient manner.
- Gathered business scope and technical requirements.
- Improved development time by leveraging metadata, and reduced IT coordination costs by providing a common working business model across the enterprise architecture.
- Used transformations like Source Qualifier, Joiner, Update, Lookup, Stored procedure, Expression and Sequence Generator for loading the data into target Data Mart.
- Created events and tasks in the work flows using workflow manager
- Designed testing procedures and test plans.
- Extensively used UNIX shell scripts for cleansing.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell scripts.
- Experience in backup and restoring of Repository
- Prepared the Schedules for the Historical Load run and regular Load runs.
- Designed and developed Oracle PL/SQL scripts for Data Import/Export, Data Conversions and Data Cleansing
Environment: Informatica PowerCenter 7.1/8.1, PowerConnect, Erwin 4.0, Business ObjectsXI/6.5, Control M, SQL Server 2000, SQL*Loader, Oracle 9i, DB2 8.0, SQL Server 2000, SQL Loader, Sun Solaris 5.8, Windows 2000.
Confidential, HYDERABAD, INDIA MAY'03- AUG'05
SENIOR SYSTEMS ENGINEER
Division: Finacle Core banking and Wealth Management Solution
Projects:
Confidential, Singapore
- Developed backend Procedures and Functions using C for mutual funds SOA module
- Developed the front-end UI of the product using J2EE and JavaScript with CORBA ORB middleware interface
- Enhanced the existing libraries of Core Banking product using SQL/ PLSQL for executing the localized batch jobs
Confidential, China
- Achieved zero regressions in entire product development life cycle
- Developed backend Procedures and Functions using ORACLE for mutual funds module
- Enhanced the existing libraries of Core Banking product using PL/SQL for executing the localized batch jobs
Confidential, Egypt
- Fixed defects/ issues and implemented enhancements in collateral applications in accordance with SLA period. Worked as a Technical Anchor for the team, handled back end production problems, performed code review, participated in design of modules and configuration management activities Reconciliation of sources in different versions of product
- Configuration management to ensure different versions of the sources are integrated with the baseline in CVS.
- Develop jasper reports for Mutual Funds batch jobs, and certain menus using Ireport tool and JRXML.
- Unit testing the patches.
Environment: Oracle 9i/10g, SQL*Loader, SQL, PL/SQL, Java, J2EE, Jasper Reports, Sun Solaris 5.8, Windows 2000.