We provide IT Staff Augmentation Services!

Architect Resume

3.00/5 (Submit Your Rating)

Cincinnati, OH

SUMMARY:

  • 16 years of experience including extensive of Ab Initio, Datastage, Informatica, Erwin and NCR Teradata (V2R4/V2R5 /V2R5.1/V2R6/V2R6.2/12/13/13.10) data warehousing/Data modeling experience in the areas of data administration, physical database design, system architecture, database performance tuning, and ETL design and development. I am also skilled in the implementation of end user data access tools and client-server reporting applications.
  • 12 Years’ experience in NCR Teradata, Teradata CRM, Teradata RLDM/MLDM, Teradata Solution Methodology (TSM) Teradata SQL, Tools & Utilities(BTEQ, Fast Export, Multi Load, Fast load, TPUMP, Teradata Paralllel Transporter, SQL Assistant, View Point, Teradata Manager, Teradata Administrator, Teradata Query Manager, and Teradata Performance Monitor and Priority Scheduler).
  • Around four yrs experience in Ab Initio ETL, ORACLE PL/SQL experience, Data warehousing, data cleansing, Transformation and Loading in a complex, high-volume environment. Having extensive programming skills in Ab Initio ETL tool, DB2, PL/SQL, ORACLE, and UNIX Shell scripting.
  • 7+ years of strong ETL data integration experience using Informatica Power Center 8.6.1/8.0/7.1/7.0/6.2/5.1.2/1.7/1.6/8.x, Informatica Power Mart 8.6.1/8.0/7.0/6.2/5.1.2/5.0/4.7 (Source Analyzer, Repository Manager, Data warehousing designer, Mapping Designer, Mapplet, Transformations), PowerConnect for DB2/SAP/Peopesoft/Siebel, Power Plug, NCR Teradata V2R6/V2R5/V2R4, Oracle 9i/8i/7.3, SQL*Loader, OLAP, SQL Server 2000/7.0/6.5/6.0, MS Access 2000/7.0.
  • Over six years on Oracle (10g, 9i, 8i, 8.x, 7.x), on both OLTP and OLAP environments for high volume database instance
  • 6+ Years of experience with creating logical and physical data model and converting logical data models into Teradata physical database designs using Erwin, system implementation, performance tuning, and support.
  • Skilled in large-scale multi terabyte initial database load and on going update techniques, backup and recovery requirements, SQL, etc.
  • Hands on experience with IBM MVS mainframe environment using JCL, TSO, ISPF, IBM utilities and DB2.
  • Hands on Experience with AbInitio, Informatica, Business Objects, SAS, Congo’s and Micro strategy and knowledge on C++, Rational Rose 98, .Net.
  • Proven record of success in design, development and implementation of software applications using object oriented technology.
  • Excellent communication and interpersonal skills and adept in working with team’s offsite.

TECHNICAL SKILLS:

  • Hardware Pentium V Workstations, NCRS Unix Servers 4400, NCR Windows 2000 servers 4850 & 447
  • Operating Systems Unix, Windows XP, Windows 2000, Windows NT 4.0 and MVS Application
  • Servers:IIS, Web logic, Web sphere, ATG Dynamo
  • Databases: TeradataV2R2/3/4/5/5.1/6.x/12/13/13.10,Oracle, Sql Server
  • Teradata Tools & Utilities Query Man / SQL Assistant, WinDdi, Teradata Query Manager, Bteq, BteqWin, FastLoad, MultiLoad, Arc, Asf2, Teradata Administrator, Teradata Performance Monitor, NetBackup and NetVault.
  • ReportingTools: Crystal Reports
  • Business intelligence MicroStrategy, Business Objects, COGNOS, Informatica (5.0/6.0/7.1/8.0/8.5), Datastage (7.x/8.x) and Ab Initio.

EDUCATION AND TECHNICAL CERTIFICATION:

  • Sun certified Java Programmer.
  • Master of Computer Applications.

PROFESSIONAL EXPERIENCE:

Confidential,Cincinnati, OH Feb 2012 - Present
Sr. Teradata Architect/Data Modeler

  • Involved in meetings to gather requirements from the Business Analysts.
  • Analyze the existing data and designs logical and physical data models.
  • Created logical and physical data models using Teradata mLDM.
  • Provides naming standards and conventions and ensures that data dictionaries are maintained across multiple database environments.
  • Created data models of the following subject areas: Party, Organization, Geo Graphic Location, Item, Sales Order/Return, Contract and Invoice.

Environment: Teradata 13.10, Teradata mLDM, Teradata SQL, Tools & Utilities (BTEQ, Fast Export, Multi Load, Fast load, Teradata SQL Assistant, Teradata Manager and Teradata Performance Monitor, Teradata Administrator), Oracle 11g, Erwin 7.3.11.

Confidential,Milwaukee, WI Nov 2011 – Jan 2012
Sr. Teradata Architect/Data Modeler

  • Involved in meetings to gather requirements from the Business Analysts.
  • Analyze the existing data and designs logical and physical data models.
  • Created logical and physical data models using Teradata mLDM.
  • Provides naming standards and conventions and ensures that data dictionaries are maintained across multiple database environments
  • Map logical and technical requirements to the ODS model.
  • Created data models of the following subject areas: Party, Organization, Geo Graphic Location, Item, Sales Order/Return, Contract and Adjustment.

Environment: Teradata 13.10, Teradata mLDM, Teradata SQL, Tools & Utilities (BTEQ, Fast Export, Multi Load, Fast load, Teradata SQL Assistant, Teradata Manager and Teradata Performance Monitor, Teradata Administrator), Oracle 11g, Erwin, Windows 2000/Unix.

Confidential,Lisle, IL Jan 2011 – Nov 2011
Sr. Teradata Architect/Data Modeler

  • Involved in meetings to gather requirements from the Business Analysts.
  • Created logical and physical data models using Teradata mLDM.
  • Created conceptual, logical & physical data models and assists the DBA in developing the physical data model (PDM).
  • Analyze the existing data and designs logical and physical data models.
  • Provides naming standards and conventions and ensures that data dictionaries are maintained across multiple database environments
  • Created & maintained metadata related to the model
  • Created data models of the following subject areas: Party (Dealer, Customer, Supplier and Organization), General Ledger, Sub Ledger, Journal Entry, Chart of Accounts Balance, Geo Graphic Location, Invoice, Item, Cost, Contract and Adjustment.

Environment: Teradata 12, Teradata SQL, Tools & Utilities(BTEQ, Fast Export, Multi Load, Fast load, Teradata SQL Assistant, Teradata Manager and Teradata Performance Monitor, Teradata Administrator), Teradata mLDM, Oracle 10g, Erwin, Datastage 8.x, Windows 2000/Unix.

Confidential,Salem, NC Nov 2007 – Dec 2010
Sr. Teradata Architect/Data Modeler

  • Involved in meetings to gather information and requirements from the clients.
  • Created conceptual, logical & physical data models and assists the DBA in developing the physical data model (PDM).
  • Analyze the existing data and designs logical and physical data models
  • Provides naming standards and conventions and ensures that data dictionaries are maintained across multiple database environments
  • Created & maintained metadata related to the model
  • Designing data structures and creating an ERD, work with standards & Full system development life cycle.
  • Created data models of the following subject areas: Customer, Job & Call, PRP Payments and Contracts. Designed new database structures for Customer, Job & Call and Contracts.
  • Created logical and physical data models using Teradata mLDM.

Environment: Teradata V2R6.2, Teradata CRM, Teradata mLDM, eradata SQL, Tools & Utilities(BTEQ,Fast Export, Multi Load, Fast load, TPUMP, Teradata Parallel Transporter, Teradata SQL Assistant, Teradata Viewpoint, Teradata Manager, Teradata Query Manager, and Teradata\'s Performance Monitor, Teradata Administrator), Oracle 10g, Erwin, Informatica, Windows 2000/Unix.

Confidential,San Bruno, CA Jun 2007 – Oct 2007
Sr. Teradata DBA/ Architect

  • Involved in meetings to gather information and requirements from the clients.
  • Monitor and maintain a production Teradata database environment, including runtime optimization, capacity management and planning, security, configuration, scheduling, and execution of maintenance utilities.
  • Technical expert in the areas of relational database logical design, physical design, and performance tuning of the RDBMS.
  • Working as a Data Architect to the client team implementing Teradata Retail Logical Data Model and also worked with the team on implementing the Retail Logical Data Model and contributed significant amount of retail experience to the data model.
  • Maintain and Tune Teradata Production and Development systems. Supported application development time lines by implementing designs as well as incremental changes to database definition in a timely manner into production and non-production Teradata systems.
  • Actively involved in the database discussions related to Teradata’s Retail Logical Data Model (rLDM), which was used for implementing the PLMS.
  • Perform tuning and optimization of database configuration and application SQL Define database backup & recovery strategies, implement and administer. Create and Maintain users for Production/Development Teradata Systems.
  • Created Fast Load, Fast Export, Multi Load, Tpump, BTEQ scripts.
  • Dynamically set query priorities using Teradata scheduler. Capacity Planning: Re-size storage and spool space proactively, to accommodate growing data content.
  • Involved in data encryption (Protegrity) installation.
  • Completed the physical database design and administered the production objects in a change control environment.
  • Expanding the data warehouse to meet new business initiatives, logical data model design, physical data model design enhancing performance, and constructing a data mart.
  • Working on all Teradata Solution Methodology (TSM 4.0) phases related to ETL.
  • Involved in database upgradation from V2R5.1 to V2R6.2.
  • Involved in converting existing SQL Server data into Teradata.
  • Set up Teradata Priority Scheduler, for control of Production workload, and reconciliation of ad-hoc users with management Reports and required ETL Jobs Optimized performance on Oracle server and Oracle RAC.
  • Implemented Unit, Functionality, Performance and Stress testing on Mappings and created Testing Documents.
  • Optimized performance of Mappings and sessions by identifying bottlenecks and eliminating them.

Environment: Teradata V2R5/V2R6.2, Teradata CRM, Teradata SQL, Tools & Utilities(BTEQ,Fast Export, Multi Load, Fast load, TPUMP, Teradata Parallel Transporter, Teradata SQL Assistant, Teradata Manager, Teradata Query Manager, and Teradata\'s Performance Monitor, Teradata Administrator), Erwin, Datastage, Informatica, Ab Initio, Windows 2000/Unix.

Confidential,KS Jul 2004 – May 2007
Sr. Teradata DBA - One Sprint Financial Information System
The approach for this project initiative is based largely on the planning level target state approaches for the Business Intelligence domain. This card is part of an overall program for the business that will include Detailed Receivables, Cost, and Reporting Applications. Integrate multiple financial information warehouses and reporting systems into a single One-Sprint application. This application helps to report actual results and to provide input into budgets, forecasts, and business cases. Scope of the project is, to build One Sprint Finance Data Store to consolidate core Customer & Revenue Information from Sprints legacy customer and billingsystems.

  • Involved in meetings to gather information and requirements from the clients.
  • Involved in Designing the ETL process to Extract translates and load data from OLTP Oracle database system to Teradata datawarehouse.
  • Monitor and maintain a production Teradata database environment, including runtime optimization, capacity management and planning, security, configuration, scheduling, and execution of maintenance utilities.
  • Sorted the Extraction from heterogeneous source systems, like Oracle, Teradata internal and external flat files and building of the Transformations and Loading formatted data into the Multi-file and Serial files during the intermediate and the final stages of the processes (ETL) using Ab Initio/Datastage/Informatica.
  • Technical expert in the areas of relational database logical design, physical design, and performance tuning of the RDBMS.
  • Extensive knowledge in writing stored procedures, functions, packages, triggers, along with Unix shell scripts and experience in Data Warehousing concepts and Data modeling
  • Maintain and Tune Teradata Production and Development systems. Supported application development time lines by implementing designs as well as incremental changes to database definition in a timely manner into production and non-production Teradata systems.
  • Extensively worked with Micro Strategy Intelligence Server, MicroStrategy Web and MicroStrategy Administrator reporting basics using MicroStrategy Desktop and MicroStrategy Web. Involved in troubleshooting MicroStrategy prompt, Filter, Template, Consolidation, and Custom Group objects in an enterprise data warehouse team environment.
  • Completed the physical database design and administered the production objects in a change control environment.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio/Datastage/Informatica Components such as partitioning components, reformat, rollup, dedup Sort, join, scan, normalize, gather, merge, filter by expression, replicate, redefine, generate records, input table, output table, continuous update table etc and string functions, date functions, miscellaneous functions, inquiry functions, lookup functions etc.
  • Creating Mainframe datasets, submitting JCL, check the dataset files using File-Aid Using 3278 Terminal.
  • Perform tuning and optimization of database configuration and application SQL Define database backup & recovery strategies, implement and administer. Create and Maintain users for Production/Development Teradata Systems.
  • Extensively used ETL to load data from Oracle database, XML files, and Flat files data also used Power Connect to import data from IBM Mainframes.
  • Responsible for trouble shooting, identifying and resolving data problems, Worked with analysts to determine data requirements and identify data sources, provide estimates for task duration.
  • Created Fast Load, Fast Export, Multi Load, Tpump, BTEQ scripts for One Sprint Financial Information System.
  • Converting existing Oracle/SQL Server data into Teradata.
  • Dynamically set query priorities using Teradata scheduler. Capacity Planning: Re-size storage and spool space proactively, to accommodate growing data content.
  • Gather information from different data warehouse systems and loaded into One Sprint Financial Information System Consolidated model using Fast Load, Fast Export, Multi Load, Bteq and UNIX shell scripts.
  • Worked on creating and integrating Micro Strategy Reports and objects (Attributes, Filters, Metrics, Facts, and Prompts) with the data warehouse.
  • Involved in unit testing of mappings also involved in integration testing and user acceptance testing.
  • Set up Teradata Priority Scheduler, for control of Production workload, and reconciliation of ad-hoc users with management Reports and required ETL Jobs Optimized performance on Oracle server and Oracle RAC.
  • Created Pre and Post Session Scripts for checking Source files existence, Archiving Source files and bad files Deployed Metadata into Production repository from Development repository.
  • Implemented Unit, Functionality, Performance and Stress testing on Mappings and created Testing Documents.
  • Optimized performance of Mappings and sessions by identifying bottlenecks and eliminating them.
  • Involved in unit testing, systems testing, integrated testing and user acceptance testing.
  • Prepared Job docs and Module Inventory List (MIL) for code migration to production

Environment: Teradata V2R5/V2R6, Teradata CRM, Teradata SQL, Tools & Utilities(BTEQ,Fast Export, Multi Load, Fast load, TPUMP, Teradata SQL Assistant, Teradata Manager, Teradata Query Manager, and Teradata\'s Performance Monitor, Teradata Administrator), Ab Initio, MicroStrategy, Erwin, Windows 2000/Unix, MVS, JCL, ISPF 5.6, 3278 Terminal.

Confidential,KS Mar 2004 - Jun 2004
Teradata DBA - Par Pinnacle (Little Billers)
This project is based on Sprint Long Distance Revenue (charges, taxes, discounts and Customer) information.

  • Export the data into from UNIX to Mainframe for backup.
  • Created Mainframe datasets, JCLs, Procedures and libraries.
  • Working in the database team supporting both database design and data architecture activities using Teradata RDBMS (V2R4/5/6).
  • Developed Ab Initio graphs based on Detailed Design documents and Reviewed High Level Design document for data movement.
  • Involved in creating Flat files using dataset components like Input file, Output file, Intermediate file in Ab Initio graphs.
  • Extensively Used Transform Components Aggregator, Match sorted Join, Denormalize sorted, Reformat, Rollup and Scan Components.
  • Implemented the component level, pipeline and Data parallelism in Ab Initio for ETL process for Data warehouse
  • Extensively used Partitioning Components Broad Cast, partition by key, partition by Range, partition by round robin and Deportation components like Concatenate, Gather and Merge in Ab Initio.
  • Used various component of Ab Initio graphs like ‘Partition By Key’, ’Sort’, ’Reformat’, ‘Join’, ‘De-Dup’ etc. to impose business logic on the incoming data for loading and maintaining Dimensional tables (insert/update), such that the history of the records could be retained, for recurring loads.
  • Export Amdocs/LittleBillers files to Teradata using Fast load, Fast Export, Multiload, TPump.
  • Using AbInitio converted text files from ASCII to EBDIC.
  • Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations.
  • Developed Mapplets using corresponding Source, Targets and Transformations.
  • Created Sessions and Batches through the Informatica Server Manager.
  • Designed and documented the validation rules, error handling and test strategy of the mapping.
  • Tuned the sessions for better performance by using Server Manager Statistics.
  • Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions and batches.
  • Created and supported Development and Testing Teradata databases and proposed backup and recovery strategies to the client.
  • Set up Teradata Priority Scheduler, for control of Production workload, and reconciliation of ad-hoc users with management Reports and required ETL Jobs Optimized performance on Oracle server and Oracle RAC.
  • Implemented Lookup’s, lookup_local, In-Memory Joins and rollup’s to speed up Various Ab Initio Graphs.
  • Extensively used Ab Initio Co>OS commands like m_ls, m_wc, m_dump, m_copy, m_mkfs etc.
  • Developed performance utilization charts, optimized and tuned SQL and designed physical databases. Assisted developers with MVS COBOL, Teradata load utilities, SQL
  • Getting the revenue information for long distance, Prepaid Calling card, Sprint Video and Sprint Conference from Amdocs and Littlebillers and export files into Mainframe using FTP.
  • Involved in Query Analyzing, performance tuning and testing.

Environment: Teradata V2R5/V2R6, Teradata SQL, Tools & Utilities(BTEQ,Fast Export, Multi Load, Fast load, TPUMP, Teradata SQL Assistant, Teradata Manager, Teradata Query Manager, and Teradata\'s Priority Scheduler, Teradata Administrator), Ab Initio-2.12.2, Datastage, Informatica, SAS, Windows 2000/Unix, MVS, JCL, ISPF 5.6, 3278 Terminal.

Confidential,OH May 2002 – Feb 2004
Sr Teradata Developer
RMDB (Relationship marketing database) is a marketing automation tailored for Teradata, sales friendly built on Teradata with a web interface, fully Unicode complaint. It empower sales to manage communications into their accounts, deploy better organized and targeted marketing progress, Increase automation of lead management and reporting on campaign effectiveness. It provides access to history of NCR communications, Support CRM for initial contact through loyalty/retention Deliver external evidence of effective internal use of CRM, including Teradata and other NCR components. It has also various reports which help marketing and sales persons to analyze and introduce better programs for better sales.

  • Designed and developed Teradata load scripts on a Teradata V2R5/5.1 platform in a UNIX environment. Performed DBA functions and data validation.
  • Created the table views, stored procedures and macros for CPP.
  • Created tables, views, macros, stored procedures for ERR, Event Plus page.
  • Monitored and tuned the Teradata Database and ETL scripts written in Teradata SQL, for performance. Developed download the reports in Excel formats.
  • Served as ETL specialist responsible for data mart consolidation from an Oracle to a Teradata V2R5 platform in a UNIX environment. Also served as DBA and mentor to customer and fellow Teradata Professional Services consultants in the areas of database administration, performance tuning, and problem diagnosis and resolution.
  • Imported Sources and Targets to create Mappings based on business logic and developed Transformations using Power center Designer. Used Informatica Workflow Manager, Workflow Monitor to create sessions and batches.
  • Extensively used Transformation Language functions in the mappings to produce the desired results.
  • Partitioned the Sessions for better performance.
  • Worked on all the transformations like Lookup, Aggregator, Expression, Filter, Router, Update Strategy, Stored Procedure and Sequence Generator.
  • Created and ran pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager, Workflow Monitor.
  • Developed number of Ab initio Graphs based on business requirements using various Ab Intio Component like Partition by Key, Partition by round robin, Reformat, rollup, join, scan, normalize, gather, replicate, merge etc..
  • Used workflow manager and workflow monitor to schedule and monitor the workflows.
  • Involved in Designing the ETL process to Extract translates and load data from OLTP Oracle database system to Teradata datawarehouse.
  • Interacted with mainframe server running on MVS to extract the DB2 data available in data sets using JCL which includes Teradata SQL statements and calls Teradata utilities like Multiload, fast load and Tpump.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as partitioning components, reformat, rollup, dedup Sort, join, scan, normalize, gather, merge, filter by expression, replicate, redefine, generate records, input table, output table, continuous update table etc and string functions, date functions, miscellaneous functions, inquiry functions, lookup functions etc.
  • Extensively involved in the design of data extraction, Transformation and migration.
  • Designed and implemented automated ongoing Teradata Production monitoring of hourly user access and user resource utilization, drawing data from dbc.ampusage.
  • Designed and implemented an automated system for ongoing ETL job performance and Report creation, logging start and end time of each step in each ETL Job and each Report.
  • Involved in Query Analyzing, performance tuning and testing.
  • Created Account Name Change Report using Teradata (FastExport, MultiLoad, FastLoad, TPUMP, BTEQ) Tools & Utilities to Export data.
  • Created FastExport, MultiLoad, FastLoad UNIX script files for batch Processing.

Environment: Teradata V2R5, OLAP Tools, Teradata SQL, Tools & Utilities(BTEQ,Fast Export, Multi Load, Fast load, TPUMP,Queryman, Teradata WinDDI, Teradata Manager, Teradata Query Manager, and Teradata\'s Priority Scheduler), Teradata Manager, Ab Initio-2.12.2, Informatica, Windows 2000/Unix, MVS, JCL,

Confidential,CA Sep 2001 – Apr 2002

Sleuth9 Network Security which automatically controls DDoS, attacks. Reduces security breaches and protects the network around the clock.

  • Database design for creating defects in the Database.
  • Developed defect-reporting system to enter the defects in the database using JFC Swings.
  • Developed to track the information like IP Address, hostname, login when it was choking and tracking.
  • Developed SMI Filter monitoring system for online tracking.
  • Developed IP Filter monitoring for display the online tracking and choking information when Sleuth9 running.
  • Developed a web site for Deep Nines.
  • Developed a send mailing System when DDoS attacks was happened using sockets.
  • Environment: Java, JFC Swings, networking, Solaris, Windows2000, JBuilder, Oracle9I, Apache Server, Cold Fusion, JavaScript, Net.Data and HTML

Confidential,CA Mar 2000 - Aug 2001
Teradata Developer
Retail Data Mining (RDM) Tool is a vertical application of data mining technology. This provides unrivalled integration with teradata, can analyze higher volumes of data at higher speed. The aim of this project is to support business users, where they can apply retail knowledge against real-life customer segments discover automatically across million of customers transactions. To achieve the purpose Analysis, Segmentation, Scoring and Reporting is done. This Segmentation depends on the segments in the basket.
As a team member in developing a web based application for printing Bar code labels from the web. This was mainly developed for Automotive Industry of different Hubs. Hub sends ship Schedules to their suppliers through EDI. Supplier receives this order as XML document. This XML document was parsed using the Sun XML parser and update the backend, and generate the barcode labels depending upon the sent data.

  • Developed the GUI of 2 different versions. Had the sole responsibility to develop the 2.0 GUI and to integrate it with database, Teradata. In GUI used lot of swing features like JTables for reporting, JprogressBar for scoring and analysis, JSlider for selecting the clusters, JPanels for Variables registration etc. Developed Server side programs to retrieve the data from the database, SQL programs to connect the database and retrieve data from database depend on the Variable selection criteria. Used Fast Export to transfer data from a table to local text file. Created batch files and run the batch file at run time. Developed several servlets, which interacts with database Oracle.
  • Built Data Providers on different Teradata tables, like queries on Universes, Free Hand SQL and Stored procedures.
  • Developed jsps for user interaction.
  • Client side validations were done using the JavaScript.
  • Javawebserver 2.0 was used as web server.

Environment: JDK 1.2.2, JFC Swings, Teradata, JBuilder, EJB, WebSphere 3.5, XML, XML DOM, JavaServlets, JavaServerPages, JavaScript, Net. Data and Cold Fusion.

Confidential,AZ Sep 1999 – Feb 2000
Customer Problem Reporting System
The Customer Problem Reporting (CPR) system is hardware and software Solution aimed at replacing the current Technical Action Request (TAR) system used by the Motorola Computer Group (MCG) for reporting and tracking vendor, customer and internally found problems against MCG hardware and software products. This project is divided into four modules. The general user can search and view the CPR reports. The engineer can create, edit, request close and search and view the CPR reports. The Manager can Assign an owner, create, edit, request close and view the reports. The Administrator can do all the above things.

  • Involved to develop Search forms using Apache Element Construction Set and HTML.
  • Developed Web Configuration files Modifications.
  • Developed SQL programs retrieve data from database depend on search criteria.
  • Developed Server side programs to retrieve the data from the database.

Environment: JDK 1.2.2, J2EE, JavaServlets, JFCSwings, JavaServerPages, EJB, Web Logic 5.0, HTML, Oracle, Apache Jserv, JavaApplets, XML, Visualage for Java

Worked as a Programmer/analyst in India from Aug 1996 to Aug 1999.

We'd love your feedback!