We provide IT Staff Augmentation Services!

Etl Designer/etl Engineer Resume

4.00/5 (Submit Your Rating)

CA

Summary

  • 12+ Years of IT experience in the Requirement Gathering/Analysis, Data Analysis, Application Design, Application Development, Implementations and Testing of Applications in Datawarehouse, and Client/Server Projects.
  • 9 years of strong data warehousing experience using Informatica PowerMart 6.1/5.1/4.7, PowerCenter 8.x/7.x/6.x/5.1/1.7 as ETL tool
  • Good exposure in Facets Trizetto healthcare system
  • Good knowledge and hands on experience in hpXr healthcare system
  • Hands on experience on Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling using Erwin.
  • Hands on experience on Dimensional Data modeling and Relational Data modeling.
  • Experience in Administration activities like Creating and Managing Repositories, Users, User Groups, Folders, Working with Administrator functions of Repository Manager.
  • Extensive experience in Data Warehouse, Data Mart and Business Intelligence using OLAP/DSS tools
  • Implemented the RALPH KIMBELL design strategies on metadata
  • Involved in developing Data modeling, Logical/Physical model & ER-Diagrams using ERWIN
  • Extensive experience in Client/Server technology area with Oracle Database, SQL Server and PL/SQL for the back end development of Packages, Stored Procedures, Functions and Triggers.
  • Involved in complete System Software development Life Cycle (SDLC) of Data warehousing, Decision Support System.
  • Hands on experience on Integrating Informatica with Salesforce data
  • Expertise in UML, Rational Unified Process (RUP) and Rational Rose.
  • Good knowledge and experience in Postgres DB and Salesforce data.
  • Knowledge of Teradata utilities (SQL, BTEQ/ BTEQWin, FastLoad, MultiLoad, FastExport, Tpump, Queryman, etc)
  • Expertise in implementing complex Business rules by creating complex mappings/mapplets, shortcuts, reusable transformations and Partitioning Sessions.
  • Worked with Onsite and Offshore model and lead the team
  • Worked on Agile Methodology
  • Extensive experience in data analysis to improve overall efficiency to support decision making
  • Effective problem-solver. Organized. Team player
  • 3+ years of Tech Lead and Architect experience.
  • Having Good knowledge of using Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), Power Exchange, DAC(Data Warehouse Application Console) tools.

TECHNICAL SKILLS

Hardware: UNIX Servers, Microprocessors – 8085, 8086.

Database: Oracle 10g/9i/8i/7.x, SQL Sever 2005, Teradata, DB2UDB7.2

OS: UNIX, IBM Mainframes, Windows 200/NT/Xp/98/95, Sun Solaris, DOS

Languages: C, C++, C#, Java, Transact SQL (T-SQL), Unix Shell scripts, HTML, XML, XSLT, COBOL, FORTRAN.

ETL Tools: Informatica PowerCenter /PowerMart 8.x/7.X/6.1.0/5.1.0/4.7.2/4.6, PowerConnect

BI Tools: Cognos 6.x/7.0, Business Objects

Documentum Tools: EDMS 98, e-Content Server 4.1, Desktop Client 4.1, DDS, DA, IC, DQL, Docbasic, InputAccel,WDK 4.2, Server API, WF Manager, Right Site Server

Servers: Oracle9iAS, Weblogic 6.x/5.1, Websphere 4.x/3.x, LDAP (iPlanet / Netscape Directory Server 4.12), JWS, IIS, Tomcat 4.x/3.x

Tools/Technologies: Microsoft .Net, Microsoft Mobile Internet Toolkit, MSMQ, COM/COM+, ODBC, ADO, JDBC, Visual Source Safe, IIS, MTS, SOAP, XML,BugZilla

Web: ASP, JSP, XML, VBScript, Java Script, WML, XML, NET, ASP.NET, VB.NET, and C#. NET, XML, XSL, XSLT

Methodology: UML – O O Analysis & Design (OOAD)

Design: Rational Rose, Erwin Platinum 4.0/3.5, and Visio

ERM/CRM: Siebel 99/2000

EDUCATION

Bachelor of Engineering in Electronics

CERTIFICATIONS

Sun Certified Java Professional

Informatica Brain bench certification

PROFESSIONAL EXPERIENCE

Confidential, San Jose, CA Dec’ 11 – Present

ETL Designer/ETL Engineer

Proj: Liferay, Canvas, Salesforce ,FDS

Aim of this project to replicate the database structures and data for Liferay, Canvas and Salesforce data into staging area on daily incremental basis to fed FDS(Foundational Data Source) datamart.

Role:

  • Come up with DDL Scripts to create Postgres,Salesforce objects in Oracle DB
  • POC for Liferay,Canvas and Salesforce ETL move
  • Come up with High level and Low level Design document
  • Design Informatica Interfaces
  • Primarily responsible for ETL design, coding and testing strategy
  • Design suggestion changes
  • Analyzing the data for defects and fixing the defects
  • Created Mappings using Mapping Designer to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Normalizer, Filter and Router transformations,Union Transformations etc
  • Designed mapplets using Mapplet Designer and used those Mapplets for reusable business logic
  • Implemented Audit approach
  • Created re-usable transformations and Worklets
  • Involved in data analysis
  • Worked on Postgres,Salesforce connection issues
  • Worked on extracting Salesforce data to Stage on incremental basis.

Environment: Informatica Power Center 9.x, Workflow Designer, Power center Designer, Repository Manager, Oracle 10g, Windows Scripting, Windows NT/2000/XP,Postgres,Salesforce,Liferay,Canvas

Confidential, Newport Beach, CA May’11 – Oct’11

ETL Designer

Proj: DI HUB,Conversion,HR

DI Hub Project:

DI Hub is a File validation framework to make sure processing right set of data with File validation rules prior to generate the load ready file to feed peoplesoft and Datmart.

Conversion Project:

Convert the legacy codes to people soft codes and apply the transaction rules to load the DW transcation entry tables prior to loading to Data Mart.

HR:

HR set of source files needs to pass the DI Hub validation framework and apply the error handling prior to generate the people soft load ready files.

Role:

  • Converting business rules into ETL technical specifications
  • Design Informatica Interfaces
  • Primarily responsible for Code Reviews
  • Design suggestion changes for DI HUB processes
  • Coming up with Design plan and Preparing the ETL Design document
  • Creating Unit test cases and plan for Unit testing approach
  • Analyzing the data for defects and fixing the defects
  • Created Mappings using Mapping Designer to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Normalizer, Filter and Router transformations,Union Transformations etc
  • Designed mapplets using Mapplet Designer and used those Mapplets for reusable business logic
  • Implemented Error Handling Strategy
  • Created re-usable transformations and Worklets

Environment: Informatica Power Center 9.x, Workflow Designer, Power center Designer, Repository Manager, Microsoft SQL server Management Studio 8.0,SSIS,SSRS, Windows Scripting, Windows NT/2000/XP.

Confidential, SanFrancisco, CA Feb ‘09 – Apr’11

ETL Architect/ ETL Designer Lead

hpXr (Health Plan Extended Reporting)

Main theme of the hpXr project is to support flexible, high performance reporting across all business cycles of a health plan. hpXr is high performance dimension design model supports standard health care claim processing system. Extracting the Claim data from Facets to hpXr (Dimensional data modeling) by applying the business rules to support reporting and Analysis purposes.

Role:

  • Involved in Requirements review sessions and working with data modeler for design changes
  • Converting business rules into ETL technical specifications
  • Design Informatica Interfaces
  • Proving the requirements to team and clarifying the requirements and managing the team
  • Primarily responsible for Design and Code Reviews
  • Performance tuning and design suggestion changes for existing hpXr processes
  • Coming up with Design plan and Preparing the ETL Design document
  • Creating Process and Data flow diagrams
  • Creating Unit test cases and plan for Unit testing approach
  • Analyzing the data for defects and fixing the defects
  • Created Mappings using Mapping Designer to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Normalizer, Filter and Router transformations,Union Transformations etc
  • Designed mapplets using Mapplet Designer and used those Mapplets for reusable business logic
  • Implemented Error Handling Strategy
  • Created re-usable transformations and Worklets
  • Wrote shell scripts to perform pre-session and post-session operations
  • Co-ordinating with middleware team for migrating Informatica objects into test environment
  • Impact analysis for hpXr upgrade and testing
  • Impact analysis on Facets upgrade and testing
  • Involved in Tidal scheduling sessions coming up with various test cases
  • Automating the ETL applications using Tidal tool
  • Co-ordinating/Managing ETL Offshore team
  • Creating SCR’s and CR’s for ETL artifacts

Environment: Informatica Power Center 8.x, WorkFlow Designer, Powercenter Designer, Repository Manager, Oracle 10g, SQL, PL/SQL, SQL Loader, UNIX Shell Scripting, Windows NT/2000/XP, Visio, Business Objects,PVCS, TIDAL Enterprise scheduler, Informatica Data Quality(IDQ), Informatica Data Explorer (IDE),Toad, SQL Developer

Confidential, Waukegan,IL Aug ’06-Jan’09

ETL Architect(Informatica and PL/SQL) /SME

Commercial Data Warehouse(CDW)

Commercial Data Warehouse project will be to meet the analytical needs for sales and marketing teams. The main theme of the project is design and development of a new analytics driven data model. This project will deliver a data warehouse including third party Rx data and call data, current and historical sales territory information, product hierarchy of brand, sub brand and product strength. Project will deliver the capability to analyze and report on this data using Business Objects.

The warehouse populated from ODS, MAT and external sources via Kalido and queried using Business Objects. The primary user groups for this system are Regional Sales Analytics and Market Analytics and Business Insight.

Responsibilities:

  • Analyzing business processes, functions, existing transactional database schemas and designing star schema models to support the users reporting needs and requirements
  • Converting the Business rules into Technical Specifications for ETL process
  • Created Conceptual data model by gathering the business requirements from various sources like discussion with functional teams,business analysts and from business documents
  • Designed logical and physical database designing using ERWIN Tool.
  • Created High Level and Low level design documentation for National,Sub National, Comm Dashboard and RSA Systems
  • Conducted code review sessions with BSA ,Project Managers and with Technical Team.
  • Used SQL * Loader to load the data from Flat files to Oracle for Comm Dashboard system.
  • Created number of PL/SQL procedures for may other updates.
  • Developed complex aggregate, join, look up, filter, router transformation logic to generate consolidated data needed.
  • Worked extensively on Flat Files, as the data from various Legacy Systems is flat files.
  • Created Mappings using Mapping Designer to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Normalizer, Filter and Router transformations
  • Created Mappings using Mapping Designer to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Normalizer, Filter and Router transformations
  • Designed and optimized moderate to complex Initial load to populate SCD Type-2 using date format.
  • Created mapplets using Mapplet Designer and used those Mapplets for reusable business process in development.
  • Implemented Error Handling Strategy
  • Created re-usable transformations and Worklets
  • Improved the session performance by pipeline partitioning.
  • Upgraded Powercenter 7.x to 8.x.
  • Generated Unit and integration Test Plans and related documents
  • Created Workflows in the Workflow designer and executed tasks such as Sessions, Commands, etc. And Monitored transformation processes using Workflow monitor
  • Involved in performance tuning of the Informatica mapping and identifying the bottlenecks.
  • Wrote shell scripts to perform pre-session and post-session operations
  • Usied Shell script and Perl script to audit check on the flat files before loading the data into National data mart
  • Primarily responsible for Production support National ,Comm Dashboard,RSA Dashboard systems.
  • Co-ordinating with middleware team for migrating Informatica objects into test and production enironment
  • Troubleshooting the loading failure cases, including database problems.
  • Wrote JIL scripts to automate the data load into datawarehouse
  • Used Autosys scheduling tool to schedule and monitoring the jobs
  • Involved in developing the reports using Business Objects
  • Led the team in the absence of the project leader.

Environment: Informatica Power Center 7.x/8.x, WorkFlow Designer, Powercenter Designer, Repository Manager, Oracle 9i, SQL, PL/SQL, SQL Loader, UNIX Shell Scripting, Windows NT/2000/XP, Visio, Kalido DIW,Business Objects

Confidential, San Jose, CA Nov’05 – July’06

Sr.Informatica Developer / Data modeler

The First Franklin Business Intelligence solution is to identify and retain the most profitable portion of its customer base, attract new customers with similar profiles, and increase the overall effectiveness of its marketing programs and overall profitability.The Callidus project’s main objective is to create a central database for data and to provide a set of standard reports to give better analysis to the financial group. The Data warehouse project was developed for reporting and high level business needs for their financial sector, using historical and online data. Source for the data extraction are Oracle, Flat Files, and XML files for the Oracle as a target Database.

Responsibilities:

  • Involved in Analysis and detailed level of Mavent mapping documentation
  • Interacted with the end users, Project Manager and other resources in understanding the requirements.
  • Primarily responsible to convert business requirements into system design
  • Designed logical and physical database designing using ERWIN Tool.
  • Scheduled the ETL jobs daily. Weekly and monthly based on the business requirement
  • Worked on Informatica Power Center tool–Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager
  • Created Mappings using Mapping Designer to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Normalizer, Filter and Router transformations
  • Worked extensively on Flat Files, as the data from various Legacy Systems is flat files.
  • Involved in Error Handling Strategy.
  • Implemented slowly changing dimensions (Type 2) methodology to keep track of historical data
  • Created unique primary key values to replace missing primary keys using Sequence Generator Transformation making reusable to use the same Sequence Generator in multiple mappings
  • Extensively worked in the performance tuning of programs, ETL procedures and processes.
  • Created Workflows in the Workflow designer and executed tasks such as Sessions, Commands, etc. And Monitored transformation processes using Workflow monitor
  • Created various tasks like Assignment, E-mail and Decision etc
  • Wrote shell scripts to perform pre-session and post-session operations

Environment: Informatica Power Center 6.x/7.x, WorkFlow Designer, Powercenter Designer, Repository Manager, XML, Oracle 9i, SQL, PL/SQL, SQL Loader, UNIX Shell Scripting, Windows NT/2000/XP, Visio, Erwin 3.5.2,BugZilla

Confidential, MI June’04 – Oct’05

Informatica Developer

The aim of this project is to support business executives to take quick decisions for Standard Federal Bank by means of integrating data from disparate feed systems, situated across different locations, into a repository of meaningful business information. The Banking Data Warehouse (BDW) system for Standard Federal Bank is designed to handle the increasing data volumes generated by the bank's growing, active client base while improving customer service levels. The BDW Solution provides quick access to data to enable a more informative decision making process. BDW helps people do their jobs more effectively and efficiently. Data is accessible and flexible and presented in a usable format, allowing more time for analysis.

Responsibilities:

  • Converted the business rules into technical specifications for ETL process.
  • Created Process Flow Diagram using MS Visio
  • Involved in Designing Logical and Physical Database Designing using ERWIN Tool.
  • Used the Administration Console to perform repository functions, such as creating, copying, backing up, and restoring repositories in Informatica Power center 7.x
  • Created staging tables to do validations against data before loading data into the target tables
  • Tuned the Performance for ETL jobs by tuning the SQL used in Transformations and fine tuning the database.
  • Scheduled the ETL jobs daily. Weekly and monthly based on the business requirement
  • Extensively used ETL to load data from different sources
  • Created Mappings using Mapping Designer to load data from various sources, using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Filter and Router transformations
  • Developed and Monitored Tasks and Workflows using Informatica Workflow Manager
  • Created Workflows in the Workflow designer and executed tasks such as Sessions, Commands, etc. And Monitored transformation processes using Workflow monitor
  • Involved in Fine-tuning of sources, targets, mappings and sessions for Performance Optimization
  • Used Pipeline partitioning to improve Session performance
  • Created various tasks like E-mail and command etc
  • Used Informatica’s features to implement slowly changing dimension tables
  • Created unique primary key values to replace missing primary keys using Sequence Generator Transformation making reusable to use the same Sequence Generator in multiple mappings
  • Deployed the cubes in the Powerplay Enterprise Server in order to view the cubes on the Powerplay Web.
  • Implemented drill-through from Cognos PowerPlay 6.61 cubes to Impromptu reports
  • Used Cognos Transformer 6.61 to built Multidimensional Cubes
  • Extensively used Impromptu6.0 in accessing the source data to built reports

Environment: Informatica Power Center 6.x/7.x, WorkFlow Designer, Powercenter Designer, Repository Manager, COGNOS Powerplay 6.6, Cognos Impromptu 6.0, XML, DB2, Oracle 9i, SQL, PL/SQL, SQL Loader, UNIX Shell Scripting, Windows NT/2000/XP, Visio, Erwin 3.5.2

Confidential, NJ Apr’03 –May’04

Informatica Developer

Developed a business intelligence system to quickly identify customer needs and better target services using Informatica PowerCenter software. With PowerCenter a large amount of customer-related data from diverse sources was consolidated, including customer billing, ordering, support and service usage. This project dealt with building various types of reports using Powerplay and Impromptu and also providing security across all cognos applications using Access Manager

Responsibilities:

  • Analysis, requirements gathering, functional/technical specification, development, deploying and testing.
  • Significant Multi-dimensional and Relational data modeling experience with modeling tools like ERWIN & VISIO with a strong emphasis on data Warehouse/data Marts analysis, design, and implementation.
  • Extensive use of Informatica Tools like Designer, Workflow Manger and Workflow Monitor.
  • Extensively used Informatica client tools like Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, and Informatica Server Manager.
  • Extracted source data from Oracle, SQL Server, Flat files and XML using Informatica, and loaded into Oracle or flat file targets.
  • Used transformations like Aggregators, Sorter, Dynamic lookups, Connected & unconnected lookups, Filters, Expression, Router, Joiner, Source Qualifier, , Update Strategy, sequence Generator.
  • Developed, tested stored procedures, functions in PL/SQL
  • Created reusable transformations and mapplets to use in multiple mappings
  • Used Workflow Manager/Monitor for creating and monitoring workflows and worklets.
  • Improved the session performance by pipe line partitioning
  • Extensively used mapping parameters, mapping variables and parameter files
  • Analyzed the query performance and designed the loading process schedules
  • Created sessions and batches for various mappings and automated using UNIX shell scripts.
  • Wrote the shell scripts to process the PL/SQL procedures and wrote the PL/SQL program units for data extracting, transforming and loading
  • Involved in troubleshooting the loading failure cases, including database problems.
  • Extensively used Cognos Impromptu & Cognos PowerPlay in accessing the data from data marts

Environment: Informatica 6.x (Power Mart/ Power Center), WorkFlow Designer, Powercenter Designer, Repository Manager, Cognos, Oracle9i/8i, PL/SQL, Teradata, SQL*Plus, TOAD, SQL Server 2000, Windows NT 4.0, UNIX, Erwin

Confidential, Indianapolis, IN July’02 - Mar ‘03

Informatica Developer

ELI LILLY is a pharmaceutical company which produces and distributes Pharma products across the globe .This project built a pharmaceutical Financial Data Warehouse to present an integrated, consistent, real-time view of enterprise-wide data. The Data Warehouse enhances sales and Order management reporting for the pharmaceutical research group. Data from different sources was brought into Oracle using Informatica ETL and sent for reporting using cognos.. This Data Warehouse enhances Sales and Order Management reporting for a pharmaceutical research group, delivering reports and information to sales and marketing management.

Responsibilities:

  • Extensively involved in System Study & Business Requirement Analysis.
  • Created Process Flow Diagram using MS Visio
  • Defined the entity-relations – ER Diagrams and designed the physical databases for OLTP and OLAP (data warehouse).
  • Involved in the design and development of Star schema using dimensional modeling.
  • Created and maintained the centralized repository using Informatica Power center 6.0
  • Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings.
  • Maintained and improved integrity by identifying and creating relationships between tables
  • Developed and implemented Error Handling Strategies
  • Worked with mapping wizards for slowly changing dimensions and slowly growing dimensions by using Informatica mapping designer 6.0.
  • Used Shell Script to execute post-session and pre-session commands
  • Designed and developed complex aggregate, join, lookup transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica Power center 6.0
  • Designed and developed mappings using Source qualifier, Aggregator, Joiner, Lookup, Sequence generator, stored procedure, Expression, Filter and Rank transformations.
  • Created stored procedures to validate the data before loading data into data marts.
  • Generated the List Reports, Drill Through Reports, Cross Tab Reports using cognos Impromptu

Environment: Informatica Power center 6.0, PowerMart, WorkFlow Designer, Powercenter Designer, Repository Manager, Cognos, Visio, Oracle 8i, XML, ERWIN , Visio,Windows and UNIX

Confidential, Jersy City, NJ Jan’02-June’02

Informatica Developer

Project: Anti Money Laundering System (AML)

This assignment involves the alert management for Brokerage Fraud and Risk Management System (BFRM). The name of the project is AML (Anti Money Laundering). The group involved in this project is CTG (Corporate Technology Group) that is responsible for creating a system that will help MLCO in tracking Accounts, Account addresses, Customers, Employees and Organizations involved in money laundering. A watch list is created for all the above groups and supplied to MANTAS Corp. that calculates the risk scenarios and generates alerts. This is an ongoing process of development, enhancement and support from all the data transmitting groups that transmit data to CTG for creation of loading appropriate data in the BFRM data warehouse.

Responsibilities:

  • Preparation ETL specifications and Creating Mappings in Informatica PowerCenter/PowerMart 5.1.0 for the incoming data.
  • Working with Terabytes (TB) of data supplied by different groups involved in the project.
  • Extensively involved in performance tuning of Informatica mappings and sessions.
  • Extensively using PL/SQL for validating the target data.
  • Using shell scripts for file transfer protocols/management, data transfer and querying the Oracle 8i database.
  • Using XML sources as source files for loading data in the BFRM data warehouse.
  • Using IBM mainframes data for loading the Accounts Payable, Mutual Funds, Investment Banking and Global Market Investments (GMI) data in the BFRM datawarehouse.
  • Using slowly changing dimension concept in the mappings and sessions.
  • Using parallel processing capabilities of Informatica PowerCenter 6.0/5.1.0.
  • Co-coordinating different groups in Merrill Lynch like: Employee group, Accounts group, Network group, transmission control group etc. for proper data transmission.
  • Using Autosys to schedule UNIX shell scripts, PL/SQL scripts and Informatica jobs.
  • Enhancing the mapping and sessions to reflect changes from different groups.

Environment:Informatica Power Mart 5.1/PowerCenter1.5, Oracle 7/8.0, DB2, COBOL, UNIX Shell Scripting, XML, PL/SQL, ANSI SQL, Sun Solaris 2.x, Windows 2000

Confidential, Boston, MA June’01-Dec’01

Project: Real Estate Information System

Real Estate Information system is a web-based system, developed for Pikes Peak Association of Realtors, Colorado. The main theme of the project is to have information regarding various real estates. Various Realtors can register and provide their real estate information. It provides links for various other Real Estate information sites. This site is used to buy or sell homes.

Responsibilities:

  • Requirements Analysis through User Meetings, use case documentation and through functional specifications
  • Analyzed the business requirements using RUP and Rational Rose UML approach
  • Wrote Use Cases and identified alternate flows, exception flows and business rules.
  • Involved in developing Realtor registration form and search pages for Realtors and for Homes with different search criteria’s.
  • Developed business logic using Java RMI and Servlets.
  • Designed, developed, and deployed Enterprise Java Beans (EJB) for server side components
  • Coded classes to interact with database using JDBC.
  • Coded JSP and HTML pages for displaying various information.
  • Development of JavaScript for client end data entry validations.
  • Involved in installation of the web server and deployment of the application.

Environment:Java, J2EE, Servlets, Ejb, Weblogic 5.0, Java RMI, JSP, XML, HTML, JDBC, Java-script, and Oracle 8,Windows NT, RUP, UML.

Confidential, Hartford, CT Nov’00 - May’01

Documentum Developer

Project: Middle Market Industry Program (MMIP)

The main objective of Middle Market Industry Program (MMIP) Project is to define and implement a reusable business redesign process that can be followed from the creation of structured/unstructured business information through delivery. This business information will be held in a repository and become available for multi channel distribution. The focus of this projectIs to implement the framework that will enable us to use documentum and BEA Web logic Personalization Server for the purpose of automating the communication and Management of business information.

Responsibilities:

  • Involved in development applications on the of e-Content Server, Desktop Client, Right Site, Intranet Client, Documentum Administrator, Web Publisher including creating and authentication for users and groups using ACL
  • Created custom object types and DocApps in documentum developer studio and created custom attributes, permission sets, aliases, developed lifecycles and workflows
  • Definition of user content requirements
  • Implementation of content management architecture for all content
  • Implementation of a content management system for middle market target programs
  • Streamline and automation of content management workflow
  • Customized delivery of information based on user groups
  • Creation of shared ownership and responsibilities for content
  • Built customized life cycles, work flows and server side methods using Documentum Server API, Web publisher, DQL, Docbasic, DFC and Java
  • Developed and implemented custom functionality for Desktop Client
  • Involved in creating custom object types and properties
  • Developed custom menu options
  • Developed reports using Seagate Crystal Reports 8.0

Environment:Windows NT4.0 and Sun Solaris2.7, VB 6.0, e-Content Server 4.1, Right Site Server 4.1, Web publisher, Web cache, LDAP, Oracle 8.1.6, IIS4.0, Developer Studio 4.1, Desktop Client 4.1, Documentum Administrator Intranet Client, Workflow Manager, BEA Personalization Server, Java 2, Jdk1.3, JSP, DFC, DQL, Docbasic, Documentum API, VB6.0, Crystal Reports, Reporting Gateway, Parallel Crystal Server

Confidential, India Sep’99 -Oct’00

Senior Java Programmer

Cipla Pharmacy, a retailer of drugs and related products required an e-commerce package for online search and purchase of non- prescription drugs.

The scope as defined included: The Client-Side Module handled client-based transactions. It included display of items available in inventory with their price and a small description, and then allowed the customer to select an item and add that to a session-specific cart and finally place and confirm an order. The Shop-Store Module handled administration tasks. It included addition, removal and the dynamic revision of products in the shop store’s repository. Suppliers Module was responsible for handling of restocking requests originating from the store.

Responsibilities:

  • Conducted a detailed analysis of the system requirements based on input from the client Worked in a team of six developers.
  • The team was entrusted with coding the Client-Side Module and the Shop-Store Module.
  • Dynamic Web Page content was generated using Servlets and JSP, with various Java API elements included in the code.
  • EJB entity beans and session beans were used to interact with the database.

Environment: Java2.0, EJB1.1, WebLogic5.0, JSP, SERVLETS, XML1.0, Oracle, Unix, JDBC

Confidential, India Jan’99 – Aug’99

Java Programmer

Employee Management System maintains daily activities of employees in the Organization. Every day employee can key the information regarding their work progress. EMS is also useful in providing the information regarding responsibilities, job description and set of goals v/s achievements.

Responsibilities:

  • Involved in design and development of web pages.
  • Involved in the Analysis, Design and development of the application.
  • Developed in server side programming using servlets, applets.
  • Performed unit Testing and Integration Testing.

Environment: Java, Servlets, ASP, Java web server, IIS, Oracle 7.3, Windows NT

We'd love your feedback!