We provide IT Staff Augmentation Services!

Application Support Admin/developer Resume

3.00/5 (Submit Your Rating)

Experience Summary:

  • Have 7 years of Experience In the field of IT as Data warehouse Analyst, Design, Development and Administration in Information System and utilizing my technical skills and experience with opportunity for learning and advancement application software optimized for Client/Server environment and large transaction volume.
  • Install and configuration of Information server 8.7.
  • Configuration of High availability and clustering for Engine and Service Tier.
  • Install and configure SAP BW and R/3 connectivity to Data Stage.
  • Performed administration tasks like role setup, connectivity, security, configuration of projects, user registries, create or update users and groups, mapping credentials, configuring DSN and TNS entries, parallel, compiler, reporting, operator and user-defined environment variables.
  • Configuration of XMETA repository, Node Agents, DSEngine and Web Sphere Application Server.
  • Expert in backing up and restoring of Infosphere/Websphere Information Server manually and also by using isrecovery automated tool.
  • Expertise in Administration of IBM InfoSphere/Websphere Information Server Tools like DataStage, QualityStage, Balanced Optimization and Information Server Manager.
  • Experience in source systems analysis and data extraction from various sources (Flat files, XML files, Oracle 8i/9i/10g/11g, DB2, SQL Server, Teradata, MQ and SAP) into staging area and target Data warehouse.
  • Experience in converting the Business Logic into Technical Specifications.
  • Extensive professional expertise in Extraction, Transformation and Loading (ETL).
  • Extensively worked on processing stages like join, merge, lookup, transformer, change capture, Remove duplicates, Funnel, Filter and Pivot stages, etc.,
  • Designed Documents for performing unit testing and integration testing of developed code.
  • Fixed errors, issues and bottlenecks in design, compilation and execution of server jobs, parallel jobs and complex sequence jobs.
  • Extensively worked on Server Edition, Enterprise Edition (Parallel Extender) and development of Data Warehouse/Data mart Applications.
  • Strong understanding of the principles of Data warehouse using Fact Tables, Dimension Tables, star and snowflake schema modeling.
  • Experience in Data modeling, Data warehouse using modeling tools like Visio and ERWIN.
  • Designed many Batch Processing jobs within DataStage and created many sequences in DataStage using Job Sequencer.
  • Experienced in Scheduling and running Jobs using DataStage Director, External tools like Control-M, Autosys and Unix Crontab.
  • Involved in performance tuning for various DataStage jobs by usage of Hash Stage, SQL Tuning, Job Tuning and usage of IPC`s.
  • Responsible for all activities related to the development, implementation, administration and support of ETL processes for large-scale data warehouses using DataStage.
  • Extensive experience Microsoft SQL Server, Oracle 11g/10g/9i/8i, IBM DB2, SQL, PL/SQL, SQL* Loader, TOAD.
  • Strong in UNIX. Extensively used UNIX scripts for scheduling the jobs.
  • Strong Work experience in implementing slowly changing dimensions (Type I, Type II and Type III).
  • Exposed to various segments of Data warehouses like TELECOM, FINANCE, BANKING and INSURANCE.
  • Experience in Installation, development and administration of Data Stage.
  • Excellent analytical skill, problem solving skill, interpersonal skill, good communication skill, presentation skills and team player with ability to work under tight schedule.

Technical Skills

Operating System

Windows XP/2003/NT/2000, AIX, Solaris, Linux, HP-Unix

Automation Tools

ControlM, Autosys, Datastage Director

ETL Tools

Datastage and Qualitystage 5.0/7.0/7.5/8.0./8.7/8.7.0.1,

Web servers

IBM Web sphere Application Server 6.0/7.0/8.0

Languages

C,SQL, PL/SQL, Shell Script, Java Script.

Database

TeradataV2R5, Oracle9i/10g, DB2, MS Access.

Modeling tool

Erwin 3.5/4.2, MS Visio

Other tools

TOAD 8.0, MS Office, SQL*Plus,SQL*Loader, FTP

Educational Details:
Bachelors in Electronics and communication Engineering.

Professional Experience

Confidential, Des Peres, MO October 2011 – Till Date
Application Support Admin/Developer

Confidential is the largest telecom operator in U.S.A. Confidential designed a Data Management Tool (DMT), which assists AT&T with engineering record quality check, error identification and data cleanup efforts. The tool’s objective is to perform both initial cleanups and sustaining data maintenance in the future. Once the data is cleansed, it is anticipated to directly contribute to improved functionality and automation, resulting in cost savings. As a part of the DMT process, the ETL application Datastage was used to pull data from the GCOMM application. As a part of ETL process,

  • The wire centers in the current batch cycle for DMT processing according to region is updated and scheduled for GCOMM tools.
  • Then the data is staged for individual feature types into DMT. The data staged for GCOMM into DMT is held in the temporary tables. These tables are truncated at the beginning of data load. Then GCOMM key attribute data is copied to these temporary tables.
  • After the initial staging process, current row is expired and new row will be created if any key change values are detected by comparing the data from stage tables with the data in history tables. History tables are used to track the attribute changes of a feature over time.
  • For each feature, the key attribute values will be compared against business rules and if any discrepancy is detected, an error will be created or updated with an appropriate reason and status code.
  • The detected errors will be autocorrected wherever possible and posted back to GCOMM.

Responsibilities:

  • Upgraded from Websphere Information Server 8.0.1 to Infosphere Information Server 8.1 FP1 on Dev, Test and Prod servers.
  • Installation and configuration of Information Server version 8.7 and fixpack1 on Dev, Test and Prod servers.
  • Defined strategy for the Installation and configuration of the new environment.
  • Installation of Information Server Patches for Service, Engine and Client Tiers.
  • Configured IBM Infosphere Datastage and Qualitystage, created users, groups, credential mappings and assigned administrator and user roles, privileges in different environments.
  • Configured Environment variables, security, and connectivity. Tuned the deployment for performance, and backed up the installation.
  • Added, deleted and setup DataStage projects from Data stage administrator...
  • Manage space allocation and usage within the DataStage file systems.
  • Ensure that the DataStage server is backed up appropriately, and the recovery of DataStage server data if necessary.
  • Export / Import wizard to move the DataStage jobs between different servers – Dev, QA, and Prod.
  • Used Technical transformation document to design and build the extraction, transformation, and loading (ETL) modules.
  • Designed the mappings between sources (external files and databases) to operational staging targets.
  • Extensively worked with IBM web console for WISD jobs and unlocking the Datastage job.
  • Configured the server files uvobdc.confg and .odbc.ini files and setup TNS entries.
  • Created Configuration files to increase parallelism of DataStage jobs.
  • Developed custom Routines and Transforms.
  • Developed Parallel jobs using Parallel stages like: FTP, Dataset, Sequential file, Complex flat file, Funnel, Filter, Modify, CDC, Merge, Join, LookUp, Transformer, Oracle enterprise stage, Oracle Connector, etc.,
  • Used Data stage Director for Scheduling the sequences and jobs.
  • Reuse implemented through multiple instance jobs, parameter sets, and shared containers Configuration files, to increase parallelism of DataStage jobs.
  • Create and use DataStage Shared Containers, Local Containers for DS jobs and retrieving Error log information.
  • Used IBM Data Stage Manager for importing metadata from repository, new job categories and creating new data elements
  • Designed the jobs in Data Stage Designer to extract the data from the OLTP to staging area and target
  • Responsible for monitoring/troubleshooting Data Stage jobs during the production data load processes
  • Developed solutions for complex logic by writing Before/After Routines and contributed in Parameterization of Data Stage jobs through parameter file.
  • Fine tune the environment to avoid any bottle necks and performance.
  • Subject matter expert for any critical issues related to DataStage.
  • Extensively used multiple configuration files (environment variable) to increase the nodes according to the varying processing needs.
  • Experienced in enhancing the DataStage jobs, batches and job sequencer in the production environment.
  • Experienced in implementing slowly changing dimensions Type1 & Type2 for the dimension tables.
  • Expert in using IBM Data stage connector migration tool.
  • Extensively used Quality stage to standardize/integrate the data from different sources (Address Matching).
  • Resolved errors in compilation and execution of different jobs.
  • Extensively used UNIX scripts (Crontab) for scheduling the jobs.
  • Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT) for every code change and enhancement.
  • Responsible for starting and stopping Datastage server during maintenance window.
  • Extensively wrote User Defined SQL queries.

Environment: IBM Information Server V8.7.0.1/8.7/8.1.0.1/8.0.1 EE, IBM Data Stage 8.X EE (Designer, Administrator, Director), SQL, PL/SQL, MS Visio, Oracle 11g/10gR2, IBM DB2 9.1/9.7, TOAD 7.3, Windows XP/7, AIX 5.3/6.1.

Confidential, Washington D.C. March 2009 to September 2011 
Datastage Admin/ Developer

Confidential is an international financial institution which provides loans to developing countries. Shared object is the system which loads all the information to the shared database. This system is the heart of Business intelligence application in the World Bank Data warehouse. This application extracts data from different source systems and loads into Business Intelligence system; which is further referenced by finance applications. Vendor Data warehouse is the application which holds the information about the economic data, development reports, social indicators, debt tables, etc., This Vendor Data warehouse helps the Bank in the analysis of economic and social trends in developing countries, which emphasizes Bank borrowers.

Responsibilities:

  • Analyzed the requirements, functional specifications and identifying the source data to be moved to the warehouse.
  • Extracted data from various source systems.
  • Set up the projects, roles, users, privileges in different environments.
  • Setting up job parameter defaults and environment variables.
  • Used the Datastage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse.
  • Validated, Scheduled, Run and monitored developed jobs using DataStage Director.
  • Exported and imported jobs between development and production environments using DataStage Manager.
  • Extensively used built-in Processing stages, which includes Aggregator, Funnel, Remove Duplicates, Join, Transformer, Sort and Merge in most of the jobs.
  • Scheduling jobs for daily loads.
  • Involved in planning for building a new DataStage Environment.
  • Involved in unit testing of the system.
  • Defined Migration approach for the new version
  • Developed PX jobs that include both pipeline parallelism and partition parallelism techniques.
  • Extensively worked on Slowly Changing Dimensions techniques to maintain the historical details of the data. SCD Type 1 and SCD Type 2 were used as a response to the change in the source data.
  • Worked on Quality Stage to cleanse and standardize the data received from different sources.
  • Integrated stored procedures and user defined SQL into Data Stage.

Environment: IBM Information Server 8.1/8.0.1 EE, IBM Data Stage 8.X EE (Designer, Administrator, Director), DB2, Oracle, SQL server, Control-M, RHEL 4/5, Windows XP.

Confidential, Toledo , OH July 2007 – February 2009 
Datastage Developer

Developed a Central Data warehouse for the Debit division of Bank One. This warehouse is projected to assist Sales and Marketing Department to categorize their customers based on significant portfolio of services including checking accounts, savings accounts, personal loans and geographical area. Using different ad-hoc analysis, Warehouse is supposed to assist in defining strategy for each customer category. In the Debit Card Division of Bank One, we built a customer-centric data warehouse to analyze transactions across finance, marketing, risk, collections, and consumer relations.

Responsibilities:

  • Involved with the end users / business analysts to collect the requirements of the project.
  • Experienced with DataStage Parallel Extender for partitioning the data into subsets and load balancing across all available processors to achieve job performance.
  • Preparation of technical specification for the development of Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining ETL standards.
  • Extensively used Data Set, File Set, Lookup File Set, Transformer, Sort, Join, Merge, Lookup, Funnel, Copy, and Modify stages.
  • Used DataStage Manager to import the metadata / schema definitions into various jobs.
  • Worked on the clean up, fixes and loading of metadata into the Warehouse.
  • Created job sequences to handle the load dependency.
  • Used Director to run, monitor and schedule the jobs.
  • Extensively wrote user SQL coding for overriding generated SQL query in DataStage.
  • Extensively involved in Quality Stage (Integrity Stage) for quality checking and error detection process.
  • Developed detailed analysis of Data warehouse DB2 for creating Database and DataMarts.
  • Created UNIX shell scripts to extract specific columns needed for business from the raw files.
  • Created and documented various Data Quality Scripts to perform system integrated testing and user acceptance testing.
  • Created lots of PL/SQL scripts to tune and optimize the database performance.
  • Database triggers, Functions, Stored Procedures were used to define constraints on the database.
  • Created and maintained ETL design document and integration test plans.
  • Extensively used Erwin to import and export the data model definition of business.

Environment: DataStage Enterprise Edition 7.5(Designer, Manager, Director, Administrator), Quality Stage, DB2, Sun Solaris, Windows NT 4.0.

Confidential, Chicago July 2005 – June 2007
Data Warehouse Developer

Confidential offers different kinds of Insurances like Life Insurance, Auto and Business. BCS Insurance serves customers through independent agents, brokers, financial institutions, and affinity groups. The project involved extracting data from different sources and loading into Data Marts. The major job involved was transforming the data into the staging area then loading the data in the Data Mart.

Responsibilities:

  • Involved in analyzing existing schema to identify suitable dimensions and facts for schema and Implemented logic for Slowly Changing Dimensions.
  • Interacted with the Data Modelers while designing data mart for warehouse and reporting layers.
  • Designed and developed mappings between sources systems (external files and databases) to operational staging targets.
  • Used MetaBroker for importing metadata using Erwin Tool.
  • Used IBM Websphere Data Stage Manager for importing metadata from repository, new job categories and creating new data elements.
  • Used the IBM Websphere DataStage Designer to develop processes for extracting, transforming, integrating, and loading data into Datamart.
  • Developed user defined Routines and Transformations for implementing Complex business logic.
  • Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis).
  • Involved in database development by creating Oracle PL/SQL Functions, Procedures, Triggers, Cursors and Packages.
  • Creation of database objects like Tables, Views, Materialized views, Procedures, Packages using Oracle tools like SQL* Plus.
  • Involved in Unit and System Testing for various jobs.
  • Performed job tuning for better performance.
  • Developed Shell Scripts to automate file manipulation and data loading procedures.

Environment: IBM Websphere DataStage 5.0(Designer, Manager, Administrator, Director), Datastage Enterprise Edition 7.0, MetaStage, Business Objects 5.1, Erwin 3.5, Oracle 8i, SQL*PLUS, PL/SQL, HP-Unix 10.2.

We'd love your feedback!