We provide IT Staff Augmentation Services!

Sr. Etl Archite Resume

0/5 (Submit Your Rating)

CT

SUMMARY

  • Over 10 years of experience in data analysis, modeling, design, and development of the databases.
  • Highly experienced in scalable parallel processing infrastructure and worked with IBM Infosphere Data stage 11.5/9.1/8.5/8.0/7.5.2/7.5/7.1 Parallel extender (Enterprise Edition) to dynamically distribute to all available processors to achieve job performance
  • Extensively worked on procedures for data movement in data warehousing using Datastage. This includes specification of data cleansing and design of staging area.
  • Worked on Staging, Transformation, Pre Loading and Loading of data from multiple sources into Data Warehouse
  • Excellent knowledge and experience in data warehouse development life cycle, dimensional modeling, repository management and administration, implementation of STAR and Snowflake schemas
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Analyzed the quality of data to determine its consistency and reliability, Documented detailed findings together with recommendations for standardizations and reconstructions using IBM Ascential Quality Stage (integrity) System
  • Worked on stages like Transformation, copy stage, modify stage, Surrogate key generator stage, Aggregator stage, Filter Stage, FTP enterprise Stage, XML Metadata Importer, Join stage, Merge stage, Look up, Funnel stage, Remote Deployment.
  • Experienced with developing ETL design documentation, performing impact analysis, data profiling and data analysis
  • Extensively worked on Parallel Extender on Orchestrate/Enterprise Edition Environment for splitting bulk data into subsets and to dynamically distribute to all available processors to achieve job performance.
  • Expert in creating reports using IBM cognos and SAP Business objects for reporting needs.
  • Fine tuned the performance of the various ETL jobs & processes
  • Experienced in integration of various data sources (DB2 - UDB, SQL Server, Teradata, Sybase, and Oracle) into data staging area.
  • Extensive working experience in maintenance and Production support of different Applications
  • Experienced in shell scripting to automate the jobs, file manipulation and text processing
  • Expertise on oracle development tool set including PL/SQL, SQL * plus, SQL loader, TOAD
  • Experience in writing, testing and implementations of Triggers, Procedures and functions using PL/SQL
  • Excellent team player with good communication, interpersonal & analytical skills. Problem-solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and an enthusiastic
  • Performed Unit testing, Volume testing and performance testing

TECHNICAL SKILLS

ETL Tools: IBM Infosphere Data Stage 11.5/9.1/8.5/8.1/8.0/7.5/7.1 , IBM WebSphere Analyzer, Profile Stage, Audit Stage, IBM Infosphere Quality Stage

Reporting Tools: Business Objects XI r2/6.5, Crystal Reports XI/10/8.5

Data warehousing: Data marts, OLTP, OLAP, Normalization, Dimensional Modeling, facts, Dimensions star schema and snow flake schema

Data Modeling: Erwin 4.1/3.5, ER studio 3.5, MS Visio 2003

Languages: MS Visual C, MS Visual C++, UNIX Shell Scripting, SQL, PL/SQL, Visual Basic

Operating System: UNIX, AIX, Windows NT/95/98/2000/XP, IBM information server 8.5/8.1

Database Tool: PL/ SQL, TOAD, SQL Loader

Databases: Oracle 10g/9i/8i/7.x, SQL server 2008/2005/2000/7.0 , Teradata 15.0, UDB, DB 2 9.7/7.1/6.1

PROFESSIONAL EXPERIENCE

Confidential, CT

Sr. ETL Architect

Responsibilities:

  • Developed Design Documents, Process Flow Documents, Quality Testing Documents for building various process
  • Worked on Various kind of sources like Excel, Flat Files, Fox Pro Database, Oracle Database, SQL Server Database, DB2 Databases.
  • Created Shared Container to simplify the Datastage design, and used it as a common job component throughout the projects.
  • Created custom routines to validate the source file names with effective dates and extract dates
  • Created template jobs for working with SAP source systems like IDoc files.
  • Worked on Infosphere quality stage with zipcodedownload database to maintain quality of the data and to perform data profiling of the existing mortgage loan securities
  • Developed several advanced queries for reconciliation process in the development and test environment for performing the Maintenance releases and Defect fixes.
  • Create Production support documentation for overnight batch support purposes.
  • Setup 24×7 proactive Batch monitoring which resulted in 60% decrease of downtime
  • Worked on developing teradata FastLoad and MultiLoad utitlies.
  • Design, develop, and maintain Cognos Framework Manager security and created dashboards for trend analysis
  • Implemented Change Data Capture for capturing the Incremental Changes in the Source Systems using Change Data capture stage.
  • Worked on creating the offshore module by managed consistent performance and quality
  • Developed template sequencer jobs with easy restart abilities and maintenance.
  • Translated business requirements into data mart design and developed ETL logic based on the requirements using DataStage
  • Partner with Cognos power users in report training and mentoring to deliver solution
  • Worked on building automated flows for successful nightly batch involving in building Enterprise datawarehouse from various dependent source systems
  • Worked on Database Stages like Oracle Connector Stage, ODBC Connector stages for SQL Server, DB2 EE, Aggregator, Transformer, Merge, Sort for developing jobs
  • Developed advanced queries and stored procedures on MS SQL Server database and DB2 databases.
  • Worked on Quality stage to Identify, separate, and standardize the address standardization with US, SERP, CASS & DPID database.
  • Developing various reports like List reports, Crosstab reports, Maps, Dash Boards & Score cards using BI tools like Cognos and Qlikview
  • Built several TPT job scripts to extract & load to the landing area.Developed TPT scripts to load data from oracle to teradata
  • Worked on Data conditioning on the source feed using Investigate stage, Standardization, matching stages in Quality Stage.
  • Developed various scripts for automating the Process flow on Autosys as part of batch Cycles.
  • Worked on Comprehensive Capital Analysis and Review (CCAR) actives in building advanced calculation engines to load projection data and analyzing the market values and book values as part of Federal audit process.

Environment: IBM Infosphere DataStage 11.5/9.1, IBM Qaulity Stage 11.5, IBM Cognos 10.0,Oracle 12c/11g, Teradata 15.0, MS Office 2013, CA Autosys 2015, Zena, MS SQL Server 2012, Red Hat Linux 6.5, Qlikview 6.0

Confidential, IL

Sr. Datastage Developer

Responsibilities:

  • Worked on creating the design documents and Datastage job template for creating the ETL process for Anti Money Laundering Project.
  • Migrated large volumes of ETL data from several source systems of Northern Trust to Oracle Staging table in the Target databases.
  • Understanding the technical process involved in populating the target database from several source systems.
  • Developed SQL queries to extract data from different tables and different databases.
  • Created various complex datastage jobs to load the data from the source table to the staging database table.
  • Optimized the performance of the existing Datastage jobs and improved the ETL scheduling process.
  • Translated business requirements into data mart design and developed ETL logic based on the requirements using DataStage
  • Worked on Configuring the IBM Information Analyzer, importing the metadata, integrating tables and analyzing the raw metadata.
  • Involved in Analysis, requirements gathering, functional/technical specification, and development, deploying and testing.
  • Implemented Change Data Capture for capturing the Incremental Changes in the Source Systems using Change Data capture stage.
  • Developed several advanced queries for reconciliation process in the development and test environment for performing the Maintenance releases and Defect fixes.
  • Created Shared Container to simplify the Datastage design, and used it as a common job component throughout the project.
  • Used Team Foundation Server (TFS) version control and managing t different versions of the Database scripts, Shell Scripts, parameter files and Datastage packages.
  • Used Control-M for automation of scheduling for UNIX shell script jobs on daily basis and proper dependencies and tasks.
  • Responsible for unit, system and integration testing. Development test scripts, test plan and test data. Participated in UAT (User Acceptance Testing).
  • Developed advanced queries and advanced stored procedures with SQL Server web clusters to populate the data from multiple databases.
  • Used stages like Transformer, sequential, Oracle, Aggregator, Data Set, File Set, CFF, Remove Duplicates, Sort, Join, Merge, Lookup, Funnel, Copy, Modify, Filter, Change Data Capture, Change Apply.
  • Provided Defect resolution process documentation for resolving the defects in the developed jobs and reconciliation jobs to validate the rejected records in the process.
  • Worked on providing the test files for the Application Database vendor as per business requirements after Data Validation, Reconciliation and extraction process.
  • Defined production support document with recovery process.

Environment: IBM Infosphere DataStage 8.5, Information Server Manager 8.5, Oracle 10g, MS Office, HP Quality Center, Control-M, MS SQL Server 2008/2005, IBM Db2 9.7, Ultra Compare, MS Office suite 2008

Confidential, IL

Sr. DataStage Developer

Responsibilities:

  • Prepared technical design/specifications for Data Extraction, Data Transformation, Data Cleansing and Data Loading
  • Worked data tracebility, data models, design documents and Technical specification documents and Analysis related to the STAR Project.
  • Understood the technical specifications and developed Datastage Parallel jobs for Extraction, Transformation and Loading process.
  • Involved in evaluating the scope of application database, defining data relationship within and between different groups of data.
  • Developed SQL queries to extract data from different tables and different databases.
  • Created various complex datastage jobs to load the data from staging area to the base data base and from there to data mart.
  • Improved the performance of the IBM Datastage jobs and improved the ETL scheduling process.
  • Worked on Database Stages like ODBC Connector stages for SQL Server, DB2 EE, Aggregator, Transformer, Merge, Sort for developing jobs
  • Developed advanced queries and stored procedures on MS SQL Server database and DB2 databases.
  • Administered all the Database duties in the development and test environment for performing the Database changes for Maintenance releases and Defect fixes.
  • Extensively worked on IBM Information server to import/Export the packages and also to manage the Table data definitions.
  • Created jobs with Shared Container to simplify the Datastage design, and used it as a common job component throughout the project.
  • Maintaining system interfaces, extracting, translating and loading data between internal applications and externally to the Claim and benefit information.
  • Used Serena version control and managing the different versions of the Database scripts, Shell Scripts, parameter files and Datastage packages.
  • Used ASG zena for automation of scheduling for UNIX shell script jobs on daily, weekly monthly basis with proper dependencies and tasks.
  • Worked on unit, system and integration testing. Development test scripts, test plan and test data. Participated in UAT (User Acceptance Testing).
  • Developed advanced stored procedures with SQL Server web clusters to populate the data from multiple databases.
  • Provided level 2 production support for the stored procedures and ETL related process involved as part of STAR Project and others.
  • Defined production support document with recovery process.
  • Involved in the preparation of ETL documentation by following the business rule, procedures and naming conventions. Also, created major development, deployment and support documentation.

Environment: IBM Infosphere DataStage 8.5/8.1, Information Server 8.5, Secure CRT 6.7, MS SQL Server 2008/2005, IBM Db2 9.7, ASG Zena, Serna Dimensions, Ultra Compare, MS Office suite 2010

Confidential, IL

Sr. Application Developer

Responsibilities:

  • Actively involved in the paymentnet4 project database Migration, ETL Reporting, implementation and production support.
  • Developed design documentation needed for developing jobs based on the business requirements and existing database models for migrating the database.
  • Worked majorly on importing metadata from MS SQL server database and migrating to Oracle 10g involving various complex business logics satisfying the customer requirements.
  • Participated majorly in JAD sessions, requirements gathering, design meeting and code reviews.
  • Mapping Data Items from source systems to the target System.tuned Datastage jobs for better performance by creating Datastage Hash files for staging the data and lookups.
  • Involved in creating repositories for the metadata using IBM information server on AIX.
  • Also, worked on ETL reporting using IBM Websphere Datastage to generate output files in text formats with different kinds of delimeters and constraints involving business requirements.
  • Configured the Oracle Enterprise stage from the DataStage server to connect to the source and Target Oracle databases.
  • Extracting, cleansing, transforming, integrating and loading data into data warehouse using Data stage Designer
  • Worked on Teradata as a referential database to load in to the oracle datawarehouse for scheduling process.
  • Created shared containers to use in multiple jobs and generic jobs in sequencers.
  • Extensively worked on IBM information Server to synchronize, integrate and restructure information for consistency.
  • Worked with the production support team to debug and modify the jobs related to audit report tables and shadow tables.
  • Worked extensively on IBM Rational Clearcase tool, Rational Clear Quest and HP quality center to participate in the water fall methodology model.
  • Worked with DataStage director to schedule, monitor and analyze performance of individual stages and run DataStage jobs.
  • Worked on IBM Infosphere Datastage 8.5 in the latter stages of the project and maintaining the application.
  • Designing and Developing PL/SQL Procedures, functions, and packages to create custom tables.
  • Worked on Audit report tables and shadow tables to generate adhoc output files based on the customer needs.
  • Developed job sequences to run jobs in parallel and in series taking into consideration the interdependencies and business requirements.
  • Extensively used control-M for automation of scheduling for UNIX shell script jobs on daily, weekly monthly basis with proper dependencies
  • Written Shell scripts in UNIX to invoke the data stage jobs and to schedule the jobs.
  • Unit Test Data Stage Jobs in development including creating the appropriate test data.
  • Proven track record in troubleshooting of DataStage jobs and addressing production issues like performance tuning and enhancement.

Environment: IBM Infosphere DataStage 8.5/8.1, Putty, Oracle 10g, MS SQL Server 2005, Teradata V2R6/V2R5, SQL Developer, Quality Center, UNIX, Rational Clear Quest, Rational Clear Case, Erwin 4.0,Control-M

Confidential, MN

Sr. ETL Developer

Responsibilities:

  • Involved in designing, developing and testing of the strategy to populate the banking data from various source systems (Flat files, DB2, COBOL copybooks and Oracle) feeds using Datastage, PL/SQL and Unix Shell scripts
  • Extracting, cleansing, transforming, integrating and loading data into data warehouse using Data stage Designer
  • Improved the Data consistency by using IBM Information Analyzer.
  • Imported various Application Sources (Database tables, Flat files, Cobol files, XML files, Etc)
  • Worked on Configuring the IBM Information Analyzer, importing the metadata, integrating tables and analyzing the raw metadata.
  • Worked with Metadata Definitions, Import and Export of Datastage jobs using Data stage Manager
  • Involved in transforming various operational sources like Oracle, Teradata, DB2, SQL Server 2000, Flat files into a staging area.
  • Analyzed the quality of the data to determine its consistency, reliability and documented in detail findings together with GUI using IBM Information Analyzer.
  • Developed the jobs on Datastage to extract the data from flat files and DB2 tables
  • Transformation of inconsistent records into consistent formats to improve the performance.
  • Extensively worked on Database Stages like Oracle, ODBC, DB2 EE, DB2 API stage.
  • Designed and developed Server and Parallel jobs using Datastage designer to migrate data from flat files,DB2,ORACLE into the Data Warehouse
  • Multi-data warehouse environment with repositories residing on UDB and DB2 RDBMs that run respectively on AIX and Z/OS Operating Systems.
  • Configured the Oracle Enterprise stage from the Data Stage server to connect to the source and Target Oracle databases.
  • Used stages like Transformer, sequential, Oracle, Aggregator, Data Set, File Set, CFF, Remove Duplicates, Sort, Join, Merge, Lookup, Funnel, Copy, Modify, Filter, Change Data Capture, Change Apply, Head, Tail, Sample, Surrogate Key, External Source, Target, Compare.
  • Used SQL Loader to load the data into Oracle tables.
  • Created UNIX shell scripts using K-shell for calling the stored procedures for truncating the tables and calling the SQL Loader for loading the data into the target staging area and loading the stats into the stats tables.
  • Involved in the testing of the various jobs developed and maintaining the test log.

Environment: IBM Web Sphere DataStage 8.0, Ascential DataStage 7.5.2 (Parallel Extender), IBM Information Analyzer, IBM Information Server 8.0, UNIX, COBOL,IBM AIX 5.1, Oracle 10g, SQL, PL/SQL, Unix, SQL Server 2005, Erwin 4.0, DB2 9.1, Sun Solaris, Toad 9.1

Confidential, CT

Sr. Data warehouse Developer

Responsibilities:

  • Involved in requirement gathering, analysis and study of existing systems.
  • Developed jobs in Parallel Extender using different stages like Transformer, Aggregator, lookup, Source dataset, external filter, Row generator, Column Generator, Peek Stages
  • Used Data stage Administrator to assign privileges to user groups.
  • Developed UNIX shell scripts for FTP source files maintenance
  • SQL and PL/SQL wrote backend store procedures, functions and triggers.
  • Extensively used Datastage Designer for Developing Server jobs, Parallel jobs and performed complex mappings based on user specifications.
  • Used Datastage Designer for Extracting Data, Performing Data Transformations and Aggregate Data.
  • Maintaining system interfaces, extracting, translating and loading data between internal applications and externally to the shareholding and banking industry.
  • Used Quality Stage Standardize Stage Wizard and Quality Stage Match Stage Wizard for standardization and match capabilities. Also worked on other modules like Investigating.
  • Excellent experience working in DB2, Teradata and SQL Server databases.
  • Improved performances by using modify stage before transformation process.
  • Used Datastage Manager to store, manage reusable Metadata and created custom routines and transforms for the jobs.
  • Developed various SQL scripts using Teradata SQL Assistant and used some of them in Teradata Stages.
  • Experienced in transforming complex data files binding to the business requirements.
  • Worked extensively on all the Server stages such as Sequential, Aggregator, Hashed Files, Sort, Link Partitioner, Link Collector and ODBC.
  • Distributed load among different processors by implementing the partitioning of data in parallel extender.
  • Used Partition methods and collecting methods for implementing parallel processing.
  • Extracted data from oracle, DB2 databases, and transformed data and loaded into Oracle data warehouse.
  • Used lookup stage with reference to Oracle table for insert/update strategy and for updating slowly changing dimensions
  • Created DataStage jobs and job sequences and tuned them for better performance.
  • Worked on Automating and scheduling tasks and report distribution
  • Developed UNIX shell scripts to automate the Data Load processes to the target Data warehouse.

Environment: IBM WebSphere DataStage 7.5.1, Parallel Extender, IBM WebSphere Quality Stage 7.5, DB2 UDB 8.0, Oracle 8i/9i, PL/SQL, Teradata, Business Objects XI r2, OLAP, Erwin5.5, SQL plus, SQL*Loader, TOAD, Sun Solaris 2.8.

Confidential, CT

Data warehouse Consultant

Responsibilities:

  • Extensively involved in determining the data needed to address business user’s analytical requirements and design data models to support these analysis.
  • Used DataStage Manager to define Table definitions, Custom Routines and Custom Transformations and Export and import the Datastage jobs
  • Designing and developing web user interfaces in Visual Basic.NET.
  • Extensively worked with Parallel Extender for parallel processing to improve job performance while working with bulk data sources
  • Worked extensively on different types of stages like Sequential file, ODBC, Hashed File, Aggregator, ORABulk, BCPLOAD, Transformer, Merge, Sort for developing jobs
  • Extensively used Link Partitioner and Link collector in improving the performance.
  • Extensively worked on the source data from Flat file source systems, Oracle source systems, and DB2 source systems.
  • Designed jobs using the stages like Sequential file, Oracle enterprise, Teradata Enterprise, Transformer, Lookup, Join and Modify.
  • Developed the data warehouse repository using DataStage Manager by importing the source and target database schemas
  • Extensively used Teradata Load and Unload utilities such as Multi Load, Fast Export and Bulk Load stages in Jobs for loading/extracting huge data volumes
  • Used the DataStage Director extensively to run, schedule, monitor, debug, Trouble shoot and test the application on development, and to obtain the performance statistics
  • Involved in performance tuning of the ETL process and performed the data warehouse testing
  • Designed the complete Decision Support System using Business objects by creating different types of reports for trend analysis using filters, conditions and calculations
  • Worked extensively on SIEBEL CRM application tool for integrating telesales, account, and sales management by optimizing information shared by multiple employees, and streamlining existing processes like retailing.
  • Used Unix shell scripts to run Datastage jobs and load data in to Database
  • Extensively worked with Quality stage to standardize data by trimming the spaces, removing nonprinting characters, incorrect spellings using Transfer, Parse, collapse, Select, Unijoin and Build stages.
  • Worked extensively on the testing the modules and rectifying the missing link in the different modules.

Environment: IBM Web Sphere Datastage 7.0, Erwin 4.0, Oracle 9i, PL/SQL, OLAP, MS SQL-Server 2000, SQL*Loader, Oracle Siebel CRM 7.7, Teradata, Sun Solaris 8

Confidential, PA

ETL Developer

Responsibilities:

  • Actively involved in gathering requirements from the end users.
  • Developed several Parallel jobs to improve performance by reducing the runtime for several jobs.
  • Designed, developed and tested the Data stage jobs using Designer and director based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Analysed COBOL Copybooks for Source Dictionary.
  • The data model is in Star-Schema, extensively used Erwin for data modeling.
  • Expert in dimensional modeling, Star Schema modeling, Snowflake modeling, fact and dimension table design, physical and logical Data modeling using Erwin.
  • Extensively worked with Datastage Parallel Extender for Parallel Processing to improve job performance while working with bulk data sources.
  • Implemented round robin, Sort & Merge, Hash partitions for parallel processing.
  • Edited Configuration file for configuring multi node for data stage Parallel edition implementation.
  • Used stages like Transformer, sequential, Oracle, Aggregator, Remove Duplicates, Sort, Join, Merge, Lookup, Data Set, Funnel, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key.
  • Scheduled, viewed and the edited the logs by using Director. Also monitored the results in Director.
  • Did performance tuning at source, transformations, target and administrator.
  • Used command line prompts and UNIX scripts for automation.
  • Implemented data stage for integrating real time middleware messages with XML files
  • Involved in creating different projects using administrator.
  • Experienced in writing utilities (Which call the jobs from transformer stage) and custom programs (Used for complex logic and called from transformer).
  • Importing/exporting the data stage projects and taking the backup.
  • Written User defined SQL queries and run the stored procedures in after SQL of source and Target.
  • Wrote PL/SQL stored procedures using Toad.

Environment: Ascential Datastage 7.0/PX/EE, TOAD 7.1, IIS Web Server, COBOL, SQL, PL/SQL, Oracle 9i, ERwin, MQ Series, Business Objects

Confidential

Sr. Software Engineer

Responsibilities:

  • Planned the application schema, designed database objects like Tables, Indexes, views, sequences and integrity constraints using technical documents.
  • Worked closely with system administration during software installation and upgrades
  • Involved in analysis, design, coding, testing, data conversion and implementation
  • Used complex SQL queries to the query the database
  • Used bulk collect, collection objects and ref cursor
  • Used hints for query optimization
  • Generated summary and detailed reports
  • Rebuild the indexes for better performance
  • Resolved the bottlenecks in the performance issues and helped in tuning the application
  • Pinned the frequently used packages and procedures in the memory

Environment: Oracle 8i, Reports 2.x/3.0, PL/SQL, SQL, MS Office, Windows 95/98

We'd love your feedback!