We provide IT Staff Augmentation Services!

Integration Architect/etl Developer Resume

0/5 (Submit Your Rating)

Arden Hills, MinnesotA

TECHNICAL SKILLS:

Operating Systems: Sun Solaris, SCO UNIX, OS/390, OS/400, IBM, OPEN VMS, Windows NT & OS/2

Languages & Packages: SQL*Plus, PL/SQL,T - SQLJCL,SPUFI,QMF, RPG400, CL400, COBOL400, QMF, CoSort, Erwin, Unix Shell Scripting

RDBMS: Oracle, Teradata & DB2 - UDB

ETL Tools and Utilities: SSIS, Power Center Informatica 7.1.1,8.xDM Express (Syncsort)

WORK EXPERIENCE:

Integration Architect/ETL Developer

Confidential, Arden Hills, Minnesota

Responsibilities:

  • Designing Integration solutions in accordance with architectural standards.
  • Creating Informatica mapping for system Integration with Web Services.
  • Creating Informatica mapping using XML sources and XML Targets.
  • Perform Tuning existing Informatica mappings reduced Running time from 3 hours to 10 minutes using Pipeline partitioning, pushdown optimization and dynamic session partitioning
  • Mentoring Junior ETL Developers in Standards and Best Practices and Leading the Code reviews.

Environment:Informatica 9.5.1, Informtica ILM, DB2, Oracle, SoapUI, WebServices, XML

Solution Architect/ETL Developer

Confidential, Blommington, Minnesota

Responsibilities:

  • Provide end to end solution and design details
  • Performed overall project management, analysis, architecture, design, development and implementation of data warehousing / business intelligence initiatives
  • Developing a Customer Rules Engine designed to keep track of unique visitors, website referral information, and their purchases.
  • Creating Microsoft SSIS Extract, Transform and Load (ETL) packages and PL/SQL to load data from multiple source systems into one Data Warehouse Analytics Repository, where all reports and dashboards will be collecting data.
  • Interfacing with 3rd party web statistics generators to pull in valuable customer and supplier data.

Environment:SSIS 2012,Informatica, PL/SQL, Oracle11g, Webtrends, SpotFire, SAS

Data Warehouse Lead ETL Developer/Solution Architect

Confidential, Blommington, Minnesota

Responsibilities:

  • Designed and architect Migration process of Informatica Mappings from ORACLE to SQL Server Database
  • Build New and Modify informatica Mappings scripts from PL/SQL to T-SQL
  • Develop SQL Server Integration Services (SSIS) Packages.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Re-design Informatica Workflow Scheduling for efficiency
  • Informatica Workflow Testing and Remediation
  • Production On Call Support
  • Manage Developers OnSite and offshore
  • Developed Long term and short term objectives of Integration Program
  • Developed Best Practice Guiding Principles for Integration Projects
  • Provide Inventory of all Integration Tools and role of each tool across Enterprise
  • Creates architectural solutions that articulate the business context, conceptual design, and component level logical design

Environment: Informatica PowerCenter 8.0.1, Shell Scripting, Oracle 10g, ERWIN 6, CosBatch Scheduler, UNIX -AIX

Senior ETL Developer

Confidential, Minnetonka, MN

Responsibilities:
  • Data Analysis on the source data and providing data pattern inputs to the data modeling team. Created Design documents for Informatica mapping and low level ETL design specification with use cases and Visio diagrams.
  • Create DDL and Database objects in Development database using data model prepared by Data Modeler in ERWIN. Developed and maintained ETL(Data Extraction, Transformation and Loading) mappings using Informatica to extract data from Multiple source systems.
  • Worked as a production Support Developer.
  • Mappings designed and developed using Informatica to load VLBD and hence created for high performance ETL load. Conducted Code reviews for all of the junior developers against the business requirements and validated their unit test results to ensure the code was ready for migration to the SIT environment
  • . Used Delta load strategy to process, transform and load claims incremental data which consisted of the claims made, claims settlement and the GL transactions between the medical providers.
  • Created wrapper scripts using UNIX shell scripting for pre process validation and starting the Informatica Workflow load process.
  • Providing QA support by analyzing and fixing defects with quick turnaround time.

Data Warehouse ETL Lead Developer

Confidential, Eagan, MN

Responsibilities:

  • Helped create a temporary data store (TDS) for the Medicare Analytics Reports.
  • Created COBOL, REXX and JCL scripts to re-format, validate and FTP files that are from Mainframe to the AIX machine.
  • Worked with Kimball Group on ETL architectural design .
  • Wrote shell scripts for initial and incremental loads of Member, Provider, Claims and Revenue data with delta detection and type 2 dimension process using SyncSort and DB2 SQL PL.
  • Ensure all scripts are followed BCBSMN naming/coding standards. .Recommended Informatica for the ETL load and involved in the evaluation of the tool.
  • Worked with Data Stewards to analyze business requirements for Medicare Analytics data store.
  • Coordinated with QA team to build and deploy baselines using Rational Clear Case.

Senior ETL Developer/Analyst

Confidential, Eden Prairie, MN

Responsibilities:

  • Involved in analysis of source, business requirements and identification of business rules. · Responsible for developing Pre-work for DATA ACQUISITION and DISPOSITION DOCUMENT (DADD) for respective target.
  • Used Evoke Axio for profiling source data helping assessment and identifying the quality and consistency of the source data.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 7.1.1.
  • Responsible for Converting DataStage Code to informatica.
  • Documented Pre-code, Technical Design and Post Code for Internal review of the code.
  • Extracted sources from flat-files, staged to oracle (staging area) and created load file in fixed width format Db2load scripts were used to insert / update to DB2 target database.
  • ·Created complex mappings and reusable transformations used across all subject areas.
  • Used COSORT scripting for simplified source feed for ETL, resulting in data being sorted or merged to facilitate source feed for ETL.
  • Analyzed data issues for mapping to existing database structures, OLAP processing, developing and applying business rules.
  • Developed Test Cases according to Business and Technical Requirements and prepared SQL scripts to test data.
  • Involved in managing objects being moved to RCS and promoting final code to production environment. Active member for remote and onsite ETL support for Confidential of California.

Senior ETL Developer

Confidential, Dublin, OH

Responsibilities:

  • Involved in analysis of source systems, business requirements and identification of business rules.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.
  • Documented user requirements, translated requirements into system solutions and developed implementation plan and schedule .
  • Extracted sources from flat-files, oracle and load them into Teradata.
  • Created mappings using the transformations like Source qualifier, Aggregator, Expression, lookup, Router, Filter,Update Strategy, Joiner, and Stored procedure transformations
  • Worked on Informatica Power Center 6.2/6.1 tool - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet, Transformations, Work Flow Manager (Task Developer, Worklets, Work Flow Designer) and Work Flow Monitor.
  • Extensively used Informatica to load data from DB2, Flat Files to TeraData / Oracle.
  • Converted Cobol programs, which are in embedded sql to informatica logic.
  • Created reusable transformations and mapplets and used them in mappings.
  • Used Informatica Power Center Work Flow Manager to create sessions, work flows and work-lets to run with the logic embedded in the mappings.
  • Fine tuned Transformations and mappings for better performance.
  • Involved in the process in documentation of source-to-target documentation design documentation of the Data Warehouse Dimensional Upgrades.
  • Extensively used Informatica for loading the historical data from various tables for different departments
  • Created complex mappings and transformations.
  • Did Unit Testing and tuned for better performance.
  • Created various Documents such as Source-To-Target Data mapping Document, Backroom Services Document, Unit Test Cases and Data Migration Document.
  • Created dimensions and facts in the physical data model.

Senior ETL Developer

Confidential, Maple Grove, MN

Responsibilities:

  • Wrote COBOL programs using MF Net Express (3.1). Analysis, Design and Coding of complex programs, involving High-level presentation reports controlling Fonts and Spacing using Xerox, Dynamic Job Descriptor Entries, "DJDE". Create "OMRGEN" form controls for imaging system.
  • Developing Unit test plans.
  • Analyzing and documenting the current process flow.
  • Convert NetExpress programs from 16-bit code to 32-bit code.

Senior Consultant/AS400

Confidential, Minneapolis, MN

Responsibilities:

  • Created test cases for unit testing of converted and synchronized programs on AS/400.
  • Converted and Migrated Wang (Cobol) data and Source program to AS/400(COBOL).
  • Ran parallel processing to the WANG, compares results to validate conversion process.
  • Created AS400/COBOL programs for horizontal validation and data integrity checks.
  • Developed File Manager CL programs on AS/400 for file comparison and QA Validations.
  • Resolved defects by working with QA and Developer teams to ensure testing issues are resolved on basis of defect reports.
  • Performed validation testing of data on Wang Mainframe to ensure Integrity of files to be migrated to AS/400.
  • Performed data Integrity validation in converted data by creating Scripts using IBM AS/400 query utility.
  • Created Data Cleansing and Validation process using Micro focus Cobol (NetExpress).
  • Converted files from ASCII to EBCDIC code, using NetExpress built conversion utility.
  • Created DDS source files and create files on AS400.
  • Created scripts to FTP files from PC Network to IBM AS400 using Binary and ASCII formats.
  • Interfaced Wang /Cobol programs into AS/400.
  • Wrote Script for running difference code reports using UltraEdit for code Synchronization.
  • Worked with PVCS Version Management tool for checking program synchronization.

Senior Consultant

Confidential, Saint Paul, MN

Responsibilities:

  • Analyzed/modeled current information flow and recommended solutions.
  • Monitored data quality from internal and external data sources.
  • Interviewed business clients and documents project requirements.
  • Developed Informatica ETL standards document and templates.
  • Maintained existing mapping by resolving performance issues.
  • Facilitated the design of corporate database structures by assisting data modelers to produce data warehouse and data mart logical model
  • Implemented Informatica PowerMart transformations/mappings.
  • Performed loading and testing of the Oracle data warehouse using PowerMart and SQL.

Senior Consultant

Confidential, Saint Paul, MN

Responsibilities:

  • Performed data source analysis and assist to develop business requirements.
  • Defined, document all data definitions, data sources and business meaning and usage.
  • Implemented Informatica PowerMart transformations/mappings.
  • Re-engineered the COBOL, SAS programs and JCL.
  • Performed loading and testing of the Oracle data warehouse using PowerMart and SQL.
  • Facilitated the transformation of corporate data into useful information for business decisions.

Consultant

Confidential, MN

Responsibilities:

  • Analyzed the COBOL programs and the current process of reports generation.
  • Developed project specs.
  • Identified the different sources where the data could be extracted and ported to the data warehouse for the recreation of the reports.
  • Reverse-engineered the existing COBOL print programs and process to identify what business logic is done on the raw data. The work involved documenting the process and the programs.
  • Wrote COBOL programs, copybooks and JCL to extract the information to be loaded into the data warehouse.
  • Trained to use Informatica's PowerMart products for extract/transformation/loading of warehouse data.
  • Developed data requirements, sourced to target data mapping, transformation rules and summarization strategies. Defined strategies for data acquisition, extraction, cleansing and loading.
  • Defined and implemented data load validation processes, update and maintain metadata and ensure overall data integrity.
  • Performed loading and testing of the Oracle data warehouse using Informatica.
  • Created/modified/maintained Informatica ETL mappings that map source data from Mainframe system to Target Oracle database and data warehouse based on requirements using Informatica.

Consultant

Confidential, Plymouth, MN

Responsibilities:

  • Responsible for interface process for the acquisition of new business by defining user requirements, developing technical design and writing of detail program specification. This included FTP, and coding/testing JCL. Additional programs coded using EasyTrieve Plus.
  • Responsible for program conversions from COBOL II to EasyTrieve Plus. Coded new EasyTrieve programs to produce the initial reports, which had been previously coded in COBOL II. This included JCL and putting new programs into production.
  • Responsible for design and coding of cost management reconciliation system. Initially the system was semi-manual using spreadsheets and rapid file to reconcile. All programs coded in EasyTrieve Plus.
  • Responsible for restructuring of Travel System DB2 database that consists of 30 tables by coding DB2 COBOL programs that insert rows and updating the existing rows. * Performed system tests and developed user acceptance test plan.

Computer Instuctor

Confidential

Responsibilities:

  • Taught programming concepts, which included records, data structures, fields, different types of files, structured programming, top-down approach, program preparation and debugging, and planning tools such as flow charts, pseudocode and hierarchy charts.
  • Taught Introduction to COBOL Programming which included COBOL divisions, basic COBOL operation, printing reports and conditional statements.
  • Taught Intermediate COBOL Programming that included data validation techniques, control break processing, and special features printing outputs.
  • Taught Advanced COBOL Programming which included advanced logical control features, array processing and table handling, file maintenance, additional COBOL options and programming project.

Programmer/Analyst

Confidential

Responsibilities:

  • Maintained three systems, including billing, payroll and general accounting, which were coded under COBOL programming language in VS operating system using WANG VS80 mainframe.
  • Specification design and coding of a productivity incentive package system, a paying system that depends on performance of an individual worker in a month. Each worker given a certain indicator for his performance. Payment was made mid month and coded in COBOL.
  • Specification, design and coding of billing cash system, which enables operator to accept cash transactions (electrical bills), so as to process bills on line. The system initially was processed through mainframe batch system.
  • Migrated from the old VS system to a new system in NCR server on UNIX system. Coded programs for data cleanup (purging programs), responsible for production of reports or data to be deleted for management approval, and coded sub-programs that accept mainframe VS data format and convert to PC UNIX format.

Programmer

Confidential

Responsibilities:

  • Responsible for maintenance of government payroll system which all was coded in COBOL under TME operating system in ME 29 mainframe.
  • Responsible for specification design and coding of foreign debt management system using COBOL. This was a database for all foreign aid, which was reimbursed in the organization for different donor aided projects and produced several reports for foreign assistance budget performance.

We'd love your feedback!