Looking for a Position as BODS Analyst / Consultant or ETL/Analyst/Developer in SAP BODS environment where I can use my past and existing skills in ETL tools, Software development and support, Unix Shell scripting, SQL, Data warehousing, data integration / data quality or Business Intelligence.
Industries: | Information Technology, Financial, Insurance, eCommerce, Data warehouse, Consumer |
Application Package: | Lotus Notes, Scheduler, Zena, Syncsort, Netron Fusion, Clearcase, Endeaver, Infoman, Hummingbird Suite, Microsoft Office, Microsoft Project, Microsoft IIS, Visio |
Databases / Tools: | DB2, UDB, Netezza, VSAM, Adabas, Oracle-x, SQL Server, MS ACCESS, PostgreSQL Open Source. |
Hardware: | Mainframes - IBM System-4331, System-3090, System 390, Servers - SUN Enterprise 3500, 450 and workstations , SPARC 2000, 1000, 20, 10 SUN Ultra 2, HP3000, NCR Tower, IBM-RS-6000, ICL-OS used: Dos/ VSE, MVS/TSO, AIX, Solaris 2.5.1 2.6, 2.7, HP-UX v10.2, AT T System-V 7.2.4 |
Operating Systems: | UNIX, AIX, Linux, Windows, Mainframe DOS/VSE, MVS |
Programming Languages: | Cobol, M.F.Cobol, JCL, SQL, NATURAL, Visual Basic, Shell Scripts |
Business Intelligence Tools / ETL: Data Cleansing : | Business objects XI, SAP BODS, Informatica Power center, SAS Enterprise Guide Harte-Hanks Trillium Software and TS Discovery |
Confidential SAP BODS Consultant 3 Years full time - Worked extensively with data profiling tools sap information steward 4.0 for ongoing project tunderstand the quality of data before loading intthe database.
- Data quality management and data profiling.
- Worked extensively with Business Analysts on SAP INFORMATION STEWARD.
- Had extensive experience in trouble shooting in resolving the defects and alsloading the data in DEV, QA and PROD environments.
- Involved in complete life cycle of a data migration from DEV TPROD.
- Created Data Flows for dimension and fact tables and loaded the data inttargets in Datawarehouse.
- Created Data Integrator mappings tload the data warehouse, the mappings involved Extensive use of simple and complex transformations like Key Generator, SQL Query transform, Table Comparison, Case, Validation, Merge, lookup etc. in Data flows
- Involved in data profiling like source column data profiling, detail data profiling and used validation and address cleansing transformations by using data quality in SAP BODS 3.2, 4.1.
- Involved in meetings with the process team in order timplement business rules and in scoping for the objects.
- Worked extensively DB2 control centre in order tdthe backend testing.
- Had extensive meetings involvement as a part of defect resolution and alslive trouble shooting with the data team lead, SME and alsrest of the team.
- Had extensive team meetings with business people in order tassimilate the business requirements and alsin terms of deadlines and requirement estimation.
- All jobs included extensive work on extract the data from different sources. Built extract jobs in order textract the data intstaging.
- Worked extensively with work flows, data flows, try catch blocks, lookups and alsvalidation.
- Substantial development experience in creating Stored Procedures, PL/SQL, SQL and Query optimization.
- Tuned Data Integrator Mappings and Transformations for the better performance of jobs in different ways like indexing the source tables and using Data transfer transformation.
- Scheduling jobs using web admin tool.
- Doing production support and fixing the issues raised by the jobs while running in production.
- Environment: SAP Business Objects Data Integrator BODS XI 4.0 , Information Steward, Unix, DB2, MS ACCESS, Oracle Windows 2000
|
Confidential |
SAP BODS Developer 6 Years full time - Aviva Canada Inc. is one of the leading Property and Casualty insurance groups in Canada, providing home, automobile, recreational vehicle, group and business insurance tmore than three million customers. The group has more than 3,000 employees, 30 locations, 1,700 independent broker partners and is a wholly-owned subsidiary of UK-based Aviva plc, the world's sixth-largest insurance group.
- In my role as System Analyst I gained experience in Business intelligence tools like Business Objects, SAP Data Integrator BODS , Informatica Power Center, Sybase Power Designer, SAS Interface guide, mainframe, UNIX, SQL and DB2.
|
5 years of experience in Analysis, Design and Development of various Data Warehousing projects using SAP Business ObjectsData Services 4.0/XI3.2/3.1, Data Integrator XIR2/11.7/11.5, Business Objects XI 3.1/XIR2/6.5/6.0/5.1. - Production Support of monthly ETL jobs and trouble shooting.
- Promoted changes in mappings intProduction environment.
- Maintained existing mappings ttake advantage of new features that improve performance.
- Implemented 5 ETL projects from design tGo-live using SAP BODS/BODI tool and involved in all phases of SDLC process.
- Wrote SQL scripts tsupport mappings in SQL transformation.
-
- Analyzed source data in various data marts feeding certain internal and external clients viz PCFi, IMS.
- Interacted with Business Analysts / Architect tunderstand the business requirements.
- Designed complex SQL queries textract data tbe fed as reports tBusiness Analysts.
- Designed day tday reports in Business Objects XI and support existing reports for fine tuning.
- Attended 4 days of training in Trillium Software from Vendor.
- Experience in Data cleansing by using Trillium Software tool by working on a 6 months project related tmarketing campaign. Performed installation of postal code directories on Unix server, Support related activities like creating users, assigning roles tusers, controlling user privileges, killing idle processes on server, moving the data Cleansing process tar file from dev. Environment tthe server etc.
- Analyzed Broker portal data and participated in Hierarchy maintaining system and automation of underwriting rules system.
- Designed and implemented ETL process of loading data for automation of underwriting rules system.
- Participated in DB2 physical and logical database design and implementation in addition tsystems architecture design.
- Wrote Unix shell scripts tsupport the existing data load processes by analyzing the flow of data and process which helped significantly in automating the process.
- Involved primarily in analyzing the MINDS data mart and extracted Claims and premium related data for PCFi by writing SQL scripts, Unix Scripts tvalidate and automate, designed scheduler tschedule the process in Zena.
- Testing using Mercury Quality center.
- Experience in writing technical and functional specification for various ETL processes designed in Data Integrator or Informatica.
- Pre-migration and post migration testing of processes as a result of servers physical moved tvendors or another location.
- Designed ETL process for HST project and assisted in data modeling by using Sybase PowerDesigner tool which is similar tErwin.
- Design data load jobs in SAP BODS.
- Design Jobs scheduling in ASG-Zena tool for scheduling and automation.
- Primary and On-call support for production support in monitoring file feeds, job triggers, processs dependencies etc. and provide resolution or escalation on rotational basis.
- Analyzed the source data and design of data warehouse and ETL processes.
- Conversion and migration of ETL functionality from PostgreSQL tDB2 UDB EEE databases.
|
Environment: SAP Data Integrator XI 3.2 and 4.1 BODS , SQL, IBM DB2 8.0, Sybase Power Designer, Unix, AIX, Windows 2000 |
ETL Developer / Consultant |
Worked on a migration project for F100 Investment Bank, NY a co-sourcing project of Keane Canada and F 100 Investment B. The application system named 'WIRE' has been designed using the Natural/ Adabas/ DB2 on Z/OS and was involved in migration and conversion of all the components of the application system tLinux. |
- Conversion of JCL's tLinux shell scripts using korn shell
- Wrote 65 Shell scripts trun the application system which previously ran on Mainframe / JCL
- Accomplished unit testing for all shell scripts.
- . Used Informatica as ETL tool tload data intDB2 tables.
- Successfully ran the scripts for functional testing.
- Debugged the Natural programs migrated from Mainframe tLinux
- Converted data files from binary tASCII format using 'fileport' tool
- Reported bugs and errors tthe vendor and client
|
Environment: | SAG Natural/Adabas, DB2/UDB, JCL, Linux, IBM Z/OS |
Confidential |
Analyst/ Programmer |
- Worked on Migration day by day development on system of a client of Moore . Worked in VSE environment and UNIX / Micrfocus Express Server on conversion and development project.
|
- Conversion of Assembler programs tMicrFocus Cobol under Unix.
- Conversion of other VSE Cobol batch programs tMicrFocus Cobol/ Unix.
- Writing new shell scripts for maintaining production environment, Development maintenance of application system of TD Bank.
- Wrote programs in Shell Script treplace JCL.
|
Environment: | IBM mainframe System VSE , MicrFocus COBOL/ Cobol-II, JCL, QMF, Lotus Notes, Cobol Express Server, Hummingbird suite, Shell Scripting and C |
Confidential insurance |
Analyst Programmer |
- Worked as team member of claims System Maintenance team involved in the designing of newly Oracle database.
- Worked in the development of the check reversal from scratch and the enhancement of life pension applications.
- Data-analysis for Marketing Group. Compiled marketing information from the Oracle data-warehouse for marketing clients using the SCIF Oracle marketing Information Warehouse. Tools used include FastLoad, FastExport, MultiLoad, BTEQ and customized UNIX scripts.
- Full responsibility for the research, design, testing and implementation of the insurance marketing system. This involved gathering requirements, creating functional specifications, transforming these intdetailed specifications and finally modeling a database and developing a complex and interactive system tallow over 200 users tsubmit requests and have the requests assigned and completed.
- Participated in the requirements analysis and design for several enhancements in claims management system and Call center sub-system.
- Projects for the Claims sub System. Developed COBOL programs and participated in unit, system testing and implementation.
- Developed and re-engineered modules in Compensation claims systems using COBOL, Embedded SQL and performed Unit testing, system testing and regression testing.
- Maintenance of on-going systems designed in M. F. COBOL ACU COBOL using embedded SQL.
- Designed an automation system tfacilitate call center directly open new claims on a remote Unix server using Oracle database.
- A conversion assignment of application systems from COBOL-370 tM.F.COBOL with embedded SQL, including all programs 400 and data files 150 in mainframe environment
- Design and implemented a check-printing system converted from Cobol intOracle 8 in the head office, San Franciscand trained all users in all district offices all over the State.
|
Environment: | IBM MVS System, MicrFocus Cobol, Oracle-8, Cobol-II, DB2/UDB, Infoman, Fileaid, QMF, Spufi, Xpeditor, Syncsort, Netron Clearcase, Cobol Express Server |