Project/technical Etl Lead/bi Developer Resume
SunnyvalE
SUMMARY
- Technically sophisticated engineering professional with over 13+ years of experience designing, developing and implementing data warehousing solutions using IBM Info Sphere Data StageV.11.7, Pentaho PDI, Teradata, SAP HANA and Tableau.
- Broad knowledge and experience in analyzing and documenting the enterprise structure including logical organization of business strategies, metrics, business capabilities, processes, information resources, business systems and infrastructure.
- Success developing and delivering large scale projects within agile and waterfall project development methodologies.
- Expertise in developing data modelling, data integration, data management and business intelligence solutions.
- Hands on in ERWIN in creating Schemas, building logical and physical models.
- Worked on upgrade projects, which involve migration from Pentaho PDI to Data Stage 11.7 version and from Oracle 11g to SAP HANA Teradata.
- Extensively involved in the development of Data stage ETL Parallel jobs for extracting data from different data sources, data transformation and loading the data into data warehouse for data mart operations.
- Have 12 years of in - depth experience in Extraction Transformation and Loading (ETL) processes using IBM Info sphere Information server Data Stage 11.3, 8.5, 8.0.x (Parallel Extender), Oracle, Teradata, autosys, TWS(Tivoli) and Control-m.
- Skilled team leader and trainer with a track record of directing direct multiple tasks effectively to ensure on target completion of all deliverables.
- Outstanding interpersonal and communication strengths leveraged to train users, troubleshoot system issues, and ensure total client satisfaction.
- Deep expertise in Data Modeling, Data Integration, Metadata Management and BI.
- Ability to use business knowledge to proactively identifying, evaluate and recommend appropriate and relevant solutions, actions and outcomes.
- Extensive knowledge as BI and report developer using various tools like COGNOS 10BI and Tableau.
TECHNICAL SKILLS
ETL Tools: Data Stage 7.5/8.1/8.7/11.7 , Pentaho Data Integration
Reporting Tools: Tableau V8.2, Cognos 10BI
Data Modelling Tools: IBM Info sphere Data Architect V9.1, Erwin Data Modeller 7.3
Databases: SAP HANA, Teradata, Oracle 10g.
Operating Systems: Windows, Unix.
Scheduling Tools: Autosys, Zeke Scheduler
Others: Mainframe ISPF 5.7, JCL, Cosort, Unix Shell Scripts, SQL Script, C#, ADO.NET, ASP.NET, XML, WPF, Python, SAP Basis R/3, SAP HANA. Knowledge on IoT(Internet of things)Arduino, Raspberry Pi-3, Gprs Module sim900
PROFESSIONAL EXPERIENCE
Project/Technical ETL Lead/BI Developer
Confidential, Sunnyvale
Responsibilities:
- Engage with Business Readiness Functional leads to understand DQ requirements, raise any issues or misunderstandings about the requirements.
- Developed Pentaho etl code based on the business mapping and performed unit testing for LSG modules.
- Re-loaded Tier 2 migration - Phase II testing (Oracle SQL Scripts into ETL Pentaho transformations.)
- Migrations of PUAT tables to LSG environment using Pentaho data integration.
- Involved in Migration of Cola reporting to LSG tables (Pentaho ETL).
- LSG DB and files available on the server.
- SSP and Support Edge - Shelf serial.
- E1000 Account Roll-ups- ONTAP column; NAGP to GEM mapping.
- OCI Phone-Home Data.
- Software usage on/off table and discovery Tool parsing output.
- Developed attribute views, Analytical views, calculation views, procedures and created Modified analytic privileges.
- Developed calculation views with cube star join for performance tuning.
- Hands on creating variables, input parameters, calculated columns in HANA Studio.
- Worked on creating Explorer Information Spaces in Business Object Explorer.
- Worked on Creating Universe on Analytical and Calculation views
- Creating Delivery units and importing them into systems.
- Using Unions, projections and aggregation in the Graphical Calculation Views to union, filter and aggregate columns of the calculation views
- Worked on import and export functionalities to get models and files (csv) to studio for views.
- Create Measures using SUM, MIN, MAX, and COUNT.
- Expert in writing stored procedures.
- Creating the datastore between the source systems and target systems. creating the database creation, repository creation, job server creation, project, Workflows, Dataflow, Jobs,
- Creating the transformations as per the requirement.
- Coordinated on loading data toSAPHANAfrom different source systems using SLT for Real-Time.
Environment: IBM Info sphere V11.7, Pentaho PDI, Oracle 11g, SAP HANA
Data Specialist/ ETL Lead
Confidential, NC
Responsibilities:
- Engage with Business Readiness Functional leads to understand DQ requirements, raise any issues or misunderstandings about the requirements.
- Provide Technical IA guidance and support to Business Readiness Functional leads when required.
- Translation of Global and Regional DQI's into a Technical Specification
- Develop assigned IA deliverables according defined architecture and design.
- Ensure assigned IA deliverables are completed according to plan and adhering to acceptable levels of quality.
- Perform unit testing and peer reviews to ensure the quality of deliverables.
- Resolution of IA incidents and the implementation of IA Change Requests logged by the Business Readiness Functional leads.
- Knowledge transfer of IA deliverables to the IA support team.
- Engaging with the Data Migration resources responsible for loading the data to ensure that reference and test data are loaded.
- Engaging with Data Migration resources to understand the staging tables and migration of stating tables to the test environment.
- Attending progress and team meetings to report on progress, issues or risks.
- Identifies common objectives and links them to clear measurement metrics in order to evaluate supplier performance and set improvement targets.
Environment: IBM Info Sphere Information Analyzer, Oracle 11g
Data Specialist / ETL Lead
Confidential, Bentonville AR
Responsibilities:
- Developed ETL Data stage jobs as per the Technical Mapping document received from DICOE Wal mart.
- Communicating to IBM GEO team, regarding Development deliverables and maintaining issue logs as per the GD & GEO process.
- Performed Unit testing for ETL data stage jobs as per the technical mapping document.
- Involved in ETL migration project, Conversion of ETL Online (Walmart Tool) to IBM Info Sphere Data stage.
- Created logical model and physical model using IBM Info Sphere Data Architect.
- Created Data Modeling objects using existing DDL’s from SAP tables and vice versa.
- Involved in end-to-end system testing of ETL projects.
- Worked with the Platform Solutions Manager and Project Managers to evaluate product-enabling technologies.
- Identify, assess and manage risks to the success of the project.
- Ensure Quality dependencies (within program/critical path).
- Utilized IBM metrics and data for continuous improvement reviews occur within given timeframes and according to procedure.
- Establish and manage temporary Program Management Offices (PMO) and the associated processes and procedures for risks, issues, changes, quality and cost management.
- Ensure dependencies include interdependencies (other programs) and intra-
- Establish and maintain a fully auditable communications log during the various project phases.
Environment: Info sphere - Data stage 8.5, Oracle 11g, Teradata
Data Specialist / ETL Lead
Confidential
Responsibilities:
- Module lead for Development team and involved in testing activities along with the testing team.
- Developed ETL Data stage jobs as per the Technical Mapping document received from GEO - Confidential team.
- Developed and delivered ETL jobs on time, as per the schedule provided by the GEO -Development work request document.
- Communicating to Confidential GEO team, regarding Development deliverables and maintaining issue logs as per the GD & GEO process.
- Performed Unit testing for ETL data stage jobs as per the technical mapping document.
- Involved in manual testing by preparing test cases, expected results and sample data as per Testing - Work request document send by GEO team.
- Performed test execution for Connector IN project, by executing the Confidential - Run Connector Shell scripts and validating the output results.
- Validating the expected and Actual testing results as per the Confidential standards.
Environment: Info sphere - Data stage 7.5, Oracle 10g
Data Specialist / ETL Lead
Confidential
Responsibilities:
- Modification on mainframe JCL is as per the project specification during migration activities.
- Capturing the discrepancies in both data stage 7.5 and data stage 8.1 by validating the data using CDC data stage jobs.
- Performed Unit testing of ETL data stage jobs and mainframe JCLs using RBCI standard template.
- Created Technical design document for RBCI enhancement projects.
- Completed Data stage migration jobs from DS 7.5 to DS 8.1.
Environment: Info sphere - Data stage 8.1, Oracle 10g
Data Specialist / ETL Designer
Confidential
Responsibilities:
- Used ETL tool IBM Web Sphere Data Stage V.8.0 for Extraction, Transformation and Loading data into BIH (Landing Area).
- Involved in creating the data stage parallel jobs and Unix Shell scripts based on mapping document.
- Used tools Data stage 8.1, Oracle 10g and Sun Os.
- Reviews of jobs, Unit testing.
- Analyzing the log files and bugs verifying.
Environment: Info sphere - Data stage 8.1, Oracle 10g
Data Stage / ETL Developer
Confidential
Responsibilities:
- Used ETL tool IBM Web sphere Data stage V.8.0 for Extraction, Transformation and Loading data into data warehouse.
- Involved in creating the data stage Mainframe jobs and Unix Shell scripts based Change request document.
- Used tools Data stage 8, Cognos 8, Margin Minder, Eureka strategy, Business Objects XI.
- Used DB2 as database with source coming from Mainframe and target as Teradata and SQL Server.
- Received E-Award from hp for creating an automation script and team management skills.
- Involved in monitoring activities like Data stage mainframe jobs, Server Jobs that is running on zeke scheduler and Solving Severity tickets like Sev 1, Sev 2 and Sev 3 with in the set timeline.
- Conducted Technical trainings to the team members as data stage track lead.
Environment: Info sphere - Data stage 8, Cognos 8, Teradata, DB2 8
Data Stage / ETL Developer
Confidential
Responsibilities:
- Involved in moving the data from Source to staging and from staging to target.
- Involved in creating the jobs based on mapping Document.
- Involved in loading dimensions and facts based on mapping design document.
- Used ETL tool IBM Web Sphere Data Stage V.7.5 for Extraction, Transformation and Loading data into data warehouse.
Environment: Info sphere - Data stage 7.5, Oracle 10g, Cognos 8
Data Stage / ETL Developer
Confidential
Responsibilities:
- Used ETL tool Ascential data stage PX 7.5 for Extraction, Transformation and Loading data into data warehouse.
- Writing technical specification and estimation.
- Developing objects using appropriate Data stage Parallel job stages.
- Reviews of jobs, Unit testing, analyzing the log files and verifying bugs.
Environment: Info sphere - Data stage 7.5, Oracle 10g, Cognos 8
Software Engineer
Confidential
Responsibilities:
- Used ETL tool Ascential data stage for Extraction, Transformation and Loading data into data warehouse.
- Developing objects using appropriate Data stage Server job stages.
- Reviews of jobs, Unit testing.
- Analyzing the log files and bugs verifying.
- Writing technical specification and estimation.
- Testing Data stage Jobs.
- Developing objects using appropriate data stage parallel job stages.
- Checking the log files for bugs.
Environment: Info sphere - Data stage 7.1, SAP BI, Oracle 10g,