We provide IT Staff Augmentation Services!

Datawarehouse Analyst Resume

0/5 (Submit Your Rating)

Palo Alto, CA

SUMMARY

  • Around 6 years of focused experience in Information Technology on Development as well as with a strong background in Data Warehousing, Database development projects.
  • Strong background in Data Integration ETL process, working as Data Integration Specialist - Oracle, GreenPlum and Datastage.
  • Hands on Performance Tuning in ETL Jobs and Sessions.
  • Hands on working experience in Data Stage 9.1/8.5/8.1 Parallel, 7.5.1/7.5x2 PX and Server Jobs.
  • Hands on working experience in Data Quality using Quality Stage 8.7.
  • Used various stages on Quality Stage 9.1/8.7 like Investigate stage, Standardize stage, Match Frequency stage, De-duplicate Match stage, Reference Match stage, Survive stage.
  • Hands on work Experience in Green plum Database.
  • Hands on work Experience in Various GreenPlum Postgresql and pl/sql, External Tables, Distribution, Vacuuming and Table Partition etc.
  • Used various stages on Data stage 9.1/8.7 Parallel jobs; like Sequential File, Transformer, Dataset stage, Aggregator, Merger, File set, Range lookup, Joiner, Copy, Modify, filter, funnel, Oracle Connector, Job Sequence, peak stage, look up stage, Joiner stage, Remove duplicate etc
  • Used various stages on Data stage 7.5 PX and server jobs; like Sequential File, Transformer, Dataset stage, Aggregator, Merger, File set, Range lookup, Joiner, Copy, Modify, filter, funnel, Oracle Connector, Job Sequence, peak stage, look up stage, Joiner stage, Remove duplicate etc
  • Used various partition Techniques like Round robin, Same, Entire, Modulus etc.
  • Sound Knowledge on 7.5 and 8.1 Data stage Architecture and internal process of datastage
  • Used various stages on Data stage 7.5 PX and server jobs; like Sequential File, Transformer, Aggregator, Hashed File, ODBC, Link Partitioned, Link Collector, IPC, Merger, Oracle OCI and Job Sequence, peak stage, look up stage, joiner stage, Remove duplicate, Stored Procedure Stage etc.
  • Hands on Performance Tuning in ETL Jobs and Sessions and SQL Tunings.
  • Writing of PL/SQL stored procedures and functions, Packages, Database Triggers, oracle Collections and Cursor, Dynamic sql that enhances the functionality of an application.
  • Configure Data Stage Director and scheduling data integration jobs and Sequencer.
  • Used Data stage Director to run, monitor and schedule the jobs.
  • Created Data stage job sequences to handle the load dependency.
  • Good exposure in UNIX commands and Shell Scripting to execute data stage jobs.
  • Also involve in preparing various documents like Application Design Document, Unit Test case, Deployment plan.
  • Good client interfacing skills and demonstrated the same by interacting with client from Offshore.

TECHNICAL SKILLS

ETL: Data stage 9.1/8.7/8.1.7.5 Parallel/Server job, Quality Stage 9.1/8.7

Case Tool: SCME

Script: Shell Script, Sql Script

RDBMS: Oracle 9i/10g, PL/SQL, GreenPlum

Operating System: AIX5.3, Solaris

Job Scheduler: Cron Tab, Zena

Tools: Sql* Loader, Toad VII, Aginity Workbench, PgAdmin.

Remedy Tools: AOTS ( Confidential &T Proprietary tool)

PROFESSIONAL EXPERIENCE

Confidential, Palo Alto, CA

Datawarehouse Analyst

Responsibilities:

  • Created source-target mapping spreadsheet using MS Excel to facilitate better understanding of the mapping and transformations.
  • Experience working with Greenplum/Pivotal Databases/Developer.
  • Convert Oracle Procedures and functions into Greenplum.
  • Proficient with Performance and Tuning of DBMS configurations, SQL, Capacity Planning.
  • Proven expertise with Backup and Disaster recovery (multiple data centers).
  • Designed and developed the ETL jobs using Informatica Mappings which distributed the incoming data concurrently across all the processors, to achieve the best performance.
  • Used Informatica Power Center 8.6 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Designed mappings and command task to Involved in the extraction, transformation and loading of the source data to the target database.
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
  • Performed import and export mappings and various components.
  • Review all Project Specific Documents before taking it forward.
  • Leading the team and initiate review meetings and get regular updates on the tasks to be completed.
  • Designed and written mappings, performed Unit tests, set up the System Test environments, generate test cases, Regression test where needed and support User Acceptance Testing.
  • Implemented procedures for development of detailed specifications along with interfaces and platforms.
  • Integrated programs with production systems and applications by designing of system testing requirements.
  • Debugged programs for development of code and custom systems as per IT guidelines.
  • Coordinated with other client teams for organizing and conducting systems and integration testing services.

Confidential, Pleasanton, CA

Datawarehouse Analyst

Responsibilities:

  • Created source-target mapping spreadsheet using MS Excel to facilitate better understanding of the mapping and transformations.
  • Prepared technical and functional designs in collaboration with teams and business users.
  • Developed data warehouse solutions and application by translation of business and functional needs.
  • Designed and developed the ETL jobs using Parallel Extender which distributed the incoming data concurrently across all the processors, to achieve the best performance
  • Making performance tuning in Greenplum for long running PostgresSQL.
  • Developing Function, External Tables, Views, Creating Schema, User, Database etc. in GreenPlum Database.
  • Enhanced GreenPlum Database data load/export subsystem to handle XML and other forms of non- tabular data.
  • Developed custom feed parsers, db schema, ETL, monitoring and management infrastructure for PostgreSQL.
  • Provided technical guidance for re-engineering functions of Greenplum warehouse operations.
  • Performed requirement gathering, designing and development of data models for operational control and monitoring.
  • Designed parallel jobs using stages such as Join, Merge, Lookup, Remove Duplicates, Copy, Filter, Funnel, Dataset, Lookup, Sort, Surrogate key Generator, Change Data Capture and Aggregator.
  • Created Master controlling sequencer jobs using Data Stage Job Sequencer.
  • Involved in the extraction, transformation and loading of the source data to the target database.
  • Performed import and export data stage jobs and various components.
  • Review all Project Specific Documents before taking it forward.
  • Leading the team and initiate review meetings and get regular updates on the tasks to be completed.
  • Designed and written mappings, performed Unit tests, set up the System Test environments, generate test cases, Regression test where needed and support User Acceptance Testing.

Confidential, NJ

Datastage ETL Developer

Responsibilities:

  • Developing the jobs for loading data from IDIS to our Target (INR) schema.
  • Creation and debugging of Jobs which will use Joins, database levels, Transformers, Sequential files, look-up and other utilities.
  • Table creations and altering, materialized views, Views, sequences, stored procedures, functions, complex queries and performance tuning.
  • Owning Data stage configuration and deployment.
  • Owning all kinds deployment related ETL track on all the Environments setting up Data stage Parameters.

Confidential

Datastage ETL Developer

Responsibilities:

  • Developing the jobs for loading data from IDIS to our Target (ARS) schema.
  • Creation and debugging of Jobs which will use Joins, database levels, Transformers, Sequential files, look-up and other utilities.
  • Table creations and altering, materialized views, Views, sequences, stored procedures, functions, complex queries and performance tuning.
  • Owning Data stage configuration and deployment.
  • Owning all kinds deployment related ETL track on all the Environments setting up Data stage Parameters.

Confidential

Datastage ETL Developer

Responsibilities:

  • Developing the jobs for loading data from ABS-DW to our Target (INR) schema.
  • Creation one Shell script will be used and control files of SQL Loader will be called through this script.
  • There will be four control files for SQL Loader as the data should be loaded partition wise based on BILLER CD.
  • Three tables will be used- one staging, one exchange and one target table.
  • First, staging table will be loaded with one partition of data. Then, the exchange table will be exchanged with that partition after doing a table analyze. Finally the target table will be loaded partition wise with the exchange table.

Confidential

Datastage ETL Developer

Responsibilities:

  • Developing the jobs for loading data from CTDI to our NAI schema.
  • Developing the jobs for IP Addresses in the feed from CTDI and populating it in NAI Schema with two new tables(Bibbase and Addon)
  • Creation and debugging of Jobs which will use Joins, database levels, Transformers, Sequential files, look-up and other utilities.
  • Table creations and altering, materialized views, Views, sequences, stored procedures, functions, complex queries and performance tuning.
  • Owning Data stage configuration and deployment.
  • Owning all kinds deployment related ETL track on all the Environments

Confidential

Datastage ETL Developer

Responsibilities:

  • Developing the jobs for loading data from BEDW to our MCDB,REMEDY,OVPI and BEDW schemas.
  • Developing the jobs for 4 interfaces(MCDB TO GCP,REMEDY TO GCP,OVPI TO GCP,BEDW TO GCP)
  • GCP staging tables to support AERS SLA reporting
  • Creation and debugging of Jobs which will use Joins, database levels, Transformers, Sequential files, look-up and other utilities.
  • Table creations and altering, materialized views, Views, sequences, stored procedures, functions, complex queries and performance tuning.
  • Owning Data stage configuration and deployment.

We'd love your feedback!