Associative Team Lead Resume
SUMMARY
- Data Stage developer having 6 years of experience and broad skill sets in ETL programming using Web Sphere Data Stage (parallel version)8.1v/11.5 v and Strong background in Data Warehousing, Star Schema, snowflake schema and ETL jobs.
- Good logical ability and systematic approach to problem solving analysis, Integration Management solution design and strong debugging skills.
- Able to quickly learn and understand business domain and behave to deliver business value on a consistent basis.
- Involved in all phases of a Data warehouse project mainly in ETL and having Expertise in Data Analyzing, Data Warehousing Technologies
- Having strong knowledge in extracting data from legacy systems to stage and then Cleanse, Process and Loading into Data Marts and Data Warehouse.
- Design and implementation of data warehousing ETL (Extract, Transform and Load) Oracle databases in - house SQL programming.
- Having experience in both development and operations as well.
- Strong experience in data analysis and troubleshooting data issues.
- Hands on experience in Tuning Jobs, bug identifying and resolve performance in Parallel Jobs in Data Stage.
- Having Strong knowledge in Linux and good knowledge in shell scripting too.
- Hands on experience in Greenplum.
- Hands on Experience in installation Greenplum in production server with 8 Cluster
- Maintaining and running regular database backup’s
- Troubleshooting Datastage jobs (ETL)
- Lead data architecture for Enterprise DW implementation-hands on designing and validating data models and ETL architecture
- Hands on experience in Reporting Tool (LOOKER)
- Responsible for data migration from Greenplum to Snowflake
- Responsible for data migration from Greenplum to AMAZON S3 using tools like S3CMD LINE TOOL and AWSCLI
- Involved in data processing in amazon Redshift
- Developed the Data Modeling in snowflake
- Excellent team player with strong written, communication and analytical skills. Capable of working under high stress environment with resource constraints.
- Working experience in various domains like Retail sales, insurance.
TECHNICAL SKILLS
Languages: c, c#
Database Server: Oracle, Greenplum, sqlserver, PostgreSQL
Operating Systems: Linux, Mac, windows
ETL Tools: Data Stage 8.1v/8.5,11.3 Pentaho
Cloud Technologies: Amazon EC2, S3, Snowflake, Redshift
Code library: GitHub
PROFESSIONAL EXPERIENCE
Confidential
Associative Team Lead
Environment: Datastage (11.3), SQL SERVER
Responsibilities:
- Used various sourced and designed and developed jobs in order to facilitate the integration process of the expenditures.
- Analyzed reports made by business user to extract data to effectively integrate expenditures from various sources.
- Evaluated the Data extraction from source to confirm for data irregularities and classify corrupt data to apply suitable transformation in the mappings in the jobs.
- Understand the business rules completely and implements the data transformation methodology.
- Involved in the Technical design document and Source-To-Target mappings.
- Extensively used Data Stage Designer for Developing Parallel jobs and performed complex mappings based on Business specifications.
- Involved in the design and development of the ETL process for the data warehouse.
- Implemented SCD 1 and SCD 2 Jobs.
- Done performance tuning to long running source SQL queries in the jobs.
- Checked in the Datastage code in SVN tortoise to maintain different versions in different stages of project life cycle
- Built the SQL queries to fetch the data from the Database as per the business requirement and used the SQL queries in the Datastage for further processing.
- Developed few server jobs as part of the project.
- Developed sequencers to automate the parallel jobs and created looping jobs using sequencers.
- Responsible for Performance tuning for better performance.
- Involved in data analysis and support to QA.
- Created build & Release documents.
Confidential
Sr Datastage Developer
Environment: Datastage, Greenplum, Linux(cluster)
Responsibilities:
- Responsible for complete working of data warehouse
- Running daily batch jobs from OLTP to OLAP
- Involved /Responsible for various data integration processes/applications
- Error handling and debugging on batch jobs
- Enhancements based on client requirements.
- Responsible for running data processing smooth
- Developed and modification on SCD’s through various scripting
- Collaborate with other function leaders in evaluating a correcting DW strategies and goals and implement changes as needed
- 24x7x365 operation management of Data Warehousing group that includes business analysis, ETL processing, reports and technical support to the environment.
- Also, responsible for reporting and Dashboards using LOOKER
- Migration of data warehouse from Greenplum to Snowflake
Confidential
Sr ETL Developer
Environment: Pentaho, PostgreSQL, Linux(cluster)
Responsibilities:
- Creating data-model & Data-Ware house for the bank
- Creating Amazon EC2 instance.
- Design Pentaho jobs to extract, transforming result set and integrate the data from Salesforce to data-sink.
- Created shell scripts to fetch data from Oracle source.
- Created SCD-2 to capture the changes and load it into DWH using SQL scripting
- Installed and implemented Tableau server.
- Generate Tableau reports like “tracking pipeline tendencies of the Loan applications”, “customer ownerships “etc.
- Responsible for migrating Tableau reports from Development to Production systems
Confidential
Sr ETL Developer
Environment: Pentaho, MySQL and Linux
Responsibilities:
- Gather the Client requirements to develop the jobs
- Troubleshoot the Pentaho jobs and add error handling
- Responsible for rectification and modification of the scripts for seamless working of jobs (which were previously developed)
- Developed & configured new Database for the client.
Confidential
Datastage Developer
Environment: DataStage 8.5, XP, Oracle 10g, Unix, Teradata
Responsibilities:
- Understand the business rules completely and implements the data transformation methodology.
- Understanding the Technical design document and Source-To-Target mappings.
- Designing DataStage jobs for validating and loading data.
- Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating and loading into Data Warehouse.
- Extracting data from sources such as oracle, Flat files to transform and load into target Databases.
- DataStage Designer was utilized for developing mappings using stages, which includes Transformer, Join, Lookup, Merge, Funnel, Filter, Oracle Enterprise Stage, Sequential File, Dataset, File Set, Copy, and Modify Stage.
- Implemented SCD- Type II Techniques.
- Designed Sequencer to automate the whole process of data loading.
- Extensively worked with DataStage Shared Containers for Re-using the Business functionality
- Experience in running jobs using Job Sequencers, Job Batches.
- Involving in performance tuning jobs process identifying and resolve performance issues.
- Involved in unit testing of DataStage jobs.
Confidential
Datastage Developer
Environment: Web sphere Data StageWindows-XP, Oracle 10g
Responsibilities:
- Extensively used Data Stage Designer for Developing Parallel jobs and performed complex mappings based on Business specifications.
- Involved in the design and development of the ETL process for the data warehouse.
- Worked extensively on different types of stages like Sequential file, Aggregator, Funnel, Change capture, Transformer, Merge, Join, Lookup for developing jobs.
- Extensively used processing stages like the Lookup stage to perform lookup operations based on the various target tables, Modify stage to alter the record schema of the input data set, Funnel stage to combine various datasets into a single large dataset and Switch stage to trigger the required output based on a Specific condition.
- Constructed containers for performance analysis.
- Involved in the unit testing.
Confidential, San Antonio, Texas
ETL Developer
Environment: Web sphere Datastage Windows-XP, Oracle 10g
Responsibilities:
- Understanding the Requirement Specifications.
- Extensively involved in ETL development.
- Developed Parallel jobs with the collection of all Source stages, Target stages and processing stages like Aggregator, Transformer.
- Used Job parameters for username, password schemas etc.
- Extracted data from sequential files and loaded into staging area.
- Created job sequencers of dependency jobs and batches.
- Tuned jobsacross sources, targetsfor better performance.
- Maintained standards for naming conventions and Quality.
- Involved in the unit testing.