Etl Developer Resume Profile,
,
Summary
- 9 years of experience in IT Industry as a UNIX Shells script, Autosys, Tivoli development and migration.
- Designing complex scheduling frame works and processes for larger applications with more than 20K Autosys jobs using Autosys scheduler.
- Creating Process and Frameworks to Manage UNIX and Autosys development projects.
- Managing and Implementing Autosys in various large-scale ETL projects.
- Coding, Implementation, Performance Tuning and Supporting Autosys implemented projects.
- Establishing process for production support team to monitor the production environment using the Autosys Tool.
- Automating various report creation using shell script and Perl script.
- Creating Wrapper scripts, Archival, NDM and FTP scripts using UNIX Shell script and PERL.
- Preparing the test environment before various release activities.
- Analyzing new projects and helping them in implementing Autosys for scheduling, monitoring and reporting purpose.
- End-to-End analysis and migration of Autosys jobs from Autosys version 4.0 to 4.5 and 4.5 to R11 for more than 70 applications with a job count of more than 150k.
- Migration of Tivoli to Autosys R11 for more than 30 applications with a job count of more than 70K.
- Analysing new projects that have implemented or planning to implement Autosys scheduler and suggesting a scheduling model for Autosys job scheduling. Developed few Autosys models like Hub and spoke, super box, Drop-DeleteTrigger for various projects.
- I have been involved in building a High speed testing environment where more than 250 applications are running batch cycle to perform integrated and regression testing.
- Worked in projects that migrated from Mainframes and CA7 to Data stage and Autosys.
- Over Two years of experience in System Analysis, design, development and implementation of Data Warehousing Systems using IBM Data Stage 8.5/9.0.
- Proficient in developing strategies for Extraction, Transformation and Loading ETL mechanism.
- Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator.
- Experienced in integration of various data sources DB2-UDB, PL/SQL, Oracle into data staging area.
- Expert in working with Data Stage Designer and Director.
- Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.
- Experience in troubleshooting of Data Stage jobs and addressing production issues like performance tuning and enhancement.
- Expert in working on various operating systems like UNIX AIX 5.2/5.1, Sun Solaris V8.0 and Windows 2000/NT.
- Proficient in writing, implementation and testing of triggers, procedures and functions in PL/SQL and Oracle.
- Experience in using software configuration management tool like SVN Tortoise for version control.
- Expert in unit testing, system integration testing, implementation and maintenance of Data stage jobs.
- Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.
- Self-Starter and Adaptive to new technologies.
Technical Skills:
- Operating Systems : Windows 98/NT/2000/XP/Vista/Windows7
- Databases : Oracle, DB2
- Database Tools : Toad, DB2 Connector
- ETL Tools : DATA STAGE- IBM Web Sphere Data stage and Quality Stage 8.5 and 9.0
- Scheduler Tools : Autosys, Tivoli, Crontab
- Tools And Languages : UNIX Shell script, PERL, Oracle SQL, PL/SQL, AWK, SED, DB2
- Application Packages : Microsoft Office Suite
Professional Experience:
Confidential
Description: As part of Batch Support Services BSS group I develop, support and manage various ETL Data stage and Autosys/Tivoli applications with in Bank of America. I am responsible for developing, managing and supporting production install, preparing test and UAT environments, executing the test cycles and monitoring the testing environment for various applications.
As part of Autosys/Tivoli Analyst my Roles and responsibilities are:
- Create Autosys and Tivoli Schedules as per the ETL Data Stage requirements.
- Migrating Autosys jobs from 4.5 to R11 for various applications inside bank.
- Converting Tivoli schedules to Autosys R11 jobs.
- Managing Autosys and UNIX development and testing process for ETL Projects.
- Developed tools using Perl and Shell scripts for automating Autosys jobs creation, migrating jobs to various environments and maintaining a version control for various Autosys jobs.
- Creating Frameworks and processes for Autosys and UNIX development projects inside the bank.
- Estimation of efforts and delivery timelines for various release activities.
- Manage Execution and Monitor of Testing batch cycles for various applications.
- Dealing with abends and perform restart, force complete and cancelling of jobs in Autosys.
- Supporting various applications in USA, CANADA and EUROPE in production and all test environments and providing support while running the batch cycles.
- Attending Initiative related, Release related and Application related meetings and share the information with the offshore team members.
- Work allocation to various team members. Review the deliverables before implemented in Test and UAT environment.
- Developing Shell scripts and Perl scripts to automate various manual works in day today support activities.
- Analysing new projects that have implemented or planning to implement Autosys scheduler and suggesting a scheduling model for Autosys job scheduling. Developed few Autosys models like Hub and spoke, super box, Drop-Delete Trigger for various projects.
- Analysing the Autosys jobs for existing projects and suggesting scheduling improvements.
- Worked in various migration projects that are migrating from Autosys 4.0 to 4.5 versions within the bank.
- Worked in projects that migrate from mainframes and CA7 to Data stage and Autosys.
- Worked with Development team to understand various projects and help them implement Autosys to reduce a lot of manual effort in testing batch cycle in lower environments.
- I have been involved in building a High speed testing environment where more than 250 applications are running batch cycle to perform integrated and regression testing.
- On board various applications to the high-speed change environment to performing integrated testing and have worked in converting Tivoli, Quartz, and crontab-scheduled jobs to Autosys jobs.
- Co-ordinate with various teams to perform data-conditioning, environment set up and batch cycle execution in the High speed test environment.
- As part of ETL developer my roles and responsibilities are:
- Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM Data stage 8.5 /9.1
- Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Development Environment.
- Active participation in decision-making and QA meetings and regularly interacted with the Business Analysts development team to gain a better understanding of the Business Process, Requirements Design.
- Used Data Stage as an ETL tool to extract data from sources systems, loaded the data into the ORACLE database.
- Designed and Developed Data stage Jobs to extract data from heterogeneous sources, applied transform logics to extracted data and Loaded into Data Warehouse Databases.
- Created Data stage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
- Extensively worked with Join, Look up Normal and Sparse and Merge stages.
- Extensively worked with sequential file, dataset, file set and look up file set stages.
- Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
- Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
- Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins.
- Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.
- Creation of jobs sequences.
- Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool like Autosys and Tivoli.
- Coordinate with team members and administer at onsite and offshore.
- Analyse performance and monitor work with capacity planning.
- Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.
- Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
- Participated in weekly status meetings.
- Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually.
Confidential
Description: Development and Maintenance of various Autosys jobs. Migration of code to various environments.
As Migration Analyst my roles and responsibilities were :
- Impact Analysis of various Autosys Jobs.
- Estimation of efforts and delivery timelines.
- Development of Unix scripts for generating the Autosys Jils.
- Developing Unix scripts to decommission the old Autosys jobs.
- Changing the kshells as per the new server.
- Migration of Autosys jobs from the current system to new system.
- Catering to service request by the Clients such as changing the max run alarms of the autosys jobs, rearranging the autosys dependencies.
- Impact analysis on the upstream/downstream during the Migration of the system into new environment and doing the required changes in the scripts.
- Involved in unit testing and provided required support to business during the UAT and Bug Fixing.
- Proactively involved in resolving the postproduction roll out issues.
- As Production Support Executive my roles and responsibilities were:
- Monitoring Various Autosys Jobs
- Analysing various Autosys failures and suggesting solutions to developers to give a permanent fix for the production failures.
- Develop Unix scripts to automate the manual activities within the project.
- Develop Autosys jobs as per Client requirements.
- Generate Autosys Calendars for new Autosys jobs.
- Follow up with upstream and downstream teams for various autosys failures.
- Contact different vendors for getting the files in case of late receipt of the same.
- Reviewing solutions for critical autosys functionalities.
- Fixing issues in various environments Production, UAT and Test Environments
- Proper hand over to Production Support teams
- Providing various daily and Weekly reports to clients
Confidential
Description: I was involved in the migration of the Ranger fraud management system and following were my Roles and Responsibility:
Roles and responsibilities:
- Data Gathering and Hardware resizing as per business projections.
- Development of new shell scripts as well as migration of existing shell scripts to the new server.
- Migration of Data from the current system to new system.
- Catering to service request by the Clients such as data Spooling and analysis
- Impact analysis on the upstream/down stream systems during the Migration of the system into new environment and doing the required changes in the scripts.
- Involved in unit testing and provided required support to business during the UAT and soft launch phase of the project.
- Coordinated with Vendor in resolving the major issues like subscriber precheck, Autothreshold etc.
- Bug Fixing.
- Proactively involved in resolving the postproduction roll out issues.
Confidential
Roles and responsibilities:
- Re-structuring the flow of CDR's in the system in order to increase the performance of processing and meeting the Business SLA.
- Changed the Logic of Spooling the CDR's from the source system in order to improve the performance and reduce the duplication of file while spooling.
- Developed scripts for populating the CDR Processing Speed in the Dashboard.
- Automated Calling Level and Credit Limit Reconciliation Process.
- Received appreciation for improving the performance of scripts which are business critical.
- Applied performance-engineering techniques to increase the throughput of procedures.
- Suggesting Business by providing reports for enabling events like Call Velocity and Call Collision help in catching cloning frauds .
- Identifying the problem in data coming from the source systems and co-ordinated to resolve the same.
- Testing of the Releases given by Vendor before applying the same on production.
- Developed the logic for separating the PRI / BRI subscribers from normal subscribers.
- Involved in end-to-end patch release development/testing which includes release engineering, internal testing of releases, coordinating with business during UAT, Vendor management, feasibility analysis, data gathering, design, development, preparing test cases and release of Change Requests.
- Help the client to manage the Frauds in the telecom network. Activities include maintenance of Fraud Management System Server and Software related issues to provide back-end support for the team at the client side.
- Data gathering, testing and implementation of key new features like Cell Phone Cloning Check , Invalid Subscriber Fraud Management , International Roaming Fraud Management etc
- My prime concerns were releasing bug free patches and preparing test scenarios.
- Developed performance enhanced shell scripts for automating various tasks.
- Got client appreciation in automating most of the manual activities.
Major Achievements in Confidential Project
Confidential
Responsibilities:
- Data Gathering and Hardware resizing as per business projections.
- Impact analysis on the upstream/down stream systems during the launch of the Virgin Mobile in TTSL system.
- Development of new scripts and changing the existing scripts as per the system requirement.
- Catering to service request by the Clients such as data Spooling and analysis
- Involved in unit testing, UAT and soft launch phase of the project.
- Proactively involved in resolving the postproduction roll out issues.
Confidential
Responsibilities:
- Analysing the Business Projections and Estimating the Hardware requirements.
- Data Gathering for requirement analysis and implementation.
- Impact analysis on the upstream/down stream systems and doing the required changes in the scripts.
- Development of new scripts and procedures for launching the new circles.
- Involved in unit testing, UAT and soft launch phase of the project.
- Bug Fixing during the Soft Launch Phase.
- Proactively involved in resolving the issues post production.
Confidential
Responsibilities:
- Analysing the Business requirement and checking for the Feasibility of the same in the FMS system.
- Developed a Data Manipulator, which transforms the fields in CDRs to system understandable format for meeting the business requirements.
- This piece of code is a reusable component, which waived off vendor expenses.