We provide IT Staff Augmentation Services!

Etl Lead Resume

5.00/5 (Submit Your Rating)

NC

Summary:

  • Over 7.5 years of IT experience from a high reputed IT firm, rich knowledge in estimation, analysis, design, improvement, implementation ,control and testing of Enterprise Applications in the data warehousing environment using ETL tools.
  • Solid experience as ETL developer and lead loading data from various source systems to a single target delivery model using ETL utilities (Teradata and Informatica utilities), automating batch jobs and managing multi level change releases in parallel.
  • Worked as Application DBA in developing logical and physical data models and knowledge of various data models, designing ETL workflows for enterprise application.
  • Worked as Informatica administrator, helping server maintenance, up gradation, version control, change control , migration of infromatica components across servers. Extensive knowledge in power center, power exchange along with designer, workflow manager and monitor.
  • Sound knowledge in using batch scheduling tools such as Maestro and Autosys schedulers.
  • Experience in relational DBMS such as Oracle, Sybase and SQL servers and the modes and drivers used for connecting Informatica and ETL tools.
  • Solid back end knowledge on Informatica admin setup and backup & restore activities.
  • Hands-on experienced working with source control tools such as VSS, CVS and Rational Clear Case.
  • Strong knowledge of UNIX shell scripting and VI editor utilities.
  • Basic working knowledge of mainframe MVA and VM systems and file aid commands.
  • Work experience in gathering requirements, documenting functional and non functional requirements, effort estimation, design, design review, testing, preparing test cases, solving performance issues, solving technical issues, server migration activities, production implementation and post production support.
  • Sound knowledge in different areas of banking such as investment, securities, consumer and retail banking. Have extensively worked on merger and acquisition tasks.
  • Strong database design skills. Experienced in the Data Modeling tools Logical Database Modeling using Erwin Data modeler. Experience in writing SQL queries, stored procedures and triggers.
  • Implemented automated testing and syntax checking tool using Visual Basic for ETL projects to reduce manual effort and error.
  • Motivated team player with ability to work challenging situations and would like to thrive as team leader, possessing good communication, interpersonal, analytical and problem solving skills.
  • Highly adept at promptly and thoroughly mastering new technologies with a keen awareness of new industry developments and the evolution of next generation programming solutions.
  • Have been certified as Teradata masters in V2R5 series and few basic banking certificates.

Technical Skills:


Languages

C, UNIX Shell scripting, PL/SQL, SQL, VB

Data Warehousing/ETL Tools

Informatica Designer, workflow manager and monitor, Teradata utilities (BTEQ, Multiload, Fast Load, and Fast Export), Teradata Index Wizard, Teradata Statistics Wizard, Teradata SQL Assistant, Informatica data profiler and Power exchange.

RDBMS

Oracle, Teradata, SQL server and NOMAD

Programming Languages

C, UNIX Shell scripting, PL/SQL, SQL, VB

Schedulers

Autosys and Maestro scheduler.

Design Tools

ERWIN modeler

Operating systems

Windows VISTA/XP/2000, UNIX AIX, UNIX Solaris, MVS

Other Tools and
software’s used

SASplex, star client, VM

Version control tools

VSS, clear case and subversion.

Professional Experience:

Project: BOFA AML (Anti money laundering) EPSS Nov 2011 – till date
Confidential,NC
Role: ETL Lead

Description:
The project involves processing of system and manually generated ALERTS (any suspicious transaction BANK) from different detection channel sources and provides more details to the investigators to analyze further on the ALERTS and file SAR with the US government as part of AML compliance. The job cycle runs on a daily basis and to generate cases with collective details, de-duplicating , formatting and loading data in the target model for end users.

Responsibilities:

  • Daily meeting with the business team to understand the customer requirements in detail
  • Convert the requirements to documents with the help of Business analysts and get the sign off from business team on requirements captured.
  • Provide collective estimation to TDM and TDL using the standards of ETL tools used.
  • Develop a detailed design document for the requirements after working with the ETL developers and architects.
  • Helping architect in developing the logical and physical data model.
  • Working closely end user team to understand their needs of reports, and thus compatibility with the target model.
  • Work with ETL developers to check for the compliance of scripts with standards and other performance impacting factors.
  • Work with the source systems and down streams users on install date and share responsibilities.
  • Work with the change advisory team/administrators to install the code components, DDL, automation scripts and other profile setups in the production environment.
  • Validate the data loaded in production, provide warranty support, respond to user/business queries and fix productions issues if any.

Environment: SASplex, Teradata, Shell Scripts, BTEQ queries, Autosys, Oracle9i,PL/SQL, Subversion.

Project: GCIB (Global corporate and investment banking) Nov’09 – Oct’11
Confidential,NC
Role: Technology Lead

Description:
This is an Enterprise Information Management (EIM) project for GCIB Resource Pool supporting a crucial arm of the bank\'s data warehouse (The W). GCIB on ‘The W’ supports the GCIB profitability engine, sales, revenues, financial reporting and commercial accounts. It is the lone decision support information provider for the majority of Global, Corporate and Commercial applications. This system operates on monthly, weekly and daily processing cycles. The GCIB on ‘The W’ data warehouse is primarily a feeder data store that performs transformation routines and then provides other systems with data extracts, database links and source data pulls.

Responsibilities:

  • Requirements Gathering of various business partner ordering process
  • Daily meeting with all the business partners to understand their current business process
  • Acquire technical knowledge about their current system
  • Provide modifications for improving their current business processing based on the project technology
  • Document the as-is and to-be processes and get confirmation from the concerned stakeholders
  • Analyze the requirements along with technical aspect and prepare detailed user story
  • Provide with the gap analysis between the existing requirement and the to-be requirement
  • Work closely with the functional analyst and the customers to implement the requirements
  • Educate the customers on the system , help them with testing and successfully work with the team to successfully move the changes into production
  • Provide UAT test scripts and conduct UAT sessions and help the business users test every scenario
  • Provide support for moving the code into production and also involve in post production support
  • Analyze any existing production issues that come up by following the entire life cycle process for the same.

Environment: Teradata, Informatica, UNIX Solaris, ETL utilities(Mload, Tpump, BTEQ scripts), SQL Assistant, Mainframes (JCL), CA7, Maestro , Autosys Scheduler.

Project: ETL COE May’07 – Nov’09
Confidential,NC
Role: Technology Lead

Description:
This was an Enterprise Information & Analytics (EIA) project supporting and maintaining Midrange ETL Servers and migration activities. In Bank most of the EIM and Non –EIM projects use Informatica and Unix for components for sourcing, transforming and loading tasks. Migration activities involve change control, access control and process control activities. It also involved administrative activities such repository backup and restoration, decommissioning AIX and Wintel environment and setting up SUN environment.

Responsibilities:
· Worked in migration of applications from Informatica 7.1 to Informatica 8.1 server, from AIX server to SUN solaris server.
· Helped in change control and release management team in migrating the Informatica and Unix scripts.
· Worked for automation of script helped in migrating Unix scripts from AIX Unix server to SUN solaris.
· Involved in performance tuning of and supporting of jobs and scripts running higher CPU volume.

  • Conversion of BTEQ scripts from Maestro scheduler to Autosys scheduler tool.
  • Environnent: Informatica, UNIX AIX Oracle/Teradata , UNIX Solaris, SQL Loader, Maestro , Autosys ,Mainframes (JCL), CA7

Project: Wholesale Credit Risk Model May’05 – Apr’07
Confidential,India
Role: Module Lead

Description:
It was a Global Compliance Operational Risk Technology (GCORT) project to form a resource pool for performing SQA Related Activities, Application Delivery Activities and System Testing Activities. It included software engineering processes, assistance in CMM, SSP & Six Sigma Processes, Metrics Collection & Gap Analysis, Performing SQA activities & reviews, Defining, modifying and tailoring process, system design & reviews, Creation/Automation of Test Scripts, Document & Analyze test results, stress/volume and usability test.

Responsibilities:

  • Preparation of a system appreciation document with an analysis of existing mapping code and get confirmation from onsite team
  • Identifying the performance bottlenecks based on the table used and the volume of data
  • Preparation of a complete test plan covering all the business scenarios
  • Coding and Unit testing of the application
  • Integration testing of the module before delivery to onsite
  • Preparation of test scripts to perform load testing on the application with various user loads
  • The throughput of the application was recorded for the varying load and performance tuning was done
  • The entire process was documented and shared with the team
  • Environnent: Teradata V2R5, Teradata utilities, Sybase, Oracle, Maestro,

We'd love your feedback!