We provide IT Staff Augmentation Services!

Development Lead Resume

4.00/5 (Submit Your Rating)

Lebanon New, JerseY

SUMMARY

  • 11 years of Overall IT Experience in Data Profiling, Analyzing, Designing, Developing, Testing, Maintaining and Supporting applications in Data Warehousing, Data Integration and Data Migration for Insurance and Banking Industries
  • Very good understanding of Data warehousing concepts using Star/Snowflake Dimensional Modelling, Fact tables and Dimension tables
  • Skilled in Design and Development of Data Lake using Confidential Big Integrate 11.7/11.5 and Hortonworks Big Data Hadoop with Hive and HDFS
  • Worked extensively on various versions of Confidential Infosphere Datastage 11.7/11.5/9.1/8.5/8.1
  • Developed Parallel Jobs using various stages like Confidential connector, ODBC Connector, Stored Procedure, Sequential File, Dataset, External source, SAP ABAP Extract, SAP BAPI Load, Hive Connector, File Connector, Excel, Transformer, Join, Lookup, Merge, Sort, Remove duplicates, Copy, Filter, Funnel, Modify, Aggregator, Change Data Capture and SCD Stages
  • Developed Sequence Jobs using various stages like Job Activity, Execute command, Nested condition, User variables, Exception Handler, Terminator, Start Loop, End Loop and Email notification activities
  • Experienced in using Datastage Server Routines & Shared containers
  • Competent in Data Integration from disparate Data sources like Hive tables, HDFS files, Confidential, SQL Server, Flat files, CSV files, XML files and Mainframe Binary files
  • Experienced in migrating jobs from Datastage 11.5 to 11.7, 9.1 to 11.5 and 8.1 to 8.5
  • Expertise in Troubleshooting the Datastage jobs and Performance Tuning
  • Proficient in writing HQL Queries, SQL Queries, PL/SQL Stored Procedures, Functions, Cursors, Views, Triggers, Materialized Views and SQL Performance Tuning
  • Experienced in creating Shell scripts for Preprocessing, File manipulation, Master scripts for calling Sequence jobs, Automating Export and Import functionalities
  • Used Datastage Director, Crontab and ESP Scheduler for scheduling the Batch processing loads
  • Used SVN and TFS for maintaining the different versions of Datastage jobs
  • Experienced in creating and scheduling Parameterized Reports, Tablix Reports, Matrix Reports, Linked Reports, Charts and Adhoc Reports in SSRS
  • Experienced in projects adopting Waterfall and Agile - Scrum methodologies of SDLC
  • Expertise in using tools such as Visio, Rally, JIRA and Confidential Quality Center for Project Management and Defect Management
  • Designed and Implemented generic Audit/Reconciliation and Error Handling Frameworks
  • Established Standards and Best Practices for Continuous Process Improvement
  • Ability to Work both independently and as a team with Highly Organized, Collaborative, Transparency in Communication, Reliable and Problem-solving skills
  • Experienced in Onsite-Offshore Co-ordination, Client facing and Team leading skills

TECHNICAL SKILLS

Concepts: Data warehousing, SDLC - Waterfall & Agile - Scrum

BI Tools: Confidential Infosphere Datastage 11.7/11.5/9.1/8.5/8.1 , SSRS

Big Data: Confidential Big Integrate, Hive, HDFS

Relational Databases: Confidential 12c/11g/10g, SQL Server 2016/2012

Languages: SQL, PL/SQL, Shell Scripting

Schedulers: ESP Scheduler, Crontab

Versioning Tools: Tortoise SVN, Team Foundation Server (TFS)

Other Tools: Visio, Rally, JIRA, Confidential Quality Center

PROFESSIONAL EXPERIENCE

Confidential, Lebanon, New Jersey

Development Lead

Responsibilities:

  • Works directly with the Business Users/Business SMEs/Client IT Personnel to gather the Requirements
  • Translates Business Requirements into Technical flows for development by offshore teams
  • Provides L0 Estimations and Project Estimations to the client
  • Plans the Sprint activities, Grooms the ETL User stories and other dependent teams User stories & co-ordinates the work with offshore teams
  • Collaborates closely with multiple business stakeholders and teams for meeting the deliverables
  • Establishes Standards and Best Practices for Continuous Process Improvement
  • Performs Code Reviews & provided Review comments to the team based on the Standards & Best Practices
  • Helps the team to tune the long running Datastage jobs
  • Provides daily scrum updates to Client Technical Leads & Biweekly status updates to Client Managers
  • Creates Implementation-Verification-Backout plans & presents them to Enterprise Change Advisory Boards for Production migrations
  • Co-ordinates the Production Deployments with the Implementation teams
  • Provides KT to Application Support and Maintenance teams & also Shadow support until the warranty period

Confidential, Lebanon, New Jersey

Senior ETL Developer

Responsibilities:

  • Worked directly with the Business Users/Business SMEs/Client IT Personnel to gather the Requirements
  • Created Technical specification and Flow documents & did walk through to the clients for approval before implementation
  • Developed Datastage Parallel & Sequence jobs based on the Technical specifications and Mapping documents
  • Used Shared Containers for reusable logics & Server Routines for Audit functionalities
  • Created Unit Test Cases & documented the Unit Test Results
  • Worked with QA & End to End Testing Teams in Model & Preprod Environments for Testing support like doing Data loads, fixing issues & reprocessing the data loads
  • Provided daily status updates to Technical & Business stakeholders
  • Created Release documents & worked with the Client Release Management Teams for Production migrations

Technical Lead

Confidential

Responsibilities:

  • Worked with Onshore Business Analysts & Architects to understand the Business requirements
  • Worked closely with Architects to convert Business requirements into ETL frameworks & wrote detailed Technical specification of all modules
  • Created High-Level and Low-level ETL Design/Flow Diagrams in Visio
  • Developed Datastage jobs for Control Architecture, Framework Modules & Business Critical Modules
  • Performed Code Reviews & provided Review comments to the team based on the Standards & Best Practices
  • Helped the team to tune the long running Datastage jobs
  • Resolved the technical challenges of the team members & provided feedback
  • Reviewed the Unit Test Cases & performed Technical/Functional Testing
  • Provided daily status updates to Onshore Internal Teams
  • Did Code Walkthrough to Onshore Leads & Architects
  • Created Release documents & reviewed with Onshore Teams for Production migrations
  • Provided KT to External Vendor Support teams & also Shadow support until the warranty period

Associate Consultant

Confidential

Responsibilities:

  • Analyzed and Understood the existing Data model that supports the Brokerage requirements identified for Reverse engineering
  • Created Parameterized Reports, Tablix Reports, Matrix Reports, Linked Reports, Charts, Adhoc Reports & Stored Procedures in SQL Server to simulate the existing Hyperion Reports
  • Created Shared Schedule Subscriptions & Report Specific Subscriptions to schedule the SSRS Reports
  • Massively worked on Tuning the Performance of Queries & Stored Procedures in SQL Server to generate the SSRS Reports quicker
  • Compared the Existing Hyperion Reports with the New SSRS Reports and reported the gaps in the Requirements/Data model promptly
  • Worked directly with Business SMEs to match the data from New SSRS Reports with the Existing Hyperion Reports
  • Provided daily status updates to Onshore Internal Teams
  • Provided KT to External Vendor Support teams & also Shadow support until the warranty period

Associate Consultant

Confidential

Responsibilities:

  • Worked directly with the Business Users/Business SMEs/Client IT Personnel to gather the Requirements
  • Created Technical specification documents & did walk through to the clients for approval before implementation
  • Developed Datastage Parallel & Sequence jobs based on the Technical specifications
  • Used SVN for maintaining the Datastage job versions
  • Provided Walk through of the Datastage jobs to the Clients once the SIT Testing is completed
  • Used Crontab for scheduling the jobs in Lower Environments
  • Worked with QA & Business Teams in SIT & UAT Environments for Testing support like doing Data loads, fixing issues & reprocessing the data loads
  • Provided daily & weekly status updates to Technical & Business stakeholders
  • Created Release documents & worked with the Client Release Management Teams for Production migrations
  • Provided KT to Client Production support teams & also Shadow support until the warranty period

Senior Software Engineer

Confidential

Responsibilities:

  • Involved in gathering the Requirements with Onshore Teams
  • Created High-Level & Low-Level Design documents based on the requirements
  • Created a generic framework that helps to Purge all the required tables & Purge durations that can be customized according to the tables
  • Developed Datastage Parallel & Sequence jobs based on the Low-Level Design document of this module
  • Did Peer reviews of the Datastage jobs
  • Used SVN for maintaining the Datastage job versions
  • Used Datastage Director for scheduling the jobs in Lower Environments
  • Maintained the Target tables in DEV, SIT & UAT Environments
  • Worked with QA & Business Teams in SIT & UAT Environments for Testing support like doing Data loads, fixing issues & reprocessing the data loads
  • Created Shell scripts for automating the Export & Import Functionalities, Master script for calling the Datastage sequences

Senior Software Engineer

Confidential

Responsibilities:

  • Worked with Offshore/Onshore leads to understand the Reporting requirements & helped in building the Dimensional model
  • Created High-Level & Low-Level Design documents based on the Dimensional model
  • Developed Datastage Parallel & Sequence jobs based on the Low-Level Design document of this module
  • Did Peer reviews of the Datastage jobs
  • Automated the Unit Testing by using PL/SQL Cursors & Procedures for doing the data validations based on the requirements
  • Used DB Links to extract the tables in other schemas for Validation purposes
  • Used DB Triggers for recording the Operational/Audit Information
  • Created and Maintained Views & Materialized Views for the Reporting teams to extract the data from the Dimensional model
  • Maintained the Target tables in DEV, SIT & UAT Environments
  • Worked with QA & Business Teams in SIT & UAT Environments for Testing support like doing Data loads, fixing issues & reprocessing the data loads

Engineer

Confidential

Responsibilities:

  • Developed Datastage Parallel & Sequence jobs based on the Low-Level Design document of this module
  • Used different Parallel job stages such as Confidential connector, ODBC Connector, Transformer, Join, Lookup, Merge, Sort, Remove duplicates, Copy, Filter, Funnel, Modify, Aggregator, Change Data Capture & SCD Stages
  • Used different Sequence job stages such as User Variables, Job Activity, Execute Command, Start Loop, End Loop, Nested condition, Email notification activity, Terminator
  • Created SQL Scripts for data validation
  • Created Unit Test Cases based on the requirements & documented the Test Results
  • Maintained the Target tables in DEV & SIT Environments
  • Migrated the Datastage jobs from DEV to SIT and to UAT Environments as DSX files
  • Worked with QA Teams in SIT & UAT Environments for Testing support like doing Data loads, fixing issues & reprocessing the data loads
  • Monitors the Daily batch loads in Production

Associate Engineer

Confidential

Responsibilities:

  • Developed Datastage Parallel & Sequence jobs based on the Low-Level Design document of this module
  • Used Datastage Director for Running & Monitoring the jobs including logs
  • Created SQL Scripts for Count & Data validation of all tables in Application Server to Data Server
  • Migrated the Datastage jobs from DEV to SIT and to UAT Environments as DSX files
  • Worked with QA Teams in SIT & UAT Environments for Testing support activities like doing Data loads, fixing issues & reprocessing the data loads
  • Monitors the Daily batch loads in Production

We'd love your feedback!