We provide IT Staff Augmentation Services!

Application Support Lead Resume

0/5 (Submit Your Rating)

SUMMARY

  • A dynamic, skilled professional with over 9+ years’ experience in Data warehousing domain.
  • Have 5 years of working experience on Teradata and ABINITIO.
  • Have 2 years of comprehensive experience in Big Data Analytics.
  • Good knowledge ofHadoopArchitecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, and Map Reduce concepts.
  • Experience in using Hive, Pig, Sqoop and Manager.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • Extending Hive functionality by writing custom UDFs.
  • Experience in analyzing data using HiveQL, Pig Latin and Map Reduce.
  • Knowledge in job work-flow scheduling and monitoring tools like Oozie.
  • Expertise in writing code from scratch and troubleshooting in UNIX Shell scripts.
  • Expertise in writing SQL queries using Teradata.
  • Worked extensively on ABINITIO graphs and Teradata Utilities like MLOAD, TPUMP, FASTLOAD, FASTEXPORT, SQL queries and loading data into Data Warehouse/Data Marts.
  • Secondary skill set includes GoldenGateETL and Netezza.
  • Extensive experience in ETL Analysis, Design, Development, Testing, Implementation, Maintaining Standards, Quality Audits, Performance Tuning, Automation of jobs and Maintenance and support of various applications.
  • Have prepared a neat and detailed HLD, LLD and Run books documents.
  • Excellent skills in a wide variety of technologies and a proven ability to quickly learn new programs and tools.
  • Very good communication skills and quick adaptability to new technologies and new working environment.
  • Excellent organizational skills and ability to prioritize workload.
  • Imparted various training sessions on UNIX, GoldenGate ETL, Teradata& ABINITIO to Entry level trainees.
  • 8+ month UK and 11+ month RiyadhSaudi client facing experiences.

TECHNICAL SKILLS

Business Areas: Banking and Financial services, Telecom services

Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Oozie, Flume

ETL Tool: ABINITIO and GoldenGate

Data warehousing: Teradata and Netezza

Operating System: Linux, UNIX, Windows

Databases: Teradata, Netezza, Oracle

Tools: ControlM, TWS, Autosys

Languages: UNIX Shell Scripts, PL/SQL and Teradata FLOAD, MLOAD Utilities.

PROFESSIONAL EXPERIENCE

Senior Developer

Confidential

Responsibilities:

  • With help of design team, Prepared LLD documents for MTP project.
  • Developed Golden Code and created configuration scripts to interact with target server.
  • Applied tables mapped and filter condition as per business requirements in GoldenGateEnv.
  • Developed ABINITIO graphs to load source feeds into Teradata database.
  • Developed shell scripts for ad-hoc request and automate the process.
  • Developed ABINITIO graphs to transfer all flat files to Hadoop filesystem.
  • Created ControlM jobs to execute scripts/graphs at defined time.
  • Developed Map Reduce programs using combiner and custom partition to parse raw data, populate staging tables and load refined data into partitioned tables for all the domains.
  • Involved in creating Hive tables, and loading and analyzing data using hive queries
  • Tested raw data and executed performance scripts.
  • Written Hive queries for analyzing and reporting purposes of different streams in the company.
  • Supported code/design analysis, strategy development and project planning.
  • Exported the analyzed data to the relational databases using Sqoop for virtualization and to generate reports for the BI team.
  • Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
  • Developed SQL statements to improve back end communications.
  • Hands on experience in exporting the results into relational databases using Sqoop for visualization and to generate reports for the BI team.

Environment: GoldenGate, ABINITIO ETL, Netezza, Hadoop, Map Reduce, Hive, Sqoop, HBase (NoSQL database), Java 1.6, and UNIX Shell

Senior Developer

Confidential

Responsibilities:

  • With help of design team, Prepared LLD documents for CLDM12 project.
  • Developed Golden Code and created configuration scripts to interact with target server.
  • Applied tables mapped and filter condition as per business requirements in GoldenGateEnv.
  • Developed ABINITIO graphs to load source feeds into Teradata databse.
  • Developed shell scripts for ad-hoc request and automate the process.
  • Developed ABINITIO graphs to transfer all flat files to Hadoop filesystem.
  • Created TWS (Tivoli Workload Scheduler) jobs to execute scripts/graphs at defined time.
  • Developed Map Reduce programs using combiner and custom partition to parse raw data, populate staging tables and load refined data into partitioned tables for all the domains.
  • Involved in creating Hive tables, and loading and analyzing data using hive queries
  • Tested raw data and executed performance scripts.
  • Written Hive queries for analyzing and reporting purposes of different streams in the company.
  • Supported code/design analysis, strategy development and project planning.
  • Exported the analyzed data to the relational databases using Sqoop for virtualization and to generate reports for the BI team.
  • Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
  • Developed SQL statements to improve back end communications.
  • Hands on experience in exporting the results into relational databases using Sqoop for visualization and to generate reports for the BI team.

Environment: Hadoop, Map Reduce, Hive, Sqoop, HBase (NoSQL database), GoldenGate,Teradata, ABINITIO ETL and UNIX Shell Scripting.

Hadoop/Teradata Developer

Confidential

Responsibilities:

  • Gather business requirements from the Business Partners.
  • Loading files to HDFS and writing HIVE queries to process required data.
  • Worked on setting up Hadoop over multiple nodes, designing and developing MapReduce.
  • Involved in installing Hadoop Ecosystem components.
  • Support Map Reduce Programs those are running on the cluster.
  • Involved in HDFS maintenance and loading structured and unstructured data.
  • Worked as Team Member for Statement Module.
  • Involved in integration and unit testing.
  • Client interaction (interaction with clients for requirements)
  • Import data using Sqoop to load data from Teradata to HDFS on regular basis.
  • Developed Scripts and Batch Job to schedule various Hadoop Program.
  • Wrote Hive queries for data analysis to meet the business requirements.
  • Created Hive tables and worked using Hive QL.

Environment: ABINITIO ETL, Teradata, Oracle PL/SQL, TWSand UNIX Shell Scripting.

Application support Lead

Confidential

Responsibilities:

  • Automation of summarized TWS jobs details on weekly, daily, monthly and quarterly and years and getting status on email and COGNOS reporting as well for wider use.
  • Developed TWS menu option scripts which can handle TWS activities required for L2 team. This is quite useful for new user of TWS.
  • Develop code to find server space utilization.
  • Developed code for daily, weekly report generation, late running job, frequently ABENDING jobs etc.
  • Developed report for western Europe application status like business date, no of files received, no of files missing, no of bad files received, no of correct files received etc.
  • It is very help full to analyses issue before batch cycle start and reduce incident.
  • Provide complete information in short time frame.
  • Save business hours and human effort.
  • Reduce human error.
  • Reduce process complexities.
  • Save business hours.
  • Make resources/team free to utilize for other or new request.

Environment: ABINITIO ETL, Teradata, Oracle PL/SQL, TWS and UNIX Shell Scripting.

Teradata Developer

Confidential

Responsibilities:

  • Gather business requirements from the Business Partners.
  • Developed code with using of UNIX shell scripts and Teradata utilities.
  • Developed code for client ad-hoc request.
  • Test the code in all environments (Dev, SIT,UAT, OAT and PROD) and promoted into production.
  • Worked on maintenance and performance tuning of codes.
  • Created Autosys jobs to execute these codes at defined time.

Environment: Teradata, Oracle PL/SQL, TWS and UNIX Shell Scripting.

We'd love your feedback!