Teradata Data Analyst Resume
PROFESSIONAL SUMMARY:
- Overall 5 Years of Experience in IT Design, Development, Testing and Production Support.
- 4 Years of Experience in Data Analysis, Report Generation, Maintenance of Business Report Processes and Data Verifications and Validations.
- Experienced with different Relational databases like Teradata, Oracle and SQL Server.
- Experienced in Interacting with Users, Analyzing client business processes, Documenting business requirements, Performing Design Analysis and Developing Design Specifications.
- Experienced in Financial (Credit card and Banking) and Health Insurance Domains.
- Extensive SQL experience in querying, data extraction and data transformations.
- Extensive Experience in working as Dual Controller on various Business Projects which requires Dual Data Validation and Data Consistency.
- Experienced in writing numerous SQL Queries and Performing ADHOC analysis.
- Experienced in developing business reports by writing complex SQL queries using views, macros, volatile and global temporary tables
- Experienced in creating UNIX Korn shell scripting for Batch jobs.
- Experienced in Automating and Scheduling the Teradata SQL Scripts in UNIX using Korn Shell scripting.
- Experienced on File Transferring from Windows Server to UNIX and Mainframes Servers using UNIX FTP Commands
- Experienced working on large volume of Data using Teradata SQL and BASE SAS programming.
- Experienced in Extracting Data from Mainframes Flat File and converting them into Teradata tables using SAS PROC IMPORT, PROC SQL etc.
- Strong Experience in working on MS Access, MS Excel and MS Power Point
- Good experience in Production Support, Identifying root causes, Trouble shooting and Submitting Change Controls
- Experienced in handling all the domain and technical interaction with application users, analyzing client business processes, documenting business requirements.
- Possess strong analytical and problem solving skills and have quick learning curve. Committed team player and capable of working on tight project delivery schedules and deadlines.
- Experienced in writing Design Documents, System Administration Documents, Test Plans & Test Scenarios/Test Cases and documentation of test results.
- Experienced in Designing and Preparing Training Material and Conducting Training Sessions to the users.
- Excellent ability with competitive spirit in finding various approaches for problem solving and highly skillful, creative and innovative in developing efficient logic/code.
- Proficiency in prioritizing and multi-tasking to ensure that tasks are completed on time
- Demonstrate a willingness, interest, and aptitude to learn new technologies and skills.
- Good Communication and interpersonal skills.
TECHNICAL ENVIRONMENT:
RDBMS: Teradata V2R5, MS Access, Oracle 8i/9i/10g, Sql Server 2000/2005.
Query Tools: TOAD, Teradata SQL Assistant.
Utilities: BTEQ, Fast load, Multi-Load, Fast Export, SQL *Loader.
Scripting: UNIX KORN Shell Scripting
Languages: C, C++, Java, SQL, PL/SQL, COBOL, MATLAB, SAS.
GUI: MS Office Suite, Visual Basic 6.0
Servers: Microsoft IIS 4.0/5.0
OS: Windows XP, Windows NT/98/2000, IBM AIX 5L, Sun Solaris5. x Red Hat Linux.
Web: HTML, DHTML, Java Script, Dreamweaver, Adobe Photoshop, PHP
PROFESSIONAL EXPERIENCE:
Confidential,New Jersey (Oct ’09 to till date)
Role: Teradata data analyst
Responsibilities:-
- Worked with various Business users to gather reporting requirements and understand the intent of reports and attended meetings to provide updates on status of the projects.
- Responsible for Analyzing report requirements and developing the reports by writing Teradata SQL Queries and using MS Excel, Power Point and UNIX.
- Created Weekly, Monthly and Quarterly Business Monitoring reports on different Business Programs which helps Managers to take Decisions on the Programs.
- Wrote several Teradata SQL Queries using Teradata SQL Assistant for Ad Hoc Data Pull request.
- Created Teradata objects like Tables and Views.
- Extensively worked on to convert ORACLE scripts into Teradata scripts
- Used Joins like Inner Joins, Outer joins while creating tables from multiple tables.
- Created Multiset, temporary, derived and volatile tables in Teradata database.
- Implemented Indexes, Collecting Statistics, and Constraints while creating tables.
- Utilized ODBC for connectivity to Teradata via MS Excel to retrieve automatically from Teradata Database.
- Designed & developed various Ad hoc reports for different teams in Business (Teradata and Oracle SQL, MS ACCESS, MS EXCEL)
- Unix Shell Scripting was used for automating logs, for user created table backups and for checking daily log.
- Created pivot tables in Excel by getting data from Teradata and Oracle.
- Helped users by Extracting Mainframe Flat Files (Fixed or CSV) onto UNIX Server and then converting them into Teradata Tables using BASE SAS Programs.
- Used SAS PROC IMPORT, DATA and PROC DOWNLOAD procedures to extract the FIXED Format Flat files and convert into Teradata tables for Business Analysis.
Environment:- Teradata SQL Assistant 7.0, BTEQ, Oracle 9i, Teradata V2R6, Windows NT, UNIX Shell scripts, SQL, PowerPoint, Excel.
Confidential,Atlanta, Georgia (Mar ’08 to Aug 09)
Role: Teradata developer
Responsibilities:-
- Worked on Numerous Adhoc Data Pull requests from Business Analyst, BSA and Financial Analyst.
- Performed in depth analysis on the data pulled for adhoc request and prepared graphs Using MS Excel and MS PowerPoint.
- Worked with Process Manager to review the existing Business Processes and helped to enhance and make better.
- Developed numerous Teradata SQL Queries by creating SET or MULTISET Tables, Views, Volatile Tables, using Inner and Outer Joins, Using Date Function, String Function and Advanced techniques like RANK and ROW NUMBER functions.
- Automate SQL Jobs in UNIX using KORN Shell scripting.
- Generated Tools in MS Excel using VBA Program for Process Manager to extract the data automatically from Teradata database. This helped non technical users and save lot of time.
- Wrote SAS Scripts to read TAB Delimited Flat Files and Convert into Teradata Tables using FastLoad Techniques.
- Tuning of SQL to Optimize the Performance, Spool Space Usage and CPU usage.
- Worked on Business Segmentation for Direct Mail and Email by writing SQL queries in Teradata SQL Assistant.
Environment: - Teradata SQL Assistant 7.2, BTEQ, SAS, UNIX Shell scripts Teradata V2R5, Oracle 9i, Fload, Mload.
Confidential,Chicago, IL (Aug’07 to Jan’08)
Role: Teradata Developer
Responsibilities:-
- Gathering of Requirements from various systems business users.
- Communicating with the business users to understand business requirements clearly
- Conducting meetings with the data architects to understand the source system elements
- Provided support to other developers in accurately mapping source attributes into the Teradata FinancialServices Logical Data Model (FSLDM) and in interpreting business requirements
- Worked closely with the Data Modeler and Data Architect in enhancing the FSLDM
- Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
- Implemented logical and physical data modeling with techniques using Erwin in Data Mart
- Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements.
- Worked on testing primary index and skew ratios before populating data into tables using sampling techniques, explain plan in Teradata before querying large tables with several joins.
- Performed tuning and optimization of complex SQL queries using Teradata Explain.
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Used various Teradata Index techniques to improve the query performance
- Writing UNIX shell scripts for processing/ cleansing incoming text files.
- Involved in loading of data into Teradata from legacy systems and flat files using complex MultiLoad scripts and FastLoad scripts.
- Involved heavily in writing complex SQL queries to pull the required information from Database using TeradataSQL Assistance.
- Created a shell script that checks the corruption of data file prior to the load. .
- Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs
- Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process
- Involved in troubleshooting the production issues and providing production support.
- Developed unit test plans and involved in system testing.
Environment: - Teradata V2R5, Teradata Administrator, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, Teradata FSLDM, Quality Center, UNIX, Windows 2000, Shell scripts.
Confidential,NY (Dec’06 to Jul’07)
Role: Data Analyst
Responsibilities:-
- Extensively involved in requirements gathering and data gathering to support developers in handling the design specification
- Extracted data from existing data source and performed ad-hoc queries by using SQL, PL/SQL, MS Access and UNIX.
- Migrated data from Flat Files to Oracle database using SQL*Loader
- Designed & developed various departmental reports by using SAS, SQL, PL/SQL, and MS Excel
- Created Global and volatile temporary tables to load large volumes of data into teradata database
- Created UNIX Scripts that uses Bteq to access Teradata Database.
- Created tables, indexes, views, string functions and joins in Teradata
- Designed/Developed scripts to move data from the staging tables to the target tables.
- Designed the order of flow for the execution of the jobs and scheduling the jobs.
- Fine tuning of SQL to optimize the performance, spool space usage and CPU usage.
- Converted Scripts from Oracle to Teradata.
- Script maintenance was done by using version control using virtual source safe (vss)
- Developed and executed business reports using the Teradata SQL advanced techniques like rank, row number and etc.
- Involved in writing stored procedures and packages using PL/SQL.
- Used Unix shell script programming to calculate the date values and assigned to a Unix variable and used these variable in BTEQ SQL Where conditions
- Extensively developed Unix Shell scripts to run batch Jobs and communicate the messages to the users
- Tuned the queries to improve the Performance of the existing reports for various functional areas.
Environment: - UNIX, Teradata, SQL, PL/SQL, SAS, Oracle, Excel Macros- VBA, Toad, MS Access, SQL*Loader
Confidential,Pune, India (Jul ’05 to Nov’06)
Role: Teradata/ ETL developer
Project scope: - Bharti Airtel Limited is one of Asia’s leading integrated telecom services providers with operations in India, Sri Lanka and Bangladesh. Bharti Airtel since its inception has been at the forefront of technology and has pioneered several innovations in the telecom sector. The company is structured into four strategic business units - Mobile, Telemedia, Enterprise and Digital TV
Responsibilities:-
- Developed the Fabrication & Manufacturing Module using Business Objects Data Integrator.
- Scheduling and Monitoring of Jobs under each project using BO Web Administrator.
- Developed Korn Shell scripts based on Load and Extract Specifications.
- Accomplished data movement process that load data from DB2 into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, and improved the design of Extract and Load specifications.
- Developed Daily, Monthly and summary reports using web intelligence and crystal reports
- Designed and developed a Generic Billing system for the company
- Responsible for all pre-ETL tasks upon which the Warehouse depends including managing & collection of various existing data sources
- Worked on multitasking for different branches and different customers
- Documented the mappings used in ETL processes.
- Extensively worked on data stores, Work Flows and Data Flows.
- Prepared Technical Design Document and Installation Document
Environment: - Teradata, Teradata SQL Assistant, BTEQ, FLOAD, FEXP, MLOAD, Oracle SQL-PL/SQL 9.x, SQL*Loader and UNIX Shell Scripts.
Education:
Bachelor of Technology .
Masters in Software Engineering.