Teradata Developer Resume
CA
SUMMARY
- Over 10 + Years of IT Experience in development of Enterprise Data Warehouse applications using Informatica, Oracle and Teradata.
- Experience in all phases of Data warehouse development from Requirements, analysis, design, development, testing and post production support.
- Strong in - depth knowledge in doing data analysis, data quality and source system analysis.
- Independent, Self-starter, enthusiastic team player with strong adaptability to new technologies.
- Experience using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems
- Knowledge in DVO Automated ETL Testing tool.
- Experience in Big Data Technologies using Hadoop, Sqoop, Pig and Hive.
- Experience in writing Hive and Unix shell scripts.
- Excellent track record in delivering quality software on time to meet the business priorities.
- Developed Data Warehouse/Data Mart systems, using various RDBMS (Oracle, MS-SQL Server, Mainframes, Teradata and DB2).
- Highly Proficient in using Informatica Power Center, Power Exchange and explore on Informatica Data Services.
TECHNICAL SKILLS
ETL Tools: Informatica, Data Stage, SSIS
Databases: Teradata 12/13/14, Oracle 9i/10g/11g/12c, MySQL, SQL Server 2000/2005, MS Access, DB2, Hadoop (HDFS)
GUI: .Net Custom development, Business Objects, Micro Strategy
Operating Systems: Windows, Unix, Linux
Languages: C#, VB Script, HTML, DHTML, Java Script, SQL, PL/SQL, Unix Shell, Python, Hive, Pig
Web Related: ASP.NET, VB Script, HTML, DHTML, JAVA, Java Script
Tools: & Utilities: Teradata Parallel Transporter, Aprimo 6.1/8.X, Bteq, SQL Assistant, Toad, SQL Navigator, SQL*Loader, $U, HP Quality center, Data Flux, UC4, Control-M, Crontab
Domain Knowledge: Banking, Finance, Insurances, Health Care, Energy, Retail
PROFESSIONAL EXPERIENCE
Teradata Developer
Confidential, CA
Responsibilities:
- Developed design documentation and implementation proposal for extraction of data from W data warehouse.
- Developed modules to extract, process & transfer the customer data using Teradata utilities.
- Created Fast Export scripts for extracting and formatting customer data from data warehouse in a mainframe file.
- Created FTP Id to set up transmission path between mainframe and UNIX machine.
- Deliver new and complex high-quality solutions to clients in response to varying business requirements and Creating and managing user accounts.
- Multiload, Fast load, BTEQ, created, modified databases, performed capacity planning
Environment: Teradata, Teradata Studio, Unix, SQL Work Bench, Tableau Jenkins, JIRA
Data Engineer /ETL Developer
Confidential, CA
Responsibilities:
- Involved in requirements gathering, Design and Development.
- Transformed data from STG Tables into Final Tables using Hive Scripts.
- Writing python scripts to get the usage stats for all edge nodes.
- From Hadoop used to refine and analyze clickstream data
- Developed Sqoop Jobs to integrate Data from Oracle and SQL Server for application migration.
- Import data using Sqoop into Hive and Hbase from existing SQL Server.
- Export and Import data into HDFS, HBase and Hive using Sqoop.
- Involve in create Hive tables, loading with data and writing Hive queries which will run internally in MapReduce way.
- Work closely with the business and analytics team in gathering the system requirements.
- Deliver new and complex high-quality solutions to clients in response to varying business requirements and Creating and managing user accounts.
- Written a script to predict the run times using past data
- Providing operational support for current and ongoing efforts.
- Written a program to send alerts for the jobs which are taking more resources.
- Load and transform large sets of structured and semi structured data.
- Involved in Hadoop upgrade.
- Maintaining the real time usage data at detail and aggregate level.
Environment: Hive, HBase, MapReduce, Sqoop, Oozie, Hue, LINUX, Python, and Tableau.
Teradata developer /ETL Developer
Confidential, CA
Responsibilities:
- Worked with Safeway and united business users to capture data requirements and transformational rules between source systems with Albertsons EDW.
- Developed Mapping and design documents for Sales, Promotional and Marketing data.
- Performed data profiling, source systems analysis to understand data and quality issues.
- Developed BTEQ scripts to transform from Stage to 3rd NF and then to aggregate.
- Tuned Complex Teradata queries to meet performance level agreements using statics, Indices, and Partitioning Techniques.
- Multiload, Fast load, BTEQ, Created, modified databases, performed capacity planning.
- Deliver new and complex high-quality solutions to clients in response to varying business requirements and Creating and managing user accounts.
- Developed Informatica mappings for source to target loading .
- Source System Analysis and provide input to data modeling and developing ETL design document as per business requirements.
Environment: Teradata, Teradata Viewpoint, Teradata Studio, Unix, Informatica
Teradata Performance Engineer/ETL developer
Confidential, San Ramon, CA
Responsibilities:
- Performance tuning, including collecting statistics, analyzing explains & determine which tables needed stats. Increased performance by 50-75% in some situations.
- Develop complex SQL queries to identify the performance bottle necks in the processing.
- Profound understanding of Banking Campaign management experience.
- Working with Business analyst to develop productive models and advance SQL statements.
- Multiload, BTEQ, created & modified databases, performed capacity planning.
- Deliver new and complex high-quality solutions to clients in response to varying business requirements and Creating and managing user accounts.
- Refreshed the data by using fast export and fast load utilities.
- Developed Informatica mappings for source to target loading from BODI to TP.
- Worked on Aprimo Integration/Customization and configuration.
- Source System Analysis and provide input to data modeling and developing ETL design document as per business requirements.
- Worked on IDQ parsing, IDQ Standardization, matching IDQ web services.
- Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data
- Imported the mappings developed in data quality (IDQ) to Informatica designer.
- Involved in Data Profiling using Informatica Data Quality (IDQ).
- Worked on supporting all the ETL Inbounds and Outbound of TDM in production environment.
- Developed Sqoop Jobs to integrate Data from Oracle and SQL Server for application migration.
- Import data using Sqoop into Hive and Hbase from existing SQL Server.
- Export and Import data into HDFS, HBase and Hive using Sqoop.
- Involve in create Hive tables, loading with data and writing Hive queries which will run internally in MapReduce way.
- Work closely with the business and analytics team in gathering the system requirements.
- Deliver new and complex high-quality solutions to clients in response to varying business requirements and Creating and managing user accounts.
- Written a script to predict the run times using past data
Environment: Teradata, Teradata Viewpoint, Aprimo, Informatica Power Center, IDQ, Oracle, PL/SQL, Windows, HP Quality center, Unix, HIVE.
ETL and Teradata Developer
Confidential, CA
Responsibilities:
- Analysis, Design, Development, Testing and Deployment of Informatica workflows, BTEQ scripts, Python and shell scripts.
- Source System Analysis and provide input to data modeling and developing ETL design document as per business requirements.
- Design, Developing and testing of the various Mappings and Mapplets, worklets and workflows involved in the ETL process.
- Developed and Integrated Data Quality measures into ETL frame work using Informatica Data Quality (IDQ).
- Experience in data profiling using IDQ for input into ETL Design and Data Modelling.
- Extensively used ETL to transfer data from different source system and load the data into the target DB.
- Developing Informatica mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer.
- Extracting data from various sources across the organization (Oracle, MySQL, SQL Server and Flat files) and loading into staging area.
- Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica power center as mappings, mapplets
Environment: Teradata, Oracle, PL/SQL, MySQL, Informatica Power Center, Power Exchange, IDQ, OCL Tool, UC4, Control-M, ER Viewer, Business Intelligence, Micro Strategy, Windows, HP Quality center, Unix, Linux.
ETL Developer
Confidential, Annapolis, MD
Responsibilities:
- Developed Low level mappings for Tables and columns from source to target systems.
- Wrote and optimized Initial data load scripts using Information and Database utilities.
- Using Partitions to extract data from source and load it to Teradata using TPT load with proper load balance on Teradata server.
- Wrote Complex Bteq scripts to in corporate Business functionality in transforming the data from Staging into 3rd normal form.
- Participated in Teradata Upgrade project to upgrade from TD12 to TD13.10 to conduct regression testing.
Environment: Teradata, Oracle, PL/SQL, MySQL, Informatica Power Center, SSIS, SSRS, ER Viewer, Windows, HP Quality center, UNIX.
Confidential, MD
Senior ETL Developer
Responsibilities:
- Created Uprocs, Sessions, Management Unit to schedule jobs using $U.
- Conduct source System Analysis and developed ETL design document to meet business requirements.
- Tuned Teradata Sql queries and resolved performance issues due to Data Skew and Spool space issues.
- Developed Flat files from Teradata using fast export, Bteq to disseminate to downstream dependent systems.
Environment: Teradata, Oracle, PL/SQL, Informatica Power Center, $U, Business Objects, SSIS, Windows XP, UNIX Shell scripting.
Confidential, Temple, TX
ETL Developer
Responsibilities:
- Documenting functional specifications and other aspects used for the development of ETL mappings
- Design, Developing and testing of the various Mappings and Mapplets, worklets and
- Optimized Performance of existing Informatica workflows.
- Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data.
Environment: Oracle, SQL Server, DB2, Informatica Power Center, Erwin, Cognos, XML, Windows, Unix
Confidential, Minnesota, MN
ETL Developer
Responsibilities:
- Developed various Mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer
- Extracted data from various sources across the organization (Oracle, SQL Server and Flat files) and loading into staging area
- Created and scheduled Sessions and Batch Process based on demand, run on time, or run only once using Informatica Workflow Manager and monitoring the data loads using the Workflow Monitor
Environment: Oracle, SQL Server, PL/SQL, Informatica Power Center, Erwin, Cognos, Windows, UNIX