Sr. Etl/talend Developer Resume
Knoxville, TN
SUMMARY:
- 9+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
- Highly skilled ETL Engineer with 9+ years of software development in tools like Informatica/SSIS/Talend.
- Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- 3+ years Experience on Talend ETL Enterprise Edition for Big data/Data integration/Data Quality.
- Experience in Implementation of Microsoft Business Intelligence (BI) platforms including SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS) in SQL Server 2005.
- Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, Hbase, Dynamodb, Elastic Search and Spark SQL.
- Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle … DB2, Netezza, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
- Designed tables, indexes and constraints using TOAD and loaded data into the database using SQL*Loader.
- Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
- Worked hands on Datastage 8.7 ETL migration to Talend Studio ETL process.
- Strong Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
- Experience in SQL Plus and TOAD as an interface to databases, to analyze, view and alter data.
- Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
- Hands on experience in Pentaho Business Intelligence ServerStudio.
- Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
- Experienced on writing Hive queries to load the data into HDFS.
- Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer
- Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
- Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
- Hands on experience in Deployment of DTS and SSIS packages using ETL Tool.
- Designing SSIS packages for extraction, transformation and loading of data.
- Configuring SSRS. Generating Reports using SSRS.
- Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
- Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
- Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
- Experienced with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assitant.
- Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and also used Netezza Utilities to load and execute SQL scripts using Unix
- Proficient in the Integration of various data sources with multiple relational databases like … MS SQL Server … 8.0/7.0,UDB, MS Access,Netezza, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Experienced in using Automation Scheduling tools like Autosys and Control-M.
- Experience in data migration with data from different application into a single application.
- Responsible for Data migration from mySQL Server to Oracle Databases
- Experienced in batch scripting on windows and worked extensively with slowly changing dimensions.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.
WORK EXPERIENCE:
Sr. ETL/Talend Developer
Confidential, Knoxville,TN
Responsibilities:
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Performed data manipulations using various Talend components like tMap, tJavaRow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
- Extensive experience on Pentaho designer, Pentaho kettle, Pentaho BI server, BIRT report designer
- Developed advanced Oracle stored procedures and handled SQL performance tuning.
- Involved in creating the mapping documents with the transformation logic for implementing few enhancements to the existing system.
- Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC)
- Developed the Talend mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database.
- Loaded data in to Teradata Target tables using Teradata utilities (FastLoad, MultiLoad, and FastExport) Queried the Target database using Teradata SQL and BTEQ for validation.
- Used Talend to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
- Created connection to databases like SQL Server, oracle, Netezza and application connections.
- Created mapping documents to outline data flow from sources to targets.
- Prepare the Talend job level LLD documents and working with the modeling team to understand the Big Data Hive table structure and physical design.
- Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend.
- Created SSIS Packages to back up the database and compress and zip the compressed database backup.
- Handled the Errors at Control and Data flow level in SSIS Packages.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings.
- Developed mapping parameters and variables to support SQL override.
- Developed Talend ESB services and deployed them on ESB servers on different instances.
- Created mapplets & reusable transformations to use them in different mappings.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Developed the Talend jobs and make sure to load the data into HIVE tables & HDFS files and develop the Talend jobs to integrate with Teradata system from HIVE tables
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Unit testing, code reviewing, moving in UAT and PROD.
- Designed the Talend ETL flow to load the data into hive tables and create the Talend jobs to load the data into Oracle and Hive tables.
- Migrated the code into QA (Testing) and supported QA team and UAT (User).
- Created detailed Unit Test Document with all possible Test cases/Scripts.
- Working with high volume of data and tracking the performance analysis on Talend job runs and session.
- Conducted code reviews developed by my team mates before moving the code into QA.
- Experience in Batch scripting on windows, Windows 32 bit commands, Quoting, Escaping.
- Used Talend reusable components like routines, context variable and globalMap variables.
- Provided support to develop the entire warehouse architecture and plan the ETL process.
- Knowledge on Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.
- Modified existing mappings for enhancements of new business requirements.
- Worked on Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Netezza.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
- Configured the hive tables to load the profitability system in Talend ETL Repository and create the Hadoop connection for HDFS cluster in Talend ETL repository
- Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.
Environment: Talend, TOS, TIS, Hive, Pig, Hadoop 2.2, Sqoop, PL/SQL, Oracle 12c/11g/, Erwin, Autosys, SQL Server 2012, Teradata, Netezza, Sybase, SSIS, UNIX, Profiles, Role hierarchy, Workflow & Approval processes, Data Loader, Reports, Custom Objects, Custom Tabs, Data Management, Lead processes, Record types.
Talend Developer
Confidential - Raleigh, NC
Responsibilities:
- Interact with Solution Architects and Business Analysts to gather requirements and update Solution Architect Document.
- Engaged in various data warehouse and migration projects in understanding the business requirements and analyzing the design documents created by Data Architects and then generate a technical Low level Design document with the approach for the project.
- Worked hands on Datastage 8.7 ETL migration to Talend Studio ETL process.
- Performed analysis, design, development, Testing and deployment for Ingestion, Integration, provisioning using Agile Methodology.
- Attended Daily Scrum meetings to provide update on the progress of the user stories Rally and to the Scrum Master and also to notify blocker and dependency if any.
- Experienced in creating Generic schemas and creating Context Groups and Variables to run jobs against different environments like Dev, Test and Prod.
- Created Talend Mappings to populate the data into dimensions and fact tables.
- Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
- Experienced in Talend Data Integration, Talend Platform Setup on Windows and UNIX systems.
- Created complex mappings in Talend 6.0.1/5.5 using tMap, tJoin, tReplicate, tParallelize, tJava, tjavarow, tJavaFlex, tAggregateRow, tDie, tWarn, tLogCatcher, etc.
- Created joblets in Talend for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
- Experience in using Repository Manager for Migration of Source code from Lower to higher environments
- Developed jobs to move inbound files to vendor server location based on monthly, weekly and daily frequency.
- Implemented Change Data Capture technology in Talend in order to load deltas to a DataWarehouse.
- Created jobs to perform record count validation and schema validation.
- Created contexts to use the values throughout the process to pass from parent to child jobs and child to parent jobs.
- Developed joblets that are reused in different processes in the flow.
- Developed error logging module to capture both system errors and logical errors that contains Email notification and also moving files to error directories.
- Provided the Production Support by running the jobs and fixing the bugs.
- Experienced in using Talend database components, File components and processing components based up on requirements.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
- Performed unit testing and also integration testing after the development and got the code reviewed.
- Responsible for code migrations from Dev. to QA and production and providing operational instructions for deployments.
Environment: Talend Studio 6.0.1/5.5, Oracle 11i, XML files, Flat files, HL7 files, JSON, TWS, Hadoop 2.4.1, HDFS, Hive 0.13, HBase 0.94.21, Talend Administrator Console, IMS, Agile Methodology, HPSM
ETL Developer
Confidential - Birmingham, AL
Responsibilities:
- Design, Develop, Test, Implement and Support of Data Warehousing ETL using Informatica.
- Used JIRA to create, implement and deploy ETL related stories.
- Participated in daily scrum and bi weekly iteration planning as part of agile environment.
- Research, analyze and prepare logical and physical data models for new applications and optimize the data structures to enhance data load times and end-user data access response times.
- Used SQL tools like TOAD to run SQL queries and validated the data.
- Created Java programs to read data from web services and load into teradata.
- Created Unit test environments for ETL processes using sql scripts, JUNIT, HSQLDB (In memory database) and Liquibase.
- Developed stored procedures/views in Teradata and used in Informatica for loading scoring tables.
- Performed ETL using different sources like MYSQL tables, CSV, fixed length files and loaded into Teradata, HDFS and hive targets.
- Manage and scheduled ETL jobs using Informatica Workflow manager in development and production environment.
- Prepare high level design document, detail design document, system requirement document, technical specifications, table level specs and test plan documents.
- Extract data from legacy systems to staging area and then cleanse, homogenize, process and load into the data warehouse.
- Develop MERGE scripts to UPSERT data into Teradata for an ETL source.
- Worked on writing pig/hive/hcatalog scripts to process huge data files like web clicks data.
- Used GIT as version control for the code and implemented branching for different environments.
- Provided 24x7 production support for the ETL processes.
ENVIRONMENT: Informatica 8.6/IDQ, JIRA, Java, Maven, GIT, FLUX, Teradata, MySQL, Putty, XML, JUNIT, Hadoop, Apache Pig, Hive, Web Services, OBIEE, Microsoft Office, Oracle 10g/11g.
ETL Developer
Confidential
RESPONSIBILITIES:
- Requirement Analysis and Effort Estimation.
- Co-coordinating with onsite for requirements and Project deliveries for ETL.
- Designed processes to extract, transform and load data from different file source systems via Informatica mappings into the data warehouse.
- Implemented complex mappings to implement the specified business logic.
- Populated the data into the Staging tables prior to loading the Target tables.
- Implemented Type 2 Slowly Changing Dimensions.
- Developed and used re-usable transformations and mapplets to improve productivity and the efficiency of the development process.
- Extensively used Informatica client tools such as Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Informatica Workflow Manage and Workflow monitor.
- Responsible for creating and generating the Tabular Reports, Matrix Reports, Charts using SQL Server Reporting Services.
- Developed complex SQL Queries, Stored Procedures, Functions and Triggers.
- Designed schedule of Informatica Mappings for daily/weekly load.
- Performed Data Analysis and Reconciliation.
ENVIRONMENT: Oracle 9i/8x, PL/SQL, Windows, Autosys, Informatica 7.1, Erwin, SQL Server2008, SQL plus
Oracle Developer / Programmer
Confidential
RESPONSIBILITIES:
- Responsible for system analysis, design, testing and documentation.
- Responsible for developing PL/SQL procedures, packages, triggers and other database objects.
- Responsible for developing user interfaces using Visual Basic 6.0.
- Implemented integrity constraints on database tables.
- Responsible for performance tuning activities like Optimizing SQL queries, explain plan, creating indexes.
- Worked in Database structure changes, table/index sizing, transaction monitoring, data conversion and loading data into Oracle tables using SQL*Loader.
ENVIRONMENT: Oracle 8x, /Visual Basic / Windows, Windows 2005 server, SQL* Plus, SQL, PL/SQL., SQL *Loader.