We provide IT Staff Augmentation Services!

Sr. Etl/talend Developer Resume

5.00/5 (Submit Your Rating)

Union Beach, NJ

SUMMARY:

  • 8+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
  • Highly skilled ETL Engineer with 9+ years of software development in tools like Informatica/SSIS/Talend.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • 3+ years Experience on Talend ETL Enterprise Edition for Big data/Data integration/Data Quality.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, Hbase, Dynamodb, Elastic Search and Spark SQL.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle … DB2, Netezza, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Worked hands on Datastage 8.7 ETL migration to Talend Studio ETL process.
  • Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType,, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc
  • Strong Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Hands on experience in Pentaho Business Intelligence ServerStudio.
  • Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
  • Experienced on writing Hive queries to load the data into HDFS.
  • Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Expertise in working with relational databases such as Oracle … SQL Server … DB2 8.0/7.0, UDB, MS Access and Teradata, Netezza.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assitant.
  • Experienced in integration of various data sources like Oracle SQL, PL/SQL, Netezza, SQL server and MS access into staging area.
  • Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experienced in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and also used Netezza Utilities to load and execute SQL scripts using Unix
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle12c/11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Extensively worked with Netezza database to implement data cleanup, performance tuning techniques
  • Experienced in using Automation Scheduling tools like Autosys and Control-M.
  • Experience in data migration with data from different application into a single application.
  • Responsible for Data migration from mySQL Server to Oracle Databases
  • Experienced in batch scripting on windows and worked extensively with slowly changing dimensions.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.

WORK EXPERIENCE:

Sr. ETL/Talend Developer

Confidential, Union Beach, NJ

Responsibilities:

  • Involved in End-End development of the implementation and Roll out.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Created projects in Talend administration center(TAC)and assigned roles to the users.
  • Worked Talend End to End Development which includes all Talend Admin Tasks.
  • Handling Talend admin tasks like user creation, job scheduling, execution plan, command line etc.
  • Talend Admin tasks like Installation, Configuration, Job Server Configuration, Command line, Project creation, assigned user access, Job Scheduling etc.
  • Work on Data Migration using export/import.
  • Created Talend jobs using the dynamic schema feature.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Used more components in Talend and Few to be mentioned: tjava, TFTP, tsalesforceinput, tsalesforceoutput, tdelimited files, tlogrow, tlogback,tuniquerow components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File.
  • Integrated maven with GIT to manage and deploy project related tags.
  • Installed and configured GIT and communicating with the repositories in GITHUB
  • Performed necessary day to day Subversion/GIT support for different projects.
  • Created and maintained Subversion/GIT repositories, branches and tags
  • Coordinated with the business to gather requirements and preparing Functional Specification document.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Worked on Talend ETL and used features such as Context variables,
  • A data fabric unifies data management across distributed resources to allow consistency and control of data mobility, security, visibility, protection, and access.
  • Involved in automation of FTP process in Talend and also load the data in the salesforce
  • Optimized the performance of the mappings by various tests on sources, targets and data fabric.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Involved in end-to-end testing of jobs.
  • Wrote java conditions to take data from various sources and integrated it with Talend.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Developed over 90 mappings to support the business logic including the historical data for reporting needs.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Used transformations like Router, Update Strategy, Lookups, Normalizer, Filter, Joiner and Aggregator.
  • Developed Type-1 and Type-2 mappings for current and historical data.
  • Incorporated business logic for Incremental data loads on a daily basis.
  • Used Parameter Variables and Mapping variables for incremental data feeds.
  • Used Shared folders for Source, Targets and Lookups for reusability of the objects.
  • Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD.
  • On-call support for production maintenance
  • Platform: Talend6.4, DB2 UDB, Salesforce, Autosys, FTP Server.

Sr. ETL/Talend Developer

Confidential, Irvine, CA

Responsibilities:

  • Involved in End-End development of the implementation and Roll out.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Created projects in Talend administration center(TAC)and assigned roles to the users.
  • Worked Talend End to End Development which includes all Talend Admin Tasks.
  • Handling Talend admin tasks like user creation, job scheduling, execution plan, command line etc.
  • Talend Admin tasks like Installation, Configuration, Job Server Configuration, Command line, Project creation, assigned user access, Job Scheduling etc.
  • Work on Data Migration using export/import.
  • Created Talend jobs using the dynamic schema feature.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Used more components in Talend and Few to be mentioned: tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File.
  • Integrated maven with GIT to manage and deploy project related tags.
  • Installed and configured GIT and communicating with the repositories in GITHUB
  • Performed necessary day to day Subversion/GIT support for different projects.
  • Created and maintained Subversion/GIT repositories, branches and tags
  • Coordinated with the business to gather requirements and preparing Functional Specification document.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc.
  • A data fabric unifies data management across distributed resources to allow consistency and control of data mobility, security, visibility, protection, and access.
  • NetApp's vision for data management is a data fabric that seamlessly connects different clouds, whether they are private, public, or hybrid environments
  • Involved in automation of FTP process in Talend and FTP the Files in UNIX.
  • Optimized the performance of the mappings by various tests on sources, targets and data fabric.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Involved in end-to-end testing of jobs.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Developed over 90 mappings to support the business logic including the historical data for reporting needs.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Used transformations like Router, Update Strategy, Lookups, Normalizer, Filter, Joiner and Aggregator.
  • Developed Type-1 and Type-2 mappings for current and historical data.
  • Incorporated business logic for Incremental data loads on a daily basis.
  • Written complex PL/SQL procedures for specific requirements.
  • Used Parameter Variables and Mapping variables for incremental data feeds.
  • Used Shared folders for Source, Targets and Lookups for reusability of the objects.
  • Scheduled the Informatica jobs from third party scheduling tool Autosys Scheduler.
  • Involved/Migrated Informatica from 8.6 to version 9.6
  • Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD.
  • On-call support for production maintenance
  • Platform: Informatica 9.6, DB2 UDB, UNIX, Autosys, SQL Server 2008.

Environment: Informatica Power Center 8.6.1, 9.6.1, Oracle 11g, SQL,PL/SQL, TOAD, MY SQL, Unix, Autosys, OBIEE,Xml,Plat files.

ETL Talend Developer

Confidential, New York, NY

Responsibilities:

  • As a consultant studied the existing DataMarts to understand and integrate the new source of data.
  • Managing the off shore support group in India for support issue as well as small enhancements for data warehouse.
  • Preparing the weekly status report and coordinating weekly status calls with technology lead/business
  • Designed and created new Informatica jobs to implement new business logic into the existing process.
  • Using Informatica modules (Repository Manager, Designer, Workflow Manager and Workflow Monitor) to accomplish end to end ETL process.
  • Performed data profiling with Sources to analyse the content, quality and structure of source data
  • During mapping development.
  • Created required scripts/transformations to extract the source data from various sources such as Oracle, Flat Files etc.
  • Used all the complex functionality of informatica (Mapplets, Stored Procedures, Normalizer,
  • Update Strategy, Router, Joiner, Java, SQL Transformation Etc ) to interpret the business logic into
  • The ETL mappings.
  • Designed and developed complex aggregate, joiner, lookup transformations to implement the business rules in the ETL mappings to load the target Facts and Dimensions.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Used Mapplets and Reusable Transformations prevent redundancy of transformation usage and maintainability.
  • Created Complex Informatica mappings and in other hand Simple mappings with Complex SQLs in it based on the need or requirement of business user.
  • Used Informatica's features to implement Type 1, 2, 3changes in slowly changing dimension Change Data Capture ()
  • Different database triggers Created and configured workflows, worklets & Sessions to transport the data to target systems using Informatica Workflow Manager.
  • Fine-tuned the session performance using Session partitioning for long running sessions.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used Versioning, Labels and Deployment group in the production move process.
  • Automation of Workflow using UNIX scripts using PMCMD, PMserver commands.
  • Setup Permissions for Groups and Users in all Environments (Dev, UAT and Prod).
  • Createdtables, views, primary keys, indexes, constraints, sequences, grants and synonym.
  • Involved in developing optimized code using PL/SQL for Server related Packages to centralize the application through procedures containing PL/SQL were created and stored in the database and fired off when contents of database were changed.
  • Used debugger to test the mapping and fixed the bugs.
  • Conducted Design and Code reviews and extensive documentation of standards, best practices and ETL Procedures.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • Developed Oracle Stored Procedures, Packages and Functions and utilized in ETL Process.
  • Handled the performance tuning of Informatica Mappings at various level to accomplish theestablished standard throughput.
  • Analysed the Target Data mart for accuracy of data for the pre-defined reporting needs.
  • Wrote complex SQLs to achieve and interpret the reporting needs into the ETL Process. Also worked n SQL tuning to achieve the maximum throughput.
  • Assisted in all aspects of the project to meet the scheduled delivery time..
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Conducted unit testing of all ETL mappings as well as helped QA team in conducting their testing.
  • Wrote UNIX shell scripts to work with flat files, to define parameter files and to create pre and post session commands.
  • Used Autosys Tool to schedule shell scripts and Informatica jobs.
  • Performed Unit, Grid Integration, Testing and validate results with end users.
  • Worked as a part of a team and provided 7 x 24 production support.

Environment: Informatica Power Center 9.5, Erwin, MS Visio, Oracle 11g, SQL, PL/SQL, Oracle Sql Developer Tool, SQL Server 2008, Flat Files, XML, Mainframe, Cobol Files, Autosys, UNIX Shell Scripting, Subversion.

Sr. ETL/Talend Developer

Confidential, Santa Monica, CA

Responsibilities:

  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Performed data manipulations using various Talend components like tMap, tJavaRow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
  • Extensive experience on Pentaho designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Developed advanced Oracle stored procedures and handled SQL performance tuning.
  • Involved in creating the mapping documents with the transformation logic for implementing few enhancements to the existing system.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC)
  • Developed the Talend mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database is a combination of Flat files, Oracle tables, Excel files and Teradata database.
  • Loaded data in to Teradata Target tables using Teradata utilities (FastLoad, MultiLoad, and FastExport) Queried the Target database using Teradata SQL and BTEQ for validation.
  • Used Talend to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Created connection to databases like SQL Server, oracle, Netezza and application connections.
  • Created mapping documents to outline data flow from sources to targets.
  • Prepare the Talend job level LLD documents and working with the modeling team to understand the Big Data Hive table structure and physical design.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings.
  • Developed mapping parameters and variables to support SQL override.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Created mapplets & reusable transformations to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Developed the Talend jobs and make sure to load the data into HIVE tables & HDFS files and develop the Talend jobs to integrate with Teradata system from HIVE tables
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Unit testing, code reviewing, moving in UAT and PROD.
  • Designed the Talend ETL flow to load the data into hive tables and create the Talend jobs to load the data into Oracle and Hive tables.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Working with high volume of data and tracking the performance analysis on Talend job runs and session.
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Experience in Batch scripting on windows, Windows 32 bit commands, Quoting, Escaping.
  • Used Talend reusable components like routines, context variable and globalMap variables.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Knowledge on Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.
  • Modified existing mappings for enhancements of new business requirements.
  • Worked on Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Netezza.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Configured the hive tables to load the profitability system in Talend ETL Repository and create the Hadoop connection for HDFS cluster in Talend ETL repository
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Environment: Talend, TOS, TIS, Hive, Pig, Hadoop 2.2, Sqoop, PL/SQL, Oracle 12c/11g/, Erwin, Autosys, SQL Server 2012, Teradata, Netezza, Sybase, SSIS, UNIX, Profiles, Role hierarchy, Workflow & Approval processes, Data Loader, Reports, Custom Objects, Custom Tabs, Data Management, Lead processes, Record types.

Informatica Developer

Confidential

Responsibilities:

  • Extraction, Transformation and data loading were performed using Informatica into the database. Involved in Logical and Physical modeling of the drugs database.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files to target Oracle Data Warehouse database.
  • Based on the requirements created Functional design documents and Technical design specification documents for ETL.
  • Created tables, views, indexes, sequences and constraints.
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Transferred data using SQL Loader to database.
  • Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Implemented SCD methodology including Type 1 and Type 2 changes.
  • Used legacy systems, Oracle, and SQL Server sources to extract the data and to load the data.
  • Involved in design and development of data validation, load process and error control routines.
  • Used pmcmd to run workflows and created Cron jobs to automate scheduling of sessions.
  • Involved in ETL process from development to testing and production environments.
  • Analyzed the database for performance issues and conducted detailed tuning activities for improvement
  • Generated monthly and quarterly drugs inventory/purchase reports.
  • Coordinated database requirements with Oracle programmers and wrote reports for sales data.

Environment: Informatica Power Center7.1, Oracle 9, SQL Server 2005, XML, SQL, PL/SQL, UNIX Shell Script.

Oracle Developer

Confidential

Responsibilities:

  • Involved in creating database objects like tables, stored procedures, views, triggers, and user defined functions for the project which was working on.
  • Analyze the client requirements and translate them into technical requirements.
  • Gathered requirements from the end user and involved in developing logical model and implementing requirements in SQL server 2000.
  • Data migration (import & export - BCP) from text to SQL server.
  • Responsible for creating reports based on the requirements using reporting services 2000.
  • Identified the database tables for defining the queries for the reports.
  • Worked on SQL server queries, stored procedures, triggers and joins.
  • Defined report layouts for formatting the report design as per the need.
  • Identified and defined the datasets for report generation.
  • Formatted the reports using global variables and expressions.
  • Deployed generated reports onto the report server to access it through browser.
  • Maintained data integrity by performing validation checks.

Environment: MS SQL 2000, Windows server 2000, SQL Query Analyzer and Enterprise Manager, MS Access 2000 & Windows NT platform.

We'd love your feedback!