We provide IT Staff Augmentation Services!

Sr. Etl/talend Developer Resume

0/5 (Submit Your Rating)

Union Beach, NJ

SUMMARY

  • 8+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
  • Highly skilled ETL Engineer with 9+ years of software development in tools like Informatica/SSIS/Talend.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • 3+ years Experience onTalendETL Enterprise Edition for Big data/Data integration/Data Quality.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, Hbase, Dynamodb, Elastic Search and Spark SQL.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle 10g/9i/8i/7.x, DB2, Netezza, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Worked hands on Datastage 8.7 ETL migration to Talend Studio ETL process.
  • Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType,, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc
  • Strong Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Hands on experience in Pentaho Business Intelligence ServerStudio.
  • Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
  • Experienced on writing Hive queries to load the data into HDFS.
  • Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Expertise in working with relational databases such as Oracle 12c/11g/10g/9i/8x, SQL Server 2012/2008/2005, DB2 8.0/7.0, UDB, MS Access and Teradata, Netezza.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experienced with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assitant.
  • Experienced in integration of various data sources like Oracle SQL, PL/SQL, Netezza, SQL server and MS access into staging area.
  • Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experienced in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting, and also usedNetezza Utilities to load and execute SQL scripts using Unix
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle12c/11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Extensively worked with Netezza database to implement data cleanup, performance tuning techniques
  • Experienced in using Automation Scheduling tools like Autosys and Control-M.
  • Experience in data migration with data from different application into a single application.
  • Responsible for Data migration from mySQL Server to Oracle Databases
  • Experienced in batch scripting on windows and worked extensively with slowly changing dimensions.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.

TECHNICAL SKILLS

Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS

ETL Tools: Talend, TOS, TIS, Informatica Power Center 9.x/8.x/7.x/6.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server),, SSIS, Ab-Initio.

Databases: Oracle 12c/11g/10g/9i/8i, MS SQL Server /2005, DB2 v8.1, Netezza, Teradata, Hbase.

Methodologies: Data Modeling - Logical Physical Dimensional Modeling - Star / Snowflake

Languages: SQL, PL/SQL, UNIX, Shell scripts, C++, SOAP UI, JSP, Web Services, Java Script, HTML, Eclipse

Scheduling Tools: Autosys, Control-M

Testing Tools: QTP, WinRunner, LoadRunner, Quality Center, Test Director, Clear test, Clear case.

PROFESSIONAL EXPERIENCE

Confidential, Dover, NH

ETL/ Talend Developer

Responsibilities:

  • Involved in End-End development of the implementation and Roll out.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Work on Data Migration using export/import.
  • Created Talend jobs using the dynamic schema feature.
  • Created Talend jobs to copy the files from one Database to another and utilized Talend components.
  • Used the Custom Components in the Talend
  • Developed Talend ETL jobs to push the data into Talend MDM and develop the jobs to extract the data from MDM.
  • Spring boot (STS) deployment, configuration and maintenance across a variety of UNIX and Windows platforms.
  • Developed data validation rule in the Talend MDM to confirm the golden record.
  • Data governance and Clean up old reports everyday generated by a Data base.
  • Impact analysis on current system by the new requirement.
  • Production Implementation and Creating SQL Queries using Teradata.
  • Monitoring the job, checking job status using Bamboo.
  • Used more components in Talend and Few to be mentioned: tELTJDBC, tELTINPUT, tELTJDBCMap, files, tlogrow, t components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File.
  • Coordinated with the business to gather requirements and preparing Functional Specification document.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
  • Worked on Talend ETL and used features such as Context variables, Database components like tDBLInput, tdbOutput, file components, ELT components etc.
  • Involved in automation of Springboot process in Talend and the Files in UNIX.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations.
  • Used ESP as a Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Involved in end-to-end testing of jobs.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Developed over 90 mappings to support the business logic including the historical data for reporting needs.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Developed Type-1 and Type-2 mappings for current and historical data.
  • Used Parameter Variables and Mapping variables for incremental data feeds.
  • Used Shared folders for Source, Targets and Lookups for reusability of the objects.
  • Scheduled the Informatica jobs from third party scheduling tool ESP Scheduler.
  • Involved/Migrated Informatica from 8.6 to version 9.6
  • Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD.
  • On-call support for production maintenance

Environment: Talend open studio (7.0) Informatica Power Center 8.6.1, 9.6.1, Teradata data base, SQL, PL/SQL, Teradata studio, DB Vis, Unix, Spring boot (STS), ESP, Bamboo.

Confidential, Union Beach, NJ

Sr. ETL/Talend Developer

Responsibilities:

  • Involved in End - End development of the implementation and Roll out.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Created projects in Talend administration center(TAC)and assigned roles to the users.
  • Worked Talend End to End Development which includes all Talend Admin Tasks.
  • Handling Talend admin tasks like user creation, job scheduling, execution plan, command line etc.
  • Talend Admin tasks like Installation, Configuration, Job Server Configuration, Command line, Project creation, assigned user access, Job Scheduling etc.
  • Work on Data Migration using export/import.
  • Created Talend jobs using the dynamic schema feature.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Used more components in Talend and Few to be mentioned: tjava, TFTP, tsalesforceinput, tsalesforceoutput, tdelimited files, tlogrow, tlogback,tuniquerow components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File.
  • Integrated maven with GIT to manage and deploy project related tags.
  • Installed and configured GIT and communicating with the repositories in GITHUB
  • Performed necessary day to day Subversion/GIT support for different projects.
  • Created and maintained Subversion/GIT repositories, branches and tags
  • Coordinated with the business to gather requirements and preparing Functional Specification document.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Worked on Talend ETL and used features such as Context variables A data fabric unifies data management across distributed resources to allow consistency and control of data mobility, security, visibility, protection, and access.
  • Involved in automation of FTP process in Talend and also load the data in the salesforce
  • Optimized the performance of the mappings by various tests on sources, targets and data fabric.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Involved in end-to-end testing of jobs.
  • Wrote java conditions to take data from various sources and integrated it with Talend.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Developed over 90 mappings to support the business logic including the historical data for reporting needs.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Used transformations like Router, Update Strategy, Lookups, Normalizer, Filter, Joiner and Aggregator.
  • Developed Type-1 and Type-2 mappings for current and historical data.
  • Incorporated business logic for Incremental data loads on a daily basis.
  • Used Parameter Variables and Mapping variables for incremental data feeds.
  • Used Shared folders for Source, Targets and Lookups for reusability of the objects.
  • Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD.
  • On-call support for production maintenance

Confidential, Irvine, CA

Sr. ETL/Talend Developer

Responsibilities:

  • Involved in End-End development of the implementation and Roll out.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Created projects in Talend administration center(TAC)and assigned roles to the users.
  • Worked Talend End to End Development which includes all Talend Admin Tasks.
  • Handling Talend admin tasks like user creation, job scheduling, execution plan, command line etc.
  • Talend Admin tasks like Installation, Configuration, Job Server Configuration, Command line, Project creation, assigned user access, Job Scheduling etc.
  • Work on Data Migration using export/import.
  • Created Talend jobs using the dynamic schema feature.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Used more components in Talend and Few to be mentioned: tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File.
  • Integrated maven with GIT to manage and deploy project related tags.
  • Installed and configured GIT and communicating with the repositories in GITHUB
  • Performed necessary day to day Subversion/GIT support for different projects.
  • Created and maintained Subversion/GIT repositories, branches and tags
  • Coordinated with the business to gather requirements and preparing Functional Specification document.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc.
  • Adata fabricunifiesdatamanagement across distributed resources to allow consistency and control ofdatamobility, security, visibility, protection, and access.
  • NetApp's vision fordatamanagement is adata fabricthat seamlessly connects different clouds, whether they are private, public, or hybrid environments
  • Involved in automation of FTP process in Talend and FTP the Files in UNIX.
  • Optimized the performance of the mappings by various tests on sources, targets and data fabric.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
  • Involved in end-to-end testing of jobs.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Developed over 90 mappings to support the business logic including the historical data for reporting needs.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Used transformations like Router, Update Strategy, Lookups, Normalizer, Filter, Joiner and Aggregator.
  • Developed Type-1 and Type-2 mappings for current and historical data.
  • Incorporated business logic for Incremental data loads on a daily basis.
  • Written complex PL/SQL procedures for specific requirements.
  • Used Parameter Variables and Mapping variables for incremental data feeds.
  • Used Shared folders for Source, Targets and Lookups for reusability of the objects.
  • Scheduled the Informatica jobs from third party scheduling tool Autosys Scheduler.
  • Involved/Migrated Informatica from 8.6 to version 9.6
  • Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD.
  • On-call support for production maintenance
  • Platform: Informatica 9.6, DB2 UDB, UNIX, Autosys, SQL Server 2008.

Environment: Informatica Power Center 8.6.1, 9.6.1, Oracle 11g, SQL,PL/SQL, TOAD, MY SQL, Unix, Autosys, OBIEE,Xml,Plat files.

Confidential, New York, NY

ETL Talend Developer

Responsibilities:

  • As a consultant studied the existing DataMarts to understand and integrate the new source of data.
  • Managing the off shore support group in India for support issue as well as small enhancements for data warehouse.
  • Preparing the weekly status report and coordinating weekly status calls with technology lead/business
  • Designed and created new Informatica jobs to implement new business logic into the existing process.
  • Using Informatica modules (Repository Manager, Designer, Workflow Manager and Workflow Monitor) to accomplish end to end ETL process.
  • Performed data profiling with Sources to analyse the content, quality and structure of source data
  • During mapping development.
  • Created required scripts/transformations to extract the source data from various sources such as Oracle, Flat Files etc.
  • Used all the complex functionality of informatica (Mapplets, Stored Procedures, Normalizer,
  • Update Strategy, Router, Joiner, Java, SQL Transformation Etc...) to interpret the business logic into
  • The ETL mappings.
  • Designed and developed complex aggregate, joiner, lookup transformations to implement the business rules in the ETL mappings to load the target Facts and Dimensions.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Used Mapplets and Reusable Transformations prevent redundancy of transformation usage and maintainability.
  • Created Complex Informatica mappings and in other hand Simple mappings with Complex SQLs in it based on the need or requirement of business user.
  • Used Informatica’s features to implement Type 1, 2, 3changes in slowly changing dimension Change Data Capture (CDC)
  • Different database triggers Created and configured workflows, worklets & Sessions to transport the data to target systems using Informatica Workflow Manager.
  • Fine-tuned the session performance using Session partitioning for long running sessions.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Used Versioning, Labels and Deployment group in the production move process.
  • Automation of Workflow using UNIX scripts using PMCMD, PMserver commands.
  • Setup Permissions for Groups and Users in all Environments (Dev, UAT and Prod).
  • Createdtables, views, primary keys, indexes, constraints, sequences, grants and synonym.
  • Involved in developing optimized code using PL/SQL for Server related Packages to centralize the application through procedures containing PL/SQL were created and stored in the database and fired off when contents of database were changed.
  • Used debugger to test the mapping and fixed the bugs.
  • Conducted Design and Code reviews and extensive documentation of standards, best practices and ETL Procedures.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • Developed Oracle Stored Procedures, Packages and Functions and utilized in ETL Process.
  • Handled the performance tuning of Informatica Mappings at various level to accomplish theestablished standard throughput.
  • Analysed the Target Data mart for accuracy of data for the pre-defined reporting needs.
  • Wrote complex SQLs to achieve and interpret the reporting needs into the ETL Process. Also worked in SQL tuning to achieve the maximum throughput.
  • Assisted in all aspects of the project to meet the scheduled delivery time..
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Conducted unit testing of all ETL mappings as well as helped QA team in conducting their testing.
  • Wrote UNIX shell scripts to work with flat files, to define parameter files and to create pre and post session commands.
  • Used Autosys Tool to schedule shell scripts and Informatica jobs.
  • Performed Unit, Grid Integration, Testing and validate results with end users.
  • Worked as a part of a team and provided 7 x 24 production support.

Environment: Informatica Power Center 9.5, Erwin, MS Visio, Oracle 11g, SQL, PL/SQL, Oracle Sql Developer Tool, SQL Server 2008, Flat Files, XML, Mainframe, Cobol Files, Autosys, UNIX Shell Scripting, Subversion.

Confidential

Informatica Developer

Responsibilities:

  • Extraction, Transformation and data loading were performed usingInformaticainto the database. Involved in Logical and Physical modeling of the drugs database.
  • Designed the ETL processes usingInformaticato load data from Oracle, Flat Files to target Oracle Data Warehouse database.
  • Based on the requirements created Functional design documents and Technical design specification documents for ETL.
  • Created tables, views, indexes, sequences and constraints.
  • Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
  • Transferred data using SQL Loader to database.
  • Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Implemented SCD methodology including Type 1 and Type 2 changes.
  • Used legacy systems, Oracle, and SQL Server sources to extract the data and to load the data.
  • Involved in design and development of data validation, load process and error control routines.
  • Used pmcmd to run workflows and created Cron jobs to automate scheduling of sessions.
  • Involved in ETL process from development to testing and production environments.
  • Analyzed the database for performance issues and conducted detailed tuning activities for improvement
  • Generated monthly and quarterly drugs inventory/purchase reports.
  • Coordinated database requirements with Oracle programmers and wrote reports for sales data.

Environment: InformaticaPower Center7.1, Oracle 9, SQL Server 2005, XML, SQL, PL/SQL, UNIX Shell Script.

Confidential

Oracle Developer

Responsibilities:

  • Involved in creating database objects like tables, stored procedures, views, triggers, and user defined functions for the project which was working on.
  • Analyze the client requirements and translate them into technical requirements.
  • Gathered requirements from the end user and involved in developing logical model and implementing requirements in SQL server 2000.
  • Data migration (import & export - BCP) from text to SQL server.
  • Responsible for creating reports based on the requirements using reporting services 2000.
  • Identified the database tables for defining the queries for the reports.
  • Worked on SQL server queries, stored procedures, triggers and joins.
  • Defined report layouts for formatting the report design as per the need.
  • Identified and defined the datasets for report generation.
  • Formatted the reports using global variables and expressions.
  • Deployed generated reports onto the report server to access it through browser.
  • Maintained data integrity by performing validation checks.

Environment: MS SQL 2000, Windows server 2000, SQL Query Analyzer and Enterprise Manager, MS Access 2000 & Windows NT platform.

We'd love your feedback!