We provide IT Staff Augmentation Services!

Talend Developer Resume

2.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY:

  • Seasoned ETL Developer around 8 years of experience in various industries such as Banking, Retail and E - Commerce with expertise in the field of Data Warehouse, ETL Maintenance, Master Data Management (MDM) strategy, Data Quality and Big Data Eco Systems.
  • Expertise on Talend Data Integration suite and Bigdata Integration Suite for Design and development of ETL/Bigdata code and Mappings for Enterprise DWH ETL Talend Projects.
  • Developed efficient mappings for data extraction/transformation/loading (ETL) from different sources to a target data warehouse.
  • Experienced in working with Horton works distribution of Hadoop, HDFS, MapReduce, Hive, Sqoop, Flume, Pig, HBase, and MongoDB
  • Experience in dealing with structured and semi-structured data in HDFS.
  • Experience in integration of various data sources like Teradata, SQL Server, Oracle, DB2, Netezza and Flat Files.
  • Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads.
  • Expert in using the Talend Troubleshooting and Datastage to understand the errors in Jobs and used the tMap/expression editor to evaluate complex expressions and look at the transformed data to solve mapping issues
  • Created complex mappings in Talend using components like: tMap, tJoin, tReplicate, tParallelize, tAggregateRow, tDie, tUnique, tFlowToIterate, tSort, tFilterRow, tWarn, tbuffer, tcontextload.
  • Strong Oracle/SQL Server database programming - development of stored procedures, triggers and views.
  • Extensive experience in using Talend features such as context variables, triggers, connectors for Database and flat files.
  • Hands on Experience on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.
  • Experience in Debugging, Error Handling and Performance Tuning of sources, targets, Jobs etc.
  • Extensive experience in Relational and Dimensional Data modelling for creating Logical and Physical Design of Database and ER Diagrams using data modelling tools like ERWIN and ER Studio.
  • Worked extensively schema for Data Warehouse, ODS architecture by using tools like Erwin data modeler, Power Designer, Embarcadero E-R Studio and Microsoft Visio.
  • Having good knowledge in Normalization and De-Normalization techniques for optimum on XML data and XSD schema designing.
  • Experience in development methodologies like Agile and RUP
  • Strong written and oral communication skills, self-motivation, creativity and ability to adapt to new technologies and applications.
  • Self-motivated, excellent written/verbal communication and team work skills yet flexible to independent responsibilities.
  • Fast learner with excellent analytical skills and good communication skills
  • Excellent interpersonal skills and leadership abilities.
  • Ability to work effectively both as an individual and as a part of teams

TECHNICAL SKILLS:

ETL Tools: Talend Enterprise Edition for Data Integration, Big Data, MDM, Informatica Power center

Data Modeling: Dimensional Data Modeling, Erwin Modeling (Erwin 4.0/3.5.5/3.5.2 , Oracle Designer) Star Join Schema/Snowflake modeling, FACT & Dimensions tables

Programming Languages: Java, Spring, Struts, Apache, PL/SQL, Unix Shell scripting

RDBMS: Oracle 10G/9i/8.x, MySQL, MS SQL Server, MS Access

Languages: SQL, PL/SQL, C, C++, VB, Shell Scripting, Java and XML

Operating Systems: Microsoft Windows, MS-DOS, Linux (Red Hat), Windows Server 2003, 2008

Big Data Environment: Hadoop, HDFS, Pig, Hive, Hcatalog, HBase, Flume, Sqoop

Reporting Tools: Jasper Soft BI, SQL BI

Other Tools: Eclipse, SOAP UI, Oracle Sql Developer, TOAD, Microsoft Visio

WORK EXPERIENCE:

Talend Developer

Confidential, Charlotte, NC

Responsibilities:

  • Worked with Data Mapping Team to understand the source to target mapping rules.
  • Analyzed the requirements and framed the business logic for the ETL process using Talend.
  • Involved in the ETL design and its documentation.
  • Developed Jobs in Talend Enterprise edition from stage to source, intermediate, conversion and Target
  • Worked on Talend ETL to load data from various sources to Oracle DB. Used tmap, treplicate, tfilterrow, tsort and various other features in Talend.
  • Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc.
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File
  • Used Talend components such as tmap, tFileExist, tFileCompare, tELTAggregate, tOracleInput, tOracleOutput etc.
  • Designed and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2 to capture the changes
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Involved in automation of FTP process in Talend and FTPing the Files in UNIX.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Extracted data from Oracle as one of the source databases.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations

Environment: Talend Data integration 5.6.1, Oracle 11g, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, T-SQL, SSIS, TOAD, AIX, Shell Scripts, Autosys.

Talend Developer

Confidential, San Francisco, CA

Responsibilities:

  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Developed complex ETL mappings for Stage, Dimensions, Facts and Data marts load
  • Worked on Data Migration using export/import.
  • Created Talend jobs using the dynamic schema feature.
  • Load and transform data into HDFS from large set of structured data /Oracle/Sql server using Talend Big data studio.
  • Used Big Data components (Hive components) for extracting data from hive sources.
  • Wrote HiveQL queries using joins and implemented in tHiveInput component.
  • Utilized Big Data components like tHiveInput, tHiveOutput, tHDFSOutput, tHiveRow, tHiveLoad, tHiveConnection, tOracleInput, tOracleOutput, tPreJob, tPostJob, tLogRow.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Performance tuning - Using the tmap cache properties, Multi-threading and tParallelize components for better performance in case of huge source data. Tuning the SQL source queries to restrict unwanted data in ETL process.
  • Used more components in Talend and Few to be mentioned: tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend
  • Implemented Talend POC to extract data from Salesforce API as an XML Object & .csv files and load data into SQL Server Database.
  • Experienced in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
  • Implemented Error Logging, Error Recovery, and Performance Enhancement’s & created Audit Process (generic) for various Application teams.
  • Experience in using Repository Manager for Migration of Source code from Lower to higher environments.
  • Created Projects in TAC and Assign appropriate roles to Developers and integrated SVN (Subversion)
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)

Environment: Talend Platform for Big Data 5.6.2, Enterprise Platform for Data integration and MDM (V6.1.1,5.5.1, 5.6.1), UNIX, Oracle 11g, SQL Server 2012, Microsoft SQL Server management Studio, WINDOWS XP

ETL/Talend Developer

Confidential, San Jose, CA

Responsibilities:

  • Developed Data-architecture, ETL Batch processes in Talend, SQL-server and PostgreSQL for Call-center systems including architecture for parsing documents related to Call-center notes
  • Designed and implemented new environments and ETL frameworks for conformed data delivery and Call-Center-Analytics in Tableau and custom-Analytics platform
  • Worked on Dimensional Models, SCDs, Error-event schema, Audit dimensions to satisfy business requirement and worked with development teams to implement the Data-Models, Mapping-docs and ETL to satisfying technical requirements
  • Developed jobs, components and Joblets in Talend
  • Created complex mappings in Talend using tHash, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited as well as custom component such as tUnpivotRow.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database tableto record job history.
  • Created Talend Mappings to populate the data into dimensions and fact tables.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Implemented custom error handling in Talend jobs and also worked on different methods of logging.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and use them in the job.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.

Environment: Talend Open Studio 5.0.1, Informatica Power center, UNIX, Oracle, SQL Server, TOAD, AutoSys.

ETL Data Modeler

Confidential

Responsibilities:

  • Interacted with the business users on regular basis to consolidate and analyze the requirements.
  • Identified the Entities and the relationships between the Entities to develop a logical model and later translated into physical model.
  • Developed Facts & Dimensions using ERWIN.
  • Used Normalization up to 3NF and De-normalization for effective performance.
  • Developed Star and Snowflake schemas using dimensional data models.
  • Involved in designing OLAP data models and extensively used slowly changing dimensions (SCD).
  • Involved in technical impact analysis, which includes identifying the new/modified upstream interfaces to the DW, listing out the changes to ETL/Informatica jobs to load the new/modified data, identifying new entities in the DW and accordingly carry out impact analysis of the business views, which are used for reporting purposes.
  • Extracted data from databases like Oracle, SQL server and DB2 using Informatica to load it into a single repository for data analysis.
  • Worked on multiple data marts in Enterprise Data Warehouse (EDW).
  • Developed ETL mappings
  • Worked on Informatica Power Center Designer tools like Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Worked on Informatica Power Center Workflow Manager tools like Task Designer, Workflow Designer, and Worklet Designer.
  • Designed and developed Informatica power center medium to complex mappings using transformations such as the Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Rank, Sequence Generator, Stored Procedure and Update Strategy
  • Worked as a key project resource taking day-to-day work direction and accepting accountability for technical aspects of development.
  • Developed the business rules for cleansing/validating/standardization of data using Informatica Data Quality.
  • Designed and developed multiple reusable cleanse components.

Environment: Erwin r7.3, SQL/MS SQL Server, MS Analysis Services, Windows NT, MS Visio, XML, Informatica.

SQL Developer

Confidential

Responsibilities:

  • Migrated access objects to SQL Server 7.0.
  • Implemented the database for production and marketing purpose.
  • Created tables, indexes and constraints.
  • Experience in loading and maintaining Data Warehouses and Data marts using Informatics.
  • Created Triggers, Stored procedures for implementing the business logic. Administered of all SQL server database objects, logins, users and permissions in each registered server.
  • Managed Database files, transaction log and estimated space requirements. Managed ongoing activities like importing and exporting, backup and recovery.
  • Configured and monitored database application.
  • Designed and implemented several complex reports and administration modules.
  • Defined stored procedures for module integrity and application.
  • Wrote database triggers in T-SQL to check the referential integrity of the database

Environment: MS SQL Server 2000/2005, Windows Server 2003, T-SQL, MS Excel, MS Access, Crystal Reports, Windows XP.

We'd love your feedback!