We provide IT Staff Augmentation Services!

Talend Developer Resume

3.00/5 (Submit Your Rating)

Dallas-tX

SUMMARY

  • 7+ years of experience in IT Industry involving Software Analysis, Design, Implementation, Coding, Development, Testing and Maintenance with focus on Data warehousing applications using ETL tools like Talend and Informatica.
  • Extensive knowledge of Data Modeling / Architecture, Database Administration with specialization in Various ETL Platforms (Talend, Informatica).
  • Expertise on Talend Data Integration suite and Bigdata Integration Suite for Design and development of ETL/Bigdata code and Mappings for Enterprise DWH ETL Talend Projects.
  • Developed efficient mappings for data extraction/transformation/loading (ETL) from different sources to a target data warehouse.
  • Experience in integration of various data sources like Teradata, SQL Server, Oracle, DB2, Netezza and Flat Files.
  • Expertise in Data modeling techniques like Data Modeling - Dimensional/ StarSchema, and Snowflake modeling, Slowly Changing Dimensions (SCDType 1, Type 2, and Type 3).
  • Extensive experience in designing and developing complex mappings from varied transformation logic and used over 100+ Components in designing Jobs in Talend
  • Excellent understanding and best practice of Data Warehousing Concepts, involved in Full Development life cycle of Data Warehousing.
  • Expertise in enhancements/bug fixes, performance tuning, troubleshooting, impact analysis and research skills.
  • Created complex mappings in Talend using components like: tMap, tJoin, tReplicate, tParallelize, tAggregateRow, tDie, tUnique, tFlowToIterate, tSort, tFilterRow, tWarn, tbuffer, tcontextload.
  • Extensive experience in Relational and Dimensional Data modelling for creating Logical and Physical Design of Database and ER Diagrams using data modelling tools.
  • Worked extensively schema for Data Warehouse, ODS architecture by using tools like Erwin data modeler, Power Designer, Embarcadero E-R Studio and Microsoft Visio.
  • Having good knowledge in Normalization and De-Normalization techniques for optimum on XML data and XSD schema designing.
  • Experience in development methodologies like Agile and RUP
  • Strong written and oral communication skills, self-motivation, creativity and ability to adapt to new technologies and applications.

PROFESSIONAL EXPERIENCE

Confidential, Dallas-TX

Talend Developer

Responsibilities:

  • Involved in the ETL design and its documentation.
  • Developed Jobs in Talend Enterprise edition from stage to source, intermediate, conversion and Target
  • Worked with Data Mapping Team to understand the source to target mapping rules.
  • Analyzed the requirements and framed the business logic for the ETL process using talend.
  • Designing, developing and deploying end-to-end Data Integration solution.
  • Designed and Implemented the ETL process using Talend Enterprise Big Data Edition to load the data from Source to Target Database
  • Involved in Data Extraction from Oracle, Flat files and XML files using Talend by using Java as Backend Language
  • Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File
  • Used Talend components such as tmap, tFileExist, tFileCompare, tELTAggregate, tOracleInput, tOracleOutput etc.
  • Designed and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2 to capture the changes
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Conducted introductory and hands-on sessions of Hadoop HDFS architecture, Hive, Talend, Pig for other teams.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Involved in automation of FTP process in Talend and FTPing the Files in UNIX.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Extracted data from Oracle as one of the source databases.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations

Environment: Talend Data integration 5.6.1, Oracle 11g, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, T-SQL, SSIS, TOAD, AIX, Shell Scripts, Autosys.

Confidential, IL

Talend Developer

Responsibilities:

  • Responsible to build the jobs by looking to the ETL Specification documents.
  • Having good experience in ETL concepts and analytics.
  • Responsible to migrate the People soft data into Oracle system using Talend DI programs.
  • Having good experience in developing the master child jobs using tRunjob components.
  • Good experience in working all Data Integration components like tmap, tfileInput Delimiter, tfileOutputDelimiter, TflowToiterate, tunique and tjava components.
  • Responsible for developing the DI jobs to implement the address validations, cleans and standardization on Talend ETL with different components and used features such as context variables, parameter files database components.
  • Working with mainframe technology to process the files from egateway.
  • Having experience with Talend with big data and distribution is Cloudera and Hortonworks
  • Responsible for understanding & deriving the new requirements from Business Analysts/Stakeholders.
  • Responsible for Data Ingestion to inject data into Data Lake using multiple sources systems using Talend Bigdata using Hive and HBase components.
  • Responsible for deliverables as committed in the sprint planning.
  • Having experience in performance tuning techniques.
  • Having experience working with Oracle BI tools and GitHub management.
  • Good experience in troubleshooting the issues and Talend components.

Confidential

ETL/Talend developer

Responsibilities:

  • Closely worked with Data Architects in designing of tables and even involved in modifying technical Specifications.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Developed complex ETL mappings for Stage, Dimensions, Facts and Data marts load
  • Good experience in Data Migration using export/import.
  • Created Talend jobs using the dynamic schema feature.
  • Experienced on AWS server components to store data in (S2, S3) buckets using Talend tool.
  • Worked with different Sources such as Oracle, SQL Server and Flat files
  • Worked on the project documentation and also prepared the Source Target mapping specs with the business logic and also involved in data modelling
  • Created multiple integration big data projects using Kafka, scop, Hbase, Pig, Hive
  • Load and transform data into HDFS from large set of structured data /Oracle/Sql server using Talend Big data studio.
  • Have used Big Data components (Hive components) for extracting data from hive sources.
  • Wrote HiveQL queries using joins and implemented in tHiveInput component.
  • Utilized Big Data components like tHiveInput, tHiveOutput, tHDFSOutput, tHiveRow, tHiveLoad, tHiveConnection, tOracleInput, tOracleOutput, tPreJob, tPostJob, tLogRow.
  • Experienced in writing SQL Queries and used Joins to access data from, and MySQL.
  • Implemented Error Logging, Error Recovery, and Performance Enhancement & created Audit Process (generic) for various Application teams.
  • Experience in using Repository Manager for Migration of Source code from Lower to higher environments.
  • Created Projects in TAC and Assign appropriate roles to Developers and integrated SVN (Subversion)
  • Used debugger and breakpoints to view transformations output and debug mappings.
  • Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation Logic.
  • Experienced in working with snowflake data warehouse with AWS components.
  • Worked on cAWSConnection, cAWSS3, cAWSSES .

Environment: Talend Platform for Big Data 5.6.2, Enterprise Platform for Data integration and MDM (V6.1.1,5.5.1, 5.6.1), UNIX, 11g, SQL Server 2012, Microsoft SQL Server management Studio, WINDOWS XP

Confidential

ETL developer

Responsibilities:

  • Developed ETL process to load Oracle data to Sql Server System using following Talend Components:
  • Oracle Components - tOracleConnection, tOracleInput, tOracleBulkExec
  • MSSql Server Components - tMSSqlConnection, tMSSqlInput, tMSSqlOutput, tMSSqlRow, tMSSqlCommit
  • Job/Context related - tRunJob, tJava, tJavaRow, tFileInputDelimited, tFileOutputDelimited, tBufferInput, tBufferOutput, RestClient, tRest, tWriteJSONField, tExtractJSONFields, tFlowToIterate, tFilterRow, tSendEmail
  • Worked on linux system (Red Hat) to deploy the talend code.
  • Deployed the code using shell scripts in other machines.
  • Worked extensively on Sql Queries for validating the records.
  • Worked on paginating the sql statements in the ETL flow to handle the memory issues and to improve the performance.
  • Worked on handling the dead lock errors while updating the Sql Server tables in the ETL flow.
  • Parameterized the overall work flow to execute the code in different environments.
  • Parallelized the workflows to improve the time for execution.
  • Developed ETL mappings
  • Worked on Informatica Power Center Designer tools like Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Worked on Informatica Power Center Workflow Manager tools like Task Designer, Workflow Designer, and Worklet Designer.
  • Designed and developed Informatica power center medium to complex mappings using transformations such as the Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Rank, Sequence Generator, Stored Procedure and Update Strategy
  • Worked as a key project resource taking day-to-day work direction and accepting accountability for technical aspects of development.
  • Developed the business rules for cleansing/validating/standardization of data using Informatica Data Quality.
  • Designed and developed multiple reusable cleanse components.

Environment: Talend Open Studio 5.0.1, Informatica Power center, UNIX,, SQL Server, TOAD, AutoSys.

We'd love your feedback!