We provide IT Staff Augmentation Services!

Sr Talend Developer Resume

4.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • 7+ years of experience in the IT industry and wide range of progressive experience in providing product specifications, design, analysis, development, documentation, coding and implementation of the business technology solutions in Data warehousing applications.
  • Extensive experience in development and maintenance in a corporate wide ETL solution using SQL, PL/SQL, TALEND 4.x/5.x/6.x on UNIX and Windows platforms.
  • Strong experience with Talend tools - Data integration, big data and experience in Data Mapper, Joblets, Meta data and Talend components, jobs
  • Extensive experience in integration of various heterogeneous data sources definitions like SQL Server, Oracle, Flat Files, Excel files loaded data into Data warehouse and Data marts using Talend Studio.
  • Experiences on databases like MySQL, Oracle using RDS of AWS.
  • Experienced in ETL TALEND Data Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.
  • Experience in scheduling tools Autosys, Control M & Job Conductor (Talend Admin Console).
  • Good experience with Big Data, Hadoop, HDFS, Map Reduce and Hadoop Ecosystem (Pig & Hive) technologies.
  • Extensively created mappings in TALEND using tMap, tJoin, tReplicate, tConvert Type, tFlow Meter, tLog Catcher, tNormalize, tDenormalize, tJava, tAggregate Row, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.
  • Excellent Experiences onNOSQL databases like HBase and Cassandra.
  • Excellent understanding of Hadoop architecture, Hadoop Distributed File System and API's.
  • Extensive knowledge of business process and functioning of Heath Care, Manufacturing, Mortgage, Financial, Retail and Insurance sectors.
  • Strong skills in SQL and PL/SQL, backend programming, creating database objects like Stored Procedures, Functions, Cursors, Triggers, and Packages.
  • Experience in AWS S3, EC2, SNS, SQS setup, Lambda, RDS (MySQL) and Redshift cluster configuration.
  • Experienced in Waterfall, Agile/Scrum Development.
  • Good knowledge in implementing various data processing techniques using Pig and MapReduce for handling the data and formatting it as required.
  • Extensively use ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems like SQL Server, Oracle, DB2 and non-relational sources like XML, flat files, and mainframe Files.
  • Well versed in developing various database objects like packages, stored procedures, functions, triggers, tables, indexes, constraints, views in Oracle11g/10g
  • Hand on Experience in running Hadoop streaming jobs to process terabytes of xml format data using Flume and Kafka.
  • Hands-on proficiency in one or more scripting languages (e.g., Java, Python, Scala, R, Shell scripting)
  • Have experience Develop data ingestion jobs in tools such as Talend to acquire, stage, and aggregate data in technologies like Hive, Spark, HDFS, and Greenplum.
  • Worked in designing and developing the Logical and physical model using Data modeling tool (ERWIN).
  • Experienced in Code Migration, Version control, scheduling tools, Auditing, shared folders and Data Cleansing in various ETL tools.
  • Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.
  • Strong Team working spirit, relationship management and presentation skills.
  • Expertise in Client-Server application development using MS SQL Server … Oracle … PL/SQL, SQL *PLUS, TOAD and SQL*LOADER. Worked with various source systems such as Relational Sources, Flat files, XML, Mainframe COBOL and VSAM files, SAP Sources/Targets etc.
  • Work hands-on with integration processes for the Enterprise Data Warehouse (EDW).
  • Knowledge in writing, testing and implementation of the Stored Procedures, Functions and triggers using Oracle PL/SQL & T-SQL, Teradata data warehouse using BTEQ, COMPRESSION techniques, FASTEXPORT, MULTI LOAD, TPump and FASTLOAD scripts.

TECHNICAL SKILLS

ETL/Middleware Tools: Talend 5.5/5.6/6.2/7.1 , Informatica Power Center 9.5.1/9.1.1/8.6.1/7.1.1

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.

Business Intelligence Tools: Business Objects 6.0, Cognos 8BI/7.0. s, Sybase, OBIEE 11g/10.1.3.x

RDBMS: Oracle 11g/10g/9i, Netezza, Teradata, Redshift, MS SQL Server 2014/2008/2005/2000 , DB2, MySQL, MS Access.

Programming Skills: SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Java, .Net, Netezza.

Modeling Tool: Erwin 4.1/5.0, MS Visio.

Tools: TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI, Fish eye, Subversion, Share Point, IP switch user, Teradata SQL Assistant.

Operating Systems: Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.

PROFESSIONAL EXPERIENCE

Sr Talend Developer

Confidential, Phoenix Az

Responsibilities:

  • Worked in the Data Integration Team to perform data and application integration with a goal of moving high volume data more effectively, efficiently and with high performance to assist in business critical projects.
  • Has developed custom components and multi-threaded configurations with a flat file by writing JAVA code in Talend.
  • Interacted with Solution Architects and Business Analysts to gather requirements and update Solution Architect Document Created mappings and sessions to implement technical enhancements.
  • Deployed and scheduled Talend jobs in Administration console and monitoring the execution
  • Created separate branches with in the Talend repository for Development, Production and Deployment.
  • Excellent knowledge with Talend Administration console, Talend installation, using Context and global map variables in Talend.
  • Review requirements to help build valid and appropriate DQ rules and implement DQ Rules using Talend DI jobs.
  • Create cross-platform Talend DI jobs to read data from multiple sources like Hive, Hana, Teradata, DB2, Oracle, ActiveMQ.
  • Create Talend Jobs for data comparison between tables across different databases, identify and report discrepancies to the respective teams.
  • Talend Administrative tasks like - Upgrades, create and manage user profiles and projects, manage access, monitoring, setup TAC notification.
  • Observed statistics of Talend jobs in AMC to improve the performance and in what scenarios errors are causing.
  • Created Generic and Repository schemas.
  • Performed Data Manipulations using various Talend Components like tMap. tjavarow, tjava, tOracleRow, tOracle Input, tOracle Output, tMS SQL Input and many more.
  • Implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvert Type, tSort Row, tReplace, tAggregateRow, tUnite etc.
  • Created standard and best practices for Talend ETL components and jobs.
  • Extraction, transformation and loading of data from various file formats like .csv, .xls, .txt and various delimited formats using Talend Open Studio.
  • Worked on HIVE QL to get the data from hive database.
  • Responsible for developing data pipeline with Amazon AWS to extract the data from weblogs and store in HDFS
  • Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements.
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
  • Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query performance.
  • Configure Talend Administration Center (TAC) for scheduling and deployment.
  • Create and schedule Execution Plans - to create Job Flows
  • Worked with production support in finalizing scheduling of workflows and database scripts using AutoSys.

Environment: Talend 7.1/6.2.1/6.0.1 , Talend Open Studio Big Data/DQ/DI, Talend Administrator Console, Oracle 11g, Teradata V 14.0, Hive, HANA, PL/SQL, DB2, XML, JAVA. ERwin 7, UNIX Shell Scripting.

Talend Developer

Confidential, Houston, TX

Responsibilities:

  • Worked on SSAS in creating data sources, data source views, named queries, calculated columns, cubes, dimensions, roles and deploying of analysis services projects.
  • SSAS Cube Analysis using MS-Excel and PowerPivot.
  • Implemented SQL Server Analysis Services (SSAS) OLAP Cubes with Dimensional Data Modeling Star and Snowflakes Schema.
  • Responsible for MDM for Customer Data Talend MDM Customers Suppliers Products Assets Agencies Stores Address Standardizations and Reference Data Employees MDM is about creating and managing the golden records of your business.
  • Developed standards for ETL framework for the ease of reusing similar logic across the board.
  • Analyse requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools.
  • Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes and constraints.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Implemented custom error handling in Talend jobs and worked on different methods of logging.
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Responsible for develop the modelling using Talend MDM at the same time responsible to develop the DI jobs to populate the data in REF/XREF tables and to create the data stewardship tasks.
  • Exposure of ETL methodology for supporting Data Extraction, Transformation and Loading process in a corporate-wide ETL solution using Talend Open Source for Data Integration 5.6. worked on real time Big Data Integration projects leveraging Talend Data integration components.
  • Analyzed and performed data integration using Talend open integration suite.
  • Wrote complex SQL queries to inject data from various sources and integrated it with Talend.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
  • Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.
  • Scheduled the workflows using Shell script.
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput&tHashOutput and many more)
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Developed stored procedure to automate the testing process to ease QA efforts and reduced the test timelines for data comparison on tables.
  • Automated SFTP process by exchanging SSH keys between UNIX servers.
  • Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.
  • Involved in production n deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
  • Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.

Environment: Talend 5.x,5.6, XML files, DB2,MDM Oracle 11g, Snowflake, SQL server 2008, SQL, MS Excel, MS Access, UNIX Shell Scripts, TOAD, Autosys.

Confidential, Houston, TX

Talend / ETL Developer

Responsibilities:

  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders. experience of data warehouse solutions using implementations of star and snowflake schemas
  • Experienced in fixing errors by using debug mode of Talend.
  • Created complex mappings using tHashOutput, tMap, tHashInput, tDenormalize, tUniqueRow. tPivot To Columns Delimited, tNormalize etc.
  • Schedule the Talend jobs with Talend Admin Console, setting up best practices and migration strategy.
  • Used components like tJoin, tMap, tFilterRow, tAggregateRow, tSortRow, Target Connections and Source Connections.
  • Mapping source files and generating Target files in multiple formats like XML, Excel, CSV etc.
  • Transform the data and reports retrieved from various sources and generating derived fields.
  • Reviewed the design and requirements documents with architects and business analysts to finalize the design.
  • Created WSDL data services using Talend ESB.
  • Created Rest Services using tRESTRequest and tRESTResponse components.
  • Used tESBConsumer component to call a method from invoked Web Service.
  • Implemented few java functionalities using tJava and tJavaFlex components.
  • Developed shell scripts, PL/SQL procedures for creating/dropping of table and indexes of performance.
  • Attending the technical review meetings.
  • Implemented Star Schema for De-normalizing data for faster data retrieval for Online Systems.
  • Involved in unit testing and system testing and preparing Unit Test Plan (UTP) and System Test Plan (STP) documents.
  • Responsible for monitoring all the jobs that are scheduled, running completed and failed. Involved in debugging the jobs that failed using debugger to validate the jobs and gain troubleshooting information about data and error conditions.
  • Performed metadata validation, reconciliation and appropriate error handling in ETL processes.
  • Developed various reusable jobs and used as sub-jobs in other jobs.
  • Used Context Variable to increase the efficiency of the jobs
  • Extensive use of SQL commands with TOAD environment to create Target tables.

Environment: Talend 5.1, Oracle 11g, DB2, Sybase, SnowFlakeMS Excel, MS Access, TOAD, SQL, UNIX, M.

SQL/BI Developer

Confidential

Responsibilities:

  • Responsible for designing and developing of mappings, mapplets, sessions and work flows for load the data from source to target database using Informatica Power Center and tuned mappings for improving performance.
  • Created database objects like views, indexes, user defined functions, triggers and stored procedures.
  • Involved in ETL process from development to testing and production environments.
  • Extracted date from various sources like Flat files, Oracle and loaded it into Target systems using Informatica 7.x.
  • Developed mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Used Informatica Power Center Workflow Manager to create sessions, batches to run with the logic embedded in the mappings.
  • Tuned mappings and SQL queries for better performance and efficiency.
  • Automated existing ETL operations using Autosys.
  • Created & Ran shell scripts in UNIX environment.
  • Created and ran the Workflows using Workflow manager in Informatica Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.
  • Created tables and partitions in database Oracle.

Environment: Informatica Power Center 8.x, Oracle, SQL developer, MS Access, PL/SQL, UNIX Shell Scripting, SQL Server 2005, Windows XP.

We'd love your feedback!