We provide IT Staff Augmentation Services!

Sr. Etl/talend Developer Resume

5.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • Over 6 years of experience in design, analysis, development, documentation, coding and implementation of the business technology solutions in Data warehouse applications.
  • Experience in development and maintenance of ETL solution using SQL, PL/SQL, TALEND 4.x/5.x/6.x on UNIX and Windows platforms.
  • Expertise in Data Warehouse, ETL Maintenance, Master Data Management (MDM) strategy, Data Quality and Big Data Eco Systems.
  • Talend tools - Data integration, big data and experience in Data Mapper, Joblets, Meta data and Talendcomponents, jobs
  • Experience in ETL TALEND Data Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.
  • Created mappings in TALEND using tMap, tJoin, tReplicate, tConvertType, tFlowMeter, tLogCatcher, tNormalize, tDenormalize, tJava, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.
  • ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems like SQL Server, Oracle, DB2 and non-relational sources like XML, flat files, and mainframe Files.
  • Scheduling tools Autosys, Control M & Job Conductor (Talend Admin Console).
  • Good experience with Big Data, Hadoop, HDFS, Map Reduce and Hadoop Ecosystem (Pig & Hive) technologies.
  • NOSQLdatabases like HBaseand Cassandra.
  • Understanding of Hadoop architecture, Hadoop Distributed File System and API's.
  • Knowledge of business process and functioning of Heath Care, Manufacturing, Mortgage, Financial, Retail and Insurance sectors.
  • Strong skills in SQL and PL/SQL, backend programming, creating database objects like Stored Procedures, Functions, Cursors, Triggers, and Packages.
  • Experience in AWS S3, EC2, SNS, SQS setup, Lambda, RDS (MySQL) and Redshift cluster configuration.
  • Experience in Waterfall, Agile/Scrum Development. strong Knowledge of Teradata and Oracle
  • Experiences on Software testing as QA/ETL/Data Warehousing/BI Tester/DB2and experienced in testing of Web Based, Data Warehousing, OLAP, OLTP Applications.
  • Good knowledge in implementing various data processing techniques using Pig and MapReduce for handling the data and formatting it as required.
  • Well versed in developing various database objects like packages, stored procedures, functions, triggers, tables, indexes, constraints, views in Oracle11g/10g
  • Hands on experience in running Hadoop streaming jobs to process terabytes of xml format data using Flume and Kafka.
  • Worked in designing and developing Logical and physical model using Data modeling tool (ERWIN).
  • Experienced in Code Migration, Version control, scheduling tools, Auditing, shared folders and Data Cleansing in various ETL tools.
  • Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.
  • Strong Team working spirit, relationship management and presentation skills.
  • Expertise in Client-Server application development using MS SQL Server … Oracle … PL/SQL, SQL *PLUS, TOAD and SQL*LOADER. Worked with various source systems such as Relational Sources, Flat files, XML, Mainframe COBOL and VSAM files, SAP Sources/Targets etc.
  • Work hands-on with integration processes for the Enterprise Data Warehouse (EDW).
  • Knowledge in writing, testing and implementation of the Stored Procedures, Functions and triggers using Oracle PL/SQL & T-SQL, Teradata data warehouse using BTEQ, COMPRESSION techniques, FASTEXPORT, MULTI LOAD, TPumpand FASTLOAD scripts.

TECHNICAL SKILLS:

ETL/Middleware Tools: Talend 5.5/5.6/6.2, Informatica Power Center 9.5.1/9.1.1/8.6.1/7.1.1

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.

Business Intelligence Tools: Business Objects 6.0, Cognos 8BI/7.0.s, Sybase, OBIEE 11g/10.1.3.x

RDBMS: Oracle 11g/10g/9i, Netezza, Teradata, Redshift, MS SQL Server 2014/2008/2005/2000, DB2, MySQL, MS Access.

Programming Skills: SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Java, .Net, Netezza.

Modeling Tool: Erwin 4.1/5.0, MS Visio.

Tools: TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI, Fish eye, Subversion, Share Point, IP switch user, Teradata SQL Assistant.

Operating Systems: Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.

Testing: DB2/ETL/Java/GUI/Mainframes Testing

PROFESSIONAL EXPERIENCE:

Confidential, Chicago, IL

Sr. ETL/Talend Developer

Responsibilities:

  • Participated in Requirement gathering, Business Analysis, User meetings and translating user inputs into ETL mapping documents.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building the Data Ingestion architecture and Source to Target mapping to load data into Data warehouse
  • Extensively leveraged the Talend Big Data components (tHDFSOutput, tPigmap, tHive, tHDFSCon) for Data Ingestion and Data Curation from several heterogeneous data sources
  • Worked with Data mapping team to understand the source to target mapping rules.
  • Prepared both High level and Low level mapping documents.
  • Analyzed the requirements and framed the business logic and implemented it using Talend.
  • Involved in ETL design and documentation.
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Created ETL test data for all ETL mapping rules to test the functionality of Informatica mappings.
  • Wrote Initial and Incremental load ETL test cases to validate the data flow between each layer in data warehouse system (i.e. source to Ingest to ECS to DCS layers).
  • Developed Talend jobs from the mapping documents and loaded the data into the warehouse.
  • Involved in end-to-end Testing of Talend jobs.
  • Analyzed and performed data integration using Talend open integration suite.
  • Experience in loading data into Netezza db using NZLOAD utility.
  • Experience in working with large data warehouses, mapping and extracting data from legacy systems, Redshift/SQL Server … UDB databases.
  • Worked on the design, development and testing of Talend mappings.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Involved in loading the data into Netezza from legacy and flat files using Unix scripts. Worked on Performance Tuning of Netezza queries with proper understanding of joins and Distribution
  • Created ETL job infrastructure using Talend Open Studio.
  • Have programming skills in SQL and PL/SQL and experience in Oracle databases on UNIX and Windows platforms.
  • Responsible for MDM for Customer DataTalend MDM Customers Suppliers Products Assets Agencies Stores Address Standardizations and Reference Data Employees MDM is about creating and managing the golden records of your business
  • Developed the business rules for cleansing/validating/standardization of data using Informatica Data Quality.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Responsible for develop the modelling using Talend MDM at the same time responsible to develop the DI jobs to populate the data in REF/XREF tables and to create the data stewardship tasks.
  • Experience in using Talend MDM components like tMDMBulkLoad, tMDMClose, tMDMCommit, tMDMConnection, tMDMDelete, tMDMInput, tMDMOutput, tMDMReceive, tMDMRollback,
  • Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow etc.
  • Used Database components like tMSSQLInput, tOracleOutput etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist.
  • Developed standards for ETL framework for the ease of reusing similar logic across the board.
  • Analyzed requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools
  • Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
  • Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Responsible for develop the jobs using ESB components like tESBConsumer, tESBProviderFault, tESBProviderRequest, tESBProviderResponse, tRESTClient, tRESTRequest, tRESTResponse to get the service calls for customers DUNS numbers.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.
  • Scheduled the workflows using Shell script.
  • Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
  • Troubleshoot database, Joblets, mappings, source, and target to find out the bottlenecks and improved the performance.
  • Involved rigorously in Data Cleansing and Data Validation to validate the corrupted data.
  • Migrated Talend mappings/Job /Joblets from Development to Test and to production environment.

Environment: Talend 6.x, XML files, DB2, Oracle 11g, Netezza 4.2, SQL server 2008, SQL, MS Excel, MS Access, UNIX Shell Scripts, Talend Administrator Console, Cassandra, Oracle, Jira, SVN, Quality Center, Agile, TOAD, Autosys

Confidential, Phoenix,AZ

Sr. Talend / ETL Developer

Responsibilities:

  • Worked on SSAS in creating data sources, data source views, named queries, calculated columns, cubes, dimensions, roles and deploying of analysis services projects.
  • SSAS Cube Analysis using MS-Excel and PowerPivot.
  • Implemented SQL Server Analysis Services (SSAS) OLAP Cubes with Dimensional Data Modeling Star and Snow Flakes Schema.
  • Created ETL test data for all ETL mapping rules to test the functionality of the Informatica.
  • Created ETL test data for all ETL mapping rules to test the functionality of the business systems
  • Developed standards for ETL framework for the ease of reusing similar logic across the board.
  • Analyze requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools.
  • Extensively used DWH ETL for testing and supporting data extraction, transformations and loading processing in a corporate-wide-ETL Solution using Informatica.
  • Tested reports developed by Business Objects Universes for both adhoc&canned Reporting users of Business Objects XI
  • Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes and constraints.
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL Process
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Implemented custom error handling in Talend jobs and also worked on different methods of logging.
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs
  • Responsible for develop the jobs using ESB components like tESBConsumer, tESBProviderFault, tESBProviderRequest, tESBProviderResponse, tRESTClient, tRESTRequest, tRESTResponse to get the service calls for customers DUNS numbers..
  • Exposure of ETL methodology for supporting Data Extraction, Transformation and Loading process in a corporate-wide ETL solution using Talend Open Source for Data Integration 5.6. worked on real time Big Data Integration projects leveraging Talend Data integration components.
  • Analyzed and performed data integration using Talend open integration suite.
  • Wrote complex SQL queries to inject data from various sources and integrated it with Talend.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
  • Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
  • Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Created WSDL data services using Talend ESB.
  • Created Rest Services using tRESTRequest and tRESTResponse components.
  • Used tESBConsumer component to call a method from invoked Web Service.
  • Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.
  • Scheduled the workflows using Shell script.
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput&tHashOutput and many more)
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Developed stored procedure to automate the testing process to ease QA efforts and reduced the test timelines for data comparison on tables.
  • Automated SFTP process by exchanging SSH keys between UNIX servers.
  • Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.
  • Involved in production n deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
  • Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.

Environment: Talend 5.x,5.6, XML files, DB2, Oracle 11g, SQL server 2008, SQL, MS Excel, MS Access, UNIX Shell Scripts, TOAD, Autosys.

Confidential, Philadelphia, PA

Talend / ETL Developer

Responsibilities:

  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Experienced in fixing errors by using debug mode of Talend.
  • Created complex mappings using tHashOutput, tMap, tHashInput, tDenormalize, tUniqueRow. tPivot To Columns Delimited, tNormalize etc.
  • Schedule the Talend jobs with Talend Admin Console, setting up best practices and migration strategy.
  • Used components like tJoin, tMap, tFilterRow, tAggregateRow, tSortRow, Target Connections and Source Connections.
  • Mapping source files and generating Target files in multiple formats like XML, Excel, CSV etc.
  • Transform the data and reports retrieved from various sources and generating derived fields.
  • Reviewed the design and requirements documents with architects and business analysts to finalize the design.
  • Created WSDL data services using Talend ESB.
  • Created Rest Services using tRESTRequest and tRESTResponse components.
  • Used tESBConsumer component to call a method from invoked Web Service.
  • Implemented few java functionalities using tJava and tJavaFlex components.
  • Developed shell scripts, PL/SQL procedures for creating/dropping of table and indexes of performance.
  • Attending the technical review meetings.
  • Implemented Star Schema for De-normalizing data for faster data retrieval for Online Systems.
  • Involved in unit testing and system testing and preparing Unit Test Plan (UTP) and System Test Plan (STP) documents.
  • Responsible for monitoring all the jobs that are scheduled, running completed and failed. Involved in debugging the jobs that failed using debugger to validate the jobs and gain troubleshooting information about data and error conditions.
  • Performed metadata validation, reconciliation and appropriate error handling in ETL processes.
  • Developed various reusable jobs and used as sub-jobs in other jobs.
  • Used Context Variable to increase the efficiency of the jobs
  • Extensive use of SQL commands with TOAD environment to create Target tables.

Environment: Talend 5.1, Oracle 11g, DB2, Sybase, MS Excel, MS Access, TOAD, SQL, UNIX.

Confidential

MS SQL Server Developer (SSIS/SSRS/DBA)

Responsibilities:

  • Involved in installation and Configuration of SQL server 2005.
  • Involved in developing logical and physical modeling of the database using Erwin.
  • Highly used T-SQL for developing complex Stored Procedures, Triggers and User Defined Functions, Views, Indexes.
  • Used SQL Profiler, System Monitor, Query Analyzer for Troubleshooting, Monitoring, Optimization and Tuning of SQL Server.
  • Created Indexes and Rebuild indexes for better Query performance.
  • Migrated DTS 2000 packages to SQL Server Integration Services (SSIS) packages and also modified the packages accordingly using the new features of SQL Server Integration Services.
  • Created SSIS packages using various Transformations for export data from different data sources and
  • Transformed data and load it to SQL Server 2005.
  • Implemented error handling and roll back process in SSIS Packages.
  • Used SSIS package Configuration, changing the variables dynamically and SSIS logging.
  • Provided Security for SSIS packages and also used protection levels in SSIS packages.
  • Responsible for Deploying, Scheduling jobs, Alerting and Maintaining SSIS packages.
  • Created and Configured OLAP Cubes (Star Schema and Snowflake Schema) Using SQL Server Analysis Services.
  • Migrated and recreated existing Dimensions and cubes using Star scheme on SQL Server to achieve the efficiency of SQL Server Analysis (SSAS).
  • Built MDX queries for Analysis Services & Reporting Services.
  • Responsible for gathering/refining requirements from the customer for developing reports.
  • Responsible for Full Report cycle including Authoring, Managing, Security and Generation of reports.
  • Developed Query for generating drill down, drill through, parameterized, cascaded reports in SSRS 2005.
  • Backing up, Restoring system & other databases as per requirements, and also scheduled those backups.
  • Managing the security of the servers, creating the new logins and users, changing the roles of the users.
  • Worked on database security, Replication and Log shipping activities.

Environment: MS SQL 2000/2005, Windows NT/2000/XP, SSIS, SSRS, SSAS, MS Access, MS Excel, VB.Net, Visio

We'd love your feedback!