We provide IT Staff Augmentation Services!

Talend/data Integration Consultant Resume

4.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • 5+ years of experience in IT Industry involving Software Analysis, Design, Implementation, Coding, Development, Testing and Maintenancewith focus on Data warehousing applications using ETL tools like Talend and Informatica.
  • 2 plus years of experience using Talend Integration Suite / Talend Open Studio and with Talend Admin Console (TAC).
  • Experience working with Data Warehousing Concepts like Kimball/ Inmon methodologies, OLAP, OLTP, Star Schema, Snow Flake Schema, Logical/Physical/ Dimensional Data Modeling.
  • Highly Proficient in Agile, Test Driven, Iterative, Scrum and Waterfall software development life cycle.
  • Extensively used ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Netezza, Oracle, DB2, SQL server, Teradata, Hive, Hanaand non - relational sources like flat files, XML and Mainframe Files.
  • Experience in analyzing data using HiveQL and Pig Latin in HDFS(Hadoop Distributed File System).
  • Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads.
  • Experience in using cloud components and connectors to make API calls for accessing data from cloud storage (Google Drive, Salesforce, Amazon S3, DropBox) inTalend Open Studio.
  • Experience in creatingJoblets in TALEND for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
  • Experience in monitoring and scheduling using AutoSys, Control M& Job Conductor (Talend Admin Console) and using UNIX (Korn& Bourn Shell) Scripting.
  • Expertise in creating sub jobs in parallel to maximize the performance and reduce overall job execution time with the use of parallelize component of Talend in TIS and using the Multithreaded Executions in TOS.
  • Strong experience in design, develop and deployment of the SSIS packages and SSRS reports using SQL Server DATA Tools
  • Experienced in creating Triggers on TAC server to schedule Talend jobs to run on server.
  • Worked extensively on Error Handling, Performance Analysis and Performance Tuning of Informatica ETL Components, Teradata Utilities, UNIX Scripts, SQL Scripts etc.
  • Strong decision-making and interpersonal skills with result oriented dedication towards goals.

TECHNICAL SKILLS

Operating Systems: Windows, UNIX (SunSolaris10, HP,AIX) & Linux

Big Data Hadoop/ETL Tools: Talend, TOS, TIS, Workflow Manager, Workflow Monitor,Spark, Hive, HDFS, Map Reduce, Hive, Pig, Scala, Data Lake

Databases: Oracle 12c, MS SQL Server, DB2 v8.1, Netezza, Teradata, Hbase.

Modelling: Data Modeling - Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, SnowFlake, Fact, Dimensions), Entities, Attributes.

Programming Languages: T-SQL, PL/SQL, HTML, XML, Javascript, Korn shell script & Windows batch scripting

Scheduling Tools: Autosys, Control-M

Testing Tools: QTP, WinRunner, LoadRunner, Quality Center, Test Director, Clear test

Other tools: SQL Developer, AginityWorkBench, Teradata SQL Assistant, Toad, Putty, MS-Office,VMWare Workstation

PROFESSIONAL EXPERIENCE

Confidential, Dallas, TX

Talend/Data Integration Consultant

Responsibilities:

  • Acquire and interpret business requirements, Create Technical artifacts, and determine the most efficient/appropriate solution design, thinking from an enterprise-wide view.
  • Worked in the Data Integration Team to perform data and application integration with a goal of moving moredata more effectively, efficiently and with high performance to assist in business critical projects coming up withhuge data extraction.
  • Extensively used tSAPBapi component to load and read data from SAP System.
  • Perform Technical analysis, ETL design, Development, Testing, and Deployment of IT solutions as needed by business or IT.
  • Participate in designing the overall logical & physical Data warehouse/Data-mart data model and data architectures to support business requirements
  • Explore prebuilt ETL metadata, Mappings and DAC metadata and Develop and maintain SQL code as needed for SQL Server database.
  • Performed data manipulations using various Talend components like tMap, tJavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
  • Analyzing the source data to know the quality of data by using Talend Data Quality.
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
  • Worked on Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those toNetezza.
  • Used SQL queries and other data analysis methods, as well as TalendEnterprise Data Quality Platform for profilingand comparison of data, which will be used to make decisions regarding how to measure business rules andquality of the data.
  • Worked on TalendRTX ETL tool, develop jobs and scheduled jobs in Talend integration suite.
  • Writing Netezza SQL queries to join or any modifications in the table
  • Used Talend reusable components like routines, context variable and globalMap variables.
  • Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and queryPerformance.
  • Processed Daily expenses through SSIS jobs by collecting data from Concur FTP servers.
  • Implementing fast and efficient data acquisition using Big Data processing techniques and tools.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).
  • Developed Oracle PL/SQL, DDLs, and Stored Procedures and worked on performance and fine Tuning of SQL.

Environment: Talend 6.4, Netezza, Oracle 12c, IBM DB2, Hadoop,TOAD, Aginity, BusinessObjects 4.1, MLOAD, SQL Server 2012, XML, SQL, Hive, Pig, SQL, PL/SQL, Map Reduce, HP ALM, JIRA.

Confidential - Pittsburgh, PA

ETL Developer/Talend Developer

Responsibilities:

  • Worked with Data mapping team to understand the source to target mapping rules.
  • Analyzed the requirements and framed the business logic and implemented it using Talend.
  • Involved in ETL design and documentation.
  • Analyzed and performed data integration using Talend open integration suite.
  • Worked on the design, development and testing of Talend mappings.
  • Created ETL job infrastructure using Talend Open Studio.
  • Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow, tJava, Tjavarow, tConvertType etc.
  • Used Database components like tMSSQLInput, tMsSqlRow, tMsSqlOutput, tOracleOutput, tOracleInput etc.
  • Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete, tFileRename.
  • Worked on improving the performance of Talend jobs.
  • Created triggers for a Talend job to run automatically on server.
  • Worked on Exporting and Imporrting of Talend jobs.
  • Created jobs to pass parameters from child job to parent job.
  • Exported jobs to Nexus and SVN repository.
  • Developed project specific 'Deployment' job responsible to deploy Talend jar files on to the windows environment as a zip file, later, this zip file is unzipped and the files are again deployed to the unix box.
  • Also, this deployment job is responsible to maintain versioning of the Talend jobs that are deployed in the unix environment.
  • Developed shell scripts in unix environment to support scheduling of the Talend jobs.
  • Monitored the daily runs, weekly runs and adhoc runs to load data into the target systems.
  • Worked on workflows configured to run concurrently having different source system names running in different regions.
  • Worked on ETL layered approach of Stage or landing tables, Core tables and reporting tables
  • Created various ETL components like Change Sets, Data Feeds, Data Elements, Categories, Data Marts and Extracts.
  • Involved in coding UNIX scripts to automate multiple ETL Jobs.
  • Performed Tuning by identifying and eliminating the bottlenecks occurred for increasing the performance.
  • Extensively used almost all of the transformations of Informatica including lookups, Stored Procedures, Update Strategy and others.

Environment: Talend 5.6, UNIX, Shell script, SQL Server, Hadoop,, Hive, Pig, Oracle, Business Objects, ERwin, SVN, Redgate, Capterra, Informatica Power Center 9.1/Metadata Manager, Map Reduce, Power Exchange 8.6, Data Analyzer

Confidential

ETL Informatica Developer

Responsibilities:

  • Responsible for designing and developing of mappings, mapplets, sessions and work flows for load the data from source to target database using Informatica Power Center and tuned mappings for improving performance.
  • Created database objects like views, indexes, user defined functions, triggers and stored procedures.
  • Involved in ETL process from development to testing and production environments.
  • Extracted date from various sources like Flat files, Oracle and loaded it into Target systems using Informatica 7.x.
  • Developed mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Used Informatica Power Center Workflow Manager to create sessions, batches to run with the logic embedded in the mappings.
  • Tuned mappings and SQL queries for better performance and efficiency.
  • Automated existing ETL operations using Autosys.
  • Created & Ran shell scripts in UNIX environment.
  • Created and ran the Workflows using Workflow manager in Informatica Maintained stored definitions, transformation rules and targets definitions using Informatica repository manager.
  • Created tables and partitions in database Oracle.

Environment: Informatica Power Center 8.x, Oracle, SQL developer, MS Access, PL/SQL, UNIX Shell Scripting, SQL Server 2005, Windows.

We'd love your feedback!