We provide IT Staff Augmentation Services!

Etl/talend Developer Resume

4.00/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • Around 7 years of hands on Data Warehouse/Analytics product development experience covering all phases of SDLC including Requirements, Design, Development, Testing and Final Release using Agile Methodology.
  • Having good noledge and experience with Credit risk Collateral management in Banking Domain.
  • Good understanding of Data Modeling and Design for large banking applications for faster, effective and optimum performance of the application.
  • Have built excellent noledge on Talend platform for Design and development of ETL code and Mappings for Enterprise DWH ETL Projects.
  • Hands on experience in performance tuning on EOD Talend batch jobs.
  • Highly capable in dealing with data migration activity for huge applications for multiple hubs (source systems).
  • Hands on experience on Talend components like transformation, file processing, java components, Oracle DB related and logging components.
  • Experienced in working with Horton works distribution of Hadoop, HDFS, MapReduce, Hive, Sqoop, Flume, Pig, HBase, and MongoDB.
  • Experience in Talend Admin for Creating Projects, Users, Project Authorization, Code Deployment, AMC
  • Proficient experience with Talend Open Studio & Talend Enterprise platform for Data Management
  • Utilized tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats.
  • Familiar in using cloud components and connectors to make API calls for accessing data from cloud storage (Google Drive, Salesforce, Amazon S3, DropBox) in Talend Open Studio.
  • Experienced in creating Generic schemas, creating Context Groups and Variables to run jobs against different environments like Dev, Test, and Prod.
  • Can write custom/user - defined java routines in Talend using core java.
  • Sound Knowledge in UNIX Commands and shell scripting.
  • Created sub jobs in parallel to maximize the performance and reduce overall job execution time with the use of parallelize component of Talend in TIS and using the Multithreaded Executions in TOS.
  • Created Execution Tasks in Talend Administration Center on jobs dat are saved either in SVN or in Pre-generated Studio as Zip files.
  • Having good noledge in Normalization and De-Normalization techniques for optimum on XML data and XSD schema designing.
  • Involved in preparing test plans and cases for unit testing based on requirements.
  • Debugging, Logging, and Testing, demonstrates the different methods for finding problems within Talend code.
  • Experienced in versioning, importing and exporting Talend jobs. Set up Triggers for Talend jobs in Job conductor.
  • Strong skill in Oracle including SQL and PL/SQL.
  • Experience in integration of various data sources like Teradata, SQL Server, Oracle, DB2, Netezza and Flat Files.
  • Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads.
  • Having good noledge in Core Java concepts.

TECHNICAL SKILLS

Languages: PL/SQL, Java, JavaScript, HTML, CSS

Operating System: UNIX, LINUX, Windows

Database: Oracle, SQL, MySQL, MongoDB, Sybase, DB2, MS SQL, Hbase, Couchbase, Teradata, Netezza

ETL Tools: Talend Data Management 6.x, Talend Big Data 6.x, Talend Big Data Realtime 6.x

Version Control: GitBlit 1.7, Github

Big Data Environment: HBase, Flume, Sqoop, HDFS, Pig, Hive, Hcatalog

Others: AWS, Microsoft Office, Agile development, JIRA, Putty

PROFESSIONAL EXPERIENCE

Confidential, Plano, TX

ETL/Talend Developer

Responsibilities:

  • Installed and configured Hadoop Map Reduce, HDFS, also developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
  • Loaded the customer profiles data, customer-spending data, credit from legacy warehouses onto HDFS using Sqoop.
  • Built data pipeline using Pig and Java Map Reduce to store onto HDFS.
  • Used Oozie to orchestrate the map reduce jobs dat extract the data on a timely manner.
  • Used Talend reusable components like routines, context variables and globalMap variables.
  • Used Pattern matching algorithms to recognize the customer across different sources, built risk profiles for each customer using Hive, and stored the results in HBase.
  • Performed data manipulations using various Talend components like tMap, tOracleRow, tjava and many more.
  • Responsible for building scalable distributed data solutions using Hadoop
  • Installed and configured Hive, Pig, Sqoop, Flume and Oozie on the Hadoop cluster
  • Setup and benchmarked Hadoop/HBase clusters for internal use
  • Created joblets in Talend for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
  • Developed jobs to move inbound files to vendor server location based on monthly, weekly and daily frequency.
  • Implemented Change Data Capture technology in Talend in order to load deltas to a DataWarehouse
  • Created Context Variables and Groups to run Talend jobs against different environments.
  • Used tParalleize component and multi thread execution option to run sub-jobs in parallel which increases the performance of a job.
  • Broad design, development and testing experience with Talend Integration Suite and noledge in Performance Tuning of mappings
  • Implemented FTP operations using Talend Studio to transfer files in between network folders as well
  • Developed and optimized Simple to complex Map/reduce Jobs using Hive and Pig
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from MySQL into HDFS using Sqoop
  • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs
  • Exported the analyzed data to relational databases using Sqoop for visualization and to generate reports for BI team
  • Responsible for writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).
  • Debugged numerous issues in Talend.
  • Importing and exporting Data from MySQL/Oracle to HiveQL using SQOOP.
  • Responsible for defining the data flow within Hadoop eco system and direct the team in implement them.
  • Exported the result set from Hive to MySQL using Shell scripts.
  • Developed HIVE queries for the analysts.

Environment: Talend Platform for Big Data 5.6.2, Enterprise Platform for Data integration and MDM (V6.1.1,5.5.1, 5.6.1), UNIX, Oracle 11g, SQL Server 2012, Microsoft SQL Server management Studio, WINDOWS XP

Confidential, Charlotte, NC

ETL Developer

Responsibilities:

  • Created ETL job infrastructure using Talend open Studio.
  • Investigate, analyze and fix reported defects.
  • Work closely with the Data Modeler, Business Analysts to understand business requirements, providing noledge and solutions on Data Warehousing, ensuring delivery of business needs in a timely cost-effective manner.
  • Perform maintenance programming and correction of identified defects.
  • Designing of Jobs to capture statistics as well as execution errors in a log files.
  • Debugged numerous jobs in Talend.
  • Identify inefficiencies and where necessary implement changes to fix the end-to-end development process.
  • High level ETL testing for Bulk and Incremental loads.
  • Exporting the Script from windows and run the jobs in UNIX.
  • Check in and Check out the codes through GIT.
  • Performing the Unit Testing and moving the jobs to SIT, QAT and UAT.
  • Supporting to SIT, QAT and UAT issues.
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File
  • Used Talend components such as tmap, tFileExist, tFileCompare, tELTAggregate, tOracleInput, tOracleOutput etc.
  • Designed and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2 to capture the changes
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Involved in automation of FTP process in Talend and FTPing the Files in UNIX.
  • Created Talend Development Standards. dis document describes the general guidelines for Talend developers,
  • Coordinating with Clients in QCP calls.
  • Worked closely with product management teams to strategize design solutions, produced navigation flows and prototypes.
  • Practiced agile methodology throughout the project.
  • Worked on the MongoDB to fetch and store the credit reports of all customers from Vantage on monthly basis.

Environment: Talend Data integration 5.6.1, Oracle 11g, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, T-SQL, SSIS, TOAD, AIX, Shell Scripts, Autosys

Confidential, Dallas/Fort Worth Area, TX

Associate System Engineer

Responsibilities:

  • Optimized the previously existing jobs, stored procedures and views to improve readability and speed.
  • Created SSIS script tasks, look up transformations and data flow tasks using T- SQL and Visual Basic (VB) scripts.
  • Managed and monitored the system utilization by using the query optimizer and checking the execution plan.
  • Designed packages which utilized tasks and transformations such as Execute SQL Task, Data Flow Task, Sequence Container, For Each Loop Container, Lookup, Aggregate, Expression, OLE DB Command and Derived Column
  • Extensively worked in developing Metrics, Derived metrics, Compound metrics and Filters for complex reports.
  • Created Consolidations, Custom Groups and Prompts for the Micro Strategy reports.
  • Analyze and develop the new objects in Test environment and after testing the objects and reports, migrated to Production Environment.
  • Unit tested all the reports by running queries against the warehouse using SQL
  • Initiated and developed a project to check the data integrity of our databases on weekly basis.
  • Conducted Data research/debugging and assigned task appropriately among team members to ensure timely delivery.
  • Maintained and updated legacy data importing and exporting applications. Streamlined existing ETL applications.
  • Assisted the quality assurance department with the testing of various portions of the InterAction product.

Environment: MS SQL Server 2008, SQL server 2005, MS Server Integration services, Java, Windows7

Confidential

System Engineer

Responsibilities:

  • Exporting the Script from windows and run the jobs in UNIX.
  • Designed physical and logical database structures/diagrams.
  • Created tables, relationships, triggers and indexes for enforcing business rules.
  • Used SQL Profiler to estimate the Slow Running queries and performance tuning purpose.
  • Wrote different complex SQL queries including inner, outer join and update queries.
  • Developed reports for payment and BI Count to show organizational and seasonal comparison.
  • Incremental and full database recovery with experience of complex recovery scenarios.
  • Worked on DTS Package, DTS Import/Export for transferring data from various heterogeneous sources to SQL Server.
  • Created packages using SSIS for data extraction from Flat Files, Excel Files OLEDB to SQL Server.
  • Check in and Check out the codes through Git.
  • Performing the Unit Testing and moving the jobs to SIT, QAT and UAT.
  • Supporting to SIT, QAT and UAT issues.

Environment: Windows NT, Windows 2007, SQL Server 2005, DTS, SSIS, T-SQL, XML, MS SQL Reporting Services

Confidential 

SQL Developer

Responsibilities:

  • Involved in creating database objects like tables, stored procedures, views, triggers, and user defined Functions for the project which i was working on.
  • Analyze the client requirements and translate them into technical requirements.
  • Gatheird requirements from the end user and involved in developing logical model and implementing requirements in SQL server 2000
  • Data migration (import & export - BCP) from text to SQL server.
  • Responsible for creating reports based on the requirements using reporting services 2000.
  • Identified the database tables for defining the queries for the reports.
  • Worked on SQL server queries, stored procedures, triggers and joins.
  • Defined report layouts for formatting the report design as per the need.
  • Identified and defined the datasets for report generation.
  • Formatted the reports using global variables and expressions.
  • Deployed generated reports onto the report server to access it through browser.
  • Maintained data integrity by performing validation checks.
  • Performing the Unit Testing and moving the jobs to SIT, QAT and UAT.
  • Supporting to SIT, QAT and UAT issues.

Environment: MS SQL 2000, Windows server 2000, SQL Query Analyzer and Enterprise Manager, MS Access 2000 & Windows NT platform

We'd love your feedback!