We provide IT Staff Augmentation Services!

Sr. Etl/ Talend Developer Resume

3.00/5 (Submit Your Rating)

TX

SUMMARY

  • 7+ years of experience in full life cycle of software project development in various areas like design, Applications development of Enterprise Data Warehouse on large scale development efforts leveraging industry standard using Talend and Informatica.
  • 3+ years of experience using Talend Data Integration/Big Data Integration (6.1/5.x) / Talend Data Quality.
  • Extensive knowledge of business process and functioning of Heath Care, Manufacturing, Mortgage, Financial, Retail and Insurance sectors.
  • Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Netezza, Oracle, DB2, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
  • Created Talend ETL jobs to receive attachment files from pop e-mail using tPop, tFileList and tFileInputMail and then loaded data from attachments into database and achieved the files.
  • Strong understanding of NoSQL databases like HBase, MongoDB.
  • Experiences on databases like MySQL, Oracle using RDS of AWS
  • Expertise in Data modeling techniques like Data Modeling- Dimensional/ Star Schema and Snowflake modeling, Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3).
  • Used AWS Services (S3, EC2, EMR, RDS, Amazon RedShift) in Projects
  • Excellent working experience in Waterfall, Agile methodologies. Proficient in performance analysis, monitoring and SQL query tuning using EXPLAINPLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Well versed with Talend Big Data, Hadoop, Hive and used Talend Big data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica PowerCenter and IDQ tool.
  • Created mappings using Lookup, Aggregator, Joiner, Expression, Filter, Router, Update strategy and Normalizer Transformations. Developed reusable Transformation and Mapplets.
  • Strong Experience with shell scripting, understanding of approaches for business intelligence, data warehouse.
  • Self-Starter and Team Player with excellent communication, organizational and interpersonal skills with the ability to grasp things quickly.

TECHNICAL SKILLS

Java/J2EE Technologies: Servlets, JSP, JSTL, JDBC, JMS, JNDI, RMI, EJB, JFC/Swing, AWT, Applets, Multi-threading, Java Networking.

Programming Languages: Java JDK 1.4/1.5/1.6/1.7/1.8 , SQL, PL/SQL, Python, Scala

Application/Web Servers: Oracle/BEA, WebLogic 8.1/9.1/10.3 , IBM WebSphere 5.1/6.0/6.1/7.0 , Jboss, Tomcat 5.0.28/6.0, Jenkins, Cucumber.

IDEs: Eclipse, Spring STS, IntelliJ, NetBeans.

Web technologies: HTML5, CSS3, XHTML, JavaScript(+ES6), TypeScript, Ajax, JQuery, AngularJS, Angular 2, Angular 4, React JS, NodeJS, Socket.io, Express JS, JSON, Swagger UI, Bootstrap.

AWS Skills: EC2, S3, RDS, EBS, IAM, AMI, ELB, CLI, SNS, RDS, VPN, NAT, GIT, DNS, Route53, DBA, Dynamo DB.

Web Services: JAX-WS, JAX-RPC, JAX-RS, SOAP, REST, SOAP UI, Microservices.

Methodologies: Agile, Scrum, RUP, TDD, OOAD, SDLC, Waterfall model.

Modeling Tools: UML, Rational Rose

Message Brokers: Kafka, Rabbit MQ, Active MQ, IBMMQ, TIBCO

Testing technologies/tools: JUnit, JMeter, Mockito.

Database: Oracle 8i/9i/10g/11G, DB2, SQL Server 2008, MySQL, MongoDB, Apache Cassandra.

Big Data Tools: Hadoop, Spark, MapReduce, HDFS, HBase, Zookeeper, Hive, Cassandra, Flume, Couch base, MongoDB, Neo4J.

Version Control: SVN, Git, Stash.

Build Tools: ANT, Maven, Gradle.

Spring Tools: Spring MVC, IOC, AOP, JDBC, JTA, IO, Spring Boot, Spring Micro services, Spring REST, Spring Eureka, Swagger UI, and Spring Zuul.

Platforms: Windows 10/7/2000/98/95/ NT4.0, LINUX, UNIX.

PROFESSIONAL EXPERIENCE

Confidential, TX

Sr. ETL/ Talend Developer

Responsibilities:

  • Used J ava 1.8 features like stream, Lambda expressions, functional interface, collections, Date/Time changes and type annotations
  • Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components
  • Created and managed Source to Target mapping documents for all Facts and Dimension tables
  • Utilized AWS Services (S3, EC2, EMR, RDS, Amazon RedShift) in project scope.
  • Used ETL methodologies and best practices to create Talend ETL jobs. Followed and enhanced programming and naming standards.
  • Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.
  • Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.
  • Utilized Big Data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Extensively used tMap component which does lookup & Joiner Functions, tjava, tOracle, txml, tdelimtedfiles, tlogrow, tlogback components etc. in many of my Jobs Created and worked on over 100+components to use in my jobs.
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more).
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Created Implicit, local and global Context variables in the job. Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.
  • Developed stored procedure to automate the testing process to ease QA efforts and also reduced the test timelines for data comparison on tables.
  • Automated SFTP process by exchanging SSH keys between UNIX servers. Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.
  • Involved in production n deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.

Environment: Talend Data Integration 6.1/5.5.1, Talend Enterprise Big Data Edition 5.5.1, Talend Administrator Console, Oracle 11g, Hive, AWS, HDFS, Sqoop, Netezza, SQL Navigator, Toad, Control M, Putty, Winscp.

Confidential, ST Petersburg, FL

Sr. ETL/ Talend Developer

Responsibilities:

  • Participated in JAD sessions with business users and SME's for better understanding of the reporting requirements.
  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Analyzing the source data to know the quality of data by using Talend Data Quality.
  • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
  • Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
  • Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats.
  • Solid experience in implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database table to record job history.
  • Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
  • Experienced in using debug mode of talend to debug a job to fix errors. Created complex mappings using tHashOutput, tHashInput, tNormalize, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited, etc.
  • Used tRunJob component to run child job from a parent job and to pass parameters from parent to child job.
  • Created Context Variables and Groups to run Talend jobs against different environments.
  • Used tParalleize component and multi thread execution option to run subjobs in parallel which increases the performance of a job.
  • Implemented FTP operations using Talend Studio to transfer files in between network folders as well as to FTP server using components like tFileCopy, TFileAcrchive, tFileDelete, tCreateTemporaryFile, tFTPDelete, tFTPCopy, tFTPRename, tFTPut, tFTPGet etc.
  • Experienced in Building a Talend job outside of a Talend studio as well as on TAC server.
  • Experienced in writing expressions with in tmap as per the business need. Handled insert and update Strategy using tmap. Used ETL methodologies and best practices to create Talend ETL jobs.
  • Extracted data from flat files/ databases applied business logic to load them in the staging database as well as flat files.

Environment: Talend 5.5/5.0, Oracle 11g, Teradata SQL Assistant, HDFS, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, Informatica, TOAD, ERwin, AIX, Shell Scripts, AutoSys, SVN.

Confidential, Columbus, OH

Sr. ETL/ Talend Developer

Responsibilities:

  • Worked closely with Business Analysts to review the business specifications of the project and also to gatherthe ETL requirements.
  • Developed jobs, components and Joblets in Talend. Designed ETL Jobs/Packages using Talend Integration Suite (TIS)
  • Created complex mappings in Talend using tHash, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited as well as custom component such as tUnpivotRow.
  • Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database tableto record job history.
  • Created Talend Mappings to populate the data into dimensions and fact tables. Frequently used Talend Administrative Console (TAC)
  • Implemented new users, projects, tasks within multiple different environments of TAC (Dev, Test, Prod, and DR).
  • Developed complex Talend ETL jobs to migrate the data from flat files to database. Implemented custom error handling in Talend jobs and also worked on different methods of logging. Created ETL/Talend jobs both design and code to process data to target databases.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote fewJava code to capture global map variables and use them in the job.
  • Successfully Loaded Data into different targets from various source systems like Oracle Database, DB2, Flatfiles, XML files etc into the Staging table and then to the target database.
  • Troubleshot long running jobs and fixing the issues.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer ofproject from development to testing environment and then to production environment. Performed Unit testing and System testing to validate data loads in the target.

Environment: Talend Open Studio 5.0.1, Informatica Power center, UNIX, Oracle, SQL Server, TOAD, AutoSys.

Confidential

Informatica / Talend Developer

Responsibilities:

  • Developed high level technical design specification and low level specifications based on the business requirements.
  • Extensively used Informatica client tools (Source Analyzer, Warehouse Designer, Mapping Designer and Workflow Manager).
  • Used Informatica Designer for developing mappings, using transformations, which includes aggregate, Update, lookup, Expression, Filter, Sequence Generator, Router, and Joiner etc.
  • Created reusable transformations and mapplets and used them in mappings to reduce redundancy in coding.
  • Extensively used Informatica Power Exchange Change Data Capture (CDC) for creation of Data Maps using Mainframe Tables.
  • Coded number of batch and online programs using COBOL-DB2-JCL. Designed Complex mappings, Used Lookup (connected and unconnected), Update strategy and filter transformations for loading historical data.
  • Extensively used SQL commands in workflows prior to extracting the data in the ETL tool.
  • Implemented different tasks in workflows which included Sessions, Command Task, Decision Task, Timer, Assignment, Event-Wait, Event-Raise Control, E-Mail etc.
  • Used Debugger to test the data flow and fix the mappings. Involved in Performance tuning of the mappings to improve the performance.
  • Performed Unit Testing and prepared unit testing documentation. Developed the Test Cases and Test Procedures. Extensive use of IDQ for data profiling and quality.
  • Built a Unix Script which checks the Mapping, Session and Workflow names by identifying the Power Center folders, builds an XML and then it zips (tar) all documents based on the names given and deploy them across the environments.
  • Scheduled Jobs and box jobs in AutoSys and analyzed the Run status of both jobs and box jobs in DB2 Environment.

Environment: Informatica Power Center 9.0/8.6, Informatica Power Exchange CDC, DB2 Mainframe, AutoSys, Toad, Windows XP, UNIX.

We'd love your feedback!