We provide IT Staff Augmentation Services!

Sr. Informatica/talend Developer Resume

3.00/5 (Submit Your Rating)

NY

SUMMARY

  • Over 8 years of experience in Designing, Development and Implementation of Data Warehouses & Data marts with ETL & OLAP Tools using Informatica Power Center 9.x/8.x/7.x, Informatica MDM, Oracle 10g/9i/8i, Talend 7.x/6.x/5.x, Talend MDM, SQL, PL/SQL, SQL Server both on Windows & UNIX.
  • Involved in every phase (analysis, design, testing, implementation) of database projects using every common type of data store - transactional (OLTP), data warehouse (Kimball/Inmon), and Data Mart (OLAP).
  • Experience in Life Cycle Development of building a Data Warehouse for large technology projects including Analysis, Design, Development, Testing, Maintenance and Documentation
  • Strong experience in designing and developing Business Intelligence solutions in Data Warehousing/Decision Support Systems using Informatica Power Center.
  • Excellent experience working on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, tmap, tFileCopy, tFileCompare, tFileExist file components, ELT components etc.
  • Involved in the data analysis for source and target systems. Good understanding of Data warehousing concepts, Star schema and Snow-flake schema.
  • Extensively executed SQL queries on Oracle using Toad and SQL server tables in order to view successful transaction of data and to validate data.
  • Solid Understanding on Implicit contexts and global context variables in Talend Job.
  • Experience in design and implementation using ETL tools like Informatica (Power Center) Designer, Repository Manager and Workflow / Server Manager
  • Experience in Repository Configuration, creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Experience in integration of various data sources from Databases like MS Access, Oracle, SQL Server and formats like flat-files, CSV files, COBOL files and XML files.
  • Experienced in writing Database Objects like Stored Procedures, Triggers.
  • Experience in implementing hybrid connectivity between Azure and on - premise using virtual networks, VPN and Express Route
  • Plan and Develop roadmaps and deliverables to advance the migration of existing solutions on premise systems/applications to Azure cloud
  • Architect and implement ETL and data movement solutions using Azure Data Factory(ADF), Informatica power Centre and Talend Enterprise Edition
  • Recreating existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL data warehouse environment
  • Design & implement migration strategies for traditional systems on Azure (Lift and shift/Azure Migrate, other third-party tools)
  • Worked on Azure suite: Azure SQL Database, Azure Data Lake(ADLS), Azure Data Factory(ADF) V2, Azure SQL Data Warehouse, Azure Service Bus, Azure key Vault, Azure Analysis Service(AAS), Azure Blob Storage, Azure Search, Azure App Service, Azure data Platform Services.
  • Experience managing Azure Data Lakes (ADLS) and Data Lake Analytics and an understanding of how to integrate with other Azure Services. Knowledge of USQL and how it can be used for data transformation as part of a cloud data integration strategy
  • Created Azure SQL database, performed monitoring and restoring of Azure SQL database. Performed migration of Microsoft SQL server to Azure SQL database.
  • Migration of on premise data (Oracle/ SQL Server/ DB2/ MongoDB) to Azure Data Lake Store(ADLS) using Azure Data Factory (ADF V1/V2).
  • Experienced on databases like MySQL, Oracle using RDS of AWS.
  • Experienced in AWS S3, EC2, SNS, SQS setup, Lambda, RDS (MySQL) and Redshift cluster configuration.
  • Experienced in AWS (Amazon Web Services), S3 Bucket and Redshift (AWS Relational Database).
  • Experienced in implementing error-handling process.
  • Experienced in using parameter files, mapping parameters in mappings.
  • Experience working with UNIX Shell Scripts and Perl for automatically running sessions, aborting sessions and creating parameter files.
  • Extensive experience in working with Quality Center tool for generating reports on daily and weekly basis.
  • Strong Interpersonal and communication skills, ability to work in a team as well as independently with minimal supervision.
  • A capable and resourceful team member who also possesses excellent written and verbal communication skills.

TECHNICAL SKILLS

Languages: SQL, PL/SQL, T-SQL, UNIX Shell Scripting, Perl, BTEQ

Methodologies: PMP/PMI Agile, Scrum, Waterfall

Operating Systems: Windows 98/XP, Windows 7, UNIX, Windows 2000

Databases Tools: Oracle 11g/10g/9i, MS SQL Server 2014/2008/2005/2000 , DB2, Teradata R14, R13, R12, V6, MS Access. Quality Center, HP Service Center.

Cloud DBs: Microsoft Azure SQL DB Gen 5, AWS Redshift, S3, AWS RDS, Amazon Web Services (AWS)

ETL Tools: Informatica PowerCenter 7.1/8.6.1/9.1/9.5 , Informatica MDM 9.x, Talend Open Studio, Talend Enterprise Edition 4.2/5.1/5.5/6.3.1/ 7.0.1 , Talend DQ, Talend MDM.

Reporting tools : Jasper soft, Tableau 10/9.x

PROFESSIONAL EXPERIENCE

Confidential, NY

Sr. Informatica/Talend Developer

Responsibilities:

  • Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs. Involved in all cycles for migrating Informatica MDM & Talend MDM data transitions.
  • Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Participant’s data (Member DataMart) from various sources like Oracle, flat files, XML files and SQL databases and loaded into landing tables of Informatica MDM Hub.
  • Developed complex ETL mappings and worked on the transformations like Source qualifier, Joiner, Expression, Sorter, Aggregator, Sequence generator, Normalizer, Connected Lookup, Unconnected Lookup, Update Strategy and Stored Procedure transformation
  • Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data & defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
  • Identified Golden Record (BVT) for Echamp Member Data by analyzing the data and duplicate records coming from different source systems.
  • Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
  • Involved in all cycles for migrating Informatica MDM to Talend MDM data transition.
  • Developed Talend ETL jobs to push the Providers, Organizations and Claims data into Talend MDM and develop the jobs to extract the data from MDM.
  • Developed data validation rule in the Talend MDM to confirm the golden record from QNXT and Plexis Soure systems.
  • Developed data matching/linking rules to standardize the record in Talend MDM.
  • Developed and implemented Software Release Management strategies for various applications per the Agile
  • Developed complex ETL mappings for Stage, Dimensions, Facts and Data marts load
  • Extensively used components like tWaitForFile, tIterate, toFlow, tFlowToIterate, tHashoutput, tHashInput, tMap, tRunjob, tJava, tNormalize and tfile components to create Talend jobs.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Implemented Talend to extract data from Uniform Assessment System (UAS)- XMLs & flat files and load data into SQL Server/Oracle Database for downstream process.
  • Worked in Talend Administration Centre as an administrator for scheduling the Jobs deployment using Job Conductor, Execution Plan and Triggers.
  • Design and Implement ETL to data load from Source to target databases and for Fact and Slowly Changing Dimensions (SCD) Type1, Type 2 to capture the changes.
  • Written shell scripts for scheduling the jobs, moving files between different servers, file modifications or used as additional procedures along with Talend jobs.
  • Implemented Talend automations to reduce time and man power, also automated ETL jobs and unit testing, reducing build time by 30%
  • Incorporated error handling in project level to validate data integrity and completeness in database.
  • Involved & Solely Completed Migration projects for SSIS, Informatica and SAP Business objects ETL related projects into Talend.
  • Worked on a Claims project for migrating traditional data sources from Microsoft SQLDBs to MSQL Azure - Cloud Migration.
  • Worked with project management, business teams and departments to assess and refine requirements to design/develop BI solutions using Azure.
  • Implement ETL and data movement solutions using Azure Data Factory(ADF), Informatica power Centre and Talend Enterprise Edition
  • Plan and Develop roadmaps and deliverables to advance the migration of existing solutions on premise systems/applications to Azure cloud
  • Created the template Informatica workflows & Talend ETL jobs that will replicate about 200 processes to load the data using Azure SQL.

Environment: Informatica 9.1, Informatica MDM 9.x, Talend Enterprise Platform for Data management (5.x/ 6.3.1/7.0.1 ), Talend MDM 5.6.1, UNIX, JIRA, Oracle, Microsoft SQL Server management Studio, Microsoft Azure SQL DB Gen 5, WINDOWS XP, Tableau 10/9. x.

Confidential, Boulder, CO

Sr. ETL/Talend Developer

Responsibilities:

  • Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
  • Converted functional specifications into technical specifications.
  • Developed complex jobs to load data from multiple source systems like flat files, XML files to data mart in Oracle database through AWS.
  • Created Design Documents for source to target mappings. Developed mappings to send files daily to AWS EFS.
  • Used UNIX scripting to apply rules on the raw data within AWS.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Involved in Unix Shell Scripts for automation of ETL process.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes, constraints
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File - AWS Self Storage service(S3).
  • Worked on slowly changing dimensions type 1 and type 2 for populating dimension tables as per mapping rules specifies.
  • Worked on Google Analytics, by populating data using tgoogleanalytics component in order to implement dashboards for business needs.
  • Extracted data from Informix database using Talend and loaded into SQL Server Database tables.
  • Developed and executed a migration strategy to move Data Warehouse from an Informix platform to AWS Oracle.
  • Worked on Talend by using Business Intelligence folder related components like tjasperoutput and tjasperoutputexec for passing the data in order to build reports through Jaspersoft.
  • Used Jaspersoft IReport Designer for generating order summary weekly reports.
  • Scheduling of the reports using Jasper Server.
  • Participated in weekly end user meetings to discuss data quality, performance issues. Ways to improve data accuracy and new requirements etc.
  • Involved in migrating objects from DEV to QA and testing them and then promoting to Production.
  • Provided production Support by running the jobs and fixing the bugs.
  • Monitor; troubleshoot batches and jobs for weekly and monthly extracts from various data sources across all platforms to the target database.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Used Fine-tuned existing Informatica maps for performance optimization
  • Troubleshoot loading failure cases, including database problems.
  • Involved in Code migration from development to QA and production environments.
  • Involved in database testing, writing complex SQL queries to verify the transactions and business logic.
  • Worked in Data Warehousing projects using Informatica 9.x/8.x/7.x, Informatica Data Quality (Data Analyst) IDQ 9.5.1, Oracle 10G/9i, Teradata, SQL & PL/SQL.
  • Worked on performance tuning of Teradata database & Informatica mappings.
  • Involved in various phases of the software development life cycle right from Requirements gathering, Analysis, Design, Development, and Testing to Production.
  • Worked on loading the data from different sources like Oracle, DB2, EBCDIC files (Created Copy book layouts for the source files), ASCII delimited flat files to Oracle targets and flat files.
  • Extracted data from various source systems like Oracle and flat files as per the requirements and loaded it to Teradata using FASTLOAD, TPUMP and MLOAD
  • Wrote complex SQL queries on Teradata and used them in lookup SQL overrides and Source Qualifier overrides.
  • Identified and eliminated duplicates data using IDQ 9.5.1. Used various built-in data profiling Rules/ Transformations/Algorithms like Biogram, Edit Distance, Jaro Distance to profile data.
  • Involved in migration of data from Oracle to Teradata.
  • Experience in working with Mapping variables, Mapping parameters, Workflow variables, implementing SQL scripts and Shell scripts in Post-Session, Pre-Session commands in sessions.
  • Extracted data from various source systems like Oracle and flat files as per the requirements and loaded it to Oracle 11g
  • Developed advanced mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate,
  • Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations. Created complex mapplets for reusable purposes.
  • Wrote complex SQL queries on Oracle and used them in lookup SQL overrides and Source Qualifier overrides.
  • Involved in Unit, Functional, Integration and System testing and preparation review documents for the same.
  • Extensively used mapping parameters, mapping variables to provide the flexibility and parameterized the workflows for different system loads.
  • Creation of sessions and workflows according to the data load in to different systems.
  • Involved in Performance tuning of mapping level and session level

Environment: Talend Integration open studio 5.5, Talend Enterprise Edition Integration Suite 5.3, Oracle, Informix, SQL Server, XML Files, CSV Files, Tivoli, Windows-XP (Client), LINUX, Toad, Oracle SQL- AWS, AWS Redshift, S3, AWS RDS, Amazon Web Services (AWS), SSH (Secure Shell), Jaspersoft IReport Designer, Talend administration center(TAC), Informatica PowerCenter 9.1, Oracle, Teradata R13/R14, XML Files, CSV Files, Tivoli, Windows-XP (Client), LINUX, Toad, Oracle SQL Developer, SSH (Secure Shell)

Confidential, Gardner, KS

Talend Developer

Responsibilities:

  • As a member of ETL Team Involved in gathering of information, determining overall ETL architecture, researching the affected data structures, determining data quality, establishing metrics and developing a full-lifecycle ETL plan.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Implemented custom error handling in Talend jobs and also worked on different methods of logging.
  • Created ETL/Talend jobs both design and code to process data to target databases.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture globalmap variables and use them in the job.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Experienced in PERL, Mod Perl, PERL regex and Object oriented PERL.
  • Loaded data from SQL server tables to Mainframe using Power Exchange.
  • Extracted data from Mainframe databases using Power Exchange and loaded into SQL Server Database tables.
  • Created Implicit, local and global Context variables in the job.
  • Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes, constraints
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File
  • Created Unix Scripts and run them using tSSH and tSystem for reading the Data from flat files and archiving the Flat files at the specified server.
  • Tuned sources, targets and jobs to improve the performance.
  • Monitor; troubleshoot batches and jobs for weekly and monthly extracts from various data sources across all platforms to the target database.
  • Provided the Production Support by running the jobs and fixing the bugs.

Environment: Talend Integration Suite 4x, 5x, Oracle 10g/11g, Flat Files, PL/SQL, UNIX, PERL, Mod Perl, Windows XP and SVN

Confidential, Pittsburgh, PA

ETL Informatica Developer

Responsibilities:

  • Involved in gathering of business requirements, interacting with business users and translation of the requirements to ETL High level and Low-level Design.
  • Documented both High level and Low-level design documents, Involved in the ETL design and development of Data Model.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data.
  • Developed complex ETL mappings and worked on the transformations like Source qualifier, Joiner, Expression, Sorter, Aggregator, Sequence generator, Normalizer, Connected Lookup, Unconnected Lookup, Update Strategy and Stored Procedure transformation.
  • Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
  • Worked on BTEQ scripts, MLOAD, FASTLOAD, TPUMP, TPT utilities of Teradata.
  • Worked on performance tuning of Teradata database & Informatica mappings.
  • Involved in various phases of the software development life cycle right from Requirements gathering, Analysis, Design, Development, and Testing to Production.
  • Worked on loading the data from different sources like Oracle, DB2, EBCDIC files (Created Copy book layouts for the source files), ASCII delimited flat files to Oracle targets and flat files.
  • Implemented change data capture (CDC) using Informatica power exchange to load data from clarity DB to Teradata warehouse
  • Extracted data from various source systems like Oracle and flat files as per the requirements and loaded it to Teradata using FASTLOAD, TPUMP and MLOAD
  • Wrote complex SQL queries on Teradata and used them in lookup SQL overrides and Source Qualifier overrides.
  • Involved in migration of data from Oracle to Teradata.
  • Experience in working with Mapping variables, Mapping parameters, Workflow variables, implementing SQL scripts and Shell scripts in Post-Session, Pre-Session commands in sessions.
  • Experience in writing SQL*Loader scripts for preparing the test data in Development, TEST environment and while fixing production bugs.
  • Experience in using the debugger to identify the processing bottlenecks, and performance tuning of Informatica to increase the performance of the workflows.
  • Experience in creating ETL deployment groups and ETL Packages for promoting up to higher environments.
  • Involved in various phases of the software development life cycle right from Requirements gathering, Analysis, Design, Development, and Testing to Production.
  • Performed and documented the unit testing for validation of the mappings against the mapping specifications documents.
  • Performed production support activities in Data Warehouse (Informatica) including monitoring and resolving production issues, pursue information, bug-fixes and supporting end users.
  • Experience in writing and implementing FTP, archive and purge Scripts in UNIX.

Environment: Informatica Power Center 9.5/9.1, Oracle 11g, DB2, TOAD 9.0, UNIX. Teradata R13, PERL and Mod Perl

We'd love your feedback!