We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

0/5 (Submit Your Rating)

Milwaukee, WI

SUMMARY

  • 7+ years of IT experience with expertise in Analysis Design Development Testing and implementation of Data warehouse, Data marts with ETL OLAP Tools using Informatica Power Center v10.x,9.x,8.x, Power Exchange, Data Quality, Oracle 11g/10g/9i, SQL Server.
  • Experience in Information Technology with emphasis on Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) inInformaticaPower Center from various database sources.
  • Strong work experience in Data Mart life cycle development, performed ETL procedure to load data from different sources like SQL Server, Oracle, DB2, COBOL files & XML Files and flat files into data marts and data warehouse using Power Center Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor.
  • Involved in understanding of Business Processes, grain identification, identification of dimensions and measures for OLAP applications.
  • Extensive DW/ETL experience using Oracle Data Integrator(ODI),Configured and setupODI, Master Repository, Work Repository, Projects, Models, Sources, Targets, Packages, Knowledge Modules, Interfaces, Scenarios, Filters.
  • Experienced to Profile, Analysis, Standardize, Clean, Integrate, Score Carding, and Reference Data from various source systems using Informatica Data Quality (IDQ).
  • Experience in creating and deploying SSIS Packages on high - activity, normalized transaction systems (OLTP), highly-usable denormalized reporting systems (OLAP) as well as staging areas and data marts.
  • Developed consumer based features and applications using Python.
  • Designed and developed Reference Integrity, Technical and Business Data Quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality environment.
  • Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using AGILE methodologies
  • Strong knowledge of Entity-Relationship concept, Facts and Dimensions tables and developing database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional data modelling.
  • Experience in AWS (Amazon Web Services), S3 Bucket and Redshift (AWS Relational Database).
  • Comprehensive experience of working with Type1, Type2 methodologies for Slowly Changing Dimensions (SCD) management.
  • Used Oracle Data Integrator Designer (ODI) to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Extensive experience in developing Stored Procedures, Functions and Triggers, Complex SQL queries using SQL Server TSQL and Oracle PL/SQL.
  • Performed the data profiling and analysis making use ofInformaticaData Explorer (IDE) andInformaticaData Quality (IDQ).
  • Experience on Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Strong experience in MS SQL Server with SQL Server Reporting Services (SSRS), SQL Server Integration Services (SSIS) and SQL Server Analysis Services (SSAS).
  • Good working knowledge of variousInformaticadesigner transformations like Source Qualifier, Dynamic and Static Lookups, connected and Unconnected lookups, Expression, Filter, Router, Joiner, Normalizer and Update Strategy transformation.
  • Extensively used Teradata Utilities like Tpump, Fast-Load, MultiLoad, BTEQ and Fast-Export.
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Experience in Task Automation using UNIX Scripts, Job scheduling and Communicating with Server using pmcmd.
  • Extensively used Control-M for Job monitoring and scheduling.
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.

TECHNICAL SKILLS

ETL Tools: InformaticaPower Center, InformaticaB2B-DT,InformaticaPower Exchange, Informatica Data Quality, andInformaticaPower Connect.

RDBMS: Oracle … MS-SQL Server … MS-Access and Teradata, Sybase, IBM DB2.

Tools: informatica, Eclipse, Putty, File Zilla, SQL Developer, Toad, Teradata SQL Assistant, Quality Center, Harvest, SQL Server Management Studio.

Programming Languages: UNIX, Shell Scripts, SQL, PL/SQL, T-SQL, XML, VB, HTML, XML, CSS, XSD, JavaScript and JSON and ASP.

PROFESSIONAL EXPERIENCE

Confidential, Milwaukee, WI

ETL/Informatica Developer

Responsibilities:

  • Involved in requirements gathering, analysis, function/technical specifications, development, deploying and testing.
  • Prepare/maintain documentation on all aspects of ETL processes to support knowledge transfer to other team members.
  • UsedInformaticaPower Center for migrating data from various OLTP databases to the data mart
  • Worked with different sources like Oracle, flat files, XML files, MS SQL Server.
  • Developed SSAS cubes with multiple fact measures and multiple dimension hierarchies based on the OLAP reporting needs.
  • Created Mappings using Mapping Designer to load data from various sources, made use of various Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Normalizer, Rank, Router, Sequence generator, Union and Update Strategy transformations.
  • Created mapplets using Mapplet Designer and used those Mapplets for reusable business process in development.
  • Built Reports in SSRS for errors generated in SSIS Packages.
  • Used SQL Server Agent to automate the SSIS 2012/2008r2 package execution.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Performance tuning ofInformaticadesigner and workflow objects.
  • Used Teradata Utilities (SQL Assistant, BTEQ, MultiLoad, and FastLoad) to maintain the database
  • Created pre/post session commands and pre/post session SQLs to perform tasks before and after the sessions.
  • Build a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
  • Implemented slowly changing dimensions (Type I and Type II) for customer Dimension table loading.
  • Created UNIX KSH shell scripts to kick offInformaticaworkflow in batch mode.
  • InvokedInformaticausing "pmcmd" utility from the UNIX script Teradata using "BTEQ".
  • Involved in Unit testing, Iterative testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Provided support for the applications after production deployment to take care of any post-deployment issues.

Environment: InformaticaPowerCenter 9.5.1, SSIS,InformaticaIDQ, SQL, SSRS, UNIX shell scripting.

Confidential, Alpharetta, GA

Sr. ETL/BIdeveloper

Responsibilities:

  • Involved in analysis of source systems, business requirements and identification of business rule and responsible for developing, support and maintenance for the ETL process usingInformatica.
  • Created / updated ETL design documents for all theInformaticacomponents changed.
  • Extracted data from heterogeneous sources like oracle, xml, DB2, flat file and perform the data validation and cleansing in staging area then loaded in to data warehouse in oracle 11g.
  • Made use of variousInformaticasource definitions viz. Flat files and Relational sources.
  • Made use of variousInformaticatarget definitions viz. relational data base targets.
  • CreatedInformaticatransformations/mapplets/mappings/tasks/worklets/workflows using Power Center to load the data from source to stage, stage to persistent, stage to reject and stage to core.
  • UsedInformaticaB2B data transformation to read unstructured and semi structured data and load them to the target.
  • Create Views, Functions, Procedures and Packages in PL/SQL forODIprocess.
  • Created Web services in IDQ developer tool and generated WSDL file for real time services.
  • Made use of various Power Center Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Rank, Router, Sequence generator, Union and Update Strategy transformations while creating mapplets/mappings.
  • Involved in extracting the data from OLTP to OLAP using SSIS.
  • Used UNIX scripting to apply rules on the raw data within AWS.
  • Made use of reusableInformaticatransformations, shared sources and targets.
  • Created different parameter files and changed Session parameters, mapping parameters, and variables at run time.
  • Analyzing the source data to know the quality of data by using Talend Data Quality.
  • Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
  • Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
  • Created ETL/Talend jobs both design and code to process data to target databases.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures
  • Design and implemented custom scripts.
  • Created measure groups, calculations, User Defined Hierarchies, Key Performance Indicators (KPI) and Metrics depending on the business requirement using SSAS.
  • Identify and fixed bottlenecks and tuned the complexODIInterfaces for optimized
  • Implemented various loads like daily loads, weekly loads, and quarterly loads and on demand load using Incremental loading strategy and concepts of changes Data Capture (CDC).
  • As part of Business Intelligence and Reporting team, played a key role in Development and Maintenance of SSRS Reporting Environment.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Profile source data using IDQ tool, understand source system data representation, formats & data gaps.
  • Created mappings for Type1, Type2 slowly changing dimensions (SCD) / complete refresh mappings.
  • Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, TO DATE, Decode, and IIF functions in Expression Transformation.
  • Extensively used the Workflow Manager tasks like Session, Event-Wait, Timer, Command, Decision, Control and E-mail while creating worklets/workflows.
  • Configured and maintained Report Manager and Report Server for SSRS.
  • Designed ETL packages dealing with different data sources (SQL Server, CSV, Flat Files etc) and loaded the data into target data sources by performing different kinds of transformations using SQL Server Integration Services (SSIS).
  • Configured the testing environment and process to export and ImportODI Objects into the Testing and Production (Execution) environment
  • Worked with "pmcmd" command line program to communicate with theInformaticaserver, to start, stop and schedule workflows.
  • Created Job Stream and added job definitions in Control-M and executed.
  • Involved in developing test automation scripts using Perl/Python.
  • Participated in User Acceptance testing (UAT) and involved in UA test cases, executed test cases, Documenting Defects, resolved defects and Signed Off from Application.
  • During the course of the project, participated in multiple meetings with the client and data architect / ETL architect to propose better strategies for performance improvement and gather new requirements.

Environment: InformaticaPower Center 9.5,InformaticaB2B,Python, Oracle 11g,SSIS, DB2, XML, Flat Files, Win7, SQL * Plus, SSRS, Control-M, Toad and UNIX

Confidential - Columbus, OH

Sr. ETL/BI/Talend Developer

Responsibilities:

  • Worked with business analyst for requirement gathering, business analysis, and testing and project- coordination.
  • Created the Detail Design Documents which have the ETL technical specifications for the given functionality, overall process flow for each process, Flow diagrams, Mapping spreadsheets, issues, assumptions, configurations,Informaticacode details, shell scripts etc. and conducted meetings with the clients for the Approval of the process.
  • Analyzed the existing mapping logic to determine the reusability of the code.
  • Handled versioning and dependencies inInformatica.
  • Profile source data using IDQ tool, understand source system data representation, formats & data gaps.
  • Developed complexInformaticamappings using various transformations- Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.
  • Work with Oracle Extract, Transform and load (ETL) methods and tools (Oracle Data IntegratorODI) to extract data from outside sources, transform it to Fit an organization’s business needs to load it into a target such as a organizations data warehouse or application database.
  • Created region level security in reports to prevent users of one region from seeing data of a different region and implemented subscriptions for Reports using SSRS.
  • Extensively used SCD's (Slowly Changing Dimension) to handle the Incremental Loading for Dimension tables, Fact tables.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Created scorecards in IDQ, with appropriate business rules to gauge the status and progress of data through multiple iterations of MDM data population.
  • Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor..
  • Used ETL methodologies and best practices to create Talend ETL jobs.
  • Created Talend ETL jobs to receive attachment files from pop e-mail using tPop, tFileList and tFileInputMail and then loaded data from attachments into database and archieved the files.
  • ConfiguredInformaticaPower Exchange connection and navigator.
  • Created Registration, Data Map, configured Real-Time mapping and workflows for real-time data processing using CDC option ofInformaticaPower Exchange.
  • Develop and maintainETLobjects; such as packages, procedures, variables, mappings, load plans, schedulers etc. in oracle fusion middleware; such asODIas ETL tools.
  • Created a SSIS Project using SQL Server Business Intelligence Development Studio.
  • Importing & exporting database using SQL Server Integrations Services (SSIS) and Data Transformation Services (DTS Packages).
  • Wrote Script to load multiple tables and used MLOAD utility.
  • Utilized Informatica IDQ to complete initial data profiling and matching/removing duplicate data.
  • Used PL/SQL to create Packages, Functions, and Procedure.
  • Used PL/SQL and SQL*Loader to create ETL packages for flat file loading and error capturing into log tables.
  • Writing Complex SQL Queries and PL/SQL Procedures to Extract Data from various source Tables of Data Warehouse
  • Installed Power Exchange adapters for AWS- S3, Redshift adapters to load the transactional data into Redshift database.
  • Used Debugger utility of the Designer tool to check the errors in the mapping and made appropriate changes in the mappings to generate the required results.
  • Performing ETL & database code migrations across environments.
  • Created and Configured OLAP Cubes using SSAS, Assisted in creation of SQL Server Analysis Services (SSAS) cubes for analysis.
  • Create Process flows usingODIPackages for loading data from source system to staging area, staging area to dimension and fact tables.
  • Created automated reports using SSRS (SQL Server Reporting Services).
  • Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Developed Auto retrieval and insertion ofNOSQLdata using PyMongo and MongoDB with Python programming.
  • Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution ofInformaticaworkflows in different environments.
  • CreateODIVariables and used it intoODIinterface, Procedure and Packages.
  • Created Unit test plans and did unit testing using different scenarios separately for every process.
  • Involved in System test, Regression test& supported the UAT for the client.

Environment: InformaticaPower Center 10/9.5,InformaticaPower Exchange 9.5, Talend Data Integration 6.1/5.5.1, Talend Big Data Platform 6.0.1/5.5, Talend Administrator Console, Toad, SSRS, UNIX Shell Scripting, Oracle Exadata, SQL Server,SSIS, Python, Teradata 14, Autosys, Linux.

Confidential, Oklahoma City, OK

ETL/InformaticaDeveloper

Responsibilities:

  • Involved in business analysis and technical design with business and technical staff to develop requirement document and ETL specifications.
  • Developed the detailed database design document which included data dictionary, data estimates, entity relationship (ER) diagrams, dimensional model (Snowflake schema) using Erwin, MS Office and MS Visio
  • Extensively UsedInformaticaDesigner as an ETL tool to extract data from legacy source (Mainframe) systems to Target system.
  • Extracted data from multiple operational sources for loading into staging area, Data warehouse, Data Marts using SCD's (Type 1 and Type 2) loads. In this process,Informaticais used to implement business rule, transform and load.
  • Created ETL Mappings, sessions and workflows usingInformaticaPower Center to move Data from multiple sources like XML, DB2, SQL Server, and Oracle into a common target such as Data Marts and Enterprise Data warehouses. And used direct and indirect flat files (delimited and fix width) as a source of data.
  • UsedInformaticaDesigner to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner (all type of joiner), Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart and Enterprise Data Warehouse.
  • Extensively used parameter file to pass mapping and session variables, and parameters.
  • Tuned performance of sessions and mapping. Identified source and target bottleneck, transformation error, and resolve the problem.
  • Involved in Unit testing. Created the test case documents for individual mappings and tasks.
  • Automation and scheduling of UNIX shell scripts andInformaticasessions and batches using Control-M.
  • Wrote complex SQL queries, Function, Store Procedure using cursor and record data type and trigger in Oracle database.

Environment: InformaticaPower Center 9.1, Solaris Server 10, Oracle 10g/11g, DB2, Control-M, Toad, Erwin, MS Office and MS Visio.

We'd love your feedback!