We provide IT Staff Augmentation Services!

Informatica Developer Resume

0/5 (Submit Your Rating)

Scotts Valley, CA

SUMMARY

  • About 7 year’s professional experience in IT industry and wide range of progressive experience in design, analysis, development, documentation, coding and implementation including but not limited to Databases, Reporting, Data Warehouse, ETL Design and BI Applications across wide industry verticals.
  • Good working Experience in Software Development Life Cycle (SDLC) methodologies like Waterfall and Agile.
  • Strong expertise in designing and developing Business Intelligence solutions in staging, populating Operational Data Store (ODS), Enterprise Data Warehouse (EDW).
  • Experienced in usingETLtools includingInformaticaPowerCenter 10.x/9.x/8.x, Power Mart, Power Exchange, Workflow Manager, Repository Manager, Administration console andInformaticaData Quality (IDQ).
  • Performed the data profiling and analysis making use ofInformaticaData Quality (IDQ).
  • Configuring and maintaining various components of the MDM Hub including the schema, staging and landing tables, configuring base objects, look ups, Hierarchies, display queries, put queries and query groups.
  • Experienced in using ETL tools to design SQL Server Integration Services (SSIS) packages in Business Intelligence Development Studio (BIDS) for Data Management applications.
  • Excellent experience in migrating DTS packages from SQL Server 2008 to SQL Server 2012 and SQL Server 2016 SSIS packages.
  • Used variousInformaticaPower Center and Data Quality transformations as - aggregator, source qualifier, update strategy, expression, joiner, lookup, router, sorter, filter, XML Parser, labeler, parser, address validator, match, merge, comparison and standardizer to perform various data loading and cleansing activities.
  • Create processes and/or tools that enhance operational workflow and provide positive customer impact
  • Worked on various applications using Python integrated IDEs Eclipse.
  • Extensively usedInformaticadata masking transformation to mask NPI data (SSN, birth date and account number etc.) in Dev and QA environments.
  • Strong experience withInformaticatools using real-time Change Data Capture (CDC) andMD5.
  • Strong expertise in designing and developing Business Intelligence solutions in staging, populating Operational Data Store (ODS), Enterprise Data Warehouse (EDW).
  • Experienced in Multithreaded upload, leveraging Microsoft’s PolyBase technology to seamlessly copy data from Azure Blob Storage to Azure SQL Data Warehouse at high throughput.
  • Experienced in using advanced concepts ofInformaticalike Push Down Optimization (PDO).
  • Healthcare integration experience with HL7 and HIPAA.
  • Designed and developedInformaticamappings including Type-I, Type-II and Type-III slowly changing dimensions (SCD).
  • Extensively worked onAzure functions like HTTP Trigger functions, Blob Trigger functions, Event Hub Triggers and Queue Triggers to access the blobs uploaded in the containers.
  • Consults with clients and teammates to identify all facets of an issue and generate a solution.
  • Understands potential impacts to processes and systems across organization and factors these into solutions. Excellent conceptualization, analytical and logic skills
  • Created mappings using different IDQ transformations like Parser, Standardizer, Match, Labeler and Address Validator.
  • Created reusable transformations to load data from operational data source (ODS) to Data Warehouse and involved in capacity planning and storage of data.
  • Used Address validator transformation to validate source data with the reference data for standardizes addresses.
  • Experienced in Teradata SQL Programming.
  • Expertise in installing, managing, and configuringInformaticaMDMHub Server,InformaticaMDMCleanse Server, Hub Resource Kit, ACTIVE VOS Console and IDD.
  • Worked with Teradata utilities like Fast Load, Multi Load, Tpump and Teradata Parallel transporter.
  • Experience in using Transformations, creatingInformaticaMappings, Mapplets, Sessions, Worklets, Workflows and processing tasks usingInformaticaDesigner / Workflow Manager.
  • Experienced in schedulingInformaticajobs using scheduling tools like Tidal, Autosys and Control- M.
  • Experience in JAVA, J2EE, Web Services, SOAP, HTML and XML related technologies demonstrating strong analytical and problem-solving skills, computer proficiency and ability to follow through with projects from inception to completion.
  • Identifying the Entities, Attributes, and their relationships to create Logical Data Model, Converted the logical to Physical Data Model through Erwin Tool forMDMand Data warehouse Projects
  • Extensive experience in Netezza database design and workload management.
  • Experienced in writing SQL, PL/SQL programming, Stored Procedures, Functions, Triggers, Views and Materialized Views.
  • Good command on Database as Oracle … Teradata 13, SQL Server …, Netezza and MS Access 2003.
  • Data Modeling: Data modeling knowledge in Dimensional Data modeling, Star Schema, Snow-Flake Schema, FACT and Dimensions tables.
  • Extensive experience in writing UNIX shell scripts and automation of theETLprocesses using UNIX shell scripting.
  • Data governance application forInformaticaMDMHub that enables business users to effectively create, manage, consume, and monitor master data using IDD.
  • Experienced with Data Profiling/Data Quality using Informatica Developer, BDM and MDM toolset
  • Experience in using Golden Gate to support data replication in the Oracle database environment.
  • Worked with other developers to match company deadlines, using Java and Ruby on Rails, MySQL, MongoDB, Redis, Jquery, Agile, Git, Ajax etc.
  • Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes usingInformatica Power Center, SSIS, SSRS, Oracle, PL/SQL, Teradata, SQL Server, Web APIs, Azure Functions in C# and DB2.
  • Excellent communication and presentation skills works well as an integral part of a team, as well as independently, intellectually flexible and adaptive to change.

TECHNICAL SKILLS

Data Warehousing/ETL: InformaticaPowerCenter 10.x/9.x/8.x,InformaticaData Quality 10.X/9.x, SSIS, Power Exchange, Metadata Manager 9.6.1, MDM, Facets, Data Mart, OLAP, OLTP and ERWIN 4.x/3. x.

Data Modeling: Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Facets, Physical and Logical Data Modeling, Erwin and Oracle Designer.

Databases & Tools: Teradata, DB2 UDB 8.5, SQL Server … Netezza, Golden Gate, SQL*Plus, SQL*Loader and TOAD.

Scheduling Tools: Autosys, Tidal and Control-M.

Reporting Tools: OBIEE, Tableau, Hadoop, Spark, Business Objects XI/6.5/6.0 and Cognos Series 7.0.

Programming Languages: Unix Shell Scripting, SQL and PL/SQL Java, XML, XSD, Python Scripting, C#.

Cloud Platforms: Microsoft Azure and Google Web Services

Methodology: Agile, Ruby, SCRUM and Waterfall.

Environment: Windows, UNIX and LINUX.

PROFESSIONAL EXPERIENCE

Confidential, Scotts Valley, CA

Informatica Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Designed table structure in Netezza.
  • Designed various mappings and Mapplets using different Transformations Techniques such as Key Generator, Match, Labeler, Case Converter, Standardizer and Address Validator.
  • Responsible for creating the Product data model in Erwin and importing intoInformaticaMDMHUB Console.
  • Developed new mapping designs using various tools inInformaticalike Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • Used SSIS to create ETL packages (.dtsx files) to validate, extract, transform and load data to data warehouse databases, data mart databases and process SSAS cubes to store data to OLAP databases.
  • Extracted data from Facets application to pull it into the Staging area.
  • Updated data into the Facets application.
  • Implemented Informatica BDM mappings for extracting data from DWH to Data Lake
  • Involved setting up Hadoop configuration (Hadoop cluster, Hive connection) using Informatica BDM.
  • Analyzed facets data like claims, billing to resolve related subject areas issues.
  • Used Facets application to open, add generations, enter and save information.
  • Worked with other teams on Facets Data model design and Facets batch processing.
  • Developed the mappings using transformations inInformaticaaccording to technical specifications.
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area.
  • Wrote Python scripts to parse XML documents and load the data in database.
  • Implement Data Quality Rules usingInformaticaData Quality (IDQ) to check correctness of the source files and perform the data cleansing/enrichment.
  • Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
  • Created profiles and score cards for the users using IDQ.
  • Have configured the Smart Search and Entity 360 in IDD for ProductMDM
  • Built several reusable components on IDQ using Parsers Standardizers and Reference tables.
  • Responsible for Deploying, Scheduling Jobs, Alerting and Maintaining SSIS packages using integration services catalog.
  • Developed Merge jobs in Python in order to extract and load data into MySQL database and used Test driven approach for developing applications.
  • Used HIPAA security guidelines
  • Participate in software and system performance analysis and tuning, service capacity planning and demand forecasting
  • Create processes and/or tools that enhance operational workflow and provide positive customer impact
  • UsedInformaticareusability at various levels of development.
  • Involved in Database migrations from legacy systems, SQL server to Oracle and Netezza.
  • Developed mappings/sessions usingInformaticaPower Center 9.6.1 for data loading.
  • Hands-on experience on Informatica Cloud services
  • Tested the ingested data into blobs after it gets stored in the Blob containers and then loaded the data from blobs toAzure Data Warehouse.
  • Performed data manipulations using variousInformaticaTransformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Created mappings, Mapplets according to Business requirement usingInformaticabig data version and deployed them as applications and exported to power center for scheduling.
  • Created NZLOAD process to load data into DataMart in Netezza.
  • Developed Workflows using taskdeveloper, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Building Reports according to user Requirement.
  • Used Informatica Intelligent Cloud Services to load data into Azure SQL Data Warehouse.
  • Experienced in loading data between Netezza tables using NZSQL utility.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Developed applications as Proof of Concept using Ruby on Rails, Javascript, HTML5, CSS3,SQL, Node, Backbone, etc.
  • Developed integration services using SOA, Web Services, SOAP, and WSDL.
  • Handling and resolving Golden Gate Unique constraint collusions.
  • Written UNIX shell scripts to load data from flat files to Netezza database.
  • SchedulingInformaticajobs and implementing dependencies if necessary using Autosys.
  • Data governance application forInformaticaMDMHub that enables business users to effectively create, manage, consume, and monitor master data using IDD.
  • Optimizing performance tuning at source, target, mapping and session level.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: InformaticaPower Center 10.1, IDQ 9.6.1, Informatica MDM HUB, Oracle, Teradata V2R5, Vertica, TOAD for Oracle, SQL Server 2016, PL/SQL, DB2, Netezza, Golden Gate, Agile, Big Data, Ruby, Python, SQL, Erwin 4.5, Business Objects, Unix Shell Scripting (PERL), UNIX (AIX), Windows XP and Autosys

Confidential, Ridgeland, MS

Informatica Developer

Responsibilities:

  • Gathered user Requirements and designed Source to Target data load specifications based on Business rules.
  • UsedInformaticaPower Center 10.1.1 for extraction, loading and transformation (ETL) of data in the data mart.
  • Designed and developedETLMappings to extract data from flat files, MS Excel and Oracle to load the data into the target database.
  • Extensively used SQL Server Integration Services (SSIS) to produce a Data Warehouse for reporting
  • Developed complex T-SQL queries and designed SSIS packages to load the data into warehouse.
  • Designing SSIS Packages using several transformations to perform Data profiling, Data Cleansing and Data Transformation.
  • Configured Message triggers and queues to notify external applications about data changes inMDM.
  • Developing several complex mappings inInformaticaa variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer usingInformaticaPower Center.
  • Experienced using the Informatica Intelligent Cloud Services (IICS).
  • Design & development of BDM mappings in Hive mode for large volumes of INSERT/UPDATE
  • Experience in managing and setting up huge Big data clusters having more than 500 nodes
  • Transformation mapping for inserting and updating records when loaded.
  • Worked on Extraction, Transforming and Loading (ETL) data flows using SSIS; creating mappings/workflows to extract data from SQL Server and Flat File sources and load into various Business Entities.
  • Expertise in conversions from SQL Server to Teradata.
  • BuiltAzure Data Warehouse Table Data sets for PowerBI Reports
  • Created Azure Data Factories for data acquisition
  • Created complex mappings to load the data mart and monitored them. The mappings involved extensive
  • Extensively usedETLprocesses to load data from various source systems such as DB2, SQL Server and Flat Files, XML files into target system Teradata by applying business logic on Use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations.
  • Created queries, procedures and packages inMDMHub for displaying and updating the data.
  • Ran the workflows on a daily and weekly basis using Active Batch Scheduling tool.
  • Examined the workflow log files and assigning the ticket to theInformaticasupport based on the error.
  • Experience in developing Unix Shell Scripts for automation ofETLprocess.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Multicast, Data conversion, Merge Join Conditional split, SQL task, Script task and Send Mail task etc.
  • Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Performed operational support and maintenance ofETLbug fixes and defects.
  • Maintained the target database in the production and testing environments.
  • Created Mapping Parameters and Variables.
  • Worked with Teradata utilities such as Mload, Fload, export etc. to load or extract tables.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
  • Worked on Master Data Management (MDM), Hub Configurations (SIF), Data Director, extract, transform, cleansing, loading the data onto the tables.
  • Performed QA testing of other developer’s ETL jobs.
  • Design, develop, test and review & optimizeInformaticaMDMandInformaticaIDD Applications
  • Supported migration ofETLcode from development to QA and QA to production environments.
  • Migration of code between the Environments and maintaining the code backups.
  • Implemented for setting up Informatica BDM tool & Hadoop cluster environment from inception to production
  • Worked on designs to migrate from traditional RDBMS to Big Data platforms
  • Designed and developed Unix Shell Scripts, FTP, sending files to source directory & managing session files.
  • Done extensive testing and wrote queries in SQL to ensure the loading of the data.
  • Developed PL/SQL code at the database level for the new objects.
  • Building Reports according to user Requirement.
  • Designed, developed, tested, and maintained Tableau functional reports based on user requirements.
  • Developed Tableau data visualization using Pareto's, Combo charts, Heat maps, Box and Whisker charts Scatter Plots, Geographic Map, Cross tabs, Histograms etc.

Environment: InformaticaPower Center 10.1.1, SQL Server Integration Services (SSIS) 2005, Informatica MDM, Oracle 11g, PL/SQL, Teradata, Flat files, SQL Server 2016, Erwin, UNIX. Toad 9.0, Big Data, Hadoop, Hive, Oracle SQLdeveloper, Tableau 10.2, Microsoft Azure.

Confidential, VA

Informatica Developer

Responsibilities:

  • Interacted with Business Analyst to understand the business requirements and implement the same into a functional Data warehouse design.
  • Responsible for Developing Informatica development life cycle process documents.
  • Created various PL/SQL stored procedures, functions, views, Cursors, indexes on target tables
  • Extracted data from various sources like Oracle, flat files and XML.
  • Development of mappings as per the technical specifications approved by the client.
  • Developed mappings using Informatica PowerCenter Designer to transform and load the data from source systems to target database.
  • Worked on various types of transformations like Expression, Joiner, update strategy, Aggregator, Filter and Lookup.
  • Experience in Error Handling using TRY and CATCH blocks and performance-tuning using counters in SSIS.
  • Analyzed facets data like claims, billing to resolve related subject areas issues.
  • Used Facets application to open, add generations, enter and save information.
  • Worked with other teams on Facets Data model design and Facets batch processing.
  • Evaluated data file submissions and developing/maintaining SSIS package for ETL process
  • Created Mapplets to reduce the development time and complexity of mappings and better maintenance and Worked on different sources like Oracle, flat files.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Involved in enhancements and maintenance activities of the data warehouse including performance tuning.
  • Used mapping variable for imparting flexible runs of workflows based on changing values.
  • Developed SSIS packages using for each loop in Control Flow to process all excel files within folder
  • File System Task to move file into Archive after processing and Execute SQL task to insert transaction log data into the SQL table.
  • Involved in the debugging of the mappings by creating break points to gain trouble shooting information about data and error conditions.
  • Responsible for testing and validating the Informatica mappings against the pre-defined ETL design standards.
  • Created sessions and workflows to run with the logic embedded in the mappings using PowerCenter Designer.
  • Created SSIS packages for File Transfer from one location to the other using FTP task.
  • Played a key role in designing the application and migrate the existing data from relational sources to corporate warehouse effectively by using Informatica PowerCenter
  • Created various tasks like sessions, decision, timer and control to design the workflows based on dependencies.
  • Co-ordinating with users and establishing good relations between users and development team.
  • Scheduling the session’s tasks comprising of different mappings for data conversion, extractions in order to load data into oracle database.

Environment: Informatica PowerCenter 9.5.1, Oracle 11g, TOAD, SQL Server, SSIS, Facets, UNIX, AutoSys.

Confidential, Dublin, OH

Informatica Developer

Responsibilities:

  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets.
  • Responsible for Impact Analysis, upstream/downstream impacts. Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Created new mappings and enhancements to the old mappings according to changes or additions to the Business logic.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Worked on Informatica Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet, and Transformation Developer.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Worked on PROVIDER, CLAIMS, MEMBER, EDI, EITTR, WEBPROVIDERS subject areas
  • Developed complex mappings using different transformations like source qualifier, connected look up, unconnected look up, expression, aggregator, joiner, filter, normalize, sequence generator and router transformations.
  • Worked with SQL Override in the Source Qualifier and Lookup transformation.
  • Used Update Strategy DD INSERT, DD UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
  • Developed SCD 1 and SCD 2 to capture new changes while maintaining the historic information.
  • Prepared detail documentation for the developed code for QA to be used as guide for future migration work.
  • Extensively worked in the performance tuning of SQL, ETL and other processes to optimize session performance.
  • Worked extensively with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
  • Extensively worked on data profiling task in SSIS.
  • Developed Unit test cases and Unit test plans to check if the data is correctly loading.
  • Developing control files, Stored Procedures to manipulate and load the data into Oracle database.
  • Scheduling Informatica jobs and implementing dependencies if necessary.
  • Managed post production issues and delivered all assignments/projects within specified time lines.

Environment: Informatica 9.5, SQL Server 2012, Oracle 10g, TOAD, Autosys, MS Office Tools.

Confidential

ETL Informatica Developer

Responsibilities:

  • Worked in all phases of SDLC from requirement gathering, design, development, testing, training and rollout to the field user and support for production environment.
  • Worked in Environment that was following Agile Methodology.
  • Prepared the required application design documents based on functionality required
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width), and Excel files to staging database and from staging to the target Oracle Data Warehouse database.
  • Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
  • Designed and developed the logic for handling slowly changing dimension tables load by flagging the record using update strategy for populating the desired.
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Performance tuning of Informatica sessions for large data files has been done by increasing the Buffer block size, data cache size, and sequence buffer length.
  • Documented ETL test plans, test cases, test scripts and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
  • Migrated mappings and sessions from development environment to testing environment by exporting/importing XML files
  • Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data
  • Worked with Oracle DAC team to schedule one time/daily jobs for different Informatica workflows/tasks
  • Performed Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
  • Involved in production support working with various mitigation tickets created while the users working to retrieve the database.
  • Worked with BI team to generate reports using OBIEE.

Environment: Informatica Power Center 9.0.1, Oracle 9i, TOAD, UNIX Shell Scripting, OBIEE, Oracle DAC

We'd love your feedback!