We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

0/5 (Submit Your Rating)

Seymour, IN

SUMMARY:

  • Accomplished Senior Informatica Developer with over 9 years of experience in Analysis, Development, and Implementation of business applications using Informatica Power Exchange, Informatica Power Center.
  • Designed and developed Complex mappings from various Transformations like re - usable transformations, and Mappings/Mapplets, Unconnected / Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more, mappings using Stored Procedure’s, Normalizer, XML, External Procedure
  • Involved in designing and implementing Data Mart / Data Warehouse applications using/INFORMATICA 9.5.1 /9.x/ Informatica Cloud express/PowerCenter/8.x/7.x/6.x (Designer, Workflow manager, Workflow monitor and Repository Manager).
  • Experience in Performance tuning of targets, sources, mappings and sessions.
  • Experience in using Informatica version 9.01 advanced edition with add-ons for data profiling (Informatica Data Explorer) and metadata exchange. Collected metadata in Informatica metadata manager repository. Set up and lead usage of new Informatica features in the new Analyst and Developer Informatica client’s tools.
  • Good knowledge on SDLC (software development life cycle) and good experience with unit testing and integration testing.
  • Preparing the High level Design Specifications for ETL Coding and mapping standards.
  • Extensive experience in OBIEErepository (Physical, Business Model and Mapping and Presentation layers) for both Stand-Alone and Integrated Siebel Analytics implementations.
  • Experience with business object data integration, integration of various data sources like Oracle, SQL server and MS access, flat files, Teradata, and XML, EBCDIC, EPIC files. Experience in Data Flux, ERWIN/ ER studio.
  • Strong expertise in Relational data base systems like Oracle 8i/9i/10g, SQL Server 2000/2005, MS Access, design and database development using SQL, PL/SQL, SQL PLUS, TOAD,SQL LOADER.
  • Two years’ experience installing, configuring, testing Hadoop ecosystem components.
  • Experience with UNIX commands, VI editor and writing Shell scripts.
  • Expertise Informatica B2B Data Transformation, DT and DX, pre-built transformations for most versions of industry Confidential messages including the following EDI standards and derivatives.
  • Experience in Ralph Kimball Methodology, Logical Modeling, Physical Modeling, Dimensional Data Modeling, Star Schema, Snowflake Schema, FACT tables, Dimension tables change or transformed of unstructured data to well formed XML.
  • Expertise in OLTP/OLAP System study, developing Database schemas like star schema, snow flake schema Dimensional Data Modeling used in relational, dimensional modeling and slowly changing dimensions (SCD’s).
  • Worked in Agile team setting to develop features for E-commerce sites with new technology platform .
  • Performed ETL procedure to load data from different sources into data marts and Data Warehouse using Power Center.
  • Experience in working with scheduling tools like Autosys and Control M. Experience with Informatica Web services / message queue.
  • IDQ-Informatica Data Quality is used for Data Quality Analysis perspective provides subset of data, attributes details which gives what is wrong and what is being used. It has capability to generate various levels of reports/graphs based on data
  • Experience with high volume datasets from various sources like Oracle, Text Files, and Netezza Relational Tables and xml targets
  • Knowledge of reporting tool like OBIEE, Cognos, Microstrategy and Business Objects.
  • Experience inInformaticaDACTool for Customizing Data warehouses and Creating Task and Task Groups.
  • Experience on ILM data archiving tool to retrieve the legacy application.
  • Used data discovery tool to view reports in ILM.
  • Good experience on Webservices and experienced with Maplets in DT is ideal
  • ETL experience in development of mappings and tuned existing mappings for better performance using InformaticaPowerExchange, InformaticaPowerCenter as per the business rules.
  • HIPPA certified through HP.
  • Expertise in writing Oracle PL/SQL Stored Procedures/Functions/Packages and cursors, Triggers.
  • SQL Tuning and creation of indexes for faster database access, better query performance in Informatica by creating Explain Plan, SQL hints for query and indexing the required columns.
  • Develop, execute ETL test plans, Document ETL processing and generate required metadata.
  • Develop a transformation by marking the relevant data directly on a sample of the data source, and mapping that data to a chosen XML schema
  • Performed Informatica B2B Data Transformation maintains architectural separation of protocols, formats, and interfaces updates than competitors. And Informatica B2B Data Transformation’s universal deployment capability.
  • Experience in leading the teams for Designed and developed ETL process to load and maintain EDW for Confidential and Confidential and Confidential and other clients. Used UNIX shell scripting, Informaticapower center, CDC, Autosys and stored procedures to achieve the above.
  • Ability to achieve project goals within project constraints such as scope, timing and budget.Allocating work and managing theresource planning.Day to day monitoring of thedeliverableand meeting the expectations of the business clients.
  • Expertise in writing DDL, DML, DCL, and TCL commands
  • Experience in creating reports using Oracle Business Intelligence 10x, and Siebel Analytics.
  • Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new technologies and tools.

PROFESSIONAL EXPERIENCE:

Confidential, Seymour, IN

Sr. Informatica Developer

Responsibilities:

  • Driving the project starting from capturing business requirements, project development, acceptance testing and final implementation.
  • Involved in gathering and reviewing business requirements. Involved in designing the specifications, Design Documents, Data modeling and design of data warehouse.
  • Migration of mappings and transformation from one environment to another. Work with dev team and business partners to clarify ETL requirements and business rules.
  • Write INFA mappings using PC, PE 8.x/PC 9.1 & 10
  • Design, develop, implement, and assist and in validating ETL processes.
  • Create and execute unit test plans based on system and validation requirements.
  • Troubleshoot, optimize, and tune ETL processes.
  • Effectively lead the different teams with size of 8-20 resources to achieve project goals within project constraints such as scope, timing and budget.Allocating work and managing theresource planning.Day to day monitoring of thedeliverableand meeting the expectations of the business clients.
  • Understand the project requirements given by the Client, Co-ordinate with offshore team to translate technical specs toInformaticawith Business rules.
  • Interacting with business users effectively to understand precise and accurate requirements.
  • Reviewing the requirements with business, doing regular follow ups and obtaining sign offs
  • Finalize resource requirements and building offshore ETL and reporting teams based on signed off business requirements.
  • Providing clarifications to offshore team about technical specification documents including data transformation and business rules.
  • Use a single solution to transfer any file or message via any protocol.
  • Boost productivity and decrease operational costs with preconfigured connections, wizards, customizable processes, and an intuitive user interface.
  • Minimize business and regulatory risks of sensitive data with advanced Drummond-certified, secured communications and accelerated encryption option.
  • Effectively coordinating with offshore teams, Client BI team and business users to deliver the project on time
  • Working closely with the Client BI team for resolving technical issues, getting data models, ETL architectures and high level designs reviewed and any sort of technical support required
  • Effectively coordinating with business users to complete acceptance testing within timelines and obtain sign offs.
  • Working on data warehousing concepts/design with good understanding of the ETL and reporting processes.
  • Participate in cross-application integration testing and system testing and worked with team members in the defect resolution process.
  • Ensure that all timelines of loading/validating data are met with comparing host (mainframes) files.
  • Ensure smooth functioning of our development, QA and Worked with team members in the defect resolution process
  • Implementing methods to validate that data supplied by external sources were loaded correctly into the awards database.
  • Validating the data were loaded correctly, external data will be examined for signals that indicate the source might be incorrect.
  • Working closely with the Client BI team for resolving technical issues, getting data models, ETL architectures and high level designs reviewed and any sort of technical support required
  • Effectively coordinating with business users to complete acceptance testing within timelines and obtain sign offs.
  • Use a single solution to transfer any file or message via any protocol.
  • Boost productivity and decrease operational costs with preconfigured connections, wizards, customizable processes, and an intuitive user interface.
  • Minimize business and regulatory risks of sensitive data with advanced Drummond-certified, secured communications and accelerated encryption option.
  • Working on data warehousing concepts/design with good understanding of the ETL and reporting processes.
  • Participate in cross-application integration testing and system testing and worked with team members in the defect resolution process.
  • Ensure that all timelines of loading/validating data are met with comparing host (mainframes) files.
  • Installed and configured MapReduce, SCALA, HIVE and the HDFS; implemented CDH3 Hadoop cluster on CentOS. Assisted with performance tuning and monitoring.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Supported code/design analysis, strategy development and project planning.
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Assisted with data capacity planning and node forecasting.
  • Ensure smooth functioning of our development, QA and Worked with team members in the defect resolution process
  • Collaborate with team members to resolve issues to assure the successful delivery of high quality ETL and BI Code.

Environment: Informatica10/9.X/ Informatica MFT,Informatica cloud services/Power Center9.5.1/8.6, Power Exchange, UNIX, ULTRAEDIT, WINSQL, WINSCP, MS ACCESS, Windows NT, Oracle 9i/10g, DB2, MAINFRAMES, Erwin 4.0, ILM-Data Archiving 6.1, SQL, PL/SQL,SCALA, T SQL, TOAD, Hadoop(BDM), Talend, CARS, XML, Testlink, HP SERVICE MANAGER, LOTUS NOTES.

Confidential, IL

Sr Informatica Developer

Responsibilities:

  • Driving the project starting from capturing business requirements, project development, acceptance testing and final implementation.
  • Effectively lead the different teams with size of 8-20 resources to achieve project goals within project constraints such as scope, timing and budget.Allocating work and managing theresource planning.Day to day monitoring of thedeliverableand meeting the expectations of the business clients.
  • Understand the project requirements given by the Client, Co-ordinate with offshore team to translate technical specs toInformaticawith Business rules.
  • Applying rules and policies using ILM (Information Life Cycle Management) workbench for Data Masking Transformationand loading into targets.
  • Experienced in troubleshooting the errors in ILM jobs with Informatica offshore team, Followed and Maintained Policies and Guidelines for data movement adhering to Client standards using with ILM tool.
  • Implementing methods to validate that data supplied by external sources were loaded correctly into the awards database.
  • Validating the data were loaded correctly, external data will be examined for signals that indicate the source might be incorrect.
  • Rewriting the database that maintains information about agency awards programs, to enable the database andto support calculations using the new book of business concept. It will also simplify the process of adding new metrics and targets.
  • Interacting with business users effectively to understand precise and accurate requirements.
  • Reviewing the requirements with business, doing regular follow ups and obtaining sign offs
  • Finalize resource requirements and building offshore ETL and reporting teams based on signed off business requirements.
  • Providing clarifications to offshore team about technical specification documents including data transformation and business rules.
  • Effectively coordinating with offshore teams, Client BI team and business users to deliver the project on time.
  • Working closely with the Client BI team for resolving technical issues, getting data models, ETL architectures and high level designs reviewed and any sort of technical support required.
  • Effectively coordinating with business users to complete acceptance testing within timelines and obtain sign offs.
  • Working on data warehousing concepts/design with good understanding of the ETL and reporting processes.
  • Participate in cross-application integration testing and system testing testing and worked with team members in the defect resolution process.
  • Ensure that all timelines of loading/validating data are met with comparing host (mainframes) files.
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
  • This Project involved the migration of legacy Oracle applications to the SAP R/3 implementation.
  • Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
  • Managed and reviewed Hadoop log files.
  • Ensure smooth functioning of our development, QA and Worked with team members in the defect resolution process.
  • Collaborate with team members to resolve issues to assure the successful delivery of high quality ETL and BI Code.

Environment: Informatica 10/9.X, Informatica cloud services, Power Center 9.5.1/8.6, Power Exchange, UNIX, ULTRAEDIT, WINSQL, WINSCP, MS ACCESS, Windows NT, Oracle 9i/10g, DB2, MAINFRAMES, Erwin 4.0, ILM-Data Archiving 6.1, SQL, PL/SQL, T SQL, TOAD, Hadoop, Talend, CARS, XML, Testlink, HP SERVICE MANAGER, LOTUS NOTES.

Confidential, MI

Sr. Informatica Developer/project coordinator

Responsibilities:

  • Involved in gathering and reviewing business requirements. Involved in designing the specifications, Design Documents, Data modeling and design of data warehouse.
  • Migration of mappings and transformation from one environment to another.
  • Work with dev team lead and business partners to clarify ETL requirements and business rules.
  • Write INFA mappings using PC, PE 8.x/PC 9.1.
  • Design, develop, implement, and assist and in validating ETL processes.
  • Create and execute unit test plans based on system and validation requirements.
  • Troubleshoot, optimize, and tune ETL processes.
  • Document all ETL related work per Confidential DF methodology.
  • Maintain existing code and fix bugs whenever needed.
  • Ensure that all timelines of loading/validating data are met.
  • Ensure smooth functioning of our development, QA, production, and staging environments.
  • Used UNIX Putty to transfer, copy, and move and assign privileges to the files like read, write & execute. Used TOAD to develop oracle PL/SQL, DDL's, and Stored Procedures. Performance.
  • Schedule and Run Extraction and Load process, monitor task and workflow using the Workflow Manager and Workflow monitor.
  • Used Error handling strategy for trapping errors in a mapping and sending errors to an error table.
  • Scheduling Informatica workflows using Informatica scheduling tool.
  • Wrote UNIX shell scripts jobs to automate the workflows for batch processing.
  • Used Informatica Load Plan option to maintain the order of target loading.
  • Did unit testing and development testing at ETL level in my mappings
  • Created informatica mapping template with mapping visio architect tool.

Environment: Informatica Power Center 10/ 9.1/8.6.1, Informatica B2B, DTD files, IDCOS, Oracle 10g/11g, winsql, postgresql, PL/SQL, SQL*Plus, SQL Loader, HTML 4.0, MS SQL Server 2000, MS-Excel, XML, Flat files, Windows NT, HP-UX, Shell Scripting, Winscp, Putty, smartFTP.

Confidential, CT

Lead/Sr. Informatica Developer

Responsibilities:

  • Involved in gathering and reviewing business requirements. Involved in designing the specifications, Design Documents, Data modeling and design of data warehouse.
  • Responsible for definition, development and testing of processes/programs necessary to extract data from operational database and syndicated File System IMS and, SWIFT, HL 7 and EDI-X12 flat files.
  • Transform and cleanse data, and Load it into data warehouse using Informatica Power center.
  • Created the repository manager, users, user groups and their access profiles.
  • Created Logical & Physical models and used ERwin for data modeling and Dimensional Data Modeling.
  • Created complex mappings in Power Center Designer using Expression, Filter, Sequence Generator, Update Strategy, Joiner and Stored procedure transformations.
  • Created connected and unconnected Lookup transformations to look up the data from the source and target tables.
  • Modeling and populating the business rules using mappings into the Repository for Meta Data management.
  • Worked extensively with HL7 Data, EDI X12 Messages, HIPAA Transactions, ICD Codes.
  • Worked with HIPAA Transactions and EDI transaction sets (834, 835, 837, 824, 820, 270, 276, 271, 278)
  • HL7 data was used in creating EMPI and also Patients Datamart.
  • Parsed HL7 messages and worked with HL7 Delimiter definitions (Segment Terminator, Field Separator, Component Seperator, Subcomponent separator, Repetition Separator, Escape Separator) for identifying and Separating HL7 data.
  • Demonstrated expertise utilizing ETL tools, including SQL Server Integration Services (SSIS), and Informatica and ETL package design, and RDBMS systems like SQL Server, Oracle.
  • Cleansed the source data, extracted and transformed data with business rules, and built reusable mappings, known as ‘Mapplets’ using Informatica Designer.
  • Involved in the development of Informatica mappings and also performed tuning for better performance.
  • Designed and developed sophisticated workflows and design mapping using B2B transformation, Data studio with Informatica B2B components dx (Data Exchange) and DT (Data Transformation) .
  • Extensively worked on tuning (Both Database and Informatica side) and thereby improving the load time.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Setting up sessions to schedule the loads at required frequency using Power Center, Workflow manager, PMCMD and also using scheduling tools.
  • Converting unstructured data into an XML format. Used DTs to build platforms, scheduled the workflows with DT tool.
  • Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
  • Automated the entire processes using UNIX shell scripts.
  • Implemented deployment procedures and started the INFA Services thorough Unix/Putty and done the migration from old version to new version, scheduled the jobs through Informatica.
  • Developed the production support technical documents .Worked on UDB/ DB2, Tested the target data against source system tables by writing some QA Procedures.
  • Conduct status meetings with project managers, escalate issues when necessary, conducts meetings for issues resolution
  • Interacted with Lead Developers, System Analysts, Business Users, Architects, Test Analysts, Project Managers and peer developers to analyze system requirements, design and develop software applications.
  • Created new extracts for external vendors and use Informatica ETLs for new workflows to move data out of multiple data sources.
  • Unit test coding changes, fill out required documentation such as installation instructions, and follow standards and procedures.
  • Provided Tier l support duties for Informatica Production Support, including off hours and weekends. Keep the Production Support team management informed of any issues or concerns. New development and support task responsibilities.

Environment: Informatica 9.X/ Power Center 9.0/8.6, Power Exchange, informatic MFT, sHL7 3.x/2.4, HIPAA, Epic Systems,UNIX, Windows NT, Oracle 9i/10g,, DB2, Talend, Erwin 4.0,SCALA, SQL, PL/SQL, T SQL, TOAD, CARS,XML,MDM.

Confidential, NYC

Sr. Informatica Developer

Responsibilities:

  • Developed complex Informatica mappings to load the data from various sources using different transformations like source qualifier, connected and unconnected look up, expression, aggregator, joiner, filter, normalizer, rank and router transformations.
  • Worked Informatica power center tools like source analyzer, mapping designer, mapplet and transformations.
  • Developed Informatica mappings and also tuned for better performance.
  • Worked with MLOAD, FASTLOAD, TPUMP and BTEQ utilities of Teradata for faster loading and to improve the performance, load data from flat files, Oracle, Teradata database.
  • Teradata views have been developed against the Departmental database and claims engine database to get the required data.
  • Extensively used various Functions like LTRIM, RTRIM, ISNULL, ISDATE, TO DATE, Decode, Substr, Instr and IIF function.
  • Responsible for Performance Tuning at the Mapping Level and Session level.
  • Worked with SQL Override in the Source Qualifier and Lookup transformation.
  • Extensively worked with both Connected and Unconnected Lookup Transformations.
  • These views are built through Harvest Change Manager Tool in desired schema in Teradata Warehouse and used as one of the sources for Informatica.
  • Load balancing of ETL processes, database performance tuning and capacity monitoring.
  • Involved in Unit testing and System testing of the individual. Source system up-gradation.
  • Analyzed existing system and developed business documentation on changes required.
  • Used UNIX to create Parameter files and for real time applications.
  • Developed shell scripts.
  • Prepared Detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.
  • Prepared Unit Test plan and efficient unit test documentation was created along with Unit test cases for the developed code.
  • Detail system defects are created to inform the project team about the status throughout the process.
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache and Persistent Cache.
  • Used Update Strategy DD INSERT, DD UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
  • Developed Re-Usable Transformations and Re-Usable Mapplets.
  • Developed Slowly Changing Dimensions Mapping for Type 1 SCD and Type 2 SCD.

Environment: Informatica Power Centre/9.X 8.6.1/8.1.3, Informatica Identity Resolution (IIR), Teradata V2R12/V2R6, Oracle 10g, Teradata SQL Assistant and Administrator, CARS, IMS Data, XML, LINUX, UNIX Shell Scripting, Rational Clear Quest, Agile methodology, Windows, Autosys

We'd love your feedback!