We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

4.00/5 (Submit Your Rating)

Tampa, FL

PROFESSIONAL SUMMARY:

  • Over 8 Plus years of progressive hands - on experience in analysis, ETL processes, design and development of enterprise level data warehouse architectures, designing, coding, testing, integrating ETL.
  • Proficient in understanding business requirements and Experience in interacting with business users to clarify requirements and translate the requirements into technical specifications
  • Experience in Dimensional data modelling techniques, Slow Changing Dimensions (SCD), Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing), and Data warehouse concepts - Star Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling.
  • Experienced in integration of various data sources like Oracle 11g,10g/9i/8i, MS SQL Server 2005/2000, XML files, Teradata, Netezza, Sybase, DB2, Flat files, XML, Salesforce sources into staging area and different target databases.
  • Expertise in developing standard and re-usable mappings using various transformations like expression aggregator, joiner, source qualifier, lookup and router and filter transformations.
  • Designed complex Mappings and have expertise in performance tuning of source and target, mappings, sessions, workflows.
  • Experience in designing and developing code SQL, PL/SQL and comfortable in developing UNIX shell scripts to run SQL scripts and Informatica workflows from Unix server and Perl scripting and automation of ETL Processes
  • Extensively worked on InformaticaData Quality (IDQ) and Informatica Power center throughout complete Data Quality projects.
  • Extensively worked on InformaticaData Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.
  • Strong Experience in various MDM implementations, Architectures in territory, customer, Employees, product and customers.
  • As a Data Quality developer initiated the process of Data profiling by profiling different formats of data from different sources.
  • Expertise in Informatica Data Services (IDS) that provides data available through a virtual database that provides access to data from heterogeneous sources without loading data to a physical target and a web service makes data available over the internet where a client can connect to a web service to access, transform, and deliver data.
  • Expertise in IDS Services like defining Business logics, analyzing the structure and data quality, creating a single view of data etc.
  • Involved in production support, resolving the production job failures, interacting with the operations support group for resuming the failed jobs.
  • Expertise in Data staging that facilitates the enriching stages that data goes through in-order to populate an ODS and/or data warehouse as it is the essential for creating a comprehensive data-centric solution for any data warehousing project.
  • Experience in creating SSIS packages using Active X scripts and with Error Handling.
  • Expertise in enhancing and deploying the SSIS Packages from development server to production server
  • With both On-site and Off-shore experience have developed skills in system analysis, troubleshooting, debugging, deployment, Team management, prioritizing tasks and customer handling.
  • Excellent working experience with Insurance Industry with strong Business Knowledge in Auto, Life and Health Care - Lines of Business.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
  • Hands on experience working in LINUX, UNIX and Windows environments.
  • Experience in working Production Support and migrated the code from DEV to QA to Production
  • Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6.1/9.5/9.1/8.6 , Informatica Cloud, Informatica Power Exchange 5.1/4.7/1.7, Power Analyzer 3.5, Informatica Power Connect and Metadata Manager (MDM), Data Quality Tool (IDQ), Informatica Data Services(IDS) 9.6.1, DataStage

Databases: Oracle11g/10g/9i/8i/8.0/7.x Teradata14.1, DB2 UDB 8.1, MS SQL Server 2008/2005, SQL Server Management Studio (2008), Netezza 4.0 and DB Artisan (Sybase) ASE 12.5.3/15Operating Systems: UNIX (Sun-Solaris, HP-UX), LINUX, Windows NT/XP/Vista, MSDOS

Programming: SQL, SQL-Plus, PL/SQL, Perl Scripting, UNIX Shell Scripting

Reporting Tools: Business ObjectsXIR 2/6.5/5.0/5.1 , Cognos Impromptu 7.0/6.0/5.0, Informatica Analytics Delivery Platform, MicroStrategy.

Data Modeling: Dimensional Data Modeling, Star Schema Modeling, and Snow Flake Modeling.

Tools: Erwin 3/ 4/4.1, Tortoise SVN 1.6.15, CA Scheduling Tool, ESP, CICS, CA WA Workstation, CA Software Change Management Workbench.

Other Tools: SQL Navigator, Rapid SQL for DB2, Quest Toad for Oracle, Toad for Data Analyst 3.0, SQL Developer 1.5.1, Autosys, Telnet, MS SharePoint, Mercury Quality center, Tivoli Job Scheduling Console, JIRA, SSIS

Domain Expertise: HealthCare, Bank, Finance.

PROFESSIONAL EXPERIENCE:

Confidential, Tampa, FL

Sr. Informatica Developer

Responsibilities:

  • Interacted with the users, Business Analysts for collecting, understanding the business requirements.
  • Understand full software development lifecycle (SLDC) for ETL processes.
  • Schedule and monitor the sessions and workflows on the basis of run on demand, run on time using Informatica Power Center Workflow Manager.
  • Analyzing Session log files to resolve errors in mappings and manage session configuration.
  • Performance Tuning at various levels including Target, Source, Mapping and Session for large data files.
  • Worked with different kinds of Code migration including folder migration, deployment groups and Manual migration.
  • Informatica administration functions including server upgrade, mapping and session migration, user administration and performance tuning.
  • Involved in Production Support by performing Normal Loads, Bulk Loads, Initial Loads, Incremental Loads, Daily loads and Monthly loads.
  • Developed reports based on issues related to the data warehouse.
  • Developed monitoring scripts in UNIX.
  • Validate the required data at the database level by using tool Toad.
  • Files data move to another server by using SCP on UNIX platform.
  • Use Toad for DDL’s and to validate the data.
  • Use debugger to test the mapping and fix the bugs and identified the bottlenecks in all levels to tune the performance.
  • Resolved the production support tickets using remedy.
  • Checked session and error logs to troubleshoot problems, and also used debugger for complex problem trouble shooting.

Environment: Informatica Power Center 9.6.1, Oracle11g, IDQ 9.6.1, UNIX, Teradata 14.0, SQL Server, SSIS, Teradata Data Mover, Autosys Scheduler Tool, Teradata SQL Assistant 13.0, Power Connect, DB2, Business Objects XI3.5, IDS 9.6.1

Confidential, Eagan, MN

Informatica Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment.
  • Strong expertise in installing and configuring the core Informatica MDM components (Informatica MDM Hub Server, Hub Cleanse, Resource Kit, and Cleanse Adaptors like Address Doctor)
  • Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Created Stored Procedures for data transformation purpose.
  • Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management
  • Worked on Informatica Power Center 9x tool - Source Analyzer, Data warehousing designer, Mapping, Designer, Mapplet & Transformations.
  • Various kinds of the transformations were used to implement simple and complex business logic.
  • Creatednumerous Mappings and Mapplets using Transformations like Filters, Aggregator, Lookups, Expression, Sequence generator, Sorter, Joiner, and Update Strategy.
  • Created and configured workflows, worklets & Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Worked in building Data Integration and Workflow Solutions and Extract, Transform, and Load
  • (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).
  • Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
  • Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain the database.
  • Build a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
  • Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading in to MDM landing tables.
  • Troubleshoot problems by checking sessions and error logs.
  • Configured sessions using server manager to have multiple partitions Source data & to improve performance.
  • Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.
  • Worked on UC4 as job scheduler and used to run the created application and respective workflow in this job scheduler in selected recursive timings.
  • Generated PL/SQL and Shell scripts for scheduling periodic load processes.
  • Extensively worked on the Triggers, Functions, and Database Constraints.
  • Tuned Informatica Mappings and Sessions for optimum performance.
  • Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling the jobs.
  • Invoked Informatica using "pmcmd" utility from the UNIX script.
  • Wrote pre-session shell scripts to check session mode (enable/disable) before running/ schedule batches.
  • Perform process analysis to provide detailed documentation and recommendations to the Load forecasting team for future improvements
  • Involved in supporting 24*7 rotation system and strong grip in using scheduling tool Maestro.
  • Involved in Production support activities like batch monitoring process in UNIX
  • Involved in Production Support by performing Normal Loads, Bulk Loads, Initial Loads, Incremental Loads, Daily loads and Monthly loads.
  • Transition support of the Load forecasting applications/tools to an IT support organization and implement standard IT support processes and procedures.
  • Prepared Unit test case documents
  • Have performed Peer Reviews within the project.

Environment: Informatica Power Center 9.6.1, UNIX, Oracle, Linux, Perl, Shell, MDM, IDQ, IDS, PL/SQL, Tivoli, Oracle 11g/10g, Teradata 14.0, Maestro.

Confidential, Cleveland, OH

Sr. Informatica Developer

Responsibilities:

  • Worked on complete SDLC from Extraction, Transformation and Loading of data using Informatica.
  • Involved in the analysis, design and development of all the interface's using Informatica Power center tools in interface team and interfaced with all the other tracks for business related issues.
  • High proficiency with Power Center, Power Connect, and involved in utilizing Informatica products to move DB2and flat-file data to oracle.
  • Create ETL processes to extract data from Mainframe DB2 tables for loading into various oracle staging tables.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Involved with Informatica team members in Designing, document and configure the Informatica MDM Hub to support loading, cleansing, matching, merging, and publication of MDM data.
  • Defined Target Load Order Plan and to load data correctly into different Target Tables.
  • Worked Extensively on Informatica Developer (IDQ) with transformations like Lookup, Update Strategy, Router, Standardizer, Aggregator, Filter, Expression, Union, Labeler, Address Validator, Java and so on.
  • Created complex Data Quality (IDQ) mappings depending on the requirement.
  • Created and ran the unit test scripts for base data mart, remediate unit test defects and modified code accordingly.
  • Used relational sources and flat files to populate the data mart.
  • Translated the business processes into Informatica mappings for building the data mart.
  • Worked on SQL tools like TOAD to run SQL Queries to validate the data.
  • Used Debugger to test the mappings and fix the bugs.
  • Involved in Disaster recovery exercise program as for Data Warehouse.
  • Involved and worked as main resource for upgrading from Informatica 8.6 to 9.1.0.
  • Involved and worked as main resource in migration from our input vendors WKH to IMS.
  • Wrote ad-hoc query batches and stored procedures to assist in analyzing, cleaning, checking and processing data to ensure the maximum possible integrity and quality.
  • Worked with different Sources such as Oracle, SQL Server and Flat file.
  • Used Informatica to extract data into Data Warehouse.
  • Extensively used several transformations such as Source Qualifier, Router, Lookup (connected &unconnected), Update Strategy, Joiner, Expression, Aggregator and Sequence generator transformations.
  • Created reusable transformations and Mapplets and used them in mappings.
  • Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations for the Claim
  • Profitability Systems to facilitate Daily, Monthly and yearly loading of Data. Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, and the Target Data.
  • Worked on SQL tools like SQL developer to run SQL queries to validate the data.
  • Design and Development of pre-session, post-session and batch execution routines to run Informatica sessions using Informatica Server.
  • Created and Scheduled sessions & Worklets using workflow Manager to load the data into the Target Database.
  • Involved in migrating the coding to production environment.

Environment: Informatica Power Center 9.0.1, Erwin, Teradata, Tidal, SQL Assistance, DB2, XML, Oracle 9i/10g/11g, MQ Series, OBIEE 10.1.3.2, IDQ, MDM Toad and UNIX Shell Scripts.

Confidential, Chicago, IL

Informatica Developer

Responsibilities:

  • Designing the dimensional model and data load process using SCD Type 2 for the quarterly membership reporting purposes.
  • Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.
  • Generating the data feeds from analytical warehouse using required ETL logic to handle data transformations and business constraints while loading from source to target layout.
  • Worked on Master Data Management (MDM), Hub Development, extract, transform, cleansing, loading the data onto the staging and base object tables
  • Extracted data from multiple sources such as Oracle, XML, and Flat Files and loaded the transformed data into targets in Oracle, Flat Files.
  • Wrote Shell Scripts for Data loading and DDL Scripts.
  • Designing and coding the automated balancing process for the feeds that goes out from data warehouse.
  • Implement the automated balancing and control process which will enable the control on the audit and balance and control for the ETL code.
  • Improving the database access performance by tuning the DB access methods like creating partitions, using SQL hints, and using proper indexes.
  • All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
  • Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
  • Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and create mapplets that provides reusability in mappings.
  • Analyzing the impact and required changes to in corporate the standards in the existing data warehousing design.
  • Following the PDLC process to move the code across the environments though proper approvals and source control environments.
  • Source control using SCM.

Environment: Informatica Power Center 9.0.1, Erwin 7.2/4.5, Business Objects XI, Unix Shell Scripting, XML, Oracle 11g/10g, DB2 8.0, IDQ, MDM, TOAD, MS Excel, Flat Files, SQL Server 2008/2005, PL/SQL, Windows NT 4.0, Sun Solaris 2.6.

Confidential

Informatica Developer

Responsibilities:

  • Performed business analysis, requirements gathering and converted them into technical specifications.
  • Architected all the ETL data loads coming in from the source system and loading in to the MIS data mart.
  • Developed Informatica Sessions & Workflows using Informatica Workflow manager.
  • Optimized the performance of the Informatica mappings by analyzing the session logs and understanding various bottlenecks (Source/target/transformations).
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors that occur while loading.
  • Involved in Oracle PL/SQL query optimization to reduce the overall run time of stored procedures.
  • Created UNIX shell scripts to invoke the Informatica Workflows & Oracle stored procedures.
  • Created UNIX shell scripts to file move, file archive & FTP data files to other downstream applications.
  • Wrote ad-hoc SQL queries for reporting as per the requirement.
  • Involved in unit testing of various objects (Informatica Workflow/Oracle stored procedures/UNIX scripts).
  • Supported various testing cycles during the SIT & UAT phases.
  • Involved in creation of initial data set up in the production environment and involved in code migration activities to production.
  • Supported the daily/weekly ETL batches in the Production environment.
  • Prompt in responding to business user queries and changes.

Environment: Informatica Power Center 8.5, Teradata 11.X, Oracle 10g, COGNOS 8, UNIX, Version One, MS SharePoint.

We'd love your feedback!