We provide IT Staff Augmentation Services!

Informatica/ Tableau Reports Developer Resume

0/5 (Submit Your Rating)

IL

SUMMARY

  • 8+ years of overall experience working in the Information Technology Industry.
  • Extensively worked with largeEDWsystems in all functional areas including Automobile, Manufacturing, Consumer products, Order Management and Health Insurance.
  • Experience in BI / Data warehouse projects involving Informatica Power Center, Informatica Data Quality and Tableau Data visualization products.
  • Strong experience in all phases of Software Development Lifecycle (SDLC) using Waterfall, Agile/Scrum, RUP (Rational Unified Process) and Software Testing Life Cycle (STLC).
  • Strong experience with Scrum methodology and well versed in writing user stories. Excellent understanding of Business Requirements Gathering, Translating Requirements into Specifications and Application Design.
  • Proficient in using Agile Scrum methodologies, performed roles of Scrum Master following sprint/standup sessions and used Excel extensively to write user stories, analyzed the Iteration Burn Down charts and reviewed defects.
  • Experience in preparing and documenting the User Acceptance test (UAT) plan and obtaining the necessary signoffs from the concerned business units.
  • Worked on various projects through all phases of Data Warehouse Development Life Cycle, including, requirement gathering and analysis, design, development, testing, performance tuning, and production support.
  • Good Experience on Managing, Scheduling, and Monitoring of Informatica workflows.
  • Experience in data warehousing using various ETL tools (Informatica 6.1/7.1/ 8/9).
  • Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Data Transformation Services etc.
  • Experience in working with data warehouses and data marts using Informatica Power center (Designer, Repository Manager, Workflow Manager, and Workflow Monitor).
  • Design and Maintain Informatica Power Center mappings for extraction, transformation and loading between Oracle, Teradata and Sybase databases.
  • Extensively followed Ralph Kimball and Bill Inman Methodologies. Design the Data Mart model with Erwin using Star Schema methodology. Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin. Designed and Customized data models for Data warehouse supporting data from multiple sources on real time.
  • Understanding & Working knowledge of Informatica CDC (Change Data Capture).
  • Developed Strategies for Extraction Transformation Loading (ETL) mechanism.
  • Extensive knowledge with Dimensional Data modeling, Star Schema/snowflakes schema, Fact and Dimension tables.
  • 2 Year experienced working with Informatica Big Data Edition with Hadoop - Horton works
  • Experience working with real world Big Data Solutions using Hadoop - Horton works Distribution
  • Extensive work with PL/SQL, performance tuning of Oracle.
  • Built CUBE using Microsoft Analysis Service
  • Performed the data profiling and analysis making use ofInformatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Hands-on experience with Informatica Data Quality(8.x/9.x/10.0) toolset, and proficient in IDQ development around data profiling, cleansing, parsing, standardization, validation, matching and data quality exception monitoring and handling.
  • Vast experience in Designing and developing complex mappings from varied transformation Strategy logic like Address Validator, Case Converter, Standardizer, Labeler, Parser, Match, Classifier and Comparison Transformation Services, Address Doctor etc.
  • Experienced with Informatica PowerExchange (5.x/8.x/9.x) for Loading/Retrieving data from mainframe systems.
  • Extensive experience on Data validation tools ( Data Flux, Informatica DVO, Spoon )
  • Extensive Unix Shell Scripting knowledge.
  • Extensive experience on executing jobs in BMC CONTROL - M DESKTOP and CONTROL - M ENTERPRISE (7.0.0,8.0.0)
  • Extensive experience with relational databases (Oracle 8i/9i/10g, Teradata v2 R6, SSIS, Sql Server 2000/2005).
  • Extensive experience in Tableau Desktop, Tableau Server and Tableau Reader in various versions of Tableau 7.x/8.x/9.x.
  • Extensive knowledge in various reporting objects like Facts, Attributes, Hierarchies, Transformations, filters, prompts, Calculated fields, Sets, Groups, Parameters etc..in Tableau.
  • Worked on the development of Dashboard reports for the Key Performance Indicators for the top management
  • Experience in creating different visualizations using Bars, Lines and Pies, Maps, Scatter plots, Gantts, Bubbles, Histograms, Bullets, Heat maps and Highlight tables.
  • Experience in Load balancing /optimizing resource usage and defining Environment Variables for performance tuning.
  • Proficient in network installations and troubleshooting.
  • Reliable, responsible, hard working and good team player.

PROFESSIONAL EXPERIENCE

Confidential, IL

Informatica/ Tableau Reports Developer

Responsibilities:

  • Successfully used Agile/Scrum Method for gathering requirements and facilitated user stories.
  • Documented User Stories and facilitated Story Point discussions to analyze the level of effort on project specifications.
  • Worked as Scrum Master to Lead daily stand-ups and scrum ceremonies for two scrum teams.
  • Worked with product owners to groom the backlog and plan sprints. Track, escalate and remove impediments.
  • Report at daily Scrum of Scrum meetings. Track burn down, issues and progress.
  • Worked with component teams to resolve issues.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Worked on Informatica Devloper tool(IDQ) and Tableau Reporting
  • Parsed high-level design specification to simple ETL coding and mapping standards. using Informatica, Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder
  • Created mapping documents and visio document using Power Center Architecture visio to outline data flow from sources to targets
  • Extensively used all the features of Informatica Versions 9.6.x and 9.1.x including Designer, Workflow manager and Repository Manager, Workflow monitor.
  • Extract data from flat files,DB2 and to load the data into the Oracle Extensively used various transformations like Filter, Router, Sequence Generator, Lookups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
  • Improved performance using Oracle Partitions, Indexes and other Performance Tuning techniques. Developed re-usable components in Informatica, Oracle and UNIX
  • Created flexible mappings/sessions using parameters, variables and heavily using parameter files.
  • Improved session run times by partitioning the sessions. Was also involved into database fine tuning (creating indexes, stored procedures, etc), partitioning oracle databases.
  • Setting up Batches and Sessions to schedule the loads at required frequency using Power Center Server manager.
  • Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions and Facts.
  • Worked on several data mart projects as a senior data warehouse architect, was involved in complete system development life cycle, starting from requirement specifications to delivering the tested product.
  • Extensively worked with SQL queries. Created Cursors, functions, stored procedures, packages, Triggers, views, materialized views using PL/SQL Programming.
  • Extensively worked with performance tuning of Oracle
  • Extensively used Oracle partitioning (range/hash/list), indexes (bitmap, B-tree, reverse key, etc) and various join types (hash joins, sort merge, nested iteration join) to improve the performance.
  • Sourced data from DB2 to Oracle using Fast Export and Oracle SQL Loader
  • Worked with projects using Informatica with Hadoop (Hive, Sqoop, Oozie,Pig and Python)
  • Worked with big volume of data like Billions using Netezza features -Distribution, Organize, nz migrate, Groom and Statistics
  • Involved inPerformance tuning of multiple Informatica mappingsto improve performance as the volume of Internet orders entered the system on a daily basis. The factors involved werePDO (Push down optimization), bulk loadingand also addingindexeson the staging tables.
  • Responsible for creating, testing and deploying multipleESP (External Scheduling processor) schedulesin development, testing and productionmainframe boxes.
  • Profiled the data usingforInformatica Data Quality(IDQ) for data quality measurement.
  • Work with the product owner to gather requirements. Analyze profiling results and make recommendations for improvements. Build Data profiling, Data cleansing and Data validation plans.
  • Worked in creating complex mappings from varied transformation Strategy logic like Address Validator, Case Converter, Standardizer, Labeler, Parser, Match, Classifier and Comparison Transformation Services etc
  • Creating ad-hoc reports/ Migrating reports from Tableau Reporting
  • Worked closely with business power users to create reports/dashboards using Tableau desktop, Tableau Server.
  • Created Heat Map showing current service subscribers by color that were broken into regions allowing business user to understand where we have most Customers vs. least customers for a dealer in his geography Location using closest dealer logic
  • Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop.
  • Utilized advance features of Tableau software like to link data from different connections together on one dashboard and to filter data in multiple views at once.
  • Worked extensively in creating different visualizations dashboards using Bars, Lines and Pies, Maps, Scatter plots, Gantts, Bubbles, Histograms, Bullets, Heat maps and Highlight tables worked with off-shore teams for development
  • Conducted UAT with stakeholders and business users for the validation and got system approvals by conducting UAT sessions.

Environment: Informatica Power Center 9.1.0x/9.6.1,Informatica IDQ Developer Tool 9.6.1, Tableau 8.1/9.1/9.2, DB2,Oracle 11i,, PL/SQL, Toad, Erwin 3.5.2, Report Net 1.1, UNIX, Shell Scripting, Windows XP, MS-Access, Micorsoft Visio.

Confidential, Bloomington, IL

Sr.Informatic Developer/ Data Validation Analyst

Responsibilities:

  • Understand and prepare High and Low Level Design based on Functional and Business requirements of Horizontal Data Movement(HDM).
  • Interact with the requirements team and designers to get a brief knowledge of business logics. Review of Master Data Workbooks for better understanding of business logic.
  • Perform study and select industry standard tool for Data validation.
  • Involve in Data validating, Data integrity, performances related to DB, Field size validation, check Constraints and Data Manipulation and updates by using SQL.
  • Extract data from flat files,DB2 and to load the data into the Oracle and Sales Force database. Extensively used various transformations like Filter, Router, Sequence Generator, Lookups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter, and Aggregator.
  • Improved performance using Oracle Partitions, Indexes and other Performance Tuning techniques. Developed re-usable components in Informatica, Oracle and UNIX
  • Worked on web services lists, xsd.exe tool, web services consumer transformation and xml parser transformation as well.
  • Validate the ongoing data synchronization process using validation tools to ensure the data in source and target system are synchronized. connecting to HP Service Manager (via API exposed) through DVO Review the results and open a HP Service Manager ticket for any deviations automatically (through DVO)
  • Used Data flux Data Management Studio for identifying and resolving problematic data, Standardize, Normalize and transform data, validate data and improve overall accuracy.
  • Used Data Flux MDM to integrate data from data sources into a single, reliable and accurate Master record and extending data management routines throughout information infrastructure.
  • Build Data profiling, Data cleansing and Data validation plans. Using Informatica IDQ
  • Develop many custom rules for parsing and for other anomalies due to the source mainframe limitations.
  • Build a re-usable staging area in Oracle for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
  • Provide one-off profiling, validation and duplicate suspect reports to various business areas such was Warranty and Global Supplier Management.
  • Build an Access Database to consolidate engine error data from spreadsheets and text file logs for analysis
  • Worked on Address Validator, Case Converter, Standardizer, Labeler, Parser, Match, Classifier and Comparison Transformation
  • Attended Informatica Data Quality version 9 training with Informatica

Environment: Informatica Power Center 8.6.1/8.1.1/9.0.0 , Informatica Developer(IDQ 9.1) Data flux 7.0/8.0 Control - M 7/8, Talend Open Studio for data integration 5.1, SQL, PL/SQL, SQL Server 2008, UNIX Shell Scripting, Autosys Workload Scheduler, TOAD 9.7.2, Sql Server tool, Oracle 10g/11g, Windows NT server, SQL, PL/SQL, HP-UX 10.2, TKPROF, RMAN, STATSPACK, Server Manager, SQL*Loader,SalesForce, SOQ,L UNIX, Web services, xsd.exe,.

Confidential, San Antonio, TX

Sr. Informatica Developer

Responsibilities:

  • Involved in requirement, analysis and understanding of business requirements identify the flow of information and analyzing the existing systems.
  • Designed Distributed Architecture for new data mart for HR and Sales, 401K etc.
  • Analyzed the source data coming from Oracle, Flat Files, and DB2 coordinated with data warehouse team in developing Dimensional Model.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema
  • Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files.
  • Extensively worked on Informatica Power Center tools- Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager
  • Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions and Facts.
  • Worked on Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Stored Procedure, Sequence Generator, Filter, Sorter, and Source Qualifier.
  • Migration of New and Changed Informatica objects across the environments using Folder to Folder and Deployment Group methods.
  • Created shell scripts to run daily jobs and extract the files from remote location for data loads.
  • Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for the capturing delta loads.
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation
  • Used web services, cvs files and tables as sources and loaded the data into relational tables.
  • Used xsd.exe tool to generate the xsd’s from the xml generated from the SharePoint.
  • Also used xml parser to parse the xsd generated and load the data accordingly.
  • Created reusable Mappings, Mapplet, Transformations and Parameter files for Data Quality Plan.
  • Improved Informatica process creating index, used cache files and session level partitions in Informatica.
  • Created, updated and maintained ETL technical documentation for Production Migration.

Environment: Informatica Power Center 9.x/8.x, Oracle 10g, PL/SQL, PL/SQL Developer, Data Flux, BPM (Business Process Management), Win CVS, Window XP, DB2, UNIX

Confidential, Lyndhurst, NJ

Informatica Developer

Responsibilities:

  • Requirement gathering, analysis and designing technical specifications for the data migration according to the business requirement.
  • Developed logical and physical dimensional data models using ERWIN 7.1
  • Developed test cases and test plans to complete the unit testing. Support System testing.
  • Developed complex mappings and SCD Type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup, Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Normalize, Filter, Rank and Router, Stored Procedure, XML and SQL transformations.
  • Responsible for normalizing COBOL files using Normalize Transformation.
  • Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.
  • Performed metadata validation, reconciliation and appropriate error handling in ETL processes
  • Troubleshoot data issues, validated result sets, recommended and implemented process improvements.
  • Used ANALYZE, dbms stats, explain plan, sql trace, sql hints and Tkprof to tune sql queries.
  • Extensively used Oracle partitioning (range/hash/list), indexes (bitmap, B-tree, reverse key, etc) and various join types (hash joins, sort merge, nested iteration join) to improve the performance.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe system.
  • Hands on experience with Informatica Metadata Manager
  • Design/developed and managed Power Center upgrades from v7.x to v8.6, Migrate ETL code from Informatica v7.x to v8.6. Integrate and managed workload of Power Exchange CDC.
  • Extensively worked with incremental loading using Parameter Files, Mapping Variables and Mapping Parameters.
  • Developed user defined functions (UDF) to extract data from flat files
  • Developed and modified UNIX korn shell scripts to meet the requirements after the system modifications andwas also involved in monitoring and maintenance of the batch jobs.
  • Worked with Autosys for Job scheduling.
  • Managed work assignments and coordinated within the development team

Environment: Informatica Powercenter 7.1.x, 8.6,Oracle 10g & 11g, PL/SQL, Cognos 8.3,Toad, Unix, Erwin 7.1,Windows XP, Autosys.

Confidential, Mason, OH

Informatica Developer

Responsibilities:

  • Source system evaluation, standardizing received data format, understanding business/data transformation rules, business structure and hierarchy, relationships, data transformation through mapping development, validation and testing of mappings.
  • Developed logical & physical modeling as per business requirements using ERWIN 4.0
  • Performed requirement analysis and developed designing technical specifications for the data migration according to the business requirement.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe systems.
  • Developed complex ETL structures to extract, transform and load data from multiple data sources into data warehouse based on business requirements.
  • Extensively worked with SCD Type-I, Type-II and Type-III dimensions and data warehousing Change Data Capture (CDC).
  • Created mapplets in place of mappings which were repeatedly used like formatting date or data type conversion.
  • Ensured accurate, appropriate and effective use of data; including data definition, structure, documentation, long-range requirements, and operational guidelines.
  • Monitored ETL process activity and utilization, with particular strengths in performance tuning highly transactional data integration package in both the development and production environments.
  • Responsible for troubleshooting and resolving issues related to system performance, Informatica applications, and data integrity
  • Extensively worked with SQL queries, created stored procedures, packages, triggers, views using PL/SQL Programming.
  • Involved in optimization and performance tuning of Informatica objects and database objects to achieve better performance.
  • Wrote Unix (korn shell scripts) to backup the log files in QA and production
  • Performed Unit and Integration testing and wrote test cases
  • Worked extensively in defect remediation and supported the QA testing
  • Involved in periodic maintenance of environments by applying patches and content upgrades
  • Involved in code migration for versioned repositories
  • Involved in taking repository backup and restoring it, starting and stopping Informatica services and worked with pmcmd commands.
  • Created Drill Through Reports & Master Detail Reports for Sales Department
  • Extensively used prompts, filters, cascading prompts, calculations, conditional variables, multiple queries for data extraction in reports.

Environment: Informatica Powercenter 8.1, Informatica Power Exchange, Teradata, PL/SQL, SQL*Plus, SQL*Loader, Toad, UNIX, Windows XP, Erwin 4.0, Cognos 8.3.

Confidential

Jr.Informatica Developer

Responsibilities:

  • Analyzed the Specifications and identifying the source data needs to be moved to data warehouse.
  • Involved in Database design, entity relationship modeling and dimensional modeling using Star schema.
  • Extensively used all the features of Informatica Versions 7.x including Designer, Workflow manager and Repository Manager, Workflow monitor.
  • Developed and modified UNIX shell scripts to reset and run Informatica workflows using pmcmd on UNIX Environment. Conversant with the Informatica API calls.
  • ETL mappings were developed to perform tasks like validating file formats, business rules, database rules and statistical operations
  • Worked extensively with mappings using expressions, aggregators, filters, lookup, update strategy and stored procedures transformations.
  • Created flexible mappings/sessions using parameters, variables and heavily using parameter files.
  • This project is intended to load and extract Procurement, Order Management, and Accounting data fromOracledatabase.
  • Performed tuning ofInformaticasessions by using optimizing techniques such as database partitioning, increasing block size, and data cache size, sequence buffer length, target based commit interval and SQL overrides. Identify the Bottlenecks in mappings.
  • Create test cases for the above projects in providing error free solution. Monitored Workflows and sessions using Workflow Monitor.
  • Debug through Session logs and fix issues utilizing database for efficient transformation of data.
  • Prepared Mapping documents, Data Migration documents, and other project related documents like mapping templates and VISIO diagrams.
  • Develop technical documents in ease of improvements on existing code.
  • Involved in data conversion, migration, integration quality profiling tasks.
  • Implemented Filters, Calculations, Conditions, and Graphs & Prompts in Impromptu reports
  • Developing reports in ReportNet1.1 using Report Studio & Query Studio.
  • Creating ad-hoc reports/ Migrating reports from Cognos Impromptu

Environment: Informatica Power Center 7.1.x, Oracle 9i, DB2, MSSQL Server 2005 PL/SQL, SQL*Plus, SQL*Loader, Cognos 7 Series, Toad, UNIX, Windows XP.

We'd love your feedback!