We provide IT Staff Augmentation Services!

Informati Developer Resume

0/5 (Submit Your Rating)

CA

SUMMARY

  • Over 8+ years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using Informatica Power Center 9.x/8.x/ 7.1.3/7.1.1/6.2. Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
  • Experience working with Power Center Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Proficiency in developing SQL with various relational databases like Oracle, SQL Server.
  • Knowledge in Full Life Cycle development of Data Warehousing.
  • Strong understanding of OLAP and OLTP Concepts.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Proficient in Oracle Tools and Utilities such as TOAD and SQL*Loader.
  • Experience in SQL performance tuning.
  • Experience in maintaining Data Concurrency, Replication of data.
  • Thorough Knowledge in creating DDL, DML and Transaction queries in SQL for Oracle database.
  • Experience in using SQL developer, SQL*Plus, Procedures/Functions and Performance tuning.
  • Extensive experience with Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Normalizer, Union and XML Source Qualifier.
  • Responsible for the Data Cleansing of Source Data using LTRIM and RTRIM operations of the Expression Transformation.
  • Database development and ETL process in Oracle l0g and Greenplum using PSQL, SQL, PERL and UNIX scripting
  • Expertise in RDBMS concepts, with hands on exposure in the development of relational database environment using SQL, PL/SQL, Cursors, Stored Procedures, Functions and Triggers.
  • Experience in working with Perl Scripts for handling data coming in Flat files.
  • Experience working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating parameter files.
  • Strong with relational database design concepts.
  • Used Teradata Utilities like fast load, multi load and fast export.
  • Performed data validation by Unit testing, integration testing and System Testing.
  • Extensive experience in managing teams/On Shore - Off Shore Coordination/Requirement Analysis/Code reviews/Implementing Standards.
  • Experience with managing ODBC connections.
  • Good experience with Informatica Performance Tuning.
  • Good knowledge of Data modeling techniques like Dimensional/ Star Schema, Snowflake modeling, slowly changing Dimensions using Erwin.
  • Flexible, enthusiastic and project oriented team player with solid communication and leadership skills to develop creative solution for challenging client needs.
  • Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
  • Able to work independently and collaborate proactively & cross functionally within a team.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.

TECHNICAL SKILLS

Tools: Informatica Power Center 9.5/9.1, 8.6.1/8.5/8.1.1 , Power Exchange 9.5/9.1/8.6,Informatica IDQ,Informatica Data Replication Console,Talend.

Databases: Oracle 11g/10g/9i, MS SQL Server 2008/2012, MS Access, DB2,, Teradata 14/13/12/V2R6, V2R5, Sybase, Control-M, Greenplum 4.0/4.1.

Reporting Tools: Business Objects XIR2, SAP BI 7.0, OBIEE 11g/10g,Cognos,Sharepoint

Languages: C, C++, SQL, PL/SQL, HTML, PHP, JAVA, UNIX Scripting

Other Tools: Toad, Harvest, SCM, Putty, Tidal, Autosys, ESP, Informatica MDM Hub Console, Informatica Hierarchy Manager (HM).

Operating Systems: LINUX, SUN Solaris, Windows7/XP/2000/98

PROFESSIONAL EXPERIENCE

Confidential, CA

Informatica Developer

Responsibilities:

  • Expertise in UNIX shell scripting, FTP, SFTP and file management in various UNIX environments.
  • Used Informatica Power Center 9.5 for Extraction, Transformation and Loading data from heterogeneous source systems into the target data base.
  • Experience working with Business users and architects to understand the requirements and to pace up the process in meeting the milestone.
  • Perform impact analysis of the existing process of RRP-verint and designing the solution for automating input and output processes to/from RRP.
  • Designed all the above projects and frameworks and lead the development teams to reach the project timelines within the budget.
  • Designed and developed the Informatica processes to send the data to retail web services (RWS) and capturing the response.
  • Working with global support technology relationship team to setup VDIs with the required software packages for ETL development for offshore team. On boarding the offshore team with training.
  • Effectively communicates with other technology and product team members.
  • Work on the performance improved areas. Debug the issues and coming up with the proper solutions which will reduce the process times.
  • Created mappings by cleansing the data and populate that intoStagingtables, populating the staging to Archive and then to Enterprise Data Warehouseby transforming the data into business needs & Populating theData Martwith only required information.
  • Extensively used Informatica Power Centre tools and transformations such as Lookups, Aggregator, Joiner, Ranking, Update Strategy, Mapplet, connected and unconnected stored procedures / functions / SQL overrides usage in Lookups / Source filter usage in Source qualifiers.
  • Created Pre/Post Session and SQL commands in sessions and mappings on the target instance.
  • Responsible for Performance tuning Confidential various levels during the development.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Extracted/loaded data from/into diverse source/target systems like SQL server, XML and Flat Files.
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets for better performance.
  • Developed process for TeradatausingShell ScriptingandRDBMSutilities such asMLoad,FastLoad(Teradata)
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Responsible for migrating the workflows from development to production environment.
  • Worked using Parameter Files, Mapping Variables, and Mapping Parameters for Incremental loading.
  • Managed post production issues and delivered all assignments/projects within specified time lines.
  • Extensive use of Persistent cache to reduce session processing time.

Environment: Informatica Power center 9.5/8.6.1, Oracle 11/10g, SQL server 2008, LINUX, LSF (Job scheduling), UNIX shell scripting, SAS Enterprise guide and MS Visual studio (TFS) for version control.

Confidential, San Jose,CA

ETL Developer/ Informatica Developer

Responsibilities:

  • Coordinated with Business Users for requirement gathering, business analysis to understand the business requirement and to prepare Technical Specification documents(TSD) to code ETL Mappings for new requirement changes.
  • Involved in Data Analysis.
  • Estimation, Requirement Analysis and Design of mapping document and Planning for Informatica ETL.
  • Involved in designing STAR Schema for the business processes.
  • Analysis of Source, Requirement, existing OLTP system and Identification of required dimensions and facts from the Database.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, the Mapping Designer to map the sources to the target.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 8.6.1.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
  • Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
  • Developed the transformation logic for both reference as well as transactional data flow from Legacy to New, and New to Legacy systems.
  • Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
  • Set up batches and sessions to schedule the loads Confidential required frequency using Power Center Workflow manager.
  • Extensively worked on Autosys to schedule the jobs for loading data.
  • Worked on Power Exchange for change data capture (CDC).
  • Set up batches and sessions to schedule the loads Confidential required frequency using Power Center Workflow manager and accessing Mainframe DB2 and AS400 systems.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
  • Involved in Informatica Data Masking & Data Subset Data Mapping.
  • Executed conversion maintenance on existing legacy system.
  • Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.

Environment: Informatica Power Center 8.6.1, Oracle 10g, Power Exchange 8.6, MS SQL Server 2008, UNIX(Sun Solaris5.8/AIX), SSRS, UltraEdit-32, Data Marts, Erwin Data Modeler 4.1, FTP, MS-Excel, Ms-Access, DB2, UNIX Shell Scripting, Data Modeling, PL/SQL, Autosys.

Confidential, Norwood, MA

Informatica Developer

Responsibilities:

  • Extracted data from various heterogeneous sources like Oracle, SQL Server, MS Access and Flat files.
  • Experience on working with complete Software Development Life Cycle of the application.
  • Involved in monitoring and maintenance of the Unix Server performance.
  • Involved in creating database tables, views, triggers.
  • Worked with PL / SQL to create new stored procedures and also modify the already existing procedures as per the change requirements from users.
  • Created many PL/ SQL batches and executed them through Unix Shell scripts.
  • Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
  • Extensively worked with Joiner functions like normal join, full outer join, master outer join and detail outer join in the Joiner transformation.
  • Worked with Session logs and Workflow logs for Error handling and troubleshooting in Dev environment.
  • Optimized SQL queries for better performance.
  • Experience in writing Teradata scripts and gave them to DBA’s for implementing in Production database tables
  • Created pre SQL and post SQL scripts which need to be run Confidential Informatica level.
  • Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy.

Environment: Informatica Power Center 8.6, Oracle 10g, PL/SQL, PL/SQL Developer, MS Access, Data Flux, BPM (Business Process Management), Win CVS, Window XP, DB2, UNIX

Confidential

Application ETL Developer

Responsibilities:

  • Responsible for design and development of Sales Data Warehouse.
  • Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
  • Extracted data from heterogeneous sources like Oracle, SQL Server
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
  • Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Created Mapplets and used them in different Mappings.
  • Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.
  • Worked with SAP and Oracle sources to process the data.
  • Worked on SAP for data migration Human Resources and Finance and converted various objects on Organizational Structure, Addresses, Time, Basic Pay, Bank Details, Recurring Payments, Tax assignment, Insurance Plans, Payroll etc., to generate report from SAP BI system.
  • Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.
  • Created Unix Shell Scripts to automate sessions and cleansing the source data.
  • Implemented pipeline partitioning concepts like Round-Robin, Key-Range and Pass Through techniques in mapping transformations.
  • Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
  • Worked on the data masking activities for cloning the GDW for several buyers
  • ETL and data integration experience in developing ETL mappings and scripts using Informatica Teradata V2R6/V2R5

Environment: Informatica Power Center 8.6, Oracle9i, Teradata v2r6, SAP, SAP BI 7.0, SQL Server, Sun Solaris.

Confidential

Data warehouse Developer

Responsibilities:

  • Worked with advanced PL/SQLconcepts like collections, PL/SQLobjects and records used them in building procedures and functions.
  • Worked on data export and import wizard to transfer data between different schemas in oracle using TOAD.
  • Worked with various complex queries, sub queries and joins to check the validity of loaded and imported data.
  • Worked on data transformation and retrieval from other databases to oracle, usingSQLloader and control files.
  • Worked and developed many procedures and functions according the requirement and developed brand new packages with cursors.
  • Good Knowledge in understanding already existed packages, and worked with them to create updated versions to solve the issues.
  • Involved in creating stored procedures, functions, triggers and developed monthly and weekly reports.
  • Good knowledge in database links, Worked on tables, various types of views like inline and materialized views in different databases by linking the databases.
  • Worked with Export and Text IO packages and wrote many Stored Procedures in many Forms and Reports. Worked with UNIX shell scripts and PERL scripts to generate daily reports.

Environment: Oracle 9i, Shell Scripting, TOAD for SQL, TOAD, UNIX, Windows NT, Windows 2000.

We'd love your feedback!