We provide IT Staff Augmentation Services!

Informatica Developer Resume

0/5 (Submit Your Rating)

SUMMARY

  • Over 10 years of Hands - on experience in Data warehousing for ETL tools Informatica V9.6,9.1,8.6 & Data stage V8.7,7.5.2,7.1 with Oracle 11g/10g & DB2.
  • Expertise in involving every phase of the software development life cycle, such as gathering the business requirements, analysis, design, development, testing and management for medium and large data warehouse projects in domains like Insurance, Retail and Manufacturing.
  • Expertise in Data warehousing concepts like Bill Inmon methodology and Ralph Kimball methodology, fact tables, dimension tables.
  • Expertise used Informatica, directly responsible for the Extraction, Transformation & Loading of data from multiple sources into Data Warehouse.
  • Expertise in Data Modeling experience using ER diagram, Dimensional/Hierarchical data modeling,Star Schema and Snowflake Schema modeling.
  • Expertise in application Operation Support for Corporate data warehouse projects.
  • Excellent experience with Hadoop Clusters set up.
  • Big-Data Hadoop Administrator certified in Confidential Global Service Ltd.
  • ITIL Foundation V3 certified in Confidential .
  • Excellent Team player and Quick learner withpositive attitude and self motivation.
  • Good analytical, programming, problem solving and troubleshooting skills.
  • Excellent working experience with international country United States, Germany, South Africa.

TECHNICAL SKILLS

ETL Tools: Informatica 9.6, 9.1 & 8.6 & Datastage 8.7,7.5.2 & 7.1

Database: DB2, ORACLE 11g/10g/9.2

GUI & Tools: Visual Basic, Crystal Reports, TOAD, SQL Developer

Languages: Unix shell script, VB script, Java script, PL/SQL

Big data Eco Tools: Hive, Sqoop, Storm, Cassandra, Elastic Search, Kibana Visualization

Operating System: LINUX, UNIX, Windows XP/NT/2000

PROFESSIONAL EXPERIENCE

Confidential

Informatica Developer

Responsibilities:

  • Involvement in implementing the end to end ETL life cycle.
  • Understanding the requirement gathering from customer and prepare the design documents like HLD, LLD.
  • Involved in designing the dimensional model like star schema, snow flake schema.
  • Developed the mappings. Sessions & workflows as per the requirement.
  • Monitor the daily running jobs through TWS tool and handled the aborted job while load the data into dwh.
  • Plan, monitor, track development, discover application bugs, identify root causes and fix them.
  • Implemented performance tuning logic on sources, tergets, mappings and sessions to provide maximum efficiency and performance.
  • Unit testing of all the ETL components, submitting the results.
  • Coordinate with ETL testing for fixing the defects. Performed the Sanity check after QA migration.
  • Used to involve the code peer review.
  • Coordinated with the business analysts and developers to discuss issues in interpreting the requirements.
  • Assist new team members to speed up the Informatica skill.
  • Ensured on-time, defect free delivery and achieved a high level of client satisfaction.

Environment: Informatica 9.6, Oracle 10g, PL/SQL, TWS Job Scheduler, UNIX and Linux.

Confidential

Informatica Developer

Responsibilities:

  • Involvement in implementing the end to end ETL life cycle.
  • Understanding the requirement gathering from customer and prepare the design documents like HLD, LLD.
  • Identified the significant attributes to extract through understanding of client database.
  • Designed mappings in Informatica for the loading of the data in the Data warehouse from the system.
  • Extensively involved in the extraction of different flat files and Oracle OLTP system.
  • Created complex mappings/mapplets using expressions, aggregators, joiners, ranking, filters, look up transformation in Informatica Power Mart.
  • Implemented performance tuning logic on sources, tergets, mappings and sessions to provide maximum efficiency and performance.
  • Developed and tested ETL processes and components using Informatica
  • Performed maintenance, troubleshooting and fine tuning of existing ETL processes
  • Designed, Developed and Tested Mappings that migrated data from the legacy sources to the ADS (warehouse).
  • Coordinate with ETL testing for fixing the defects. Performed the Sanity check after QA migration.
  • Used to involve the code peer review.
  • Assist new team members to speed up the Informatica skill.
  • Ensured on-time, defect free delivery and achieved a high level of client satisfaction.

Environment: Informatica 9.1, DB2, QMF tool, PL/SQL, TWS Job Scheduler, UNIX and Linux.

Confidential

Hadoop/Unix Administrator

Responsibilities:

  • Making Hadoop setup and install all the software as per the requirement for log systems.
  • Gather the complete business requirement about to make the hadoop cluster setup.
  • Importing and exporting data into HDFS using SQOOP.
  • Performance tuning of hadoop clusters.
  • Manage the file system, monitoring and review the log files.
  • HDFS support and maintenance.
  • Hadoop cluster job performances and capacity planning and monitor hadoop cluster connectivity and security.
  • Manage and coordinate the team to develop as per the business requirement.
  • Prepare the documentation as per the requirement process.
  • Execute and advise on the optimal solution implementation.

Environment: Hadoop, UNIX, Linux, Log Stash (Shipper), Redis/RabbitMQ Server, Storm (Real time processing), Sqoop, Pig, Hive, Hbase, Cassandra/Elastic Search, Kibana visualization.

Confidential

Informatica Developer

Responsibilities:

  • Involvement in implementing the end to end ETL life cycle.
  • Understanding the requirement gathering from customer and prepare the design documents like HLD, LLD.
  • Involved in designing the dimensional model like star schema, snow flake schema.
  • Developed the mappings. Sessions & workflows as per the requirement.
  • Monitor the daily running jobs through TWS tool and handled the aborted job while load the data into dwh.
  • Plan, monitor, track development, discover application bugs, identify root causes and fix them.
  • Working performance Tuning/Enhancements.
  • Unit testing of all the ETL components, submitting the results.
  • Coordinate with ETL testing for fixing the defects. Performed the Sanity check after QA migration.
  • Used to involve the code peer review.
  • Assist new team members to speed up the Informatica skill.
  • Ensured on-time, defect free delivery and achieved a high level of client satisfaction.

Environment: Informatica 8.6, Oracle 10g, SQL Navigator, TWS Job Scheduler, UNIX, Linux, Windows.

Confidential

Datastage Consultant

Responsibilities:

  • Prepared daily status report for data loading status to the customer.
  • Used FTP to transfer the files between the different UNIX boxes.
  • Manually spooled out data file from MQ series queue and transferred the source data file SAP system to UNIX system.
  • Solved data source issue for CSV or flat file; In order to load the data, manually correct the source data and load into DWH system.
  • Provided weekly status reporting for all sub projects, infrastructure issues, and data file delivery on UNIX.
  • Created the Correction Characteristics report for New/Change requests.
  • Assisted new team members to build their skills in Data Stage and Oracle.
  • Prepared documents as new requirements developed from offshore team.
  • Provided the data extract as per user request for analyzing on weekly & monthly reports.

Environment: Data stage 8.7, Oracle 10g, UC4-Job Scheduler and Windows XP.

Confidential

Datastage Consultant

Responsibilities:

  • Prepared daily status report for data loading status to onsite/customer.
  • Manually extracted data from SAP source system in order to load the data into DWH.
  • Maintained the weekly logbook to cover all issues.
  • Involved in Trafo business logic and flow of data loading into data warehouse
  • Solved and debugged blocked jobs in UC4 & Data Stage; thereafter reporting Error Character tics (analysis result) to the customer.
  • Involved in production support issues.
  • Involved in manual data loading into Integration Environment during Environment (I/P Switch) change before backup days.
  • Involved in manual spool out data file from MQ series queue and transfer the source data file SAP system to UNIX system.
  • On the database side, created and replaced the view, Rebuilt Index and partition, etc

Environment: Data stage 7.5.2, Oracle 9.2/10g, UC4-Job Scheduler and Windows XP.

Confidential

Datastage Consultant

Responsibilities:

  • Involved in production server monitoring for daily data loading status and any jobs blocking state; analysis and debugging of blocked job in scheduler and sending analysis results to the customer.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions.
  • Involved in data source issue for CSV or flat file; In order to load the data, manually correct the source data and loaded into DWH system.
  • Involved in customer requirement and prepare the documents according to complete the request and send Correction Character tics to the customer.
  • Involved in Environment (I/P Switch) change, start manually loading the data into Integration (test) Environment so data loading provision for both environments are equal and switch can transfer to single Environment.
  • In database created the View, Replace View; Rebuild Index & partition, etc.
  • Involved in validating the data for both P/I environments; if any discrepancies, delete the bad data and reload into DWH system.

Environment: Data stage 7.1, Oracle 9i, UC4-Job Scheduler and Windows NT.

We'd love your feedback!