We provide IT Staff Augmentation Services!

Data Engineer Resume

0/5 (Submit Your Rating)

Palo Alto, CA

PROFESSIONAL SUMMARY:

  • Over 10+ yearsof experience in Analysis, Design, Development, and Production support of Data warehouse applications.
  • Experience in working in Finance and Retail domain.
  • Extensive Data warehousing experience in designing and creating mappings usingInformatica Power Center (9.x/8.x/7.x), Power exchange, Data quality developer including Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
  • Main area of experience includes Teradata, Oracle, Informatica, UNIX and strong knowledge on Data Warehousing concepts.
  • Expertise in implementing the business rules by creating transformations like (Expression, Joiner, Source Qualifier, Filter, Aggregator, Lookup, Router, Update Strategy, SQL transformation etc.), and developing mappings.
  • Solid experience in all phases of Data warehouse life cycle involvingdesign, development, testing and production support of Data warehouses using Informatica.
  • Extensively worked on Relational Databases Systems like Oracle11g/10g/9i/8i, MS SQL Server, Teradata, DB2 and Source files like flat files, XML files.
  • Worked with cross - functional teams such as QA, DBA to deploy the code from development to QA, QA to UAT and UAT to Production and involved in production support to meet SLAs.
  • Highly competent in quantitative and qualitative analysis and critical thinking.
  • Self-reliant and quick to identify and understand business problems to resolve production issues.
  • Good understanding of Data Warehouse Life Cycle.
  • Good team player with multitasking, hardworking and fast learning.
  • Worked in end-to-end data warehousing projects.
  • Experienced in dealing with outsourced technical resources and coordinating global development efforts.
  • Excellent communication and interpersonal skills. Committed and Motivated team player with good Analytical skills.

TECHNICAL SKILLS:

Big data & ETL: Hadoop, Hive, MapReduce, Spark, Informatica Power center 7.x/8.x/9.x Power Exchange 8.6.1, Informatica data quality

RDBMS: Oracle 8i, 9i, 10g, Teradata, DB2, MongoDB

Reporting Tools: BRIO, Cognos, Tableau

Operating Systems: Sun Solaris 5.8/5.9/5.10, Windows, Linux

Languages: SQL, PL/SQL, C, C++, Java, Python, Nodejs, HTML, CSS, JavaScript

Scheduling Tools: Control-M, Auto Sys

Other Tools: Unix shell programming, Perl programming, My SQL, SQL Plus, Toad

PROFESSIONAL EXPERIENCE:

Confidential, Palo Alto, CA

Data Engineer

Environment: Teradata, Hadoop, Hive, Python, Github, Spark, MapReduce

Responsibilities:

  • Created and managed ETL data pipelines (sourcing data from databases, streaming data, various web APIs, etc.)
  • Integrated data from the data warehouse into 3rd party tools to make data actionable to end users
  • Provided support for ongoing maintenance and enhancements to existing data warehouse solutions
  • Created data quality checks to make sure data loads in good condition.
  • Performed strategic and operational data analysis for all internal teams (marketing, product, engineering, customer service)
  • Monitored all KPIs and presented emerging trends and actionable insights to leadership team on the RTB
  • Developed dashboards and performed ad-hoc analysis on the data
  • Ensured data quality by testing tables, dashboards, models, and analysis
  • Involved in the data backfills for the new pipeline.

Confidential, San Jose, CA

SITE ETL developer and production support lead

Environment: Informatica Power Center 9.6.1, Linux grid, Oracle golden gate, Control-m, Hadoop cluster

Responsibilities:

  • 24/7 Operational support of mission-critical applications and merchant reporting, involving various systems and other areas of Confidential infrastructure.
  • Solved complex problems using reverse engineering. Worked effectively in a multi-tasking environment and provided level 3 support including root cause analysis on issues, problem-solving, application and system monitoring, research, and project development.
  • Created RCA reports and provided to Problem management team.
  • Work with and drive tier II groups (ETL Ops, Data Integration, DBAs, SAs, Network Security etc.) to diagnose complex problems and drive resolution as quickly as possible to meet SLAs.
  • Performed root cause analysis and documented to prevent production issues from reoccurring.
  • Participated in multiple critical projects (Bill Me Later integration, Disaster Recovery project.)
  • Mentoring and junior team members. Assist other teams with automation projects and troubleshooting in development and production issues.
  • Monitor and troubleshoot high-performance server clusters designated for global merchant reporting process, interact with worldwide multifunctional teams and vendors.
  • Coordinating with the customers and vendors for any system up gradation and giving the exact procedure to follow up
  • Effectively coordinated with the development team for closing the defects.
  • Conduct reviews, identifygapsand recommend and implement change to release and change management policies on a regular basis.
  • Perform monthly scheduled maintenance activities, involved in
  • Documentation of all tickets with root cause analysis and review
  • Participated in ETL code reviews and enhancements before Prod deployment
  • Coordinate to troubleshoot the environmental issues with NOC team.
  • Involved in the cutover activities, reviewed code going into production for accuracy, maintain Informatica with multiple nodes
  • Worked with Job Scheduling team to design and setup ETL flow on Workload Automation (Control-m).

Confidential, San Ramon, CA

Sr. Informatica Developer

Responsibilities:

  • Working on building a new data warehouse for customer data information.
  • Experienced working with team leads, Interfaced with business analysts and end users.
  • Interacted with Business Analyst to gather requirements and translated them into ETL technical specifications.
  • Worked with data modelers to understand financial data model and provide suggestions to the logical and physical data model.
  • Involved in data quality profiling, design, development, unit testing and deployments of ETL Informatica components.
  • Document unit test cases and provide QA support to make testers understand the business rules and code implemented in Informatica.
  • Extracted data from various sources like XML files, flat files, RDBMS loaded into target data warehouse Oracle.
  • Implemented error handling for invalid and rejected rows by loading them into error tables.
  • Involved in Performance Tuning of ETL code by addressing various issues during extraction, transformation and loading of data.
  • Involved in writing SQL queries to validate the data on the target database according to source to target mapping document.
  • Handled Production issues and monitored Informatica workflows in production.
  • Extensively worked on batch frame work to run all Informatica job scheduling.

Confidential

Sr.ETL Developer

Responsibilities:

  • Involved in gathering the business requirements from the team and from the end users.
  • Involved with data model team to design and build relationships between different entities.
  • Involved in coding, testing, implementing, debugging and documenting the complex programs.
  • Involved in documenting High Level Design, Low level Design, STM's, Unit test plan, Unit test cases and Deployment document
  • Ensures that all the standard requirements have been met and is involved in performing the technical analysis.
  • Debugged the sessions by utilizing the session logs.
  • Resolving issues and providing on-call support regarding production SLAs.
  • Prepared codes for all modules according to require specification and client requirements and prepare all required test plans.
  • Involved in all production issues, inquiries and provided efficient resolution for same.

Confidential, Wilmington, DE

Onsite Operations lead (TCS)

Responsibilities:

  • Making a note of the Impediments faced by the team and addressing the same with the help of the project management team.
  • Develop Technical specification documents, Unit Test Cases.
  • Development of Informatica mappings Transformations and Mapplets using Source Qualifier, Filter, Lookup, Aggregator, Union, Router, Update Strategy etc.
  • Reviewing the code prepared by other team members with the client and taking signoff on the same.
  • Getting the code deployed to the QA environment as per the Implementation plan provided by the team members with the help of DBA and Admin team.
  • Creating and deploying the Informatica label and Control-M XMLs in the test and Production environment.
  • Verifying the Informatica Mappings, Mapplets, Transformations and Workflows.
  • Verifying the Control-M XMLs and ordering the jobs to run the workflows.
  • Running SQL Queries on database and verifying if the data is loaded to the respective target tables and with the expected transformation.
  • Making sure that all the failed test cases are linked to the respective defects.
  • Reviewing the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects.
  • Provide daily status to the management team and weekly status to the Client for the project.

Confidential

Onsite Lead/Production Support

Responsibilities:

  • Interacting with clients on daily basis for status and requirements.
  • Meeting SLAs as provided by client for supporting their customers and Sent regular check points to Management and Business users about the status of the Batch, Reports execution.
  • Worked as operations Engineer in 24*7(Onsite/Offshore) production support environment and escalate the batch issues/failures to on calls and SME for immediate resolution.
  • Involved in scheduling the jobs by using Control-M (batch processing) monitored the production support jobs and provided quick solutions on failures to meet SLAs on daily basis.
  • Worked in production team to solve the issues while the sessions are running in Informatica.
  • Monitoring the batch jobs which refresh warehouse on a daily basis in scheduler tools with Control-M.
  • Documented the production issues and RCA for future understanding and .
  • Excellent team player with ability to work consistently with team towards attaining goals and targets.
  • Provide supports for extra hours for critical issues and batch recovery efforts in case of database crashes and hardware and other environment issues with ETL servers.
  • Doing enhancement work and giving innovative idea to improve the existing solutions.
  • Reduced issues regarding data integrity and billing inconsistencies.
  • Technical and Quality Lead for team.

Confidential

ETL Developer

Responsibilities:

  • Working on Informatica Power Center Tools-Source analyzer, Warehouse Designer, Mapping Designer and Transformation developer.
  • Created the Source and Target Definitions in Informatica Power Center Designer.
  • Created reusable transformations and Mapplets to use in multiple mappings.
  • Designed and developed Informatica Mapping for data load and data cleansing.
  • Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate one time, Weekly, Monthly and Yearly Loading of Data.
  • Involved in fixing invalid Mappings, Unit and Integration Testing of Informatica Sessions and the Target Data.
  • Used Workflow Manager/Monitor for creating and monitoring workflows.
  • Extensively used mapping parameters, mapping variables and parameter files.
  • Involved in troubleshooting the loading failure cases, including database problems.
  • Created the Business scenario documents.

Confidential

ETL Developer

Responsibilities:

  • Involved in developing Mappings and reusable transformations using Informatica Designer and perform Testing using SQL.
  • Used various types of sources like RDBMS sources and external Sources like flat files and XML sources to load data into targets.
  • Conducted the peer review for mappings developed and tested.
  • Identified Bottlenecks in the mapping and Involved in Performance tuning.
  • Involved in configuring the dependency of the workflows.
  • Configured various tasks in the workflows for the dependency by using Command, Event wait, timer and Email task.
  • Documented the test cases and results for future understanding and .
  • Involved in process of Unit testing and integration testing before deploy code.

We'd love your feedback!