We provide IT Staff Augmentation Services!

Data Architect, Etl Developer And Lead Etl Sustainment Tier3 Production Support Resume

0/5 (Submit Your Rating)

Richardson City, TexaS

SUMMARY

  • Informatica Certified candidate with around 14 years of experience primarily in the domain of Data warehouse and Business Intelligence.
  • My main area of experience has been I.T. project technical delivery of various sizes.
  • I have worked in the business vertical of telecom with Confidential for last 7 years and have also previously worked in business verticals of media & entertainment services, manufacturing services, financial services, and transportation services. My technological forte is ETL Data Integration tool Informatica PowerCenter.
  • Forward - thinking, resourceful IT technical leader with in-depth experience providing solution, vision and leadership for Architecture, Development, Integration, Execution and Support of enterprise business systems within high-growth environments. Proven success in delivering breakthrough technical solutions that enable companies to transcend barriers to performance, driving technical solutions for Enterprise in meeting business needs. Act as a Strategist, Service Partner and Change Agent, continually seeking opportunities to innovate, re-engineer processes and re-architect solutions to maximize security, reliability, scalability and supportability. Leverage an empowering leadership style to build accountable, productive and engaged teams.
  • Strong expertise in enterprise level data architecture, analysis of requirements and design, application high level and detailed design, logical and physical data models, source to target data element mapping involving complex transformation business rules in ETL Data warehouse, Business Intelligence and Big Data Technologies.
  • All-round experience in requirement analysis, database design and Implementation of Data warehouse with strong understanding of dimensional modeling, star & snowflake schemas, data staging, decision support system.
  • Strong ETL design, development and support experience with Informatica Power Center and PL/SQL in RDBMS like Oracle and Teradata.
  • Experienced in software process engineering and applying six sigma processes to project modules to ensure quality deliverables.
  • Resourceful, creative problem-solver with proven aptitude to analyze and translate complex customer requirements and business problems and design/implement innovative custom solutions.
  • Motivated achiever who exceeds goals and has earned highest customer satisfaction rating for the IT delivery solutions.
  • Articulate communicator who can fluently speak the languages of both people and technology, blending technical expertise with exceptional interpersonal skills while interacting effectively with customers, sales staff, and technical/engineering teams; adept Confidential delivering presentations and demos and technical documentation.
  • Strong communication, interpersonal skills, mentoring and team management.

TECHNICAL SKILLS

ETL Data Integration Product: Informatica PowerCenter 8.x/9.x/10.x (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica Data Transformation iPaaS/SaaS Cloud Integration Platform: Informatica Integration Cloud

Big Data Product: Hadoop Framework, Informatica Big Data Management

Big Data Technology: Data Lake implementation using Hadoop, HDFS, Map Reduce, Hive, Pig, Sqoop, Spark

Big Data Analytics Product: Teradata Aster Big Analytics Appliance

Teradata Loader Utilities: BTEQ, FastLoad, FastExport, MLOAD, TPUMP, Use of Teradata SQL Assistant

Data Visualization (BI Reporting) Products: Tableau, OBIEE, Business Objects

Data Modeling: erwin Data Modeler r9.7

RDBMS: Teradata, Oracle, PostgreSQL, DB2, Sybase IQ/ASE, MS SQL Server

Programming Languages: Python, Java, C, C++, PL/SQL

Data warehousing Methodologies: Star Schema, Snowflake Schema, Waterfall, Agile

OS: Windows 7/8/8.1/10, UNIX, LINUX

Sort & Integration Tool: Sync Sort

EAI Tools: web Methods

Scheduling Tools: Tivoli Workload Scheduler (TWS), Autosys, ESP, Maestro, Appworx

Project Management Tools: Wiki, JIRA, Confluence, Microsoft Project Plan, Microsoft Share Point

Microsoft Software Proficiency: MS Word, MS Excel, MS Power Point, MS Project Plan, MS Outlook, MS Visio

Testing Tools: HP ALM Quality Center, HP Load Runner

Production Software Defect Management: AOTS - Confidential &T One Ticketing System - Trouble Tickets, Webtrax

PROFESSIONAL EXPERIENCE

Confidential, Richardson city, Texas

Data Architect, ETL Developer and Lead ETL Sustainment Tier3 Production Support

Responsibilities:

  • Responsible for data analysis, design, develop code solution, provide production support for eCDW application.
  • Worked on requirements gathering, creating design specifications and ETL design documents for eCDW application.
  • Code development using Informatica, Oracle, SQL, Teradata, UNIX/LINUX script, TWS scheduler.
  • Provide Sustainment Tier3 support to assist Tier2/Tier1 Production support teams when they are unable to resolve production bugs/issues Confidential their level and need assistance from the next technical support level which is Tier3.
  • Research issues/defects/AOTS Tickets on user/business request through AOTS Trouble Ticketing tool.
  • Create application design, high level technical design documents based on customer business requirements for new projects/production bug tickets.
  • Work closely with business/requirements and testing teams to assist them through the project development life cycle.
  • Designed and developed Mappings using Informatica Mapping Designer to load the data from various sources using different transformations like SQL transformation, Aggregator, Stored Procedure, Filter, Joiner, Lookup, Router, Sequence Generator, Source Qualifier, Update Strategy, Custom and Union transformations.
  • Designed and developed Informatica workflows and scheduled them in Informatica and in UNIX/LINUX shell scripts using TWS.
  • Created UNIX/LINUX shell script code to drop and re-create indexes to load data efficiently and decrease load time for eCDW data warehouse.
  • Provided project/production bug fix effort estimates, coordinated the development efforts and discussions with the business clients, updated status reports and handled logistics to enable smooth functioning of the project/task and meet all the deadlines.
  • Tested workflows for Informatica version upgrades, worked with Informatica production support for issues.
  • Actively involved in the data modeling and design of the eCDW data warehouse with Data Strategist/Architect.
  • Designed and implemented Data ware house star schema models, Identified, and built Facts, Dimensions, Aggregate tables to support the reporting needs and requirements of the users in Business Objects & OBIEE reporting tool.
  • Investigated data quality issues by data profiling on the source system, extracted data from various sources such as Oracle, fixed width and de-limited flat files, cleansed and loaded data into Teradata and Flat file targets using Informatica.
  • Implemented Slowly Changing Dimensions (1 and 2) using Informatica in eCDW data warehouse.
  • Created indexes and worked on database partitioning to obtain optimal performance for the eCDW data warehouse.
  • Involved in Design review, code review, Performance analysis and Performance test, involved in Performance Tuning (Both Database and Informatica) and there by decreased the job load time for the eCDW data warehouse.
  • Thoroughly unit test the code. Also assist QA testing, system testing and UAT testing as and when required. Working on HP ALM Quality Centre tool for testing on defect delivery and resolution.
  • Follow code version control in SVN (sub version tool) to install and promote Informatica, LINUX and Teradata code into various eCDW environments for unit test, integration test and production.
  • Complete the development package to hand-off code to QA team for system test. Prepare developer-related documents as required by eCDW.
  • Represent the design/code/solution developed Confidential the CCRB & SIS review meetings for Confidential &T code release management as required to get code deployed into production environment.
  • Prepare/maintain production bug tickets in AOTS - Confidential &T One Ticketing System and work with BIDW/eCDW deployment team to have the code installed in eCDW environments.
  • Provide support to the Deployment Team when application code is deployed during Confidential &T scheduled releases.
  • Infant care for the code to support any changes if required after the production deployment.
  • Attend Requirements meeting, Design meetings and Development Team meetings as required.
  • Comply with all Confidential &T BIDW, Deployment and Development Team policies and procedures.
  • Continuous client interaction & communication for the feedback on the progress of work and quality of deliverables.
  • Guided the new team members to observe the development methodologies and architecture.
  • Performance tuning of LINUX & Informatica code.
  • Job scheduling design using TWS scheduler tool.
  • Manage offshore development team for assistance in development & support work.
Confidential, Richardson city, Texas

Big Data Architect

Responsibilities:

  • Architect and develop ETL solutions focused on moving data from highly diverse data landscape into a centralized data lake; also, architect solutions to acquire semi/unstructured data sources, such as streaming website data, machine logs, click streams, json files, etc.
  • Manage all activities centered on obtaining data and loading into an Enterprise Data Lake.
  • Maintain a customer-focused attitude.
  • Define and document architecture roadmaps and standards.
  • Design and implement data ingestion techniques for real time and batch processes for a variety of sources into Hadoop ecosystems and HDFS clusters.
  • Visualize and report data findings creatively in a variety of visual formats that provide insights to the organization.
  • Knowledge of data, master data and metadata related standards, processes and technology.
  • Drive use case analysis and architectural design around activities focused on determining how to best meet customer requirements within the tools of the ecosystem.
  • Ensure scalability and high availability, fault tolerance, and elasticity within big data ecosystem.
  • Provide technical leadership and coaching to development, testing, production support team members.

We'd love your feedback!