We provide IT Staff Augmentation Services!

Data /etl Cloud Architect Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • Established data architecture and BI best practices, standards and data models; recommended data warehouse and reporting performance tuning techniques. Mobilized and directed BI professionals and offshore development team of 10+.
  • 2+ years of AWS PaaS and SNOWFLAKE SaaS experience.
  • Designed complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools.
  • Experienced in hot and cold data storage strategies.
  • Excellent documentation and communication skills with an ability to clearly articulate complex IaaS/PaaS/SaaS concepts to people new to cloud development engineer team.
  • Strong understanding of ER Modeling, Dimensional Modeling, Star and Snowflake Schemas. Creation of Conceptual, logical and physical data models using Erwin/Power designer.
  • High proficiency in requirement gathering and data modeling including design and support of various applications in Online Transactional processing (OLTP) and Online Analytical Processing (OLAP), Data Warehousing and ETL Environment.
  • Owned and managed all changes to the data models. Created data models, solution designs and data architecture documentation for complex information systems.
  • Contributed in roadmaps for enterprise cloud data lake architecture. Worked with data architects of various departments to finalize the roadmap for Data Lake.
  • Effectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
  • Delivered pre - sales presentations and demonstration of solutions/POCs to customers and partners.
  • Supported sales team by providing technical assistance, product education, and post-sale on-boarding support.
  • Worked with the Sales team to respond to prospect and partner enquiries including RFI’s and RFP’s.
  • Created and conducted product demonstrations on-site, remotely and at trade shows to highlight the products’ strengths.
  • Served as technical SME and answer client’s technical questions and concerns.
  • Hands on experience in Hadoop big data technologies PIG, Hive, IMPALA and HBASE. Complete understanding of Hadoop big data ecosystem.
  • Creation of metrics, attributes, filters, reports, and dashboards. Created advanced chart types, visualizations and complex calculations to manipulate the data. Act as a Point of Contact in Data Interoperability, Analytics and BI and Production Support issue resolution.
  • In-depth knowledge of SDLC and Project Management methodologies both Agile and Waterfall.

TECHNICAL SKILLS:

Programming: SQL (Analytical/JSON/XML), PL/SQL, HIVE, PIG, IMPALA, UNIX

Database: Oracle 12c, 11g, 10g, SQL server, SAILFISH (Netezza), SNOWFLAKE

Cloud computing: Amazon web services ( AWS)

NoSQL:: Marklogic ( Basic understanding )

ETL Tool: IBM datastage 11.3, 8.5, Alteryx, SSIS

Modelling Tool: Erwin, Power designer

Visualization tool: Alteryx,IBM cognos, Tableau 10.2/10.3

Agile tools: JIRA

Testing Tools: HP ALM, Quality center

Scheduling Tools: Control-m

Version Control tool: Team Foundation server (TFS), PVCS

Presentation Skills: MS Visio, MS PowerPoint

Highest Qualification: Master of Computer Applications (MCA)

PROFESSIONAL EXPERIENCE:

Confidential

Data /ETL cloud Architect

Responsibilities:-

  • Played an active role in high-performance cloud data warehouse architecture and design. Developed complex data models in Snowflake to support analytics and self- service dashboard.
  • Designed and implemented effective Analytics solutions and models with Snowflake.
  • Examined and identified Data warehouse structural necessities by evaluating business requirements.
  • Developed ETL pipelines in and out of data warehouse using combination of Python and Snowflake’s SnowSQL.
  • Participated in a collaborative team designing software and developing a Snowflake data warehouse within AWS.
  • Tune and Troubleshoot Snowflake for performance and optimize utilization.
  • Written SQL queries against Snowflake.
  • Design and developed batch and real-time data processing layer.
  • Developed scripts (Python) to do Extract, Load and Transform data.
  • Overseen the migration of data from legacy systems to new solutions.
  • Guided developers in preparing functional/technical specs to define reporting requirement and ETL process.
  • Offering support by responding to system problems in a timely manner.
  • Designed data archival strategy

Confidential

Data and Solution architect

Responsibilities:-

  • Providing all the technical support in building POCs within 3-4 weeks to show the BI Analytics/product capabilities or any specific service level agreements whose terms department wanted to improve or any critical business SQLs that need to benchmark.
  • Presenting the POC to the stakeholders.

Confidential

Data Warehouse Architect/Data Modeler

Responsibilities: -

  • Gathering of requirements for generation of detailed specifications to determine project scope, materials, timelines, and resources.
  • Design, development, and automation of the entire data-warehouse life-cycle from conception onto full-scale production.
  • Collaborating with other stakeholders to ensure that design is in aligned with business requirement.
  • Worked as OLTP Data Architect & Data Modeler to develop the Logical and Physical 'Entity Relational Data Model' for trade system with entities & attributes and normalized them up to 3rd Normal Form using ER/Studio.
  • Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.
  • Gathered business requirements, working closely with business users, project leaders and developers. Analyzed the business requirements and designed conceptual and logical data models.
  • Identified strategic data requirements and design models to make sure such data requirements align with the architecture of the enterprise.
  • Enforced naming standards and data dictionary for data models.
  • Implemented End-to-End traceability and process run control.
  • Design and development of data-warehouse databases, tables, views, indexes, constraints, stored-procedures and functions.
  • Design of automated ETL processes for import and cleansing of transactional database records used to refresh the Data-Warehouse.
  • Design of automated ETL process for Import-and-Export of feed-files to-and-from Data-Warehouse to upstream-and-downstream systems.
  • Design of OLAP objects including dimensions and fact tables.
  • Design of automated generation and delivery of business intelligence dashboards and reports using Cognos 11.X.
  • Effectively resolving issues and roadblocks as they occur.
  • Maintenance of version control processes for merging, branching, and automating deployment between Development, Test, and Production systems (TFS).

Confidential

Data lead/Hadoop Developer

Environment: Hadoop Eco system, PIG, HIVE, IMPALA

Responsibilities:

  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, and Sqoop.
  • Importing and exporting data in HDFS and Hive using Sqoop.
  • Written Hive UDFS to extract data from staging tables.
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Managed Hadoop log files.
  • Analyzed the web log data using the HiveQL.
  • Involved in defining job flows.
  • Involved in managing and reviewing Hadoop log files.
  • Involved in Loading and transforming large sets of structured and semi structured data.
  • Involved in loading data from UNIX file system to HDFS.
  • Good understanding and related experience with Hadoop stack-internals, Hive, Pig and Map/Reduce.
  • Developed HIVE queries for the analysts.
  • Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.

Confidential

ETL Tech Lead/Data Modeler/ Data Analyst

Responsibilities:

  • Built the ETL Architecture to simplify the design, Improve the performance and easy enhancements.
  • Perform the high level analysis of business requirement and provide the Cost/Effort Estimation.
  • Create the High Level ETL Design, Source to Target Mapping with the detailed business rules to load the respective table.
  • Built the STAR schema to support reporting and analytics needs.
  • Involved in Data Architecture to Analysis, Profile the data and build the Gold Copy of records.
  • Implemented the Data warehouse solution for descriptive, Diagnostics and predictive analytics.
  • Studied and Re Design the existing ETL applications to improve the performance and reduce the cost for future enhancement and maintenance

Confidential

ETL Lead

Responsibilities:

  • Identifying end-to-end scenarios and the impact to sub-systems, interfaces modules areas of design/code/test and provide expertise as to critical solutions.
  • Implementations of new processes, system improvements to improve project execution, mitigate risks and manage client standard compliances.
  • Interacting with business users and data analysts to finalize the design.
  • Development of complex ETL jobs which involves huge volume of data, complex calculations and validations, error handling and report generation.
  • Monitoring the jobs for production support at different layers, performing analysis of aborted jobs and providing quick resolution as per SLA and maintenance of ETLs.
  • Tracking of defects and ensure that all are resolved before handing over the module to QA team.
  • Participating in project audit and health checkup activities.

We'd love your feedback!