We provide IT Staff Augmentation Services!

Data Architect -azure Cloud Resume

4.00/5 (Submit Your Rating)

Troy, MI

SUMMARY:

  • 13+ years of IT experience in development, data solution architecture and implementation of OLTP & Data warehousing/Big Data Analytics solution for top enterprises with emphasis on enterprise Data Architecture, Data Modeling, Cloud data solution, Cloud migration and Data governance area.
  • Expert level experience in planning and managing all aspect of enterprise scale projects including Architecture, implementation and solution delivery.
  • Sound hands on experience in building data solution on Azure cloud platform using HDInsight Azure SQL Server, Azure SQL Data Warehouse, Azure Data Lake Store, Pol yBase and Power BI.
  • Hands - on experience in design and implementation of Data lake/Big data solution on Hadoop platform (HDFS, Hive, HBase, Oozie, Apache Sentry, Hue, Ambari )
  • Sound experience in Cloud Data Migration with Azure, AWS Redshift, TERADATA & MS SQL Server.
  • Work deeply in technology strategy , evaluate current state architecture, pain points, and provide future state data architecture, data strategy, conduct POC on emerging tools /technology.
  • Collaborate with client’s technical architects and business leads to gain thorough understanding of the clients business and technical requirements and propose optimal technology solution options.
  • Expert level proficiency in OLTP & Dimensional Data modeling using ERwin, E/R Studio , DDL generation, RE, complete compare & reusable templates creation and Data model management.
  • Deep familiarity with Industry Business Process Model & Data Model(GuideWire, IBM IIW,Teradata FSDM) in Healthcare, Telecom, Insurance PnC & Finance domain.
  • Hands on developing large scale Data Model of more than 25 Subject areas and 500 + tables.
  • Experience in setting Data Governance policies, Data Quality standards in DW/Data Lake/ MDM.
  • Extensive experience working as Data Architect with one of the world's largest MPP TERADATA Data Warehouse system in EDW & BI-Analytics space for a top Telecom company.
  • Worked extensively on TERADATA DW physical implementation, Data Modeling, determined best uses of TD utilities- Multi-Load, Fast-Load, Fast Export, TPump, Secondary Indexes, AJI, PPI, Partitioning, Compression and stats collection.
  • Sound modeling experience with NoSQL MongoDB , Cassandra & AWS Redshift .
  • Well versed with NoSQL architecture, Data model, schema design, Indexing, sharding, replication, CRUD query mechanism & data access optimization.
  • Hands-on in writing complex SQL queries, data analysis, build standard /ad-hoc BI reports with Cognos, Tableau & Power BI.
  • Experienced in AWS Cloud managed DB services (Redshift, SQL Server, Aurora, AWS DMS/ SCT) .
  • Good experience with S3 , EC2, Data Pipeline, IAM, Cloud Formation , Relational Data Services.
  • Experienced in Database migration strategy from on premise to Cloud including risk assessment, usage of storage & compute services, Schema conversion, Data migration and sync up.
  • Hands on experience in implementation of data governance , metadata management, setting up data quality rules and enterprise data management framework.
  • Sound knowledge of data modeling inside API, SOA web services which involves JSON, XML/XSD based common data services .
  • Expert in Data Modeling tool ( ERwin, ER Studio, IBM Rational UML and IBM InfoSphere Data Architect ), ERwin Model Mart & metadata explorer, and metadata EMR, IBM InfoSphere IGC).
  • Proven experience in business development, day-to-day client interfacing, technology consulting, deep technology strategy in Big Data, cloud and analytics solution space .
  • Experience in Project management, project planning, demand forecast, resource management and project coordination with the key stakeholders.
  • Excellent communication and organizational skills, good team player and self-starter.

TECHNICAL SKILLS:

Operating System: Linux, UNIX, Mac, Windows XP, Windows 7,10

Programming languageIndustry business model: Java Script, Python, R, PL/SQL,JSON,XML,XSD, OOAD, UML, XSD, Insurance Industry ACORD model, IBM IAA-IIW common industry model, Teradata HealthCare & Telecom Data model, PeopleSoft Finance/HR, Guide-Wire Claim Center/Policy Center.

Databases/NoSQL/Cloud: TERADATA13/14/15, Oracle11g/12c,Exadata,SQL Server2017,DB2,Vertica Enterprise,, Postgres, NoSQL- MongoDB, Cassandra, AWS DynamoDB, AWS Aurora DB,AWS DMS(Data Migration Service),S3,EC2,EMR,AWS Redshift, AWS Spectrum, AWS Glue,Kinesis, HDInsight, Azure SQL Server, Azure SQL DW, Azure Data Lake Store, Azure BLOB storage,Azure catalog, Azure Active Directory & IAM, SSMS.

Software Tools/BI/Analytics: CA ERWIN r7.1/7.3/8.2,9.7,CA Model Manager, ER/Studio, Watson-Life science/Health,IBM InfoSphere- IGC, IBM MDM AE, Information Analyzer, InfoSphere Data Architect,Power Designer12x/16x,Informatica, Informatica BDE,IDQ, SAP BO, Qlik Sense, Tableau, MS Power BI SSAS, SSRS, Altova XML Spy, Toad, EMC Documentum, Oracle Golden Gate, Teradata Aster, Salesforce Data loader.io

Big Data/Hadoop stack: HDFS, HBase, Apache Sqoop, Hive, Cassandra, Apache Spark, Oozie, Hue.

Other Utilities: TERADATA SQL Assistant 15, TD DBA,TOAD 10.6, HP QC, Composite (Data Virtualization Tool), Citrix, Profiling tool-Trillium, Data mapping Analytix, JIRA, MS Visio, MS Office.

PROFESSIONAL EXPERIENCE:

Confidential

Data Architect -Azure cloud, Troy, MI

Technology stack: Azure Cloud- Azure SQL Server, Azure SQL Data Warehouse, ADLS, Analysis Services, PowerBI, HDInsight, HCatalog, Informatica Cloud, Teradata, Cognos, ERWIN r9.7

Responsibilities:

  • Evaluate newer tools and technology, build data architecture strategy & technical implementation details.
  • Work with business/BAs to capture data requirement, prepare data catalog by performing data analysis/profiling in source systems to validate data fields, data structure, data quality, key fields.
  • Develop enterprise Data strategy- Data integration, Data Quality, Optimized store & access, Data retention & archival strategy, Metadata management governance policies and procedures.
  • Develop Data Model - Conceptual/l Logical/ Physical DM for ODS & Dimensional delivery layer in Azure SQL Data Warehouse(columnar store).
  • Implement optimized data store, compute, access and security services across azure cloud component.
  • Conduct Data Model(LDM/PDM) review /approval for consistency, re-usability, scale and performance.
  • Utilize ERwin advance features & custom template for Data Model design, Model management, Azure compliant DDL generation, RE, Complete Compare, developing Data dictionary and DM diagram.
  • Architect and design data integration interface into Azure Data Lake and SQL DW from various Data sources in SQL Server,Teradata,db2 and PeopleSoft.
  • Involve in Data conversion/Data migration activities(Analysis, Data cleansing, Source- target mapping from Oracle & TERADATA DW environment into Azure Data Lake and Azure cloud Data Warehouse(ADW).

Confidential, New Brunswick, NJ

Data Architect

Technology Stack: Hadoop, HIVE,Sqoop, HUE, Qlik Sense, TERADATAv15,Informatica BDE, IDQ, ERWIN r9.7, ERwin Metadata explorer, ERwin Model Manager, Teradata DBA Tool, JIRA.

Responsibilities:

  • Responsible for Data Architecture, Data Modeling, Data Integration, Data quality & Metadata management solution design and delivery for Enterprise EDW and Cloudera Hadoop environment.
  • Coordinate solution strategy and design discussion workshop with enterprise architects, Technical Leads, on core design and implementation roadmap to acquire, store & deliver data.
  • Gather system data requirement, perform deep data analysis, analyze flat files, work with business to clarify reporting metrics and KPIs, user stories and analytical use cases in order to build scalable Data model in EDL-Data Lake & TERADATA EDW .
  • Deliver high performing Logical & Physical Data Model, DDL for enterprise EDW & Hive Data Layer.
  • Implement J&J Data Governance, Data Quality & metadata framework in EDW & Data Lake.
  • Recommend and coordinate with Dev team for Hive Tables creation & loading in Confidential as per analytical requirement & source data feed.
  • Lead and mentor technical team (onsite/offshore) on detailed technical specs and implementation in Data ingestion, Loading Hive Tables, ETL Integration from Data Lake to EDW.
  • Communicate project timeline, project milestone, critical path, risks and the overall progress to the key stakeholders( scrum master, Project managers, senior management and program sponsors)

Confidential, Hartford, CT

Sr. Data Modeler/Data Architect

Responsibilities:

  • Collaborate and work with business & technical leads/Architects to develop data requirement, Data bus matrix, Data Flow diagram, Conceptual Data Model, ETL mapping specs and Data integration & data migration interface.
  • Develop enterprise Logical & Physical Data Models(3NF & Dim) for various enterprise systems across Data domains for Finance & wealth management, Policy, Claims, Customer Care, Account, Billing & audit, Sales & distribution, Legal and Risk management business functions.
  • Construct data integration strategy- ETL Architecture, ETL Mapping, Extract & Delta logic, SCD approach, Data retention/archival, Data store & data delivery strategy.
  • Defines and enforces data modeling/architecture standards, procedures, best practices to ensure consistency/ adherence to the HIG's Data Governance & information management processes.
  • Recommend modeling best practices, data modeling approaches, design standards for optimized & re-usable Data Model. Document data dictionary, data artifacts & architecture design documents.
  • Worked on DW Augmentation & new data integration for PLDW/CLDW in Personal Lines/commercial lines designed data strategy, ETL architectures/mapping to acquire, stage, store in DW Dimensional area.
  • Worked extensively in digital initiatives programs for system re-platform/ Migration from Legacy Billing & Finance, Claim systems, Policy administration PLA to the modern platform.
  • Worked on Hartford’s Enterprise Hadoop Platform (HDP) and deeply hands on with messaging & batch Ingestion, optimized store and enrichment in HDFS/Hive and Data delivery for analytical use cases.
  • Work deeply in Data Architecture & Data Governance area to establish and evolve DG Framework in Data strategy, Data quality, metadata management and Data Governance area using EMR and IBM IGC.
  • Participate in ARB session to deeply understand Business use case, reference Architecture, Data view, Integration/data services need & end to end solution architecture blueprint.
  • Did detailed POC and modeling in NoSQL(MongoDB) to create JSON objects, Collections , capped collection, aggregation pipelines, Geo spatial, text search,TTL index in key-value Document data model.
  • Lead data modeling for enterprise common messaging data services in PI/PC ACORD Schema enhancement to support API/SOA messaging services.
  • Key member of MDM implementation, responsible for detailed data analysis, Data Profiling, data standardization define Master data domain, build conceptual/Logical/Physical Data Model for Customer, Contact, Account, Policy, Claims subject area.
  • Used MS Azure cloud platform (Azure Data Lake, SQL Server, Azure SQL DW, Data Catalog, Azure Analysis Services, PowerBI) to build claim analytics solution.
  • Resolve complex Data issues, provide Ad-hoc critical data to business on demand, construct rewrite complex PL/SQL Query for ETL & BI reporting for optimized Data access.
  • Performed POC and Worked on AWS Redshift, S3 bucket, RDS, utilized Data Migration Services, SCT for schema conversion and migration.
  • Collaborate with BI-Analytics team , data scientists to prepare data, guide leveraging common data services, map data fields, data access and analytical reporting using MS BI, SAS & R Tool .
  • Write complex PL/SQL queries, determine indexing strategy, create Views and Materialized views to implement business logic and leverage Database performance.
  • Regularly present knowledge session& webinar among architects & tech group, publish white papers on latest data topic on Architecture, Modeling, NoSQL, Big Data, Cloud & Analytics Trend.
  • Actively participate in agile iteration planning, scrum meeting, sprint review in iterative &incremental agile delivery framework.
  • Manage and ensure project success, milestones, and critical path, ensures product deliverables such as requirement specifications, detailed design/dev/deployment from Confidential managed projects.
  • Work in the area of new Business development, account growth, Project management, resource demand forecast and customer relationship management.

Languages/Database: Oracle 11g/12c,Exadata,Vertica, SQL Server 2014,AWS Redshift, JSON,XML, XSD, UML, XSD, Python, R, ACORD Insurance schema, IBM IAA/IIW, GuideWire- ClaimCenter & PolicyCenter.

Special Software/Tools: Tableau , IBM Infosphere MDM, InforSphere IGC, Information Analyzer, CA ERWIN r7.3/8.2/9.6,CA Model Mart, Informatica PowerCenter 9.6,GoldenGate,Tableau,MS BI-SSAS,SSRS,PowerBI,Trillium,Subversion-SVN, Citrix, Altova XMLSpy, Toad

Big Data& Analytics stack: Hadoop, Hive, HiveQL, Sqoop, Apache Spark, R, MongoDB v3.4

Cloud: SQL Server, Azure DW, BLOB Storage, Polybase,HDInsight, MS Power BI .

Confidential, TN

Sr. BI/Data Warehouse Consultant

Responsibilities:

  • Lead Development of Dimensional/semantic Data Model on the top of enterprise 3NF Confidential .
  • Define & establish data management and data governance standards, processes and best practices with focus on creating Information assets, lineage, and data quality framework.
  • Identify critical Data elements and implement Data Quality, data storage, movement, Data access.
  • Provide recommendation to optimize/fine tune TERADATA Data warehouse for data delivery/data consumption for complex BI Analytics use cases.
  • Recommends changes in data warehouse on Data Model structure, Partitioning & Indexing strategy, creation snapshot fact, aggregated views for Analytical reports.
  • Create Join Indexes, AJI, Partitioned primary key (PPI) to optimize performance and Data access.
  • Evaluates reusability of DB Objects-base views, custom views, conformed dimensions
  • Guide and mentor development teams on methodologies, processes and best practices.
  • Participate in scrum meeting on sprint demo, discuss backlog issues & resolution.

Software/Tools Used: Teradata 14.0,DB2,Oracle 11g,SAS, CA ERWIN v8.1, Teradata SQL Assistant TD DBA Tool, Linux, MySQL, Informatica PowerCenter, SS Analysis & reporting services.

Confidential, Atlanta, GA

Data Architect

Responsibilities:

  • Interact directly with business user and gather/clarify business and system requirements.
  • Perform data analysis to validate system data requirements by confirming source system, data availability, data quality and identify integration issues.
  • Responsible for design and delivery of Data solution architecture artifacts such as HLD, Data integration interface, Logical /Physical DM,ETL Mapping specifications.
  • Design large scale Logical/Physical Data model using ERwin tool, implement enterprise standard and best practices, provide DDL script, perform reverse engineering to derive DM, Construct & publish DM, data dictionary, data catalog into enterprise metadata repository.
  • Design and build ETL Table level specification - source - target mapping, data sizing, Change data capture, transformation rules, data quality rules, Initial & ongoing delta load rules.
  • Coordinate and work with Technical leads and principle architects for review and input on architectural and technical design on large IT projects.
  • Build dimensional data models, summary tables, Semantic views for various enterprise BI & analytical applications.
  • Lead data quality initiatives, data management and data governance activities, perform data profiling in source systems, resolve complex data quality issues, instruct and extend support to ETL and BI-Analytics team on data structure, data consumption and query building.
  • Performed advance performance tuning by Partitioning, PDM extension & DM simplification.
  • Perform data exploration and data mining on large historical and current data sets to provide business ad-hoc data requirement and key business insights.
  • Design BI reporting Technical specification, build complex reports using Report Studio and Query Studio for complex reports.
  • Do data discovery, large data set preparation by sourcing data from multiple sources and design standard/ad-hoc BI reports using COGNOS,SAS.
  • Conduct and moderate DRB- Data review board meeting on data model &architectural review of technical solution strategy and data architecture.
  • Work with enterprise architect, DBAs & Business, drive design review on Data integration strategy, Integration issues, Solution architecture/Data Model, Data quality and other design aspects.
  • Work on TERADATA design best practices, data sourcing and integration strategies, Indexing, Partitioning strategies, use of TERADATA utilities MLoad, TPUMP, BTEQ,FAST EXPORT/FAST LOAD
  • Worked on complex data issues, system performance Optimization in TERADATA using primary and secondary Index, statistics collection, multi value compression, Join Index, AJI creation.
  • Guide & mentor developer/QA team on functional and technical issues and provide SME support.
  • Interface with senior TCS management and business functions on business development, business strategy and resourcing needs that affect client relationship and account growth.

Operating System: UNIX, Windows XP Professional

Languages/Database: Java script, PL/SQL, Oracle 10g, TERADATA, SQL Server, MySQL

Special Software/Tools: CA ERWIN r7.5, Teradata SQL Assistant 14.0, Informatica Power Center 9.1, SAS,OBIEE Cognos, GoldenGate, Telecom Billing Telegence,HP QC, Data Virtualization tool-Composite, MS Visio, MS Office

Confidential, Denver, CO

Technical Lead/Data Architect

Responsibilities:

  • Interact directly with business, finance user and functional expert to clarify business/ system data requirements and prepare data requirement specifications and Data matrix.
  • Responsible for data architecture solution such as reference architecture, integration interface detailed design, Table level ETL specs, Data Model & metadata artifacts.
  • Plan, mange & communicate project timeline, project milestone, critical path, risks, delays, and the overall progress to the key stakeholders.
  • Build Conceptual, Logical and Physical Data Model, DB Script and perform query performance tuning.
  • Lead and facilitated data architecture design discussion with technical leads, Architects, business customers and SMEs on external system in order to derive data
  • Perform data profiling, data analysis, instruct and support to BI & ETL team on data integration issues, Data transformation, extract/load logic, CDC approach.
  • Review and provide input /support to QA test plans, provide subject matter expertise to resolve technical and functional complexity.

Operating System: UNIX, Windows 7

Languages/Database: Java Script, Exadata/Oracle 11g,MySQL, SQL Server.

Special Software/Tools: CA ERWIN r7.3/7.5, Toad, Informatica PowerCenter 9.1, SAP BO, SAP Confidential Tool.Control-M, MS Visio

Confidential, Madison, WI

Data Architect/Data Modeler

Responsibilities:

  • Lead technical team, communicate project status including timeline, risks & overall project progress
  • Responsible for system data requirement gathering, functional analysis, data catalog creation.
  • Create data flow diagram, conceptual data model, ETL integration and data migration strategy.
  • Design logical & physical data model for operational and Dimensional DM for Analytical system.
  • Develop enterprise data strategy, Data Quality Rules, data modeling processes, standard template and best practices document.
  • Performed requirements gathering for BI Report, performed deep data analysis, prepared BI tech specs & assisted BI development in data access, user view and aggregated/summary data creation.
  • Do data discovery, investigate data quality issues, design standard/ad-hoc BI reports in BO XI.
  • Initiated & enforced data quality rules, data governance policies-procedures in end to end data design process.
  • Design and build of high precision analytical BI-report used for rigorous experimentation and iterative analytics on large datasets.
  • Build Transaction and snapshot fact, Dim Views, Fact less Fact for simplified Data access .
  • Performed data profiling, large data set preparation, data mining and exploration with statistical and machine learning algorithms.
  • Database performance tuning, query optimization and optimization of load process
  • Reviewed test plans and critical test cases, assisted in system, load and performance testing.

Operating System: UNIX, Windows XP Professional

Languages/Database: Watson-health, R, .NET, C#, Oracle 10g, SQL Server, Toad.

Special Software: PeopleSoft, CA ERWIN r 7.3,Informatica PowerCenter, Visio, IBM Rational tool, BI -SAP BO XI.

We'd love your feedback!