We provide IT Staff Augmentation Services!

Sr Data Architect Resume

0/5 (Submit Your Rating)

Bellevue, WA

SUMMARY:

  • Seasoned Data ware House & BI Consultant with 13+ years of IT experience in the software development life cycle.
  • Expertise in data warehouse and business intelligence analysis, design and development.
  • Worked on several projects including solution architect, system analysis, data model design, data analysis, data profiling and source to target mapping design.
  • Substantial experience in insurance, telecom and finance industries.
  • Worked as a data modeler, delivered successful data models and supported each phase of SDLC through final implementation.
  • Experienced in creating business solutions based on business requirements.
  • Strong ability to work with functional analysts, SME to formulate complex business processes requirements and convert them into technical specifications.
  • Expertise in conceptual, logical and physical data modeling, data analysis and data governance.
  • Designed and developed data model that utilize concepts such as star and snowflake schema and slowly changing the dimensions.
  • Design & implement a Big Data infrastructure based on the Hadoop ecosystem technologies.
  • 2 years of experience with Big Data and Hadoop Ecosystems (HDFS, HBase, Hive, Pig, Sqoop).
  • Excellent knowledge of NoSQL database Dynamo DB and analyzing data using HIVEQL.
  • Configuration and deployment experience in AWS data base technologies such as RDS and Dynamo DB and AWS Technologies such as EC2, S3, EBS, ELB, VPC, and Route 53.
  • Developed ETL solutions, Very strong knowledge of Informatica and data stage ETL tools.
  • Excellent interpersonal skills and ability to work in a dynamic team oriented environment.
  • Experienced in project quotes, quality processing, release processes and release documentation.
  • Excellent team player with good analytical, trouble shooting and communication skills.
  • Extensive working experience in Agile & DevOps model
  • Time to time communicate with business team, project manager, solution owner & SME.

TECHNICAL SKILLS:

Databases: Oracle 11i. Teradata 7.0. SQL server 2012

ETL Tools: Informatica Power Center 9.1.0, Data Stage 7.5, SSIS.

Reporting Tools: Cognos. Business Object, SSRS, Tableau.

Database/Modeling tool: Toad, Teradata SQL Assistant, Visio 2013, Erwin 7.x

Languages: PL/SQL, UNIX scripting.

Hadoop Ecosystem: HDFS, Hive, HBase, Sqoop, & Pig

Amazon: (AWS) services EC2, S3, EBS, ELB, VPC, Route 53, RDS and Dynamo DB, Redshift.

Operating Systems: Windows: 2008, UNIX

PROFESSIONAL EXPERIENCE:

Sr Data Architect

Confidential, Bellevue, WA

Responsibilities:

  • Actively involved in IDW requirement gathering, data governance meeting, planning and estimating tasks of BI business capability model.
  • Interact & prioritized business critical requirement by interacting with solution owner.
  • Collaborate with business analyst, business consultants and other subject matter experts to design, develop and enhance data models that meet desired business functions.
  • Prepared & reviewed HLSD solution with solution owner, program manager & business SME.
  • Implement end to end Hadoop solutions with a deep understanding of the Hadoop Ecosystem.
  • Develop, maintain and effectively communicate source to target mappings and data models design concepts for use as data integration (ETL) specifications.
  • Develop and maintain metadata and metadata management strategies for the data model.
  • Work with various groups (DBA, BI architects, system analyst & data analyst) to identify and resolve design or performance issues with the enterprise data warehouse.
  • Collaborate with management and other data architects on development of data strategies, best practices, defining data dictionary and standards.
  • Maintains knowledge of developments & research related to current technologies and IT trends.
  • Excellent understanding and knowledge of HDFS and NoSQL database like HBase.
  • Complete assigned deliverables with quality, efficiency and within established timelines.
  • Design & developed each source system data flow diagram as well as data flow details including Hadoop eco components.
  • Expertise in ETL tool Informatica, Developed several projects using Informatica.
  • Very strong experience in AWS Architecture implementations and recommending AWS best practices/ standards/ tools.
  • Experience in importing/exporting data using Sqoop from HDFS to RDBMS and vice - versa.
  • Time to time updated to the solution owner and program manager about development cycle.

Environment: Tera Data, Oracle, HDFS, HBase, Pig, Sqoop, HIVE Big data Hadoop, UNIXAmdocs Billing system (SAMSON in Confidential ) and Ericsson Billing system (BSCS & CBiO), Tableau. Amazon (AWS) services, EC2, S3, ELB, VPC, Route 53, RDS, Dynamo DB and Redshift.

Sr. Data Architect

Confidential, Seattle, WA

Responsibilities:

  • Conducted requirements gathering meetings with business team, subject matter experts and data based administrator to ascertain business requirements.
  • Governs database structural requirements by analyzing client operations, business process, applications, evaluating current systems, and programming.
  • Design conceptual, logical data model (LDM) and physical data model (PDM) in ER Win.
  • Hand on experience source to target (STTM) creation.
  • Documented VISIO diagram of business processes, data flow diagram (DFD) and deliver it to the business owner.
  • Communication with San Antonio team on regular basic for data analysis.
  • Interact with offshore development team & reviewed ETL mappings.
  • Done data analysis, data profiling for collector assignment tool, which is an UI based application for iHeart Media.
  • Collector assignment tool maintain data about all the vendors who runs an advertisement on clear channel.
  • Data analysis of source system, landing area, integration area & few sources from master data management system.
  • Work closely with Technology Operations & Infrastructure team in building out AWS environment for development of various AWS Services including EC2, S3, RDS, and EMR.
  • Prepared detail analysis reports before development start.
  • After data analysis & STTM creation, reviewed SSIS packages.
  • Provide test case scenarios to Quality analysis team & support during testing.
  • Actively involved in deployment of collector assignment tool.
  • Make strategic recommendations on data analytics, data quality, data modeling, collection, integration and retention requirements incorporating business requirements and best practices.

Environment: SQL server 2012, Hadoop - HIVE, SSIS - SQL server Integration services, SSRS, ER Win. Tableau, UNIX, Amazon web services (AWS). EC2, EBS, Storage S3, RRS, Cloud Watch

Sr Data Architect

Confidential, Bellevue, WA

Responsibilities:

  • Worked closely with Confidential commission’s business team to conclude Commissions business requirement and SCMS Adapter reporting needs.
  • Understand an Ericsson new billing system specific to the Commissions subject area.
  • Conducted requirements gathering workshops with Ericsson and Confidential subject matter experts and data architecture to ascertain business requirements.
  • Study, analyzed and presented existing business processes & requirements to the Ericsson team members.
  • Define and implement standard data governance processes that drive quality data, reports, and performance indicators to locate and correct issues.
  • Prepared and presented business high level solutions design (HLSD) for SCMS Adaptor.
  • Prepared the Functional design & source system analysis requirement document.
  • Experienced in existing AS-IS in architecture and data flows from different source systems.
  • Communicated with the Amdocs team for Confidential billing business logic.
  • Acted as a prime contact between the business analyst and application development teams.
  • Documented data authority process, data profiles & data scrutinize.
  • Documented business requirements and classified them into high level and low level use cases, and activity diagrams using Visio.
  • Design and developed conceptual, logical & physical data model for new billing system.
  • Key responsible person of Confidential commissions software quality assurance (QA) team.
  • Expert in source to target (STTM) creation & review development team coding,
  • Review & modified test case scenarios prepared by the testing team.
  • Maintain business requirement status open/close & approved in Quality Control (QC) on weekly basis.
  • Create POC to import and export data using Sqoop from HDFS to EDW (RDBMS).
  • Architect Big Data software systems for complex data analytics requirements, including data processing, analytics and reporting.
  • Experience in analyzing data using HIVEQL.
  • Have knowledge of Amdocs (SAMSON & RPX) Confidential billing system.
  • Excellent knowledge & experience in Informatica ETL tool.

Environment: Informatica 8.6, Oracle, Teradata, SQL Server, Visio, MS power point. PL/SQL, ER-Win, UNIX and Amdocs Billing system and Ericsson Billing system. Big Data and Hadoop Ecosystems (Hive, HBase, Sqoop, HDFS). Amazon (AWS) services, EC2, S3, ELB, VPC, Route 53, RDS, Dynamo DB and Redshift.

Principal Data Analyst

Confidential

Responsibilities:

  • Work as a data analyst team lead with the primary responsibility to analysis enterprise data warehouse
  • Work with multiple stakeholders, both Business and IT to build a holistic view of the organizations data Model.
  • Develop enterprise wide guidelines/standards for database design & data standards.
  • Extensively work on data profiling, data quality, data volume & load frequency
  • Work with solution architects to develop ETL / data integration / data processing flows.
  • Worked closely with finance, marketing and the compensation teams to understand business needs and prepared t-shirt size estimates for each project.
  • Translated business and user requirements into accurate source system requirements specification documents through systems analysis and data profiling.
  • Prepared use case diagrams to present to the business, architect and DBA.
  • As per the system requirement design logical and physical data model using data model tool.
  • Designed data models, based on CDC requirements and high level design documents.
  • Defined strategies to retain data for long periods of time.
  • Prepared data analyst task plans and maintained the entire milestone on time and quality.
  • Defined tasks in caliber, kept track on the entire analysis phase with other stakeholders.
  • As per the business requirements, developed sources to target STTM mapping for all tables.
  • Worked in a high performance and team environment with rapidly changing priorities.
  • Assisted the development team to understand STTM business logic and performance tuning.
  • Subject matter expert of Confidential CRIS- CTL billing system.

Environment: Erwin Data Model, Visio, Oracle, Informatica, Confidential and LQ data source system. CRIS- CTL billing system.

Sr System Analyst

Confidential, Bellevue, WA

Responsibilities:

  • Worked closely with different business groups to determine their business, systems, data, and reporting needs for various levels of management across the enterprise
  • Conducted requirements gathering workshops with the key clients and subject matter experts and data architecture to ascertain business requirements.
  • Analyzed and documented existing business processes using MS Visio and recommended process improvements to streamline and improve efficiency.
  • Documented data transformations business logic for ETL-Informatica.
  • Analyzed the source system SAMSON Oracle database and delivered the source system analysis SSA documents.
  • Communicated with the Amdocs team for Confidential billing business logic.
  • Acted as a prime contact between the business analyst and application development teams.
  • Documented business requirements and classified them into high level and low level use cases, and activity diagrams using Visio.
  • Expert in database design and data modeling principles.
  • Using CA Erwin to develop enterprise data models.
  • Analyzed and documented complex business processes for both existing and new systems and provided source to target (STTM) data mapping documents to the developers.
  • Developed, organized and conducted necessary programs for the staff.

Environment: Informatica 8.6, Oracle, Teradata, Visio, MS power point. UNIX and Amdocs Billing system SAMSON ( Confidential )

Data Architect

Confidential, Nashville, TN

Responsibilities:

  • Excellent understanding of the migration purposes, respective to the business and the analysis ADS business logic and existing data models.
  • Proficient in understanding the business issues and data challenges of the client's organization.
  • Presented high level business and technical analysis of new and existing applications and database.
  • Documented ETL programs, business processes and made recommendations for enhancements and improvements.
  • Interviewed subject matter experts, data architects and DBA, asked detailed questions regarding the functionality of the new applications and documented the requirements in a format that could be reviewed and understood by both the businesses and technical employees.
  • Conducted JAD workshops with the development and reporting teams, user case and the data profiles.
  • Developed several ETL jobs business logic as per the business requirement.
  • Design, Development, Coding and testing was a part of analysis role.
  • Prepared current versions of high level functional specification documents for data mart project as well as run queries to pullout business reports from the oracle database.
  • Created and delivered technical design documentation, which included source system data and process flows.
  • Communicated with the client and business analyst on all phases of the new developments.
  • Guided and mentored on all phases of the projects to the on side and off shore team members.
  • Assisted the team members in applying business rules and development.
  • Created complex sources to target mapping rules using Informatica transformation, mapplets and reusable transformations.
  • Prepared requirements, specifications, business processes and recommendations.
  • Developed data cleansing ETL mappings documentation deal with production environment performance tuning of ETL issues.

Environment: DM Express, Informatica 8.6, Oracle, Teradata, Mainframe and UNIX scripting.

Data Modeler

Confidential, San Antonio, TX

Responsibilities:

  • Conducted interviews with key business users to collect requirements and business process information by forming questioners with properly understanding the company’s business model about regulatory compliance reporting.
  • Highly interactive in JAD sessions with project stakeholders, SME, end-users, DBA, developers and project management team for identifying business needs, defining software requirements and provided technical solutions.
  • Decomposed and translated business requirements into functional, non-functional requirements and created source system analysis SSA documents.
  • Documented business requirements and classified them into high level and low level used cases and activity diagrams, and data profile documentation.
  • Provided cost estimates, investments and approached for follow-on implementation development efforts.
  • Demonstrated ability to architect complex integration scenarios using Informatica ETL.
  • Produced logical and physical data models specifications and source to target mapping documents.
  • Created requirements and functional specifications for the new ETL processes.
  • Expertise in different Informatica tasks Session, Decision, Email, Event-Raise, and Event-Wait.
  • Documented source to target complex logic using functionality of Informatica ETL such as mappings, mapplets and reusable transformations.
  • Handled the offshore development teams.

Environment: Informatica 8.6, Informatica IDQ, ER-Win, Oracle, Mainframe and UNIX.

Data Architect

Confidential, Dallas, TX

Responsibilities:

  • Attended project scoping executive meetings, defined detailed project scopes, goals and deliverables that supported the business goals.
  • Performed thorough GAP Analysis reviewing AS IS system documents to in corporate the existing functionality into the TO BE system.
  • Responsible for the re-architecture of an enterprise data model to support a new service oriented framework.
  • Created process flow diagrams for the TO-BE state using MS Visio.
  • Created data flow diagrams to represent data flow from the various applications and portal to the enterprise data warehouse EDW and vice versa.
  • Provided data modeling consulting services to the application development teams.
  • Reviewed application data models to ensure compliance and integration with the enterprise data model, identifying potential data redundancies.
  • Documented data warehouse life cycles for large scale mortgage data warehouses.
  • Worked for the default IT database team, data analysis and logical/physical modeling.
  • Worked as a team lead and communicated with the offshore development teams.
  • Wrote source to target mappings, complex SQL logic and for ETL mapping developments.
  • Reviewed codes and technical designs of the other ETL developers.

Environment: Informatica 8.6, ERWin, Oracle. UNIX scripting.

ETL - Informatica Team Lead

Confidential

Responsibilities:

  • Collaborated with various members of the project teams to include client representatives, business analysts and technical staff members to understand the business.
  • Included the design and construction of the logical data models, physical data model and Meta data.
  • Excellent exposure to the functionality and requirements of PbR and design and document ETL specifications.
  • Responsible for defining the ETL standards and best practices for the project.
  • Involved in PbR (Payment by Result) development.
  • Coordinated and supported the on-site and off-shore development team.
  • Experienced with performance tuning the ETL processes and reviewed and verified the codes.
  • Mentored the development team.
  • Implemented complex tariff logic in mappings by using mapplets and reusable components.
  • Developed mappings and workflows and schedule workflows.
  • Fixed the mappings issues in production support.
  • Developed detail designed documents and technical specifications based on requirement.
  • Database deployment for maintenance releases.
  • Supported the scope and estimation for all releases.
  • Preparation of database deployment and PbR component deployment guides for maintenance releases.
  • Prepared sub system release notes as well as sub system deployment guides for all releases.
  • Resolution of PbR FQT, EMT defects.

Environment: Informatica 8.1.1, Informatica IDQ, ERWin, Oracle 9, Windows XP, UNIX.

ETL Informatica Team Lead

Confidential

Responsibilities:

  • As a team leader my responsibility was to handle development and reporting teams, supporting them for releases and technical solutions.
  • Worked closely with business analysts, application teams, data modeler, database administrators and reporting teams to ensure the ETL solution met business requirements.
  • Communication Bridge between development and production support teams which were handled by Tata Consultancy Services, TCS.
  • Participated in estimating the tasks and cost to meet deliveries.
  • Extensively worked with data stage designers for developing various jobs.
  • Sources used such as sequential file, hashed file, main frame copy book and Oracle
  • Extensively worked on partitioning bulk data into subsets to distribute the data on different processors (MPP) achieving faster processing of data using Parallel Extender.
  • Worked on Parallel Extender involving different stages like Lookup Stage Join, Merge, Change Apply, Change Capture, Remove duplicates, Funnel, Filter and Pivot stages.
  • Implemented both pipeline and partition parallelism.
  • Used data stage director to run, monitor and schedule the jobs.
  • Extensively worked with parallel extenders and various data stage clients (designer, manager, director and administrator) for developing data stage parallel jobs and to enhance the performance of source to target transformation and loading.
  • Performed tuning of the ETL using various components.
  • Project planning, estimation and co-ordination.
  • Experienced in project quote, RUS, release process and release documentations.
  • Designed (HLD and LLD) and developed the application.
  • Attended requirement calls, work stack calls and PIR calls on behalf of the CORMIS dev team.
  • Ensured error free, timely and successful delivery of releases.
  • Knowledgeable of the data sources and transformation rules required to populate and maintain the targets.
  • Involved in optimizing the performance by eliminating targets, source, mappings, and session bottlenecks.
  • Created Informatica mapping to load the data from the excel file and flat files (CSV).
  • Used the Informatica designer to design and develop mappings for extracting, cleansing, transforming, integrating, and loading data into different targets.
  • Used debugger to test the mappings and fixed the bugs.
  • Mapping developed to various dimensions.
  • Parameter files and mapping variables were used extensively to improve performance.
  • ETL tools used to design and develop mappings and loading data into different targets.
  • Maintained warehouse metadata, naming standards and warehouse standards.
  • Written test cases and test scripts for testing the mappings.
  • Participated in weekly status meetings, and conducted internal and external reviews as well as formal walkthroughs among various teams, and documented the proceedings.

Environment: Informatica, SQL server 2005, Mainframe Copy book, flat files, Excel file, Windows XP.

Software Developer

Confidential

Responsibilities:

  • Participated in analyzing user requirements.
  • Designed and modified the existing database to in corporate new requirements.
  • Performed database administration duties like creation of tables, indexes, views, user roles, regular backups and monitored performances.
  • Identified data needed to deliver requirements, and developed data loading automation applications.
  • Developed application and test data.
  • Implemented front end programming in Visual Basic to invoke the data base.
  • Involved in documenting the entire technical process.
  • Involved in performance tuning.
  • Developed the test plan, test case and test scenarios.
  • Created the technical as well as application flow documentation for extensive use in support time.

Environment: VB 6.0, Oracle, MS access, Windows 95.

We'd love your feedback!