We provide IT Staff Augmentation Services!

Data Integration Solution Architect Resume

3.00/5 (Submit Your Rating)

Data Integration Solution Architect Dublin, OhiO

SUMMARY

  • Operationalized Business intelligence solutions and conducted advanced analytics to identify opportunities and strengths to meet business goals.
  • Created and maintained data visualizations that educated, informed, and engaged business partners on key metrices and performance measures.
  • Collaborated with enterprise data warehouse, data governance and business teams on data quality issues, as well as architecture or structure of data repositories.
  • Worked with cross functional teams to structure problems, identify appropriate data sources, extract data and develop integrated information delivery solutions.
  • Experience with Google Cloud Storage Solutions, Compute Engine, Cloud Networking solutions and Distributed Networking technology.
  • Leading the Development and implementation of data ingestion and streaming technologies like Attunity Replicate and Compose.
  • Developing dashboards using new Relic tool for alerting mechanisms.
  • Expert in designing, developing, implementing, and maintaining a database and programs to manage data analysis factors.
  • Expert in Data Mining techniques including pre - processing and cleaning methods, Feature Extraction, Data Visualization, Predictive Model Creation, Validation and Reconciliation.
  • Ability to analyze large amount of data to uncover hidden patterns, unknown correlations to support business objectives and priorities.
  • Experience in building and maintaining dashboards to track product usage and performance using Data Studio.
  • Strong Business analytical skills, ability to apply business logic to design and implement data mining techniques on large data sets and defining metrics/KPIs/Reports.
  • Experience with tools, technologies and practices needed to perform in-depth analysis of both structured transactional data and semi-structured or unstructured data.

TECHNICAL SKILLS

  • Attunity Replicate
  • Attunity Compose
  • Qlik Enterprise Manager
  • New Relic
  • Google Cloud Platform
  • GCS
  • BiqQuery
  • Cloud Functions
  • BQ Command Line Utilities
  • Stack Driver, SQL
  • Alteryx
  • AtScale
  • Tableau
  • DataStudio
  • HTML, CSS, JavaScript
  • JIRA
  • Github
  • Databases
  • MySQL
  • DB2
  • Oracle
  • MSSQL
  • Teradata
  • SAP HANA

PROFESSIONAL EXPERIENCE

Confidential, Dublin,Ohio

Data Integration Solution Architect

Responsibilities:

  • Create Data Integration Strategies between disparate systems and cloud data platform.
  • Working with the Enterprise Data Architect for designing and maintaining overall data strategy.
  • Actively working on defining the strategic technologies and prove out these technologies in lab/sandbox environments.
  • Collaborate with product teams to understand data and integration challenges, identify solutions, and guide the teams in implementing the solution.
  • Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
  • Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues
  • Working on building and optimizing data sets, ‘big data’ data pipelines and architectures.
  • Assembling large, complex sets of data that meet non-functional and functional business requirements
  • Performing root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
  • Building processes that support data transformation, workload management, data structures, dependency and metadata.

Confidential, Dublin,Ohio

Google Cloud Platform Data Engineer

Responsibilities:

  • Involved in designing the ingestion pattern for huge volume of data from legacy systems (SQL, Teradata, Db2, HANA) to Google Cloud Storage and Google BigQuery.
  • Development of Dataflow jobs in python for migrating the data between various GCP components and transforming the input to a desired output.
  • Defining the security architecture for Google BigQuery datasets by using Service Account, Active Directory Groups and Authorized Views.
  • Performing Data Munging, transformations in BigQuery to help the Business User to make better decision
  • Writing cloud functions using Nodejs and Python for background triggering of events.
  • Developed tasks in Attunity Compose for merging the files ingested into GCS by Attunity Replicate.
  • Actively involved in the development of the AtScale Cubes for creating a semantic layer for the consumption tools like Tableau, Qlikview, BO etc.
  • Developing Strategies for the cost optimization related to Read/Write operations for incurring less cost in Google BigQuery by Partitioning and Clustering of huge volume tables.
  • Developing Technical Specification, Technical Unit Testing and Technical Design documents for each interface and Reporting objects for Audit Purpose.
  • Using Shell Scripts, scheduling the jobs via Automic for executing the routines present in BigQuery.

Confidential

Senior Software Engineer

Responsibilities:

  • Development, Implementation and Testing of Business Intelligence solutions including ETL Packages for DW, financial reports, dashboard reports. Supporting diverse reporting needs of banks like MIS Reporting, Bank Centric Reporting, day to day operational reporting, financial portfolios, customer centric reports etc.
  • Design, development and implementation of ETL Packages, cube solutions in enterprise level data warehouse projects, integration of data sources, information/data transformation, business problem to solution mapping by delivering knowledge insights to support diverse-decision making process more efficiently.
  • Involved in Database conversion and migration for multiple Healthcare clients including international as well as state healthcare.

Confidential

Software Engineer

Responsibilities:

  • Setup environment, system workflow, database modelling, DW design and functionalities for new DataMart as well as maintenance, performance and functional enhancement of existing DataMart (on Sandbox, UAT, Production)
  • Involved in various phases of Software Engineering viz Requirement gathering, Requirement analysis, Detailed Design, Development, Implementation and Testing.
  • Leading team meeting and Client communication events.
  • Analyzing performance improvement areas in BI & Analytics projects and providing solutions media agencies & clients to address same.
  • Collaborated with development team and clients to improve implementation of processes and managed web content on Verizon.com; increased online customer base by approximately 15%.
  • Administered implementation of promotion and discount strategies for existing customers resulting in retention of approximately 75% of the targeted group

Confidential

Responsibilities:

  • Requirement gathering and analysis for ingesting the data from multiple sources like SQL, Oracle, Teradata, HANA into GCS.
  • Creation of the T-specs requirement for development.
  • Using Attunity Replicate for ingesting the real time data into GCS.
  • Ingesting the Caller Information/Data into GCS using Attunity.
  • Maintaining the source identical layer in the GCS and migrating the data into BigQuery using BQ scripts and Dataflow.
  • Creating a consumption layer in Google Bigquery for the objects
  • Creation of cubes (fact-dimensions models) using Atscale - used for creating a semantic layer for consumption via Tableau, SAP BO.
  • Building strategies for the security of data using Authorized views, creation of AD groups in BQ.
  • Creation of data models in BigQuery which matches with the logical structures created using power design.
  • Creating test cases, test plan as per the requirements for doing the TUT.
  • Requirement gathering and analysis for ETL and Reporting packages. Participating in documentation for Business Scope, business case, milestone delivery plan and detailed work breakdown structure etc.
  • Designing overall control flow of ETL packages and then breaking down to extract, transform and load modules using SSIS, Pentaho, tableau to monitor execution of scheduled packages on enterprise level.
  • Creating test cases, test plan as per the requirements for Extraction packages. Executing unit tests and validating expected results; iterating until test conditions have passed. Developing QA dashboard reports in tableau to monitor execution of scheduled packages on enterprise level.
  • Performing problem assessment, root cause analysis, problem resolution and documentation in existing ETL packages, performing performance & resource utilization analysis, benchmarking, performance optimization in existing ETL & reporting packages (SSRS, Tableau)
  • Writing and optimizing custom script tasks in SQL to in corporate custom business logic (Financial metrics, calculated columns) and validating results with SME’s and business users, accommodating iterative changes based on feedback. Scripts includes custom functions, stored procs, adhoc script, script tasks in ETL packages.
  • Collaborating with all developers and business users to gather required data for reporting & analytics needs.
  • Bank analytics is the advanced data analytics platform designed to address needs of Anti money laundering compliance in Banking, insurance and payment industry using advanced statistical analysis methods (Artificial Intelligence, machine learning, predictive modelling) in data analytics.
  • Design and development of predictive models for risk assessment of the customers using financial profile which involves automated data collection and risk scoring based on predefined rules, social network graph analysis model to keep a track of complex transaction among thousands of entities.
  • Using machine learning model, classification of the customers in the risk severity levels. Model’s assessment over partitioned dataset to measures unbiased prediction accuracy.
  • Design and developed machine learning predictive algorithm in python for capturing fraudulent transaction in OMNI enterprise anti money laundering finance module.
  • Performing unsupervised learning analysis like clustering with K-means, similarity index analysis with Jaquard’s coefficient, Gower’s similarity index, dendrogram analysis for customer segmentation to discover transactional behavior with pattern and suggesting outlier behavior for potential laundering.
  • Involved in core team of Data extraction, cleaning, outlier analysis, transformation, modelbuilding to results capturing and serve these results as a key insight to AML team with system imbedded popups reports, visual dashboards and various other integrated reporting needs.
  • Knowledge of Predictive models: Multi-variant regression models, Neural network, Decision tree (CART) and A/B testing.
  • Benefited various clients to detect AML violations on a proactive basis thus reducing the probability of massive fines, increasing accuracy in production of suspicious activity reports, identifying complex money laundering patterns and incorporating them into the expert system rules for accurate prediction.

We'd love your feedback!