We provide IT Staff Augmentation Services!

Solution Architect Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Insightful, result - driven IT professional with 11+ years of experience and notable success in directing a broad range of corporate cloud computing, Information lifecycle management, Data migration, Test data management, data governance and IT security initiatives while participating in planning, analysis and implementation of solutions in support of business objectives
  • Designed and Implemented Adaptive Data Foundation (ADF) Solution based on AWS cloud using various AWS services like EC2, S3 & Glacier storage, Talend (ETL), Athena(reporting), Glue and AWS cli
  • Proficient in AWS services like VPC, EC2, S3, ELB, Autoscaling Groups (ASG), EBS, RDS, IAM, CloudFormation, Route 53, CloudWatch, CloudFront, CloudTrail.
  • Experience in design and development of ETL & big data analytics solution to extract logs from On-premise environment and aws cloud environment using Cloudtrail, Cloud watch, Informatica PowerCenter, Jsonar, MongoDB.
  • Experienced with installation of AWS CLI to control various AWS services through Python/SHELL/BASH scripting.
  • Extensive experience in design, development of automation process on infrastructure management and cloud security activities using Ansible Tower as configuration management tool.
  • Good knowledge in relational and NoSQL databases like MySQL, SQLServer, Oracle, DynamoDB, AWS RDS, MongoDB.
  • Built extensive analytic report and created merchant level dashboards in sumologic, Tableau for the business development team.
  • Good command in analyzing the mainframe programs and data sources like VSAM, PS, PDS etc. and extracting the business rule for archiving and decommissioning.
  • Experience in construction of Triggers, Functions, Cursors, Stored Procedures and PL/SQL, Exception Handling using TOAD, SQL Developer and AQT tools.
  • Extensively used Informatica Power Center (9.6.1/10.1.1 ) for ETL (Extraction, Transformation and Loading) of data from multiple source systems like Flat Files (txt, CSV), Databases into Data Warehouse.
  • Good command in developing Mappings and Mapplets, Sessions, Workflows, Worklets and Tasks using Informatica Designer, Workflow manager.
  • Key person in managing weekly, monthly, and quarterly business reports on data security to enterprise leadership and present high-level search analytics to stakeholders.

TECHNICAL SKILLS

Databases: Oracle, SQL Server, Hive, AWS RDS(Postgres), DynamoDB

Operating System: LINUX, Windows

BI Tools: Excel, SSRS, Tableau, JReports (ILM),Data Visualization

Cloud Technology: Amazon Web Services and IBM Cloud object Storage

Programming: Python, Shell scripting

PROFESSIONAL EXPERIENCE

Confidential

Solution Architect

Responsibilities:

  • Build and maintain technical advisor relationships to influence key technical decision makers on platform choices of data intelligence and Cloud provisioning projects.
  • Built a modernized, highly responsive data ecosystem that helps the customer to source, transform and consume data through the cloud while allowing it to leverage advanced analytical techniques
  • Designed reports/queries/S3 folders/rationalization of files for faster performance and minimum retrieval cost in Athena
  • Performed post archival checks with proper data validation methods ensuring no data loss
  • Build a role-based access platform in AWS IAM to control the access to the users access the data in S3 storage.
  • Participate in planning, implementation, and growth of our customer's Amazon Web Services (AWS) foundational footprint.
  • Migrated existing Informatica ILM TDM applications to AWS EC2 reserved instance

Technical Lead

Confidential

Responsibilities:

  • As Lead architect in designing the data masking strategies ensuring the data security and ILM framework for the application decommission providing cost benefits to the customer.
  • Effort Estimation and resource forecasting based on demand for future projects
  • Develop recommendations as to how the business and IT teams can constantly improve their Data Governance and Data Quality practice
  • Design and develop masking method and data-sub setting criteria in informatica powercenter and TDM
  • Extract and load the lower environments with the right sized PII protected production data.
  • Automated the Network and Database discovery, PII classification, Vulnerability Assessment and Data source creation in IBM Guardium.
  • Implemented Big-data analytic platform for data security and governance on AWS cloud and On-premise environment.
  • Design and Develop pipelines and BI reports on the various databases logs to analyze the vulnerabilities on the PII - Personally Identifying information and other Sensitive data.
  • Designing the future integration of data into the business process model using ETL and appropriate technologies.

Confidential

Solution Architect

Responsibilities:

  • Implement the solution as per the design requirements and design.
  • Developed archiving business objects and built unit test cases.
  • Built the solution to store, tag and search the archived data in ILM.
  • Mapped data retention policies and data access roles for the data to be retired.
  • Developed Automation Framework for creating entities with relationships and archive request.
  • Interacted with SME’s and third party vendors to gather information on enquiries and reports identified in the analysis phase.
  • Extract business rule and logic to build enquiries and reports similar that of source application.
  • Design and develop similar reports in SSRS to retrieve the archived data from the file archive store.
  • Performed System Integration testing for the archived systems
  • Analyze any issues in the source of target databases from SQL performance perspective.
  • Provide solutions to SQL performance issues.
  • Implemented best practices for archival such as
  • Mine Data by Applications / Modules
  • Application Security -> Create the application users and assign the respective roles and privileges to perform the Data Archival activities
  • Define the appropriate modules / applications that have been identified for Data archive in EDM
  • Identify complete business objects to create and define Business Entities in EDM
  • Improved the performance of the data archival process and retrieval of the archived data.
  • Developed secured archived data retrieval for the given retrieval criteria.

We'd love your feedback!