We provide IT Staff Augmentation Services!

Senior Big Data Architect/ Manager Resume

5.00/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • Total 16+ Years in IT Industry possess Broad set of Skills (Jack of All Trades) to add maximum Cross - Functional value and End-to-End execution of Modern Technology stack to accomplish all Use cases of your Company covering all Industry Domains.
  • Big Data Evangelist, Outside-box thinker and Tech Savvy Techno-Functional Data Wrangler / Analyst turned BIG DATA Architect with specializations in Hadoop Eco-system including NoSQL Technology Platforms with 16+ years of Cradle to Grave Experience in IT Landscape ranging from Broad-spectrum of likeIT/Retail/Telecom/Healthcare/Insurance/Logistics performing PredictiveData Analysis, Data Science Analytics, Requirements Elicitation, Data Modeling, and Use Case development fromInception (POC) to Production.
  • I know “How to Monetize Data” with Client engagements and create a Win-Win Situation.
  • Big Data Platform Strategist and CLIENT-CENTRICapproach to Create Blueprints and Roadmap from Ground-up for Clients to start their Big Data journey seamlessly.
  • MACHINE LEARANING SPECIALIST- ALGORITHMS /MODELS to accomplish Use cases like-Fraud Detection, Predictive Analytics, Recommendations, Predict Cyber Attacks.
  • Expert in Creating Predictive models leveraging Machine Learning Algorithms. 1. Supervised Learning 2. Unsupervised Learning. Regression Analysis, K-Means Clustering, Decision Tree creations, Random Forest, AWS Lambda developing Recommendation Engines, Smart Cars, Customer Churn Analysis, Customer Segmentation, Fraud Detection, Sentiment Analysis, Ad Service Personalization, Data Security, Financial Trading, detecting anomalies in Audit trails and accounting.
  • IOT Platform Strategist, IoT Architecture, and planning, M2M Platform Services: Connectivity Management, Application Enablement, and Device Management offering world leading security solutions for device protection, encryption, authentication, key management, and code signing. Strong background in sensor and actuators networks (wireless and wired), control systems, embedded systems, cloud-based platforms for IOT applications.
  • IoT Platforms:Thingworx, AWS, PREDIX, Microsoft Azure, Ayla, Predix, Evrythng
  • IoT Protocols:MQTT, CoAP, HTTPS, DDS, AMQP, BLE, Wi-Fi, ZigBee Mesh Networks
  • Big Data Practice/CLOUD Architecture / SME for IoT-Internet of Things strategies.
  • Hadoop Security/ Kerberos setup and Integration with AD /LDAP/ Data Encryption / Data Masking/ Tokenization / Multi-Factor Authorization/ IAM Security Policies Configuration.
  • Strong understanding of what’s “Under the Hood” of Big Data World to offer Best practices / recommendations to the clients and drive the Big Data Initiativesfrom Pilot - to- Production”.Self-motivated and sell audiences on solutions by performing technical product presentations and demos. Possess a balance of technical and sales aptitude, including handling customer technical questions and issues, creating product positioning and translating business needs into new product features and functionalities.
  • Help Companies Monetize their Data and create actionable Insights leveraging Big Data.
  • Data Integration & Warehousing Data Science E-Commerce Web Analytics
  • Hands-On and demonstrated experience for Cloud Integration Architectures and Cloud Migration Strategy creation and SecuredApplication deployments, API’s Security.
  • Hands-on experience on setting up cloud platform components - API’s, CloudFront, Cloudwatch, RDS,DynampDb, Kinesis, Redshift, CloudTrail, AWS IoT, AWS Lambda, AWS Machine Learning, S3, DirectConnect, IAM, SNS, AWS Elastic MapReduce, Cluster Setups.
  • MULTI-TENANCY:Responsible for overseeing cloud computing, g strategy and application migration from legacy Applications and Mainframe to public or private cloud. cloud adoption plans, cloud application design, and cloud management and monitoring, oversee application architecture and deployment in cloud environments -- including public cloud, private cloud and hybrid cloud.Advice the Client about Best Practices in Multi-Tenant environments.
  • Create roadmap to migrate applications from legacy systems to cloud environment seamlessly.
  • SCAN Applications for potential Vulnerabilities and check for Code errors before deploying.
  • Provide recommendations, technical direction and leadership in new technologies such as Hadoop and NoSQL technologies as part of the overall architecture.
  • Worked in Cloud security operations center (Cloud SOC) and set up to monitor cloud service use within an enterprise (and keep the Shadow IT problem under control), parse and audit IT infrastructure and application logs via SIEMtechnologies and Machine data platforms (such as Splunk, IBM QRadar, HP ArcSight and Elasticsearch) to provide alerts and details of suspicious activity. Predictive Security Leveraging Machine Learning Algorithms & AWS Lambda Function.
  • Extensively using AWS Machine Learning / LAMBDA Functions and build complex machine learning models, and fine-tune the algorithms and create Models to Predictive Use cases like- Fraud detection, demand forecasting, predictive customer support, predicting anomalies and vulnerabilities in Network security to keep away the bad guys.

PROFESSIONAL EXPERIENCE

Confidential, Plano, TX

Senior Big Data Architect/ Manager

Responsibilities:

  • End-to-End BIG DATA APPLICATIONS DEPLOYMENT to CLOUD -AWS/ AZURE / HELION.
  • Security Policy and Code Implementation / IAM / Firewalls/ Encryption / Tokenization
  • Data Masking / Kerberos / Machine Learning Techniques for Predictive Models.
  • Application Security and Database Hardening Techniques, IBM-AppScan/SIEM Tools.
  • Designing and Deploying Big Data Solutions /Applications right from Coding the solution to designing the network infrastructure and Security, Tool selection, creating long-term strategy into Cloud Environments (AWS, AZURE,HELION). Cloud Migration .
  • Used CA “Privileged Access Manager” and Vormetric for PAM environment.
  • HPE security tools- ArcSight & Voltage, Vormetric/ Protegrity / IBM Gaurdium.
  • Setting up Proof of Concepts and product Presentations to the Clients.
  • Deployment of Applications from On-premise into Cloud environment safely and securely.
  • TOOLS/ Frameworks: Spark, Kafka, Cassandra, HBase, Tableau, HDFS, Cloudera, Hortonworks, Voltage, Vormetric, Ranger, Splunk, SIEM tools, Data Encryption, Tokenization.
  • Creating Architectural Blueprints and Documentation for the Potential clients and baseline the Offerings to the HPE platform and Portfolio enhancements.
  • Provide Thought process leadership to the Team and play crucial role and add value to the company.
  • Setting up Proof of Concepts (PoC) of various tools like Tableau in AWS environment and test and establish the benchmarks and then document the same on Technology Platforms.
  • Take the ownership of the Big data initiatives and drive the Project and drive from End-to-End providing thought process leadership offering solutions and recommending best practices with Pros and Cons after understanding clients Business problem statements and then build a Business Case and showcase ROI and make Business value proposition to the client and engage the client throughout the Life cycle of this big data initiative.
  • Interact with the third party Tech vendors (Hortonworks, Cloudera, MapR) on a day to day basis to resolve any technical issues and make sure all are in Sync with the workflows.
  • Design and architect an end to end data pipeline including data ingest, data transformation, loading and extraction for various types of data sources and integration with Enterprise environment covering non-functional aspects such as scalability, high availability, security, multi-tenancy, fault tolerance, and elasticity.
  • Work closely with business users, IT team and development team to translate the business requirements into technical requirements and articulate the design sessions.
  • Participate and setup one-on-one meetings with potential clients and stakeholders and help bring Conceptualization to Productionilse the Blue prints.
  • Train the Co-workers and empower them to transition to new Technology challenges. Setup the Presentations to various audiences and knowledge sharing activities.
  • Achieve the Deliverables within the Budgets and time lines and avoid the Scope creep.

Confidential

Security Architect

Responsibilities:

  • Big Data Architect & IoT Solutions leveraging Hadoop stack of Technologies- Machine Learning.
  • Setting up Perimeter Security and Firewall configurations and Penetration testing and PAM.
  • Integration & Security Practices / Predictive Analytics using SPLUNKPLATFORM.
  • Responsible for designing architecture solutions specific to the on-boarding of a set of partner applications/services within the IoT product portfolio in manufacturing floors on Global scale.
  • Participate in Presales activities and internal initiatives as required such as preparing technical collateral for partners’/customer’s proposals and company presentations.
  • Manage partner architecture solutions post implementation by working with partners and Thingworx Platform teams to align partner feature capability needs by creating and maintaining partner roadmaps and Ensures solutions are scalable and meet overall business requirements.
  • Cloud software architecture, communication protocols, embedded systems, and low power/restricted environment systems, IoT industry bodies Thingworx IoT Products/Services, 3rd Party IoT Services/Applications, Application Development and IoT Devices support.
  • Data ingestion of Sensor data into Cloud database, big data analytics, NoSQL Databases-Cassandra
  • SPARK Streaming, KAFKA and TABLEAU for live Real-time streaming and visualizations and Real-time actions to trigger notifications and system alerts to responsible personnel.
  • Protocol Technologies: HTTP, JMS, AMQP, MQTT, KAFKA etc
  • Web Service Technologies: SOAP, REST, WSDL, XML, etc
  • Program/Scripting Languages: JavaScript, Groovy, Python, JSON, etc.
  • M2M Platform Services: Connectivity Management, Application Enablement, Device Management, Gateways - Arduino, Raspberry Pi, Intel Edison, ARM.
  • BIG DATA TOOLS: HDFS, SPARK, SPARK STREAMING,KAFKA, CASSANDRA, TABLEAU

Confidential, Charlotte, NC

Big Data Manager

Responsibilities:

  • Create a Strategy based on the Use cases and take ownership of the project from End-to-End.
  • Focused on PCI-DSS -SOX federal audit compliance documentation and mitigation techniques.
  • Create a Data model and design the appropriate Technology Stack of Big data tools and the stress testing and validate the results and establish Benchmarks and showcase the ROI.
  • Used SPLUNK as Real-time Monitoring and analyzing Machine Data from disparate sources and trigger alerts for fraudulent patterns of user activities.
  • Providing thought leadership and helps drive Big Data within the organization.
  • Designs and develops data pipelines built on Apache Spark, and Hadoop that can perform at scale.
  • Collaborates with other Big Data Engineers, Architects, and Data Scientists to achieve goals.
  • Client facing role setting up Architecture Designs scoping and strategies / Showcasing ROI.
  • Team Lead- Real-Time Data Ingestion using Big data stack of technologies(STREAMING SPARK)
  • Own and establish a Reference Architecture for Big Data Blue print and create a road map for a centralized operations in coordination with all verticals and cross functional teams.
  • End-to-End ownership for Security /Firewall layers and Kerberisation of all 49 clusters.
  • Prepare documentations and arrange Power point presentations /white boarding and Train and mentor developers towards solution development and POC execution.
  • Participate in Strategic discussions about Data Integration, Data Ingestion, Data ETL / ELT
  • Design Real-time processing data Pipelines and trigger Proactive System notifications in Fraud detection event processing.
  • Created personalized service offerings through data mining technology using R Algorithms.

Confidential - Santa Clara, CA

Big data Solutions & IoT Architect

Responsibilities:

  • Big Data POC Project Role & Responsibilities. End to End Ownership and accountability.
  • Data Migration. Took ownership of the Project from Pilot /Solution - To- Production.
  • Big Data POC Development on AWS-Amazon Web services/Cloudera.
  • Data Ingestion / On-Premise Data Integration, Tech Support and Documentation.
  • Created a “Data Lake”-Data Migration of existing data from disparate systems and sources into Hadoop Data Lake using various tools like Sqoop(Hadoop) into HDFS/Hive. Tools: Sync sort.
  • Design and implement solutions to address business problems and showcase the ROI/Business value proposition and consensus with the client requirements.
  • Drive Proof of Concept (POC) and Proof of Technology (POT) evaluation on interoperable technology platforms and seamlessly migrate the Legacy Apps into Big data platform.
  • Train and mentor developers towards solution development and POC/POT execution
  • Communicating the AWS concepts /Business value /ROI to the “C” level management
  • Tech support and Offshore team interaction and taking ownership of the project and driving project deliverables from end to end.
  • Vendor selection process evaluation and presentations. Hortonworks / Cloudera/ MapR
  • Use Cases: Discovery of Internet of Things (IoT), Regression Analysis and created Predictive Models based on the Sensor Data from the Production floors lead to Operational efficiency and avoid Product recalls. Integrated Gateway for sensor data and ingested into Data Lake for real-time Analysis and created actionable insights using Spark Streaming MLib Algorithms.
  • Machine language/Unstructured data to create insights and establish benchmarks for production in real - time and laid down strategy for Low cost data storage

Confidential - Deerfield, IL

Data Integration Architect

Responsibilities:

  • DATA INTEGRATION & DATA MIGRATION Initiatives.
  • Work with key customers to develop specifications for ETL jobs and follow through from proposal to implementation.
  • Develop ETL jobs for using our Talend Data Management ETL toolset.
  • Assist with definition of best practices and procedures for our ETL environment.
  • Provide technical support to new customers who want to integrate their systems using our APIs. Learn and communicate our device-management API set and related security and management components.
  • Define and document API service-orchestration scenarios for common B2B integration types.
  • Document data flows and integration models.
  • Participate in best practices discussions and serve as a contributor on designing future BI product architecture.
  • Help with operational and administrative support for developed solutions. Monitor processes to ensure job completion.
  • Design and build ETL for the Walgreens Data Warehouse as time permits.
  • Gathered requirements, built logical models, and providing documentation, Benchmark systems, Analyze system bottlenecks .

Confidential

Supply Chain Solutions

Responsibilities:

  • Efficient Truck Routing, On-Time Deliveries of Shipments, Creating Models based on sensor data from the Trucks and equipment - Regression Analysis(IoT) Remote Asset Management
  • Data Analysis & Optimization / Architecture /Machine Learning Algorithms.
  • (POC /ORION Data Project Implementation)
  • Project name: ORION-(On-Road Integrated Optimization and Navigation). To Create actionable insights using the unstructured streaming data related to Logistic telematics and crunching of big data package information, user preferences and creating an efficient routing to drivers lead to a huge savings to the tune of $50million a year @ one mile a day for every Confidential driver
  • Work with Data Architects to determine file/database structure design and definition.
  • Work with Application Architects to assure design and conformance to best practices.

Environment: JAVA,.NET, Agile, MS Office, Cloud Computing.

We'd love your feedback!