Quick Analytical Resume
New, YorK
SUMMARY:
- Expert Hadoop consultant, implemented Hadoop based data warehouses for multiple customers
- Currently leading the solution development of analytics project on AWS at Confidential Bank, New York
- Collaborate with business, support teams and local country team teams to deliver multiyear Hadoop transformation project
- Completed end - to-end Hadoop/HIVE solution for 100+ million customers IP address tracker at Reliance Telecom
- 16 years’ of experience in BI/DWH with customers like GE, Eli Lilly, Equifax, Bell, Confidential, SAS, Barclays
- Implemented 6+ full life cycle BI/Analytics software development projects which include requirements analysis, technical architecture design, development, testing and deployment
- Implemented Hadoop project for Aries, Disney, Mobistar
- Hands on with cloud based solution deployment using Amazon EC2
- Expert level data analysis skills using languages like Scala, SAS, Python and experience in data modeling. Ranked in top 15% in Kaggle competition on coupon purchase prediction
- Ability to work in an agile environment as a good team player
TECHNICAL SKILLS:
Big Data Platform: Hortonworks, Cloudera, Spark, Amazon Redshift
Hadoop Components: HIVE, Pig, HDFS, Sqoop, HBase, PySpark
BI Tools/Suits: SAS, Pentaho, QlikView, Platfora
Languages: Python, Scala, Java, Base SAS and Macro, PL/SQL, Java Script
Database: MySQL, Oracle, MS SQL Server, MS-Access, Netezza. Redshift
SAS: SAS/Base (8.2, 9.1, 9.2), SAS/Graph, SAS/ACCESS, SAS DI Studio 4.2 SAS/OLAP, SAS Web Report Studio, SAS Information Maps, SAS SMC
Data Modeling: Erwin, Dimensional Modeling, Star Schema design, HIVE De-normalization
PROFESSIONAL EXPERIENCE:
Confidential, New York
Quick analyticalResponsibilities:
- Design and implement business specific solutions in Hadoop Data Lake.
- Convert traditional data warehouse applications to HIVE/Redshift data store.
- Consolidation and de-normalization of financial data model subject areas for business user’s consumption.
- End-to-end data flow design in HIVE/SAS Grid/Python.
- Implement data flows in HIVE/Pentaho ETL/Amazon Redshift.
- Analyzing data sources and troubleshooting data lineage/ quality and data consolidation issues across platforms.
- Enabling datasets on Hadoop/SAS Grid platforms through SAS Visual analytics.
- Transferring Hadoop data to AWS platform for tactical discovery and analytics.
- Data masking/unmasking to protect PII information.
- Redshift integration with QlikView and Python
- Pentaho ETL Performance tuning for s3 read/writes
- Operationalization of solution and handover to support team.
- Migration of Teradata warehouse to Hadoop.
- Project execution in agile mode and collaboration with business and support teams across world.
Confidential
Quick analyticalResponsibilities:
- Technical specification documentation, design and deployment of IPDR application using Hadoop platform and Phoenix (SQL) tools.
- End-to-end ownership of Hadoop solution development and deployment to production.
- Design and implement high performance data ingestion processes
- Implement Ipv4 NATING logic in HIVE.
- Development of HIVE/Python code that focuses on data ingestion as well as data analysis.
- Create and maintain data flow diagrams, solution architecture diagrams.
- Discuss with enterprise architecture team to ensure that the project is implemented as per the overall vision of organization on Big Data.
- Implementing data auditing mechanism in HIVE/HBase for reporting purpose.
Confidential
Quick analytical
Responsibilities:
- Analyze the existing application and separate data transformation logic to migrate it to Hadoop application.
- Create business and technical use cases for porting SAS to new analytical layer on Hadoop.
- Create tables in HIVE and implement initial data processing logic.
- Technical solution evaluations and developing recommendations for migrating churn models from SAS to Python or Hadoop.
- Design, develop and implement supporting operations like scheduling, log tracking.
- Assist in overall technical architecture for Mobistar digital transformation program for big data.
- Guide team in data analysis of day to day issues reported from business users. Review and suggest changes to other developers programs and test cases.
- Guide team for Hadoop platform migration.
Confidential, UK
Quick analyticalResponsibilities:
- Formalize requirements and expectation from global consumer healthcare BI users.
- Write and review functional specifications for SAS (ETL, Reports and Dashboards).
- Define business oriented use-cases and User Acceptance Tests (Business Acceptance Test).
Tech Mahindra Big Data
Confidential
Responsibilities:
- Hadoop competency development through PoC, demo’s, training, RFI/RFP responses related to big data projects.
- Setting Hadoop cluster for internal projects/demos.
- CDR Processing for Aries: Installed and benchmarked Hadoop cluster on EC2. Design and development of MapReduce/HIVE based solution on EC2. Ingest CDR records of devices using Flume. Data Aggregation and Loading to Hadoop using MapReduce/HIVE SQL. Managing EC2 environment.
- Sentiment Analysis for Disney: Designed and built data platform using Big Data technologies for Disney’s sentiment analytics project. Analyzed data collected from Facebook, Twitter and YouTube for new movie releases. Data processing and analysis using MapReduce /Pig/Hive. Implemented Sentiment analysis on comments data.
- SAS Platform Support for Bell Canada: Worked as transition manager for transferring 900+ users SAS 9.2 platform of Bell Canada from HP to Tech Mahindra for support and maintenance. End-to-end execution, from RFP response, customer presentation and transition to delivery team.
- Created new service offering around "Big Data Analytics” and contribution in developing TechM TAP platform which is Hortonworks certified solution from Tech Mahindra.
- SAS Performance Tuning for Ministry of Education, Singapore: Performance tuning of SAS ETL jobs and Dashboard queries. Implement security framework in SAS OLAP cubes for around 1000 users of Ministry of Education, Singapore.
- Giving functional and technical presentations, demonstrations, training for competency development.