We provide IT Staff Augmentation Services!

Hadoop And Sas Consultant Resume

0/5 (Submit Your Rating)

IL

SUMMARY

  • Over TWELVE years of experience in Big Data, Business Intelligence & Analytics projects across various technologies specifically on Hadoop Clusters (Hortonworks distributions), Cognos BI Suite, Cognos Framework Manager, Base SAS, SAS DI, SAS VA, Qlikview, Teradata, DB2 and Java.
  • Experience includes Development, Implementation and Production supports for various domains such as Capital Market, Investment & Retail Banking, Insurance and Health care’s projects.
  • Strong Understanding of translating client’s Big Data business requirements and transforming them into Hadoop centric technologies
  • Hands on experience in installing/configuring/maintaining Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, Spark, Kafka, Zookeeper, Hue and Sqoop using Hortonworks.
  • Hands on experience in developing and deploying enterprise based applications using major components in Hadoop ecosystem like Hadoop 2.x, YARN, Hive, Pig, Map Reduce, Spark, Kafka, Storm, Oozie, HBase, Flume, Sqoop and Zookeeper.
  • Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.
  • Experience in installing, configuring Hive, its services and Metastore.
  • Exposure to Hive Querying Language, knowledge about tables like importing data, altering and dropping tables.
  • Experience in installing and running Pig, its execution types, Grunt, Pig Latin Editors. Good knowledge about how to load, store, filter data and also combining and splitting data.
  • Experience in tuning and debugging Spark application running.
  • Experience integration of Kafka with Spark for real time data processing. Good experience working with Hortonworks Distribution.
  • Certified SAS professional, Profound knowledge on SAS/BASE, SAS Macros, SAS/Access, SAS/SQL, SAS/ETL, SAS EG and BI Tools.
  • Certified Cognos professional, profound knowledge on Cognos BI Suite, Cognos Report Studio, Framework Manager, Cognos Admin, various Cognos and BI Tools.
  • Involved in validating the aggregate table based on the rollup process documented in the data mapping. Developed Hive QL, SQL and automated the flow using shell scripting.
  • Expert in Data visualization using Cognos BI Suite, SAS VA, Qlikview, Tableau tool for Dashboards and Ad - hoc reports.
  • Expert in designing and developing Dashboards and Standalone and Self-service Reports.
  • Expert in Business Intelligence and Analytics using ETL/Reporting/Analytics tools to provide Business Intelligence solutions.
  • Expertise on Data Architect and Data Modeling. Worked on multiple end to end DWH Projects.
  • Strong experience on design and providing solutions on ETL, Reporting, Analytics and Data Visualization.
  • Responsible for defining the complete data warehouse architecture (ETL Design, Reports, Performance Management, Data Quality and Data Marts).
  • Extensive experience in customer interaction, Architecting and High-Low level design.
  • Expert in ETL design, development, implementation and Standard, Summary, Ad-hoc reports.
  • Performed Query optimization & Performance Tuning in complex SQL Queries, Stored Procedures. Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Expert in creating Indexes, Views, complex Triggers effective Functions and appropriate Stored Procedures to facilitate efficient data manipulation and data consistency.
  • Performance Tuning: Hands-on experience on analyzing Query plans for stored procedure optimizations. Highly skilled and experienced in Agile Development process for diverse requirements.
  • Participated in Daily scrum meetings, sprint planning, sprint review, and sprint retrospective.
  • Flexible enough to adapt new environments and Sound knowledge in analysis of functional issues.

TECHNICAL SKILLS

Big Data: Hadoop Ecosystem, Hortonwoks, HDFS, SPARK, Scala, Hive, HBase, Pig, Sqoop, Flume, Java, Kafka, Gobblin.

Relational Databases: DB2, Teradata, SQL, PL/SQL, Microsoft SQL7.0, S3 (AWS).

ETL Tools: SAS/BASE, SAS Macros, SAS/Access, SAS/SQL, SAS/ETL, SAS EG, SAS DI, Informatica 8.6.

Reporting Tools: Cognos 8.3, 10.1 and 10.2, Cognos TM1, SAS VA 7.2, Cognos Powerplay Transformer & Impromptu 6.1, 6.3 & 7.3.

Analytic Tool: Tableau 8.0/8.5, SAS VA 7.2, and Qlikview Desktop.

Schedulers: Redwood Cronacle, Autosys.

Languages: Java, Shell Script, Java script.

Tools: / IDE: MS Visual Studio 2008/2012, SnagIt, JIRA, Motio CI.

Project Mgt. Tools: SharePoint 2010/2013, Microsoft Project.

Software/Utilities: ODBC, FTP, MS Office 07/10/13.

Operating Systems: UNIX, Windows 2010/2008/2007.

SDLC: Sig Sigma, Extensively followed the Agile process; Played Agile Scrum Master role.

Domain: Banking and Finance Capital Markets & Investment Banking - Trading systems, Retail Banking & Wealth Management - Cards, Loans, Mortgage; Insurance - General, Re and Group Insurance; Health Care.

PROFESSIONAL EXPERIENCE

Confidential, IL

Hadoop and SAS Consultant

Responsibilities:

  • Coordinated with business customers to gather business requirements. And also interact with other technical peers to derive Technical requirements and delivered the BRD and TDD documents.
  • Extensively involved in Design phase and delivered Design documents.
  • Worked on analyzing Hadoop cluster and different Big Data analytic tools including Pig, Hive HBase database andSQOOP.
  • Installed Hadoop, Map Reduce, HDFS, and Developed multiple map reduce jobs in PIG and Hive for data cleaning and pre-processing.
  • Importing and exporting data into HDFS and Hive using SQOOP.
  • Migration of huge amounts of data from different databases (i.e. Oracle, SQL Server) to Hadoop.
  • Written Hive jobs to parse the logs and structure them in tabular format to facilitate effective querying on the log data.
  • Involved in creating Hive tables, loading with data and writing hive queries that will run internally in mapreduce way. Experienced in defining job flows.
  • Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting. Experienced in managing and reviewing the Hadoop log files.
  • Used Pig as ETL tool to do Transformations, even joins and some pre-aggregations before storing the data onto HDFS. Load and Transform large sets of structured and semi structured data.
  • Responsible to manage data coming from different sources.
  • Involved in creating Hive Tables, loading data and writing Hive queries. Utilized Apache Hadoop environment by Hortonworks.
  • Created Data model for Hive tables.
  • Exported data from HDFS environment into RDBMS using Sqoop for report generation and visualization purpose. Worked on Oozie workflow engine for job scheduling.
  • Involved in Unit testing and delivered Unit test plans and results documents.
  • Design new SAS programs by analyzing requirements, constructing workflow charts and diagrams, studying system capabilities and writing specifications.
  • Transfer SAS files to SAS LASR server to generate graphs in SAS VA tool. Created SAS DI Jobs to transform the data into LASR In-Memory.
  • Profound knowledge on SAS/BASE, SAS Macros, SAS/SQL, SAS/ETL, SAS EG and BI Tools.

Environment: Hadoop 2.x, HDFS, MapReduce, Pig 0.12.1, Hive 0.13.1, Sqoop 1.4.4, Flume 1.6.0, UNIX SAS/Base, SAS/Macros, SAS/EG, SAS/DI, SAS/VA (Visual Analytics), Hortonworks (HDP), Hadoop Eco System, HDFS, HIVE, UNIX Shell Script.

Confidential, NY

Cognos Solution Architect & Manager

Responsibilities:

  • Participate in business functional requirement meetings with Clients and end users. Created Framework manager packages and created starschema Model.
  • Created Functional and system requirement Design specification
  • Created Framework manager packages and reports using Query Studio and Report studio Developing reports based on the user requirements.
  • Good Experience in developing new Qlikview user interface using charts, graphs, tables, tabs, layout. Good expertise in creating QVD’s from different types of Data sources
  • Cognos and BOXI platform lead support and maintenance/stability.
  • Participated in Daily scrum meetings, sprint planning, sprint review, and sprint retrospective.

Environment: Cognos 10.2 Suite, Qlikview Desktop, Cognos Framework Manager Modeller, Event Studio, DB2, Cognos TM1. New Initiatives: Dynamic Cube POC, DQM Migration, RAVE Engine - Data Visualization, Active Reports.

Confidential

Cognos Lead

Responsibilities:

  • Understanding the client requirements by studying functional document. Created Functional and system requirement Design specification
  • Coordinate with the ETL (Informatica) team to identify the Issues.
  • Created cubes and multi-dimensional reports using Transformer and Powerplay.
  • Created Framework manager packages and reports using Query Studio and Report studio Attended weekly status meetings and provide detailed status report to the client

Environment: Cognos 10, Cognos Transformer, Cognos Impromptu, Cognos Powerplay, Cognos Migration tools, SQL database.

Confidential - WI

BI Module Lead

Responsibilities:

  • Understanding the client requirements by studying functional document. Created Functional and system requirement Design specification
  • Coordinate with the ETL (Informatica) team to identify the Issues.
  • Created cubes and multi-dimensional reports using Transformer and Powerplay.
  • Developing Redwood Cronacle scripts and job chains to automate the process of data loading. Created Framework manager packages and reports using Query Studio and Report studio
  • Attended weekly status meetings and provide detailed status report to the client

Environment: Cognos Reportnet 8.3, Cognos Transformer, Informatica, Cronacle and Teradata DB.

Confidential

BI Cognos Lead

Responsibilities:

  • Understanding the client requirements by studying functional document. Played a role of Report developer
  • Monitoring HR Daily and weekly loads data.
  • Coordinated with the ETL (Informatica) team to identify the Issues.
  • Created cubes and multi-dimensional reports using Transformer and Powerplay. Transform the data and validate the data in Teradata database (EDW).
  • Interacting with the onsite people and resolving the offshore issues.
  • Attended weekly status meetings and provide detailed status report to the client

Environment: Cognos Reportnet 8.3, Cognos Transformer, Informatica, Cronacle and Teradata DB.

Confidential

BI Cognos Developer

Responsibilities:

  • Understanding the client requirements by studying functional document. Played a role of Report developer
  • Involved Database design and created stored procedures.
  • Implementation and maintaining the application with Enhancements. Interacting with the onsite people and resolving the offshore issues.
  • Attended weekly status meetings and provide detailed status report to the client
  • Created list reports, cross-tab reports, Sub Report and template reports with drill up and drill down functionalities

Environment: Cognos Impromptu 6 & 7.3; Cognos PowerPlay Transformer, Oracle 9i DB.

Confidential

BI Cognos Developer

Responsibilities:

  • Develop stand-alone reports, drill through reports, different types of prompts in a prompt page, complex reports with various conditional formatting, for the Users.
  • Created multidimensional reports from power cubes for analyzing purpose using drill down and drill through functionalities
  • Created dimension maps and multidimensional cubes using impromptu’s query definition in power play transformer
  • Creating Users, User Classes/groups, and access privileges
  • Created standard filters, calculations, prompts and conditions Unit testing and validation of Reports and cubes.
  • Understanding the client requirements by studying functional document. Played a role of Report developer

Environment: Cognos Impromptu 6 & 7.3, Cognos PowerPlay Transformer, Oracle 9i DB.

We'd love your feedback!