We provide IT Staff Augmentation Services!

Hadoop And Sas Consultant Resume

3.00/5 (Submit Your Rating)

IL

SUMMARY

  • Over 12 years of experience in Big Data, Business Intelligence & Analytics projects across various technologies specifically on Hadoop ecosystem, Cognos BI Suite, Cognos Framework Manager, Base SAS, SAS DI, SAS VA, Qlikview, Tableau, Teradata, DB2 and Java.
  • Experience includes Development, Implementation and Production supports for various domains such as Capital Market, Investment & Retail Banking, Insurance and Health care's projects.
  • Strong Understanding of the Hadoop and its Eco - system /Architecture and associated sub-projects sitting on top of Hadoop like Hive/ H-Base /Sqoop etc.
  • Experience in analyzing Hadoop cluster and different Big Data analytic tools including Hive HBase database and SQOOP.
  • Certified SAS professional, Profound knowledge on SAS/BASE, SAS Macros, SAS/Access, SAS/SQL, SAS/ETL, SAS EG and BI Tools.
  • Certified Cognos professional, Profound knowledge on Cognos BI Suite, Cognos Report Studio, Framework Manager, Cognos Admin, various Cognos and BI Tools.
  • Experience in developing dashboard reports through SAS Visual Analytics.
  • Involved in validating the aggregate table based on the rollup process documented in the data mapping.
  • Developed Hive QL, SQL and automated the flow using shell scripting.
  • Experience in Exporting and importing of data from Relational database into Hadoop using SQOOP.
  • Good experience working with Hortonworks Distribution .
  • Expert in Data visualization using Cognos BI Suite, SAS VA, Qlikview, Tableau tool for Dashboards and Ad-hoc reports.
  • Expert in designing and developing Dashboards and Standalone and Self-service Reports.
  • Expert in Business Intelligence and Analytics using ETL/Reporting/Analytics tools to provide Business Intelligence solutions.
  • Expertise on Data Architect and Data Modeling . Worked on multiple end to end DWH Projects.
  • Strong experience on design and providing solutions on ETL, Reporting, Analytics and Data Visualization.
  • Responsible for defining the complete data warehouse architecture (ETL Design, Reports, Performance Management, Data Quality and Data Marts).
  • Extensive experience in customer interaction, Architecting and High-Low level design.
  • Expert in ETL design, development, implementation and Standard, Summary, Ad-hoc reports.
  • Performed Query optimization & Performance Tuning in complex SQL Queries, Stored Procedures. Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Expert in creating Indexes, Views, complex Triggers effective Functions and appropriate Stored Procedures to facilitate efficient data manipulation and data consistency.
  • Performance Tuning: Hands-on experience on analyzing Query plans for stored procedure optimizations.
  • Highly skilled and experienced in Agile Development process for diverse requirements.
  • Participated in Daily scrum meetings, sprint planning, sprint review, and sprint retrospective.
  • Flexible enough to adapt new environments and Sound knowledge in analysis of functional issues.

TECHNICAL SKILLS

  • Technology: Below is a list of important hardware, software products, tools and methods that I have worked with.
  • Big Data: Hadoop Ecosystem, Hortonwoks, HDFS, HIVE.
  • Relational Databases: DB2, Teradata,SQL, PL/SQL, Microsoft SQL7.0, S3 (AWS)
  • ETL Tools: SAS/BASE, SAS Macros, SAS/Access, SAS/SQL, SAS/ETL, SAS EG, SAS DI, Informatica 8.6
  • Reporting Tools: Cognos 8.3,10.1 and 10.2, Cognos TM1, SAS VA 7.2, Cognos Powerplay Transformer & Impromptu 6.1, 6.3 & 7.3 ;
  • Analytic Tool: Tableau 8.0/8.5, SAS VA 7.2, and Qlikview Desktop
  • Schedulers: Redwood Cronacle, Autosys
  • Languages: Java, Shell Script, Java script
  • Tools / IDE: MS Visual Studio 2008/2012, SnagIt, JIRA, Motio CI

PROFESSIONAL EXPERIENCE

Confidential - IL

Hadoop and SAS Consultant

Responsibilities:

  • Participate in business functional requirement meetings with Clients and end users.
  • Created the Hive tables to load the HDFS data into Hive
  • Created partitioned and bucketed tables in Hive.
  • Created the complex SAS queries according to the business rule and load the data into Summary tables.
  • Design new SAS programs by analyzing requirements, constructing workflow charts and diagrams, studying system capabilities and writing specifications.
  • Analyzing / Transforming data with SAS EG
  • Transfer SAS files to SAS LASR server to generate graphs in SAS VA tool.
  • Created CPORT and SAS Dataset files for data manipulation
  • Experience in developing dashboard reports through SAS Visual Analytics.
  • Created DI Jobs to transform the data into LASR In-Memory.
  • Profound knowledge on SAS/BASE, SAS Macros, SAS/SQL, SAS/ETL, SAS EG and BI Tools.
  • Coordination with offshore team

Environment: SAS/Base, SAS/Macros, SAS/EG, SAS/DI, SAS/VA (Visual Analytics), Hortonworks (HDP), Hadoop Eco System, HDFS, HIVE, Unix Shell Script.

Confidential, NY

Cognos Solution Architect & Manager

Responsibilities:

  • Trade Orders, Executions, and Allocation of Stock Trading on Trade Volume and CnC ( Cancel and Correction ) for 17 Asset Classes ( Equity, Derivative, Commodity, Rates, PB, Loans, Secure Lending )
  • Settlements, Confirmation, Fails, Rejections and Collaterals metrics
  • Market access rules and trend analysis of user trading
  • Participate in business functional requirement meetings with Clients and end users.
  • Created Framework manager packages and created starschema Model.
  • Created Functional and system requirement Design specification
  • Created Framework manager packages and reports using Query Studio and Report studio
  • Developing reports based on the user requirements.
  • Good Experience in developing new Qlikview user interface using charts, graphs, tables, tabs, layout.
  • Good expertise in creating QVD's from different types of Data sources
  • Cognos and BOXI platform lead support and maintenance/stability.
  • Participated in Daily scrum meetings, sprint planning, sprint review, and sprint retrospective.

Environment: Cognos 10.2 Suite, Qlikview Desktop, Cognos Framework Manager Modeller, Event Studio, DB2, Cognos TM1.

Confidential

Cognos Lead

Responsibilities:

  • Understanding the client requirements by studying functional document.
  • Created Functional and system requirement Design specification
  • Coordinate with the ETL (Informatica) team to identify the Issues.
  • Created cubes and multi-dimensional reports using Transformer and Powerplay.
  • Created Framework manager packages and reports using Query Studio and Report studio
  • Attended weekly status meetings and provide detailed status report to the client

Environment: Cognos 10, Cognos Transformer, Cognos Impromptu, Cognos Powerplay,Cognos Migration tools,Sql database.

Confidential - WI

BI Module Lead

Responsibilities:

  • Understanding the client requirements by studying functional document.
  • Created Functional and system requirement Design specification
  • Coordinate with the ETL (Informatica) team to identify the Issues.
  • Created cubes and multi-dimensional reports using Transformer and Powerplay.
  • Developing Redwood Cronacle scripts and job chains to automate the process of data loading.
  • Created Framework manager packages and reports using Query Studio and Report studio
  • Attended weekly status meetings and provide detailed status report to the client

Environment: Cognos Reportnet 8.3, Cognos Transformer, Informatica, Cronacle and Teradata DB

Confidential - WI

BI Cognos Lead

Responsibilities:

  • Understanding the client requirements by studying functional document.
  • Played a role of Report developer
  • Monitoring HR Daily and weekly loads data.
  • Coordinated with the ETL (Informatica) team to identify the Issues.
  • Created cubes and multi-dimensional reports using Transformer and Powerplay.
  • Transform the data and validate the data in Teradata database (EDW).
  • Interacting with the onsite people and resolving the offshore issues.
  • Attended weekly status meetings and provide detailed status report to the client

Environment: Cognos Reportnet 8.3, Cognos Transformer, Informatica, Cronacle and Teradata DB

Confidential

BI Cognos Developer

Responsibilities:

  • Understanding the client requirements by studying functional document.
  • Played a role of Report developer
  • Involved Database design and created stored procedures.
  • Implementation and maintaining the application with Enhancements.
  • Interacting with the onsite people and resolving the offshore issues.
  • Attended weekly status meetings and provide detailed status report to the client
  • Created list reports, cross-tab reports, Sub Report and template reports with drill up and drill down functionalities

Environment: Cognos Impromptu 6 & 7.3; Cognos PowerPlay Transformer, Oracle 9i DB

We'd love your feedback!