We provide IT Staff Augmentation Services!

Stock Analyst Resume

0/5 (Submit Your Rating)

OBJECTIVE:

  • Seeking challenging position with strong emphasis on latest technologies to use my current skill set, acquire more and desire to create the best solution possible.

PROFESSIONAL SUMMARY:

  • Successfully developed couple of initiatives (Implementation & development) on large data processing using Hadoop ecosystem (MapReduce MR1, YARN, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Oozie, Flume)
  • Experience in processing data using MapReduce, Hive/Pig scripts, HDFS and has experience in writing UDFs based on the requirements
  • Very good knowledge of partitioning and bucketing concept of Hive
  • Sound knowledge of HDFS architecture, Hadoop MapReduce Framework, Hadoop MapReduce Framework 2
  • Experience in Java applications using Core Java and Strong working experience in SQL concepts
  • Expertise Knowledge of OOPS, Multithreading, Synchronization, Serialization and Garbage Collection
  • Edureka Certified Hadoop Developer
  • Ability to excel and learn more in fast paced development environment using latest frameworks/tools development methodologies such as Agile
  • Knowledge of Data warehousing concepts and tools Involved in the ETL processes
  • Knowledge of all phases of Software Development Life Cycle (SDLC), its methodologies and process
  • Experience in Automation testing using Selenium WebDriver/RC/IDE/GRID, Junit, TestNG, Jenkins, Cucumber and Maven
  • Knowledge of HTML and CSS
  • Experience in multiplatform Unix and Linux environments, with hands - on expertise in networking, scripting, and systems integration
  • Capable of processing large sets of structured, semi-structured and unstructured data
  • Managed and reviewed Hadoop log files
  • Expertise to deep dive into technical holes and coming out with solutions
  • Innovative and enthusiastic
  • Worked with end users to formulate and document business requirements
  • Strong problem solving & technical skills coupled with clear decision making
  • Received appreciation and recognition from all previous employers
  • Proven ability to handle complex issues and exposure of working closely with Customers

TECHNICAL SKILLS:

Testing Tools: Selenium RC, WebDriver, Cucumber

Programming Languages: JAVA, JavaScript, HTML, CSS, Python, R

Big Data Ecosystems: Spark, Hadoop, MapReduce, YARN, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Oozie, Flume

Processes: Agile-Scrum, Waterfall

Operating System: Windows, Linux and Unix

Browsers: Mozilla Firefox, Internet Explorer, Safari, Google Chrome, Opera

Databases: SQL Server, MS Access, Oracle, MySQL, NoSQL

PROFESSIONAL EXPERIENCE:

Confidential

Stock Analyst

Responsibilities:

  • Worked with Data Scientist and whole team to collect and analyze datasets from various investors like mutual fund companies and banks
  • Clean and analyze unstructured data of investor’s investment using Hadoop framework and store in HDFS
  • Try to predict the Stock Movement by technical analysis using R and Python
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables
  • Collaborate with stakeholders and perform source-to-target data mapping, design and review
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data
  • Loading Unstructured and Structured data using Sqoop and exporting to RDBMS by Flume
  • Skilled with Object Oriented architectures and Patterns, systems analysis, software design, effective coding practices, databases
  • Involved in writing MapReduce jobs and giving input to HDFS for further processing
  • Developed Map-Reduce programs in HIVE and PIG to validate and cleanse the data in HDFS, obtained from heterogeneous data sources, to make it suitable for analysis

Environment: Java, Python, R, HDFS, MR1, Pig, Hive, Sqoop, Flume, HBase, Zookeeper, Oozie, Eclipse, Python, SQL and Linux

Confidential

Hadoop Engineer

Responsibilities:

  • Used various Big Data techniques for attribute extraction, categorization
  • Deal with a huge data set (Billions of products and Terabytes of data).
  • And presenting data optimally to users
  • Work with Java, Distributed Systems and Big data related technologies to design and develop high-performance and scalable systems and data pipelines
  • Help the team leverage and contribute to open source software whenever possible
  • Loading Unstructured and Structured data using Sqoop and exporting to RDBMS by Flume
  • Skilled with Object Oriented architectures and Patterns, systems analysis, software design, effective coding practices, databases
  • Involved in writing MapReduce jobs and giving input to HDFS for further processing
  • Developed Map-Reduce programs in HIVE and PIG to validate and cleanse the data in HDFS

Environment: Java, Python, R, HDFS, MR1, Pig, Hive, Sqoop, Flume, HBase, Zookeeper, Oozie, Eclipse, Python, SQL and Linux

Confidential

Core Java Developer

Responsibilities:

  • Analyze, debug, record and execute programs.
  • Create and modify block diagrams and logic flowcharts.
  • Prepare required program-level and user-level documentation.
  • Software and architectural development activities.
  • Utilizing knowledge of script languages to develop project-specific programs.
  • Coordinated with cross-functional teams to evaluate project requirements.
  • Worked with development team and IT staff to complete scheduled Java development tasks according to project timeline.
  • Prepared specific Java programs based on client requirements and desired applications.
  • Added new programs to existing client websites to improve functionality of the sites.
  • Developed safety components to deter hacking and enhance virus protections.
  • Debugged existing Java programs and application
  • Extensively involved in identifying test cases and then Automate those test cases using Selenium WebDriver, Testing and Java (Eclipse IDE)
  • Maintained and executed Maven build files
  • Integrated Automation scripts (Selenium WebDriver API) on Continuous Integration tool Jenkins for nightly batch run of the script

Environment: Java, Eclipse, Selenium WebDriver, Jenkin, Cucumber, TestNG, Maven, Agile, SQL and Windows

Confidential

QA Automation Tester

Responsibilities:

  • Responsible creating the functional testing framework using TestNG and using Selenium WebDriver for writing automated scripts with Java
  • Built several Java components for flash based games on Android platform
  • Added tasks for Income Tax Payment website using Core Java
  • Maintained and executed Maven build files for running automated tests
  • Integrated Automation scripts (Selenium WebDriver API) on Continuous Integration tool Jenkins for nightly batch run of the script
  • Used Firebug and XPath for the web based application testing with selenium for the commands and locator application
  • Analyzing product use cases, requirements, technical design and implementation artifacts to create test cases that execute the specific product functions
  • Created whole test framework using Selenium WebDriver for further test creation and execution
  • Performed performance and load testing by using Selenium WebDriver
  • Used SQL Queries to retrieve the data from the table and to perform Backend Testing
  • Participated in Agile software testing process with regular QA meetings to discuss major features of the application, test cases to write and execute
  • Prepared reports by collecting and summarizing information to provide accurate data for project reporting
  • Worked on Cucumber Testing applications and responsible for preparing weekly status reports and validated the back-end data using SQL queries for data integrity
  • Worked closely with Agile Scrum team from several domains to ensure product testability while running Regression tests in Selenium WebDriver using TestNG, and maintained test
  • Wrote and executed Automated test cases using Selenium WebDriver to automate manual testing solutions
  • Developed and Executed Test Plans and Test Cases from Requirements and Specification documents

Environment: Java, Selenium WebDriver, Jenkin, Cucumber, TestNG, Maven, Eclipse, Python, XPath, Firebug, Agile, SQL and Windows

We'd love your feedback!