Cassandra Developer Resume
Bethesda, MD
SUMMARY
- 8 years of experience as software developer in design, development, deploying and supporting large scale distributed systems.
- Having3+ years of experience as Cassandra Developer.
- Datastax certified Apache Cassandra Developer.
- Experience wif installing, configuring and monitoring Apache Cassandra cluster.
- Excellent understanding of Cassandra read and write paths, query and internal architecture.
- Knowledge on implementing multi - datacenter and multi-rack Cassandracluster.
- Excellent Knowledge of CassandraArchitecture,Cassandradata modelling & MonitoringCassandrausing Opscenter.
- Can handle commissioning and decommissioning nodes along wif monitoring ofCassandra Cluster.
- Experience in efficiently managing backup and restoring data in live Cassandra Cluster.
- Experience in working wif Hadoop distributions likeMapR, Cloudera and Hortonworks Distribution.
- Excellentunderstanding of NoSQL Data Modelling Techniques.
- Experience inperformance tuningandmaintenanceof Cassandra Database.
- Strong noledge onbackup and recovery ofCassandrasystems.
- Good noledge on Spark SQL, Spark Streaming and Scala.
- Developed Spark scripts by using Scala IDE as per the business requirements.
- Experience in migrating data using Sqoop from RDMS into Cassandra cluster.
- Experience integration of Kafka wif Storm and Spark for real time data processing.
- Experience in using NiFi and Streamsets to automate the data movement between different stages.
- Strong experience in core Java, J2EE, SQL and Restful web services.
- Experienced wif JIRA for bug tracking and issue tracking.
- Extensive experience in developing applications using Core Java and multi-threading.
- Excellent Team player wif very good analytical, interpersonal and communication skills
- Experience in Agile Methodologies SCRUM in different project life cycles.
TECHNICAL SKILLS
Operating systems: MAC OS X, Linux, Windows, Ubuntu
Languages: C, C++, Java, JavaScript, Scala and Python
Relational Databases: Oracle 10g/11g, MySQL, SQL server
NoSQL Databases: HBase, Cassandra, Mongo DB
Hadoop Ecosystem: HDFS, MapReduce, Spark, Hive, Pig.
IDE tools: Eclipse, IntelliJ, Net beans, STS, Visual Studio, Scala IDE
Big Data: Hadoop, Spark
PROFESSIONAL EXPERIENCE
Confidential, Bethesda, MD
Cassandra Developer
Responsibilities:
- Implemented and Maintained aMulti-DatacenterCassandra cluster.
- Involved in the process of Cassandra data modeling for efficientquerying.
- Involved in requirements gathering and capacity planning for multi data center Apache Cassandra cluster.
- Installed and Configured DataStax OpsCenter for Cassandra Cluster maintenance.
- Created Keyspace and columnfamilies from the data model using DataStax DevCenter.
- Determining and setting up the required replication factors for keyspaces in prod, dev etc. environments in consultations wif application teams.
- Imported data from various resources to the Cassandra cluster using Java API.
- Worked on tuning Bloom filters and configured compaction strategy based on the use case.
- Worked wif CQL for retrieving the data present in Cassandra cluster by running queries in CQL.
- Tested the application and the cluster wif different consistency levels to check for the writes and reads performance wif respective to consistency Level.
- Implementing and maintaining database security(create and maintain users, roles and assign privileges).
- Experience in working wif CCM (Cassandra Cluster Manager) and Cassandra nodetool.
- Administration and maintenance of the cluster usingOpsCenter, Devcenter, Linux, Node tool, CCMetc.
- Committed codes into GIT.
- Interact wif SCRUM team developers, programmers, tester, product owners and stake holders to deliver right value of the project at the end of each sprint.
Environment: Cassandra, OpsCenter, DevCenter, CQL, NodeTool, UNIX, Shell-Scripting, Spark, GIT, JIRA
Confidential, Springfield, MO
Hadoop/ Cassandra Developer
Responsibilities:
- Involved in the process of Cassandra data modelling and building data structures for efficient querying.
- Created Keyspace and columnfamilies from the data model using DataStax DevCenter.
- Performed bulk data loading into Cassandra usingSSTableLoader.
- Managing Cassandra clusters usingDatastax Opscenter.
- Installed and Configured DataStax OpsCenter for Cassandra Cluster maintenance and alerts.
- Involved in bootstrapping, removing, replacing, decommissioning the nodes.
- Troubleshoot read/write latency and timeout issues in Cassandra.
- Involved in Load Testing and Analyzing Bottlenecks using Cassandra-stress tool.
- DevelopedSpark Streamingapplications for Real Time Processing.
- Scheduled repair and cleanup process in production environment.
- Extracted Data from RDMS into Cassandra Cluster using Sqoop.
- Work closely wif the business and analytics team in gathering the system requirements.
- Configured compaction strategy based on the use case.
- Used DataStax Spark-Cassandra connector to load data into Cassandra and used CQL to analyze data from Cassandra tables for quick searching, sorting and grouping.
- Performed Stress and Performance testing to benchmark the cluster
- Backup and recovery using snapshot backups.
- Used Oozie to manage Pig and Hive Jobs.
Environment: /Technologies: Cassandra, Datastax 4.0, HDFS, MapReduce, Hive, Pig, Oozie, Sqoop, Git, Maven
Confidential, San Antonio, TX
Spark/Hadoop Developer
Responsibilities:
- ConfiguredSpark streamingto get ongoing information from the Kafka and store the stream information to HDFS.
- Used various sparkTransformationsandActionsfor cleansing the input data.
- ImplementedSparkusing Scala and Spark SQL for faster testing and processing of data.
- Involved in convertingHive/SQL queries into Spark transformations using Spark RDD's.
- Managing and scheduling Jobs on aHadoopcluster usingOozie.
- Load and transform large sets of structured and semi-structured data.
- Implement Flume, Spark and Spark Stream framework for real time data processing.
- UsingSpark-cassandraconnector to load data to and from Cassandra.
- Responsible in Implementing advanced procedures liketext analyticsand processing using the in-memory computing capabilities likeSpark.
- Experience in extracting appropriate features from data sets in order to handlebad, null, partial recordsusingSpark SQL.
- Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDDs.
- Experienced in handling large datasets using Partitions, Spark in memory capabilities, broadcasts in spark, effective & efficient joins, transformations and other during in ingestion process itself.
- Experience in building Real-timeData PipelineswifKafkaConnect and Spark Streaming.
- Loaded large sets of structured, semi-structured and unstructured data wif Sqoop and Flume.
- Developed Spark programs in Scala to perform data transformation, data streaming and analysis.
- Experienced in working wif spark eco system using Spark SQL and Scala queries on different formats like Text file, CSV file.
- Create and share weekly / daily status report wif the team.
Environment: /Technologies: Spark, Spark-streaming, Spark SQL, Kafka, Oozie, Sqoop, Flume, Scala, MapReduce, Hive, Pig
Confidential, San Francisco, CA
Java/Scala Developer
Responsibilities:
- Responsible forBuilding, develop, testingshared components dat will be used across modules.
- Created user stories and resolved development issues.
- Develop Web tier usingSpring MVCFramework.
- Implementeddesign patternsin Scala for the application.
- Unit testing Integration testing and bug fixing.
- Used Akka as a framework to create reactive, distributed, parallel and resilient concurrent applications in Scala.
- Developed Spark scripts by using Scala shell commands as per the requirement.
- Implementing advanced procedures like text analytics and processing using the in-memory computing capabilities like Apache Apache Spark written inScala.
- Migrating data from Apache Spark-RDD into HDFS and NoSQL like Cassandra/Hbase.
- Serialize and de-serialize objects usingPlayJson lib.
- Writing complex Sql queries.
- Use Apache Sqoop to dump the user data into the HDFS on a weekly basis.
- Actively involved in code review and bug fixing for improving the performance.
- Created production jobs using Oozie work flows dat integrated different actions like Map Reduce, Sqoop, Hive.
Environment: /Technologies: Spark, Java, Scala, Hadoop, Cassandra, HBase, Hive, Pig, Sqoop, MySQL, Github.
Confidential
Java developer
Responsibilities:
- Designed and Created Domain model and schema using object oriented design / UML diagrams BuiltSOAP Webservices.
- Design and developed configuration output in XML and PDF format.
- Responsible for design and development of Web Application using Struts MVC Framework.
- Developed the XML schema and Web Services for the data maintenance and structures.
- Performed data mapping from one XML schema to other web services.
- Used SOAP and Web Servicesfor the communication between the different internal applications.
- Experience in developing and deployingWeb ServicesusingSOAP, Rest full web services and shell scripting.
- Implemented CVS as the version control tool.
- Used Firebug for application webpage troubleshooting and Eclipse debugger for the bug fixes.
- Created various unit test cases and integration tests for the DAO, Service, and Controller components using JUnit, DbUnit and Spring Test support.
- Experience in developing JDBC to interact wif the database.
- Implemented logging using Log4j. Responsible for the co-ordination wif the team members and attending meetings.
- Used web servers like Apache Tomcat.
- Involved in the analysis, design and development and testing phases of Software Development Life Cycle.
- Created User Interface screens usingHTML, DHTML, AJAX, CSS, Java script.
Environment: /Technologies: UNIX, Java, JSP, Servlets, Oracle, Apache Tomcat, Maven, XML, spring MV
Confidential
Java developer
Responsibilities:
- Developed variousjavabusiness classes for handling different functions.
- Used SVN version control system to manage system development, and design.
- Used DB2 as the database and wrote SQL & PL-SQL.
- Developed schemas for XML.
- Implemented J2EE standards, MVC architecture using Spring Framework.
- Designed front end wif JavaScript framework in AngularJS and JQuery.
- Developed UI using HTML, JavaScript, and JSP, and developed Business Logic and Interfacing components using Business Objects, XML, and JDBC.
- Designed cascading style sheets and XSLT and XML part of Order entry Module & Product Search Module and did client side validations wif javascript.
- Managed connectivity using JDBC for querying/inserting & data management including triggers and stored procedures.
- Bug fixes for the issues observed in the stage environment.
- Written test cases to test the entire application manually and automated the testing usingJAVA program.
- Used case design using UML modeling include development of Class diagrams, Sequence diagrams, and Use Case Transaction diagrams.
Environment: /Technologies: Struts, Spring, HTML, CSS, Java, J2EE, JSP, XML, Eclipse, WebLogic, JavaScript, MySQL