We provide IT Staff Augmentation Services!

Hadoop/security Systems Developer Resume

2.00/5 (Submit Your Rating)

Bellevue, WA

SUMMARY

  • Masters’ graduate with 7+ years hands on experience in Software Development & Deployment. Well - versed in all phases of the SDLC, strong analytical & problem solving skills with good working knowledge of algorithms and data structures.
  • Good implementing skills using Object Oriented concepts & design patterns, substantial talent in achieving critical solutions using Hadoop, JAVA/J2EE, Android & AWS.
  • Deploying scalable, highly available, fault tolerant Storm, Spark application topologies in on prem and amazon cloud.
  • Over 3 years of experience in working with different Hadoop ecosystem components such as HDFS, MapReduce, HBase, Spark, Yarn, Kafka, Zookeeper, PIG, HIVE, Sqoop, Storm, Oozie.
  • Good experience in creating data ingestion pipelines, data transformations, data management and data governance, real time steaming engines at an Enterprise level.
  • Expertize in java Application Development, Client/Server Applications using core java, J2EE technology, Web Services, REST Services, Oracle, SQL Server and other relational databases.
  • Very good experience in real time data streaming solutions using Apache Storm, Spark/Spark streaming (Spark SQL, Spark Streaming), Kafka.
  • Very good knowledge on usage of various big data ingestion techniques using Sqoop, Kafka, Native HDFS java API, REST API.
  • Experience in working with various Hadoop distributions like Cloudera, Hortonworks and MapR.
  • Good Knowledge in implementing end to end Data Security and Governance within Hadoop platform using Apache Knox, Apache Sentry, Kerberos etc.,
  • Experience with different NoSQL data bases like HBase, Cassandra, MongoDB.
  • Diverse experience in working with variety of Database like Teradata, Oracle, MySQL, MS SQL.
  • Worked with different file formats like AVRO, ORC, Parquet while moving data in and out of HDFS.
  • Experience with Apache Phoenix to access the data stored in HBase.
  • Experience in Data mining and business Intelligence tools such as Tableau, Qlikview and Microstratergy.
  • Experience in automating tasks using control-m and Shell Scripting.
  • Experience with XML, JSON, CSV, ASN.1, ORC, Parquet file formats.
  • Good experience and understanding of Enterprise Data warehouse (EDW) architecture and possess End to End knowledge of EDW functioning.
  • Hands on experience on AWS cloud services (VPC, EC2, S3, RDS, Redshift, EMR, DynamoDB, Kinesis, SNS, SQS)
  • Used Oozie and Control - M workflow engine for managing and scheduling Hadoop Jobs.
  • Experienced in using IDEs and Tools like Eclipse, Net Beans, GitHub, Jenkins, Maven and IntelliJ.
  • Experience in building and deploying web applications in multiple applications servers and middleware platforms including Web logic, Web sphere, Apache Tomcat, JBoss.
  • Experience in software component design technologies like UML Design, Use case and requirement Components diagrams.
  • Strong knowledge of System Testing, User Acceptance testing and software quality assurance best practices and methodologies.
  • Experienced in using Agile methodologies including extreme programming, SCRUM and Test-Driven Development (TDD).
  • Excellent communication and inter-personal skills with technical competency and ability to quickly learn new technologies as required.
  • Ability to blend technical expertise with strong Conceptual, Business and Analytical skills to provide quality solutions and result-oriented problem-solving techniques and leadership skills.

TECHNICAL SKILLS

Hadoop/Big Data: MapReduce, HDFS, Hive, Pig, HBase, Sqoop, Spark, Spark SQL, Spark Streaming, Kafka, Flume, Storm, Zookeeper, Phoenix, Oozie, Impala, Hue, Cloudera manager, Ambari

Distributed plat forms: Cloudera, Hortonworks, MapR

Programming Languages: C++, Java, Scala, Python, R, Swift

Java/J2EE Technologies: Servlets, Spring, JSP, JSF, JDBC, Java Beans, RMI & Web services (SOAP, RESTful)

Development Tools: Eclipse, Net Beans, SBT, ANT, Maven, Jenkins, Selenium WebDriver, Jira, Bugzilla, SQL Developer, Talend, Informatica, Teradata Studio Express

Methodologies: Agile/Scrum, UML, and Waterfall

NoSQL Technologies: Cassandra, MongoDB, HBase, Dynamo DB

Databases: Oracle 12c, MySQL, MS-SQL Server, PostgreSQL, Teradata, SQLite

Web/ Application Servers: Apache Tomcat, WebLogic, WebSphere

Version Control: Git, SVN

Visualization: Tableau, Microstratergy and Qlikview

Web Technologies: HTML, CSS, XML, JavaScript, jQuery, AngularJS, Node JS, AJAX, SOAP, and REST

Scripting Languages: Unix Shell Scripting

Operating Systems: Windows XP/Vista/7/8,10, UNIX, LINUX

PROFESSIONAL EXPERIENCE

Confidential, Bellevue WA

Hadoop/Security Systems Developer

Responsibilities:

  • Developed a guaranteed delivery, fault tolerant, highly available, secured streaming distribution solution for end users not to worry about data loss and avoided data replication time for end users in case of data loss.
  • Build a lightweight filewatcher service to produce data into Kafka topics
  • Ingested the data from Kafka topics.
  • Developed distributed Storm Jobs for data processing and transformation.
  • Developed custom Storm/Kafka streaming applications for ingesting data from Kafka into HBase and Hive tables.
  • Created hive tables on HBase and exposed CDR data to end users.
  • Debugged and maintained Storm/Java code in case of any issues.
  • Extracted the data from various sources like APIs using java programs and shell scripting and ingested the data into Hive.
  • Worked on generating data analysis reports and handled adhoc data requests from the Business using Hive and Phoenix.
  • Implemented security services for data in fly using Kerberos for kafka connectivity and at rest implanted encryption voltage services
  • Deployed control-m jobs to schedule jobs periodically and ABC jobs to compare data at source and destination.
  • Implemented views to see realtime data flow into kafka topics in GRAFANA
  • Followed SCRUM agile process implemented user stories to track the project progress.
  • Implemented JUNIT for unit testing and also performed performance testing to improve the project quality.
  • Bitbucket is used to check in and check out the project source code and automated the project deployment using CI/CD Jenkins
  • Gradle is used for building the project deployment builds.

Environment: Apache Storm, Apache Hadoop, HDFS, Hive, Phoenix, Kafka, Oozie, Gradle, Eclipse, Linux, HDP 2.4, Bitbucket, Grafana

Confidential

Senior Software Engineer

Responsibilities:

  • Implemented an innovative security solution and reduced the client budget by 8% and avoided yearly maintenance costs of security certificates.
  • Notified the priority payments to the users and make them pay on go, which reduced the business wait times. Developed android application.
  • Developed Restful Web Services using JAX-RS Api’s through Spring and followed paired and TDD approaches.
  • Involved in integrating the business layer with DAO layer using ORM tool Hibernate with relational DB’s MySQl, implemented the logging mechanism using Log4j API.
  • Developed an optimized code considering space and time complexities in core JAVA.

Software Engineer

Confidential

Responsibilities:

  • Developed highly scalable and fault tolerant cricket betting game API’s using Java/J2ee.
  • Interacted with business stakeholders for requirements gathering and automated the project progress using Jenkins.
  • Used maven tool to build the application and svn repository for code persistence.
  • JUnit Frameworks were used for performing unit and integration testing and wrote unit test cases and used JAXB for marshalling & un-marshalling XML.
  • Wrote SQL queries, procedures, triggers efficiently to fetch and update records.
  • Achieved concurrency using Java executor services, multithreading
  • Worked in Agile scrum methodology and provided best solutions to user stories.
  • Implemented Spring Secirity using OUTH2.0 & JWT tokens.

Confidential

Software Engineer

Responsibilities:

  • As a developer, developed and enhanced applications using Java technologies, framework Spring and provided flexible UI using NodeJs.
  • Improved the latency delays in live games and provided live updates effectively.
  • Involved in designing and developing the application using JSP, HTML, CSS, JavaScript
  • Using JMS/MOM Api’s implemented publisher/subscriber messaging systems.
  • Integrated eclipse project with SONAR for Code quality & efficiency

We'd love your feedback!