We provide IT Staff Augmentation Services!

Sr. Technical Lead Resume

2.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Overall, 15 years of extensive experience in executing and leading the full project lifecycle using Agile methodology for Software Products and Services, across multiple Industries such as BFS, Insurance, Manufacturing with a blended mix of Technology.
  • Able to play multiple roles such as Full Stack Developer and Architect, Big Data architect, Data Scientist, Architectural aspects in Enterprise, Data & Cloud space.
  • Having extensive experience on enterprise applications, microservices development using J2EE, Groovy & Grails, Scala/Akka/Play technology stack using several JS frameworks
  • Having extensive experience on big data technologies such as Hadoop 2, Spark, Kafka, Presto, HDFS, HBase, Hive, Sqoop, Nifi, Flume using On - premise distributions (Cloudera, Hortonworks) and cloud data platforms (AWS, Azure and Google) & also on Cloud technologies such as Amazon Web services (Ec2, EMR, S3, SQS, SNS, Lambda & others) & Azure including containerization using Docker and Kubernetes.
  • Have extensive experience on building data pipelines both using custom tools, AWS EMR, Azure Data factory and Google Cloud, Data Science, Data Analytics, engineering solutions & programming.
  • Have worked on several of reporting packages and open-source frameworks/technologies/products.
  • Have used Angular, ReactJS, Sencha JS frameworks for front end development along with Bootstrap with most recent projects

TECHNICAL SKILLS

Languages: C, C++, Java8, JavaScript (ES6+), PHP, Groovy

Middle Tier Technologies: Java, J2EE (JSP, Struts 2, JSF, EJB 3.0, ZK, Ice faces, Seam, JAXP, JWS, Spring Boot, Spring 5 & its modules such as Spring Security, Spring AOP, Spring JDBC, Spring WS, Spring MVC, Spring Web flow & others), AngularJS, Angular, React JS, Servlets, jQuery, XML, XSLT, HTML5/CSS3, Groovy & Grails, Scala.

Integration Tier Technologies: MuleESB, CloverDX, Apache Camel, CloverDx

Report Packages: Business Object, BIRT/Actuate, Jasper Reports.

ORM Technologies: Hibernate, TopLink, EclipseLink, Ibatis. iPhone Technologies: Objective-C, Cocoa, IOS.

Big Data Stack: Hadoop 2, Hive, HBase, Spark, Kafka, Zeppelin, Presto, Cloudera Impala, Sqoop, Cloudera, Horton Works Distributions, Machine Learning.

Mobile Technologies: J2ME, J2MEPolish, BlackBerry, Android & IOS

Web Services: SOAP, REST

Single Sign On: SAML standards, symmetric key based proprietary techniques, Open AM, ForgeRock

Security: PKI, SSL, encryptions, hash, RSA tokenization, PCI-DSS, Spring Security, Json Web Token (JWT), oAuth2, Fortify

CI/CD: Kubernetes, Docker/Jenkins

Servers: JBoss, Apache Web Server, Tomcat, nginx

Cloud: AWS (EC2, EMR, S3, EBS, Lambda, Cloud Watch, Dynamo, Neptune, RDS, SNS, SQS, ELB, Kinesis and Others), Heroku, Azure, Google Cloud

ALM Products: Atlassian Product Suite (Jira, Confluence, Fisheye, Bamboo, Stash)

CMS Products: Alfresco with Liferay, Adobe CQ5, Apache Sling

HTML5&JS Frameworks: Extjs(Sencha), Jquery, Sencha Touch, Jquery Mobile, PhoneGap, Angular 7 and Node JS

Others: OSGi, Hudson/Jenkins, Osgi, Apache Fleix, Apache Karaf

NoSql DB: MongoDB, Cassandra

TypeSafe/Light Bend: Scala,Akka, Play

Reporting tool: jasper reports, jasper server

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

Sr. Technical Lead

Responsibilities:

  • Writing the Java based REST service APIs using spring boot & Spring cloud which were invoked by the front end (Angular).
  • Writing Java code that consumes/integrates with AWS services
  • Designed, deployed dynamically scalable, highly available, fault tolerant applications on AWS infrastructure
  • Integration of JWT and Swagger with Spring Boot.
  • Sourcing of data from various data sources (Relational Databases, Files, NoSQL Databases, Streaming Data Sources) of existing application systems into big data Hadoop platform using generalized data pipeline framework.
  • Implementation of automation of validation checks using generic validation framework which performs certain checks after ingestion process.
  • Implementation of centralized configuration application as micro service to maintain and provide configuration services to big data pipeline which will be used across ingestion, validation and distribution processes. These configurations will be used by data analysts also.
  • Azure: Have built data pipelines using azure data factory to move data from data sources to Blob storage and then via various layers of storage accounts.
  • Have also handled deployments using Azure Devops ARM templates, Azure Release pipelines and Azure Devops with terraform script for pipelines built in Azure
  • Implementation of generalized distribution framework to distribute data from transformed/aggregated data in big data Hadoop platform to downstream systems/applications such as FTP locations, Kafka Topics & other channels.
  • Implementation of centralized micro service for tracking ingestion and distribution jobs with corresponding statuses using unique key.
  • Consolidation of several similar requirements and applications into same product architecture.
  • AWS: Have built pipelines using AWS EMR, S3, Kinesis, Lambda and used cloud formation and terraform for infrastructure as service.
  • Have used angular and react js as JavaScript engines for front end application.

Environment: Java/Scala, Akka, Hadoop2, HDFS, Hbase, Hive, Zookeeper, kafka, Cloudera Impala, Spark, Data Science, Machine Learning, J2EE & Spring Boot, Javascript, Angular, Maven, SBT and DevOps, kubernetes, AWS, Azure.

Confidential, Charlotte, NC

Sr. Technical Lead

Responsibilities:

  • Designed and developed Rest Services using Spring Boot / MVC.
  • Worked with JMS Queues for sending messages in point-to-point mode.
  • Designed and deployed enterprise-wide scalable operations on AWS
  • Worked with JMS API for sending messages between two or more clients.
  • Developed front end with React JS
  • Swagger based application for mocking API’s implementation
  • Migration of T24 product to AWS cloud
  • Worked on developing java services for producing and consuming data from Kafka queues
  • Migrating complex, multi-tier applications on AWS
  • Designing & implementing the 3-Tier architecture for cloud-based deployment of T24 applications (Jenkins, Bitbucket, Sonar, Nexas, Git & Bamboo).
  • Tighten to handle the local variables vs global variables with React JS

Environment: Java/Scala, Akka, Maven, Actuate/Birt, Spring boot and libraires, Jenkins, Restlet, AWS, Hadoop2, HDFS, Hbase, Hive, Zookeeper, Cloudera Impala, Spark & other open-source librairies, Angular, Javascript & J2ee

Confidential, Chicago, IL

Lead

Responsibilities:

  • Customizing Requirement gathering from the client and customizing Bluemix cloud’s “Logistics wizard’ app.
  • Participated in the design of the 3-tier architecture and developed Controllers, Services and DAO Layer
  • NoSQL DB as the backend document store.
  • Used MySQL as the primary database
  • Developed Mule Flows for processing inbound files.

Environment: Java, Spring MVC, Active MQ, Mongo DB, NoSQL DB, SVN, IntelliJ, Spring Web, Spring JPA, Html, jQuery, Mule ESB, MySQL

Confidential, Schaumburg, Illinois

Technical lead

Responsibilities:

  • Fixation of the Confidential Insurance Data Integration Hub architecture requirements
  • Development of Data-Intake and Data Transformation module in Spring Boot
  • Developed DAO Layer for storage in Claim data in a MySQL Database

Environment: Java, Spring, MySQL

Confidential, Herndon, Virginia

Developer

Responsibilities:

  • Reply to RFP, coordination with Infosys Recruitment team to get Resources.
  • Active participation in the client telecom for Project-Planning & Schedule preparation, which was finally sent to the client by an external Project-Manager in MPP format.
  • Performance improvement planning (jProfiler 7, LoadRunner 8, jConsole)

Environment: Java-J2EE (Struts2, iBatis), Slf4j, EJB3, Oracle

We'd love your feedback!