Big Data Infrastructure Engineer Resume
Washinton, DC
SUMMARY:
- Hard - working and highly motivated Information Technology Professional with over 17+ years experience and a record of superior performance in numerous technical and leadership roles.
- Extensive technology knowledge and capabilities that complement outstanding analytical, hardware and software troubleshooting, problem resolution, and customer service abilities.
- Proven success deploying and supporting large information systems, clients, and end-users.
- Excellent client-facing, team working, and interpersonal communication skills.
- Flexible, adaptable, and constantly surviving to improve and increase knowledge and abilities.
- Strong troubleshooting skill set identifying, analyzing, debugging, unknown scenarios.
- Experience and considered an expert at a global enterprise scale in the design, implementation, integration and maintenance for DevOps Cloud Technologies
- Virtualization and Cloud (Azure, SaaS, IaaS, PaaS, VMware, Hyper-V).
- Experience in IT Infrastructure security engineering areas.
- Enterprise Data Architectures and Data lifecycles.
- Ability to manage stakeholder’s expectation on multiple high priority competing tasks.
- Document the design, and troubleshooting of technology platforms and procedures.
- Experience in conducting root cause analysis of IT Infrastructure and Network malfunctions and their remedial strategies.
- Maintain Government Public Trust Clearance.
TECHNICAL SKILLS:
Software: Cloudera Enterprise 5.14.2, Hortonworks HDP/HDF Hadoop Apache, YARN, Ambari, Pig, Hive, Spark/Spark2, SOLR, Knox, Ranger, Atlas, Zookeeper, Nagios, Sqoop, Phoenix, Nifi, Flume, Kafka, Flume, HBase, Postgres SQL, Falcon, Oozie, Puppet, Gitlab, Hyper-V, Solarwinds, JAVA, Splunk Enterprise, Centrify, GitLab v7.14x, Puppet, Jira, Confluence, Rabbit MQ, IBM Websphere Application Server ND v6.0x, v6.1x, IBM MQ Series, Java (JVM), Cincom Eloquence, Memex, JBOSS, Mitchell1, Emblem, Team Connect, Remedy, Dameware, Terminal server, PC Anywhere, VNC, Remote Desktop, Netapp, Contivity VPN Client, Trend Micro Anti-Virus, Norton Anti-Virus, Ghost, Powerquest, Legato, Veritas, Veritas Netbackup v5.1, Nextel Blackberry desktop software, Java, Citrix, Adobe Suite and Acrobat 5, Winzip, QVT, Visio 2000, Service Center v6.2, Patchlink, Rightfax, Sybase, Oracle9i, Oracleform6, Egenera, SSH, Putty, Partition Magic, ITIL and Microsoft Office Suite (NT, 2000, 2003, and XP), including Word, Excel, Access, PowerPoint, and Outlook. VB Scripting, Winbatch, Powershell scripting, Linux Automation (Bash, Python, Cron jobs, Puppet Enterprise Modules/Manifests)
Operating Systems: UNIX, Linux, AIX, Ubuntu, RedHat (RHEL 6/7), MS-Dos, and Microsoft Windows 3.x, 9x, NT 4.0, 2000 Professional, 2000 Server (Standard and Enterprise), 2003/2008/2012 Server (Standard and Enterprise R2), XP Professional, Windows 7, Windows 8, Azure Cloud Technology + AWS EC2 instances.
Administration: Big Data Hadoop Cloudera Enterprise 5.14.2 + Hortonworks, Apache Tomcat 8.5, Apache Spark/Spark2, Apache Solr, NiFi, Hyper-V Manager, Puppet, Jenkins, SonarType Nexus, Gitlab, Maven, Windows Active Directory, Egenera, and VMWare administration, Hyper-V, IBM Mainframe, F5 BIGIP - 3DNS, GTM, TMOS, Load Balancing.
Hardware: IBM, HP, Compaq, Fuji, EMC, NetApp, Clarion, and Dell as well as F5 load balancers and 3DNS, Cisco, VMWare ESX, Egenera, Unisys, hard drives, floppy drives, compact disk drives, memory, modems, circuit boards, power supplies, printers, servers
Networking: LAN/WAN domain and account administration, DHCP and Static IP addressing, OSI Model, Routing, TCP/IP, SMTP, WINS, DNS, Cisco, and Nortel VPN.
PROFESSIONAL EXPERIENCE:
Confidential, Washinton, DC
BIG DATA INFRASTRUCTURE ENGINEER
Responsibilities:
- Big Data Engineer Infrastructure supporting for OFR (Office of Financial Research) providing financial stability by looking across the financial system to measure and analyze risks, perform essential research, and collect and standardize financial data supporting large data sets via Hadoop eco system running on Cloudera platform including services stacks.
- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities for OFR environments.
- Implementing and supporting existing ETL processes.
- Monitoring performance tools via Solarwinds, Nagios, Dell OpenMange Essentials, Dell OpenManage Server Administrator 9.1.0, HUE Browser, SOLR and advising any necessary infrastructure changes.
- Big Data Hadoop Ecosystem support/administration for Cloudera Manager including stacks (Hbase, HDFS, YARN, MapReduce, Hive, Hue, Impala, Sentry, Cloudera Director, Pig, ZooKeeper, Atlas, Log Search, Key Trust, Spark/Spar2, Kerberos, Solr etc.
- Daily Administration for RHEL 6 / RHEL 7 / VMWare VM administration, Dell RX620/720/730 servers, NodeJs, Maven, GitLab, SonarType Nexus v2.x/3.x, Confluence Wiki, Jira, Jenkins, Linux cron jobs, Python scripts, Puppet Master, Bash scripts, Ansible Playbooks w/ AWS automations, Bamboo, Thycotic Secret Server 8.8
- Centrify Infrastructure services including Privileged Access Service, Authentication Service, Privilege Elevation Service, and Auditing and Monitoring Service.
- Defining data retention policies and automated backup recovery solutions.
- Proficient understanding of distributed computing principles.
- Management of Hadoop cluster, with all included services HDP, HDFS, Administration, Security, Auditing, Governance Integration, Provisioning, Data Lifecycles.
- Ability to solve ongoing issues with operating the clusters (Missing Blocks, Networking, strong triaging troubleshooting knowledge.
- Proficiency with Hadoop v2, MapReduce, HDFS.
- Experience with supporting stream-processing systems, using solutions such as Storm or Spark-Streaming, Hadoop Grid computing setup and administration.
- Good knowledge of Big Data querying tools, such as Pig, Hive, Impala, Talend.
- Experience with Spark/Spark2 customization management integration.
- Experience with integration of data from multiple data sources.
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, Postgres.
- Knowledge of various ETL techniques and frameworks, such as Flume.
- Good understanding of Big Data Architect Topologies, and delivering business solutions.
Confidential, Washinton, DC
BIG DATA INFRASTRUCTURE ENGINEER
Responsibilities:
- Big Data Engineer Infrastructure supporting for Dept of Justice 1K + Linux Nodes.
- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities for Non-Prod + Production environments.
- Implementing and supporting existing ETL processes.
- Monitoring performance tools via Nagios, Ambari, SOLR and NiFi advising any necessary infrastructure changes.
- Big Data Hadoop Ecosystem support/administration for Hortonworks including stacks (YARN, MapReduce, Tez, Hive, Pig, HBase, Oozie, ZooKeeper, Falcon, Ambari Infra, Ambari Metrics, Grafana, Atlas, Kafka, Know, Log Search, Ranger, Ranger KMS, Spark/Spar2, Druid, Kerberos, Mahout, NiFi, Slider, Solr, Zepplin)
- Daily Administration for RHEL 6 / RHEL 7 / Hyper-V Manager VM administration, NPM, NodeJs, Maven, GitLab, SonarType Nexus v2.x/3.x, Confluence Wiki, Jira, Jenkins, Linux cron jobs, Python scripts, Puppet Master, Bash scripts, Powershell scripts automations.
- Defining data retention policies and automated backup recovery solutions.
- Proficient understanding of distributed computing principles.
- Management of Hadoop cluster, with all included services HDP, HDFS, Administration, Security, Auditing, Governance Integration, Provisioning, Data Lifecycles.
- Ability to solve any ongoing issues with operating the clusters (Missing Blocks, Networking, strong triaging troubleshooting knowledge.
- Proficiency with Hadoop v2, MapReduce, HDFS.
- Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming, SOLR Collection creations, Nifi Cluster setup and administration.
- Good knowledge of Big Data querying tools, such as Pig, Hive, Impala.
- Experience with Spark/Spark2 customization w/ Ambari Management integration.
- Experience with integration of data from multiple data sources.
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, Postgres.
- Knowledge of various ETL techniques and frameworks, such as Flume.
- Experience with various messaging systems, such as Kafka.
- Good understanding of Big Data Architect Topologies, and delivering business solutions.
Confidential, CHEVY CHASE, MD
DevOps - Senior Systems Engineer
Responsibilities:
- DevOps Azure Infrastructure Support (Topologies, Project Engineering, IT Architect)
- Splunk Enterprise Infrastructure setup and monitoring support for 5000+ servers.
- Powershell Azure Cloud Automation (Azure ILB Probing, DevOps Catalog, CM process, + more).
- COTS application support (Various Confidential In-House) Legacy apps.
- Leading Mainframe Legacy application .NET Modernization efforts for Billing Dept.
- Manage & Support TFS (Team Foundation Server) for Billing Dept utilized for .NET source control development.
- Assisted with Data Center VM migration with VMware Technology.
- Administer 5000+ servers, provide on-call support, perform software troubleshooting in support of the company’s Billing Dept. business goals and technical objectives, and coordinate IT Infrastructure for any platform required.
- Strong skills in troubleshooting and identifying root cause w/ resolutions in timely fashion.
- Oversee the packaging of features from the correct code branch for hand off to engineers for deployment to the testing pipeline
- Provide technical coordination between QA, Testing, Operations and Development teams
- Configure team solution properties for code check in to comply with architectural guidance
- Manage daily builds in a high priority matter from initial creation to release. Also responsible for maintaining build infrastructure and daily build tools, from designing, developing and debugging to improving the system.
- Attend weekly meetings to keep informed regarding divisional plans and drive integration schedule for multiple teams.
- Maintain internal documentation and a web site with integration, build, suite and other BFD related information.
- Excellent debugging and trouble shooting skills as well as the ability to deal with multiple time critical issues concurrently.
- Experience in writing automated scripts (batch files, PowerShell, JScript) and using build tools (MSBuild, makefiles).
Confidential, Rockville, MD
Senior Systems Engineer
Responsibilities:
- Administer more than 5,000 servers, provide on-call support, perform hardware and software troubleshooting in support of the company’s business goals and technical objectives, and coordinate with other IT teams to facilitate timely issue resolution
- Perform VMWare administration, Veritas NetBackup administration, load balancing using F5 equipment, and DNS changes.
- Manage the call schedule for web hosting team and present project and status updates to management.
- Move servers and related equipment into and out of the data center, replace and repair failing equipment, build and configure F5 load balancers, and deploy system patches using Patchlink.
- Communicate effectively with clients to keep them apprised on production issues affecting servers and the applications residing on them.
- Interface effectively with vendors to facilitate equipment acquisitions/purchases and resolve issues.
- Strong skills in using application/system monitoring tools such as Big Brother, OVO, Infradesk.
- Track and manage the data center’s inventory of server equipment.
- Perform company standard server and document server installations and rebuilds.
- Train new system engineers on workflow processes and procedures.
- Manage Citrix webfarm utilized for jumpboxes for customers to access various Geo centers.
- Helped w/ multiple datacenter consolidation/migration across
Confidential, Rockville, MD
Senior Help Desk Technician
Responsibilities:
- Interfaced with clients and customers to facilitate the delivery of Help Desk support and ensure the highest level of customer service possible.
- Installed hardware and software and set up desktop computers, which including establishing network connectivity, mapping printers and network shared drives, and resolving setup-related application and hardware issues.
- Analyzed trends and researched frequent hardware and software issues to formulate effective resolutions to a variety of problems.
- Documented troubleshooting steps for company applications and hardware and related issues.
- Led a number of successful technical projects and implemented new processes and procedures.
- Updated workflow tickets, inventoried equipment, and provided technical training to new desktop technicians
Lead Data Operator
Responsibilities:
- Provided 24x7 on-call support and data center management in support of over 400 servers, which included directing the efforts of four subordinates and overseeing tape backups, server installs, file transfers, cabling, and server monitoring.
- Coordinated work efforts with senior management, organized work and project schedules, and delegated tasks and responsibilities among team members.
- Communicated daily with team members on projects and team objectives, documented daily tasks and responsibilities, and logged daily tape backups and file transfers to clients.
- Operated Compaq tape backup equipment using Legato software.
- Analyzed and resolved data center equipment failures.
- Provided training to team members on running backups and transferring files using ftp.