We provide IT Staff Augmentation Services!

Sr. Splunk Developer Resume

0/5 (Submit Your Rating)

Phenoix, AZ

SUMMARY

  • 8 years of Extensive experience in Splunk Developer, Information Technology field in client/server applications with strong emphasis in Business Intelligence (BI) and Data Warehousing (DW) associated projects.
  • Experience in Operational Intelligence using Splunk and deploying and administering Splunk clusters.
  • Helping application teams in on - boarding Splunk and creating dashboards/alerts/reports etc.
  • Upgrade and Optimize Splunk setup with new discharges and Setup Splunk Forwarders for new application levels brought into environment.
  • Experience in Splunk architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, Parsing, Indexing, Searching concepts, Hot, Warm, Cold, Frozen bucketing License model.
  • Knowledge of Extract keyword, sed, Knowledge objects, Knowledge of various search commands.
  • Engineered Splunk to build, configure and maintain heterogeneous environments and in-depth knowledge of log analysis generated by various systems including security products.
  • Installed, tested and deployed monitoring solutions withSplunkservices.
  • Provided technical services to projects, user requests and data queries.
  • Worked on Capacity planning and license evolution for Enterprise Splunk Instance.
  • Restructure theSplunkinfrastructure, which was built on windows servers and moved to Linux platform and Was responsible for planning and upgrading various Linux servers.
  • Creating DLP (Data Leakage Prevention) Reports through Splunk.
  • Time chart attributes such as span, bins, Tag, Event types, Creating dashboards, reports using XML.
  • Created dashboard from search, Scheduled searches online search vs. scheduled search in a dashboard.
  • Exposure in configuring Network bonding, DHCP, DNS, NFS and FTP servers.
  • Worked extensively with Informatica Designer to create mappings using Expression, Router, Lookup,connected and Unconnected, Source Qualifier, Aggregator and other transformations.
  • Experienced in all data processing phases, from the Enterprise Model, Data Model (Logical and Physical Model), and Data Warehousing (ETL).
  • Experience in Data Mart life cycle development, performed ETL procedure to load data from different sources into Data marts, Data warehouse using Informatica Power Center.
  • Extensive Data Warehouse experience using Informatica 7/8.x/9 Power Center tools (Source Analyzer, Mapping Designer, Mapplet Designer, Transformation Designer, Repository Manager, and Server Manager) as ETL tool on Oracle /DB2 Database.
  • Experienced in designing and implementing BI solutions with Microsoft SQL Server, Integration Service(SSIS), and Reporting Service (SSRS) and Tableau and Microstrategy.
  • Extensive experience in writing Packages, Stored Procedures, Functions and Database Triggers using PL /SQL and UNIX Shell scripts. Also handled Oracle utilities like SQL Loader, import etc.
  • Involved in SDLC (Software Development Life Cycle)-Analysis, Design, Development and testing.
  • Good documentation skills with proficiency in Microsoft suite of software.
  • Excellent interpersonal abilities, communication skills, Timemanagement and Team skills with an intension to work hard to meet project deadlines under stressful environments.

TECHNICAL SKILLS

Log Analysis Tools: Splunk Enterprise 4.x/5.x/6.x, Splunk Universal Forwarder, Informatica PowerCenter Center 9.5/9.1/8.6.1/8.5/8.1.1/7.1.2/7.1.1/6.1/5.1.2 , Flat Files (Fixed, CSV, Tilde Delimited, XML, COBOL, AS/400)

Operating Systems: MS-DOS, WINDOWS-2000, 98, XP, Vista, Windows 7, Mac, IBM AIX (5.1/6.1), RHL Linux, Windows R2, VMWare

Languages: HTML, CSS, XHTML, XML, JavaScript, mySQL, and Objective C, C++, VB.Net, Java 1.7/1.6

Monitoring Tools: Wily Introscope 8.x/9.x,, BSM Topaz, Tivoli Performance Viewer, Nagio, NMON, IBM Thread and Heap Analyzers

Database: Oracle 10g/9i/8i/8/7.x, MS SQL Server 2000/7.0/6.5 , TeraData, Sybase ASE, PL/SQL, TOAD 8.5.1/7.5/6.2. DB2 UDB

Dimensional Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling

PROFESSIONAL EXPERIENCE

Confidential, Phenoix, AZ

Sr. Splunk Developer

Responsibilities:

  • Designed, supported and maintained theSplunkinfrastructure on Windows, Linux and UNIX environments.
  • Provided regular support guidance to Splunk project teams on complex solution and issue resolution.
  • Created Dashboards, report, scheduled searches and alerts.
  • Installation ofSplunkEnterprise,Splunkforwarded,SplunkIndexer, Apps in multiple servers (Windows and Linux) with automation.
  • Troubleshoot Splunk Search head, Indexer and Forwarder issues and document.
  • Created Splunk app for Enterprise Security to identify and address emerging security threats through the use of continuous monitoring, alerting and analytics.
  • Created Shell Scripts to install Splunk Forwarders on all servers and configure with common configuration files such as Bootstrap scripts, Outputs.conf and Inputs.conf files.
  • Upgraded Splunk to 6.2.3 with patching in multiple server without downtime.
  • Worked on DB Connect configuration for Oracle, My SQL and MSSQL.
  • Field Extraction, Using Ifx, Rex Command and Regex in configuration files.
  • Set up Splunk to capture and analyze data from various layers Load Balancers, Web servers and application servers.
  • Expertise in Actuate Reporting, development, deployment, management and performance tuning of Actuate reports.
  • Involved in interacting with business owners, developers and business analysts in improving the application.
  • ManagedSplunkconfiguration files like inputs, props, transforms, and lookups.
  • Installed and upgradedSplunksoftware in distributed and clustered environments for numerous corporations and public entities.
  • Configured Splunk for all the mission critical applications and using Splunk effectively for Application troubleshooting and monitoring post go lives.
  • Used techniques to optimize searches for better performance, Search time vs Index time field extraction. And understanding of configuration files, precedence and working.
  • Developed detailed documentation for the installation and configuration ofSplunkandSplunkApps.
  • Utilized Agile Methodology (SDLC) to manage development lifecycle
  • Responsible for quality assurance of finished websites including the validation of web forms and links.

Environment: Splunk Enterprise Server 6.2.0/6.2.3 , Universal Splunk Forwarder, RedHat Linux, IBM HTTP Web Server 6.1/7/8, Oracle, HACMP 5.4, HTML, Perl, Java Script, XML, IIS 7, Windows 2008 R2, Python (Jython), Regular Expressions.

Confidential - Dallas,TX

Splunk Developer

Responsibilities:

  • Worked on Splunk search processing language, Splunk dashboards and Splunk dbconnect app.
  • Expertise with Splunk UI/GUI development and operations roles. Prepared, arranged and tested Splunk search strings and operational strings.
  • Configuration Management and event trigger using Service-Now.
  • Analyzed security based events, risks and reporting instances.
  • Using SPL created Visualizations to get the value out of data.
  • Created EVAL Functions where necessary to create new field during search run time.
  • Splunk Engineer/Dashboard Developer responsible for the end-to-end event monitoring infrastructure of business-aligned applications.
  • Extracted complex Fields from different types of Log files using Regular Expressions.
  • Developed, evaluated and documented specific metrics for management purpose.
  • Experience in consuming REST and SOAP Web services.
  • Prepared, arranged and tested Splunk search strings and operational strings.
  • Developed, evaluated and documented specific metrics for management purpose.
  • Experience in using JSON and XML formats and Wrote PL/SQL and Splunk queries.
  • Involved in helping the Unix and Splunk administrators to deploy Splunk across the UNIX and windows environment.
  • Configure the adds-on app SSO Integration for user authentication and Single Sign-on inSplunkWeb.
  • Used Web logic scripting tool to administer, monitor Weblogic servers and performed regular performance tuning, monitoring in WLST.
  • Worked with administrators to ensure Splunk is actively and accurately running and monitoring on the current infrastructure implementation.
  • Utilized Waterfall Methodology (SDLC) to manage development lifecycle.
  • Later participated in the Testing phase in accordance with QA team, to help them to come up with best testing scenarios.

Environment: Windows Server 2012/2008/2003 R2, Linux and Unix Servers,Splunk, SQL Server 2008, SAN, WLAN, Service Now, Tivoli.

Confidential, Austin,TX

Informatica/ETL/ Data Warehouse Developer

Responsibilities:

  • Worked closely with data modelers on Sybase Power designer for dimensional modeling (Star schema).
  • Used Informatica 8.6.0 to extract, transform and load data from multiple input sources like flat files, SQL Server into Sybase ASE and IQ database.
  • Created Mappings, Mapplets and Transformations using the Designer and developed Informatica sessions as per the business requirement.
  • Worked on Source Analyzer, Warehouse Designer, Mapping & Mapplet Designer and Transformations, Informatica Repository Manager, Workflow Manager and Monitor.
  • Worked with different types of partition techniques like key range, pass through, Round Robin and Hash partitioning.
  • Generated Reusable Transformations, Mapplets and used them extensively in many mappings.
  • Administrated Workflow Manager to run Workflows and Worklets.
  • Involved in performance tuning on Informatica mappings.
  • Used workflow manager for creating, validating, testing and running the Sequential and Concurrent batches and sessions and scheduling them to run at specified time with required frequency.
  • Created and used different tasks like Decision, Event Wait, Event Raise, Timer and E-mail etc.
  • Extensively utilized the Informatica data cleansing tool IDQ 8.5 and Data explorer IDE.
  • Created database Triggers, Stored Procedures, Exceptions and used Cursors to perform calculations when retrieving data from the database.
  • Unit testing and System Testing of Informatica mappings and Workflows.
  • Implemented Error Handling Logic, which involves testing of incorrect input, values for the mappings and the means of handling those errors.
  • Responsible for the Folder Migration and Folder Creation and creating user accounts in the Repository Manger and management of passwords and permissions.
  • Used Shell Scripts which compares the incoming file with the existing file and creates a log file with differences and the changed values are loaded into the target.

Environment: Informatica Power Center 8.6.1/8.1.1 , Sybase ASE, UNIX Scripting KSH Windows 2000, Power Exchange 8.6.1, IDE8.6.0, IDQ8.6.0

Confidential

ETL Developer

Responsibilities:

  • Analyzed Business Requirements, Analytics and strategies to improve the business.
  • Identified the Facts and Dimensions using Erwin Data modelind tool to represent the star Schema Data marts. Coordinating offshore teamplanning and review of deliverables.
  • Responsible for documenting user requirements and translated requirements into system solutions.
  • Used shell scripting to check if the flat files have come in on the correct path and triggering the informatica jobs from UNIX server.
  • Evaluating the Consistency and integrity of the model and repository.
  • Designed and built Data Marts for home and business divisions.
  • Installed and configured Informatica Server and Power Center 7.2. Migrated the Metadata changes to the Informatica repository.
  • Responsible for Data Import/Export, Data Conversions and Data Cleansing.
  • Created Informatica mappings with SQL procedures to build business rules to load data.
  • Used Transformations for data joins, complex aggregations & external procedure calls.
  • Worked on Informatica Power Center 7.2 tool - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplets, and Transformations.
  • Various kinds of the transformations were used to implement simple and complex business logic.
  • Transformations used are: Remote Procedure, Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Aggregators, Filters, Sequence Generator, etc.
  • Integrated sources from different databases and flat files.
  • Designed mappings with multiple sources using Informatica Designer tool.
  • Implemented various performance tuning concepts to enhance the jobs performance.
  • Logical and Physical Warehouse Designing using Erwin Tool.
  • Extensively used Star schema for designing the Data warehouse.
  • Responsible for various discussions with end user to analyze Meta data.
  • Designed and Developed the ETL mapping and load strategy documents.
  • Designed the ETL Mapping Documents for development team members for their implementations.
  • Worked on Informatica - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet, and Transformations.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Designed and developed the Workflow and worklets for all load process.
  • Worked on OBIEE reports to do Enhancements.

Environment: Informatica Power Center 7.2, Power Exchange 8.6.1, IDE8.6.0, IDQ8.6.0, Teradata, TOAD, XML, Windows NT, Erwin, OBIEE.

Confidential

PL/SQL Developer

Responsibilities:

  • Member of a development team tasked with creation of Oracle PL/SQL stored procedure and UNIX shell script to extract data from extensive large tables for legal Auditing purposes.
  • Generated DDL for new tables and Researched production issues related to commission payments.
  • Supported and modified Actuate reports that are currently in production and responded expediently for requests to create, modify and fix reports.
  • Worked closely with upper management and consultants onshore and offshore of the team in Documenting and coding the reports.
  • Query Optimization, Tuning SQL queries.
  • Interacted with senior developers to ascertain program specifications of changes and/or upgrade using PL/SQL.
  • Developed Procedures for efficient Error Handling process by capturing errors into user managed tables.
  • Developed UNIX Shell scripts to automate table creation, execute procedures.
  • Involved in Report Design and Coding for Standard Tabular type reports, including Drill down and Drill through functionality and Graphical presentations such as Charts.
  • Manage change control based on customer requirements to according to project work flow.
  • Assist in performance assessment of the project.

Environment: PL/SQL, Forms 6i, Reports 6i, Oracle 9i, UNIX, XSLT, XML, C#, ASP.Net, DTS, MS Visio, Erwin.

We'd love your feedback!