Splunk Engineer Resume
Dallas, TX
PROFESSIONAL SUMMARY:
- Over 7 years of experience in Data Visualization, Analytics, Data Management, Data Integration, Implementation and Maintenance of Business Intelligence, and the related Database Platforms.
- Experience in Operational Intelligence using Splunk.
- Proficiency with the usage of various search commands like stats, chart, timechart, transaction, strptime, strftime, eval, where, xyseries, table etc. ; Experience with the usage of Extract Key Word, sed, etc..
- Expertise with Splunk architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model.
- Experience in creating different visualizations using Bar, Line and Pie chart, Background Maps, Box plots, Scatter plots, Gantt charts, Bubble charts, Histograms, Trend lines & statistics, Bullets, Heat maps and Highlight tables.
- Expertise in Actuate Reporting, development, deployment, management and performance tuning of Actuate reports.
- Extensive experience in developing complex mappings from varied transformations like Router, Filter, Sorter, Connected and Unconnected lookups, Normalizer, Expression, Aggregator, Joiner, Union, Update Strategy, Stored Procedure and Sequence Generator Etc.
- Implemented a Log Viewer Dashboard as a replacement for an existing tool to view logs across multiple applications hosted on aPaaS setup.
- Business Activity Monitoring (BAM) concepts such as IBM
- Parsing, Indexing, Searching concepts Hot, Warm, Cold, Frozen bucketing.
- Expertise in using the commands like rex, erex, sedandIFX to extract the fields from log files.
- Extensive knowledge in creating accurate reports using XML,Dashboards,visualization and pivot tables for the business users.
- Use techniques to optimize searches for better performance, Search time vs Index time fieldextraction. And understanding of configuration files, precedence and working.
- Knowledge on configuration files in SPLUNK(props.conf, transforms.conf, output.conf).
- Extensive Data Warehouse experience using Informatica 7/8.x/9 Power Center tools (Source Analyzer,Mapping Designer, Mapplet Designer, Transformation Designer, Repository Manager, Server Manager) as ETL tool on Oracle /DB2 Database.
- Extensive experience in writing Packages, Stored Procedures, Functions and Database Triggers using PL / SQL and UNIX Shell scripts. Also handled Oracle utilities like SQL Loader, import etc.
- Working knowledge of data warehouse techniques and practices, experience including ETL processes, dimensional data modeling (Star Schema, Snow Flake Schema, FACT & Dimension Tables), OLTP and OLAP.
- Experience in data mart life cycle development, performed ETL procedure to load data from different sources into Data marts, Data warehouse using Informatica Power Center.
- Experienced in all data processing phases, from the Enterprise Model, Data Model (Logical and Physical Model), and Data Warehousing (ETL).
- Good understanding of Views, Synonyms, Indexes, Joins, and Sub - Queries.
- Team player with excellent communication, presentation and interpersonal skills.
- Highly motivated team player with zeal to learn new technologies.
- Ability to manage projects starting with gathering requirements, accordingly deciding andordering hardware, building, configuring and supporting them.
- Excellent understanding of project issues, tracking of issues, solving issues and closing issues.
- Have experience in coordinating and working with different infrastructure teams like dba, san,network etc. along with vendors such as IBM, HP,Splunk etc.
TECHNICAL SKILLS:
Data Warehousing: Informatica Power Center 9.5/ 9.1/8.5/8.1.1/7.1.2, Informatica Designer, Workflow Manager, Work flow Monitor, Datamart, Mapplet, Transformations, Autosys, SQL*Plus
Hadoop/Big Data: HDFS, Mapreduce, HBase, Pig, Hive, Sqoop, Flume, Mongo DB, Cassandra, Power pivot, Puppet, Oozie, Zookeeper.
Java&J2EETechnologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans
IDE’s: Eclipse, Net beans
Programming Languages: SQL, PL/SQL, SQL Plus, Unix Shell Scripting, C
Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server
Web Servers: Web Logic, Web Sphere, Apache Tomcat
Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL
Network Protocols: TCP/IP, UDP, HTTP, DNS, DHCP
ETL Tools: Informatica, Talend
WORK EXPERIENCE:
Confidential, Dallas, TX
Splunk Engineer
Responsibilities:
- Provide regular support guidance to Splunk project teams on complex solution and issue resolution.
- Created Dashboards, report, scheduled searches and alerts.
- Involved in standardizing Splunk forwarder deployment, configuration and maintenance across UNIX and Windows platforms.
- I have helped teams to on-board data,create various knowledge objects, install and maintain the SplunkApps, TAs
- Provided services related to java script for advance UI as well Python for advance backend integrations.
- Integrated ServiceNow with Splunk to generate the Incidents from Splunk.
- Extensive experience on setting up the Splunk to monitor the customer volume and track the customer activity.
- Have involved as a SplunkAdmin in capturing, analyzing and monitoring front end and middle ware applications.
- Worked on DBConnect configuration for Oracle, MySQL, MSSQL.
- Knowledge about Splunk architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model.
- Worked on setting up Splunk to capture and analyze data from various layers Load Balancers, Web servers and application servers.
- Captured data from various front end, middle ware application.
- Field Extraction, Using Ifx,Rex Command and Regex in configuration files.
- Created many of the proof-of-concept dashboards for IT operations, and service owners which are used to monitor application and server health.
- As in production environment, I was involved in triaging and resolving various complex productionissues by analyzing data from various monitoring tools from Sys logs and application logs. This involves working with various teams real time on a conference call.
- Also, worked on code changes for various maintenance and customer reported bugs as part of production support.
- Used techniques to optimize searches for better performance, Search time vs Index time field extraction. And understanding of configuration files, precedence and working.
- Various types of charts Alert settings Knowledge of app creation, user and role access permissions,Creating and managing app, Create userrole Permissions to knowledge objects.
- Create dashboard from search, Scheduled searches o Inline search vs scheduled search in a dashboard.
Environment: SPLUNK 6.1.3, Linux, Hadoop
Confidential, Detroit, MI
Splunk Power User
Responsibilities:
- Expertise with Splunk UI/GUI development and operations roles.
- Prepared, arranged and testedSplunksearch strings and operational strings.
- Involved in setting up alerts for different type of errors.
- Analyzed security based events, risks and reporting instances
- Prepared, arranged and tested Splunksearch strings and operational strings.
- Developed, evaluated and documented specific metrics for management purpose.
- Using SPL created Visualizations to get the value out of data
- Created Dashboards for various types of business users in organization.
- Played a major role in understanding the logs, server data and brought an insight of the data for the users.
- Provided technical services to projects, user requests and data queries.
- Involved in assisting offshore members to understand the use case of business.
- Assisted internal users of Splunkin designing and maintaining production-quality dashboard
- Used Datameer to analyze the transaction data for the client.
- Installed, configured and managed Datameer users on the Hadoop cluster.
- Involved in writing complex IFX, rex and Multikvcommand to extracts the fields from the log files.
- Involved in helping the Unix and Splunk administrators to deploySplunk across the UNIX and windows environment.
- Helped the client to setup alerts for different type of errors.
- Worked with administrators to ensure Splunk is actively and accurately running and monitoring on the current infrastructure implementation.
- Involved in installing and using Splunk app for Linux and Unix.
Environment: Splunk 6.0, Pivotal HD, Datameer, Linux, Bash, Perl, Hbase, Hive, Pig, Hawq, Sed, rex, erex, Splunk Knowledge Objects
Confidential, Sacramento, CA
Informatica Developer
Responsibilities:
- Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
- Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
- Involved in extracting the data from the Flat Files and Relational databases into staging area.
- Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
- Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
- Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
- Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
- Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
- Developed several reusable transformations and mapplets that were used in other mappings.
- Prepared Technical Design documents and Test cases.
- Optimized the performance of the mappings by various tests on sources, targets and transformations. Identified and removed the Bottlenecks, and implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
- Involved in scheduling of documents monthly, weekly and daily through Control-M.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Worked with Index Cache and Data Cache in cache using transformation like Rank, Lookup, Joiner, and Aggregator Transformations.
- Extensive experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor.
Environment: Informatica Power Center 8.6, Informatica Workflow Manager, DB2 UDB 8.1, UNIX Shell Scripting, Control M
Confidential, Atlanta, GA
Informatica Developer
Responsibilities:
- Involved in full Software Development Life Cycle (SDLC) - Business Requirements Analysis, preparation of Technical Design documents, Data Analysis, Logical and Physical database design, Coding, Testing, Implementing, and deploying to business users.
- Prepared SQL Queries to validate the data in both source and target databases.
- Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
- My achievements include complete revamp of the Santander Bank testing and production environment which include of retiring the hardware and reducing around 1 Million dollar savings in the maintenance and cost of hardware
- Involved in writing UNIX Shell scripts (Pre/Post Sessions commands) for the Sessions &wrote Shell scripts to kickoff workflows, unscheduled workflows, get status of workflows.
- Tuned SQL Statements, Mappings, Sources, Targets, Transformations, Sessions, Database, Network for the bottlenecks, used Informatica parallelism options to speed up data loading to target.
- Responsible for Performance Tuning in Informatica Power Center.
- Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into enterprise data warehouse.
- Extensively made use of sorted input option for the performance tuning of aggregator transformation.
- Created various tasks like Pre/Post Session, Command, Timer and Event wait.
- Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
- Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
- Created various tasks like Pre/Post Session, Command, Timer and Event wait.
- Created Materialized view for summary data to improve the query performance.
- Running and monitoring daily scheduled jobs by using Work Load manager for supporting EDW (Enterprise Data Warehouse) loads for History as well as incremental data.
- Worked with Update strategy transformation using functions like DD INSERT, DD UPDATE, DD REJECT, and DD DELETE.
- Monitored batches and sessions for weekly and Monthly extracts from various data sources to the target database.
Environment: Informatica Power Center 7.1.2, Oracle, Mainframe, DB2, COBOL, VSAM, SQL, PL/SQL
Confidential
JAVA/ Flex Developer
Responsibilities:
- Developed the Total module end to end.
- Involved in JSP / Flex integration to bring up the Flex application as Pop up on click of the button from the JSP Page.
- Involved in developing Core API for the system.
- Developed this application using Rich Internet Application (RIA) using Adobe Flex, Action Script, Cairngorm, etc.,
- Created java classes package structure module wise.
- Coordinating with team members for developing Flex components using ActionScript and other Java services for WebServices.
- Developed various Components in Flex using ActionScript to re-use on UI side.
- Created events/Commands using Cairngorm library to achieve MVC at UI side.
- Event Driven program is used to develop UI components with the help of Flex/Cairngorm libraries.
- Assisted the System Integration Team in integrating all modules of the application.
- Assisted the Deployment Team in building and deploying project on server.
- Assisted the Testing team in testing application in various aspects.
Environment: Java, JSP, Servlet, JDBC, Adobe Flex 3.0, Action Script 3.0, Cairngorm, Adobe Flex Builder, Blaze DS, Unix, putty, TOAD, Eclipse 3.2, Oracle, SQL, Weblogic 9.2.2, SOAP Webservices, CVS, Maven, CSS, XML, Windows XP.
Confidential
Jr. Java Developer
Responsibilities:
- Designed the application based on the architecture of theMVC designpattern.
- Project was developed followingAgileandScrummethodologies.
- Requirement Analysis and Documentation as per SDLC methodologies.
- Converted requirement into flow design diagram using MS Visio.
- Performed data loading usingspring-Hibernate.
- UsedWSDLto post the services in theUDDI.
- ConfiguredHibernate'ssecond level cache usingEHCacheto reduce the number of hits to the configuration table data.
- Developed views for JSP pages using AJAX
- Extensively usedHibernatein data access layer to access and update information in the database.
- Code development using Eclipse, HTML, JAVA, JSP, SWING, Servlet and SQL
- Created Functional Test cases and achieved bug fixes.
- WroteSQL,PL/SQL(Procedures/Functions/Packages/Triggers) to handle business functionality.
- UsedXMLSAXparsertosimulate xml file which has simulated test data.
- Code review and function testing for better client interface and usability.
- Participation in meeting with team, senior management and client stakeholders
Environment: Java, J2SE, JSP, Servlet, SQL, Oracle9i, JDBC, Swing, Eclipse, HTML, SDLC, MS Office, Windows, AJAX, JPA annotations, SOAP web services, WSDL, UDDI, SAX, DOM