Summary
Overview
Work History
Education
Skills
Certification
Languages
Timeline
Generic

PRADEEP REDDY AYYAPPAVANDLA

Leander,TX

Summary

  • Certified Splunk Admin, Certified Splunk Enterprise Security Admin and Certified Cribl Admin with8+ years of IT experience, as Splunk Developer and Administration in the areas of System Analysis and Development & Support.
  • Broad knowledge about Splunk architecture and various components like indexer, forwarder, search heads, deployment server, Heavy and Universal forwarder, License model.
  • Experience in Operational Intelligence using Splunk.
  • Hands-on experience in Extracting, Transforming, Analyzing, Visualizing, and presenting data from diverse business areas.
  • Equipped with good knowledge on Parsing, Indexing, and Searching concepts of Splunk.
  • Excelled in editing the Splunk Configuration files such as props.conf, Transforms.conf, Output.conf and inputs.conf according to the needs.
  • Hands on experience in Data onboarding using forwarders and REST API.
  • Expertise with the usage of various search commands like stats, chart, timechart, transaction, strptime, strftime, eval, where, xyseries, table etc.
  • Good understanding of different types of searches and reporting commands in Splunk.
  • Generated dashboard from search, Scheduled searches of Inline search vs scheduled search in a dashboard.
  • Develop custom app configurations within Splunk in order to Parse, Index multiple types of log format across all application environments and set up Splunk Forwarders for new application levels brought into the environment.
  • Capable of administering in environments like Window Servers, Red Hat Linux Enterprise Servers.
  • Good Knowledge of app creation and setting up different types of alerts, Tags, event types, pivots and workflow actions.
  • Creating user and providing role access permissions.
  • Familiar with commands such as IFX, Rex, and Regex in configuration files.
  • Hands-on experience in Cribl Data Onboarding, Functions, Pipelines, Routes to ingest data from multiple sources to Splunk destination.
  • Equipped with good knowledge on Cribl components like Master Node, Worker Node and Worker Groups.
  • Hands-on experience in most of the Linux command-line commands and shell scripting.
  • Hands-on experience in developing scripts for web applications using Linux platforms.
  • Highly Motivated team member. Continuously sought for analytical, management and troubleshooting skills.
  • Strong in solving diverse problems where technical analysis and evaluation is required.
  • Accustomed to meeting deadlines, setting policies and working with all levels of management and customers.

Overview

8
8
years of professional experience
1
1
Certification

Work History

Splunk Operations L3 Engineer

Deutsche Bank
12.2022 - Current
  • Company Overview: Deutsche Bank is a leading provider of financial services to agencies, corporations, governments, private institutions and institutions in the Americas
  • Worked on installing, configuring and administering Splunk enterprise instances and Splunk forwarder agents in a large, distributed environment consisting of windows and Linux
  • Upgraded Splunk Enterprise components which includes Search Heads, Indexers, Heavy forwarder and Management Servers in lower environments and in production
  • Migrate Splunk Enterprise components like management servers and search heads from virtual to physical servers
  • Onboarding data from different applications using various onboarding methods like Splunk UF, DB Connect, Syslog and Heavy Forwarder (apps integration)
  • Perform deployments on a daily basis to deploy apps to necessary end clients (UF) and Splunk instances like search heads, indexers and heavy forwarders
  • Analyze issues reported by SOC team associated with custom use cases, notables, Real Time monitoring dashboards, incident reviews and adaptive response actions
  • Monitor Enterprise Security threat intelligence downloads and related kvstore
  • Create change requests using ServiceNow for weekly production deployments, upgrade tasks and migration activities
  • Create, Install and renew Splunk web/server certificates on all components
  • Investigate kvstore related issues in the environment and take kvstore backup on a regular basis
  • Analyze and investigate environmental health issues reported by Splunk professional services regarding issues like ulimits, event parsing, event transformations and event line breaking
  • Provide regular support guidance on data ingestion issues associated with enterprise security datasets as well as applications and detailing the root cause analysis report on them
  • Responsible for troubleshooting various endpoint/agent issues by analyzing Splunk logs such as splunkd.log, metrics.log, Scheduler.log available in internal indexes
  • Creating and monitoring Splunk DB connected identities, Database connections, Database inputs, outputs and lookups along with analysis tasks on broken connection and data indexing issues through DB Connect
  • Refactoring existing queries to create well-structured queries that minimize performance impact
  • Requirement gathering and analysis, Interacting with team members and users regarding data onboarding
  • Track issues via incident and problem resolutions related to the tools
  • Documenting and detailing the process of data collection from various sources and design data flow of knowledge objects associated with app
  • Work closely with Splunk vendors and internal resources
  • Environment: SPLUNK8.x, Linux, shell scripting, SPL, Splunk DB Connect, JIRA, ServiceNow, Bitbucket, Git, Python, Splunk Enterprise Security

Splunk Admin/Developer

JP Morgan & Chase
02.2018 - 12.2022
  • Company Overview: JPMorgan Chase & Co
  • Is a financial holding company
  • It provides financial and investment banking services
  • The firm offers a range of investment banking products and services in all capital markets, including advising on corporate strategy and structure, capital raising in equity and debt markets, risk management, market making in cash securities and derivative instruments, and brokerage and research
  • Develop advanced Dashboards, Visualizations, Statistical reports, scheduled searches and alerts as per user requirement
  • Creating Knowledge objects like Lookups, Tags, EventTypes, Macros, Alerts, Data Models and summary Indexing to optimize performance and functionality
  • Create scheduled cron jobs, analyzing log files, managing user accounts and creating data models with underlying data to enrich the searches
  • Worked on installing, configuring and administering Splunk enterprise instances and Splunk forwarder agents in large, distributed environment consisting of windows and Linux with exposure to various Splunk apps to monitor all firm wide deployments on endpoints
  • Onboarding data from different applications using various onboarding methods like Splunk UF, DB Connect, Scripts and Kafka
  • Developed shell and python scripts to onboard source files from SFTP servers and ingest the data to Splunk
  • Deployed custom alert action application to search head cluster using deployer master which can be used to send csv files to downstream applications destination directory on SFTP server
  • Developed Job health dashboard to monitor all scheduled summary searches and user search activities
  • Developed custom HTML and CSS code into existing dashboards or new ones to incorporate required styles and visualizations
  • Provide regular support guidance to other project teams on issues associated with data and detailing the root cause analysis report on them
  • Creating enriched datasets and transferring data to downstream applications
  • Refactoring existing queries to create well-structured queries that minimize performance impact
  • Responsible for troubleshooting various endpoint/agent issues by analyzing Splunk logs such as splunkd.log, metrics.log, Scheduler.log available in internal indexes
  • Troubleshooting on logic associated with summary scheduled jobs or kvstore generators if there are any data quality issues associated with data available in them
  • Uploading data provided by the business team as lookups to enrich the existing data and gather the statistical occupancy analysis report
  • Creating Indexer/Search head level transforms and props to raw data to increase the environment response to user searches
  • Creating Custom Data Models and designing health monitoring dashboards around data models in the current environment to check if accelerated summaries are created as expected without any skips
  • Created and monitored Splunk DB connected identities, Database connections, Database inputs, outputs and lookups
  • Also performed certain analysis tasks on broken connection and data indexing issues through DB Connect
  • Created Sources, Routes, Pipelines, and Destinations to ingest data from source applications to Splunk using Cribl
  • Created functions for masking and encryption on pre/post processing pipelines inside Cribl to make sure that Splunk receives encrypted data on certain sources
  • Requirement gathering and analysis, Interacting with team members and users during the design and development of the applications and Splunk Knowledge objects
  • Track issues via incident and problem resolutions related to the tools
  • Documenting and detailing the process of data collection from various sources and design data flow of knowledge objects associated with the app
  • Work closely with Splunk vendors and internal resources
  • Environment: SPLUNK6.x and7.x, Linux, shell scripting, SPL, Splunk DB Connect, JIRA, HP Center, ServiceNow, Bitbucket, Git, Python, Cribl

Splunk Developer

CenturyLink
10.2016 - 02.2018
  • Company Overview: CenturyLink is a global communications and IT services company which is mainly focused on connecting its customers to the power of the digital world
  • CenturyLink provides broadband, voice, video, advanced data and managed network services over the U.S
  • Fiber network
  • Provided dashboard analytics, outreach, and awareness to the team on the project
  • Created Reports, Dashboards, Visualizations and Scheduled searches as per the business end user requirements and different application teams
  • Worked on an OVS project to create reports on the data which were further used by the OVS team
  • Assisted internal users of Splunk in designing and maintaining production-quality dashboard
  • Involved in Creating and managing apps, creating user, role, Permissions to knowledge objects and providing user and role access permissions to the other teams
  • Monitor the Splunk infrastructure and collaborate with respective teams to design and customize complex search queries, promote advanced searching and data analytics
  • Developed custom dashboards, data models, and reports and optimized their performance
  • Created alerts based on the developed search strings and scheduled them in order to monitor the performance of Splunk environment and report immediately in-case any alert triggered
  • Used to create and perform complex search queries using Splunk REST API's
  • Guided and supported Splunk project teams on complex solution and issue resolution related to the dashboards and reports
  • Assisted internal teams to on-board data, create various knowledge objects, install and maintain the Splunk-Apps in the environment
  • Worked closely with business partners and other teams in addressing their queries related to dashboards
  • Created scripts to automate jobs and kept in action options when the alert is triggered
  • Created and monitored Splunk DB connected identities, Database connections, Database inputs, outputs and lookups
  • Managed indexer clusters which include hot, warm and cold buckets management and retention policies
  • Worked on log parsing and created well-structured search queries in order to minimize the performance issues
  • Assisting internally and monitoring the current infrastructure to ensure Splunk is active and accurately running
  • Monitored the clustered indexes and reduced the Splunk instance errors on them by modifying the configuration files like inputs.conf, output.conf and props.conf
  • Created SQL queries to perform data validations
  • Environment: SPLUNK6.5.4 and6.4, Oracle11g, SQL, Linux, shell scripting

Education

Master of Science - Computer And Information Systems Security

University of The Cumberlands
Williamsburg, KY
05-2021

Master of Science - Mechanical Engineering

University of Dayton
Dayton, OH
05-2016

Bachelor of Science - Mechanical Engineering

GRIET
05-2014

Skills

  • Languages: SPL, Unix Shell Scripting, Python Scripting
  • Web Technologies: HTML, XML, HTTP, JSON
  • Tools: Splunk6x/7x/8x, Splunk DBConect, JIRA, BitBucket, Git, Confluence, ServiceNow, HP Center, Cribl, Splunk Enterprise Security
  • Operating Systems: Linux, Unix, Windows

Certification


Splunk Enterprise Certified Admin | Credential ID Cert-278598

Splunk Core Certified Advanced Power User | Cert-294091

Cribl Certified Observability Engineer (CCOE) | Credential ID 62852337

Cribl Certified Observability Engineer (CCOE) Stream User |Credential ID 53679741

Cribl Certified Admin - EdgeCribl | Credential ID 106843852

Languages

English
Full Professional
Hindi
Full Professional
Telugu
Native or Bilingual

Timeline

Splunk Operations L3 Engineer

Deutsche Bank
12.2022 - Current

Splunk Admin/Developer

JP Morgan & Chase
02.2018 - 12.2022

Splunk Developer

CenturyLink
10.2016 - 02.2018

Master of Science - Computer And Information Systems Security

University of The Cumberlands

Master of Science - Mechanical Engineering

University of Dayton

Bachelor of Science - Mechanical Engineering

GRIET
PRADEEP REDDY AYYAPPAVANDLA