Displaying 1 to 20 from 35 results

elk-docker - Elasticsearch, Logstash, Kibana (ELK) Docker image


This Docker image provides a convenient centralised log server and log management web interface, by packaging Elasticsearch, Logstash, and Kibana, collectively known as ELK. See the ELK Docker image documentation web page for complete instructions on how to use this image.

node-logstash - Simple logstash implmentation in nodejs : file log collection, sent with zeromq


It's a NodeJS implementation of Logstash. node-logstash is a tool to collect logs on servers. It allows sending its logs to a central server and to ElasticSearch for indexing.

VulnWhisperer - Create actionable data from your Vulnerability Scans


VulnWhisperer is a vulnerability data and report aggregator. VulnWhisperer will pull all the reports and create a file with a unique filename which is then fed into logstash. Logstash extracts data from the filename and tags all of the information inside the report (see logstash_vulnwhisp.conf file). Data is then shipped to elasticsearch to be indexed. The following instructions should be utilized as a Sample Guide in the absence of an existing ELK Cluster/Node. This will cover a Debian example install guide of a stand-alone node of Elasticsearch & Kibana.

LogTrail - Log Viewer plugin for Kibana


LogTrail is a plugin for Kibana to view, analyze, search and tail log events from multiple hosts in realtime with devops friendly interface inspired by Papertrail.




kibi - Kibi is a friendly - kept in sync - Kibana fork which add support for joins across indexes and external sources, tabbed navigation interface and more


Kibi extends Kibana 5.5.2 with data intelligence features; the core feature of Kibi is the capability to join and filter data from multiple Elasticsearch indexes and from SQL/NOSQL data sources ("external queries").In addition, Kibi provides UI features and visualizations like dashboard groups, tabs, cross entity relational navigation buttons, an enhanced search results table, analytical aggregators, HTML templates on query results, and much more.

logstash-logger - Ruby logger that writes logstash events


LogStashLogger extends Ruby's Logger class to log directly to Logstash. It supports writing to various outputs in logstash JSON format. This is an improvement over writing to a file or syslog since Logstash can receive the structured data directly. You can use a URI to configure your logstash logger instead of a hash. This is useful in environments such as Heroku where you may want to read configuration values from the environment. The URI scheme is type://host:port/path?key=value. Some sample URI configurations are given below.

HELK - The Incredible HELK


A Hunting ELK (Elasticsearch, Logstash, Kibana) with advanced analytic capabilities.At the end of the HELK installation, you will have a similar output with the information you need to access the primary HELK components. Remember that the default username and password for the HELK are helk:hunting.

logstash-gelf - Graylog Extended Log Format (GELF) implementation in Java for all major logging frameworks: log4j, log4j2, java


See also http://logging.paluch.biz/ or http://www.graylog2.org/resources/gelf/specification for further documentation.You need to install the library with its dependencies (see download above) in Glassfish. Place it below the $GFHOME/glassfish/domains/$YOURDOMAIN/lib/ext/ path, then add the Java Util Logging to your logging.properties file.


silk - Silk is a port of Kibana 4 project.


Silk is an open source (Apache Licensed), browser based analytics and search dashboard for Solr. Silk is a snap to setup and start using. Silk strives to be easy to get started with, while also being flexible and powerful.

redis-healthy - It retrieves metrics, periodically, from Redis (or sentinel) and send them to Logstash


It retrieves metrics, periodically, from Redis (or sentinel) (such as latency, connected_clients, instantaneous_ops_per_sec and others) and then send them to Logstash.

nessusbeat - A Beat that monitors a local Nessus reports directory and outputs scan results to Elasticsearch or Logstash


Nessusbeat provides a Beat that monitors a local Nessus installation's reports directory and exports, parses, and outputs scan results to supported Beat outputs. To build the binary for Nessusbeat run the command below. This will generate a binary in the same directory with the name nessusbeat.

dockelk - ELK log transport and aggregation at scale


Clone the repo and run the full stack along with a NGINX container for testing.I've create a dedicated Docker network so each container can have a fix IP.

logstash-output-jdbc - JDBC output for Logstash


This plugin is provided as an external plugin and is not part of the Logstash project. This plugin allows you to output to SQL databases, using JDBC adapters. See below for tested adapters, and example configurations.

docker-gelf-logging-example - An example of an application that uses GELF Docker logging driver to communicate with an ELK instance


This is a really simple application that outputs random fake names using the console different methods: warn,error and log. The output is then redirected to an ELK instance, where we can filter it by log level.

logger_logstash_backend - Logstash backend for the Elixir Logger


A backend for the Elixir Logger that will send logs to the Logstash UDP input. Then run mix deps.get to install it.

elk-windows-installer - Elasticsearch Logstash Kibana Windows Installer


Here you can find an installer for the ELK stack (Elasticsearch - Logstash - Kibana) for Windows. There are a few tutorials on the internet that describe how to do this operation manually. This installer is designed to install the required files and install the ELK services on the system hopefully saving you some time in the process. You can download the installer from the releases section.

logstash-laravel-logs - Process Laravel Log files on Logstash and forward to ElasticSearch


This will parse the contents of laravel.log sample file. Of course you can replace that file with your actual access log, or specify a different filename. this will show a result from ElasticSearch, with the tokens of log file broken down to each key.