Displaying 1 to 20 from 61 results

elk-docker - Elasticsearch, Logstash, Kibana (ELK) Docker image


This Docker image provides a convenient centralised log server and log management web interface, by packaging Elasticsearch, Logstash, and Kibana, collectively known as ELK. See the ELK Docker image documentation web page for complete instructions on how to use this image.

docker-elk - The ELK stack powered by Docker and Compose.


Run the latest version of the Elastic stack with Docker and Docker Compose. It will give you the ability to analyze any data set by using the searching/aggregation capabilities of Elasticsearch and the visualization power of Kibana.

VulnWhisperer - Create actionable data from your Vulnerability Scans


VulnWhisperer is a vulnerability data and report aggregator. VulnWhisperer will pull all the reports and create a file with a unique filename which is then fed into logstash. Logstash extracts data from the filename and tags all of the information inside the report (see logstash_vulnwhisp.conf file). Data is then shipped to elasticsearch to be indexed. The following instructions should be utilized as a Sample Guide in the absence of an existing ELK Cluster/Node. This will cover a Debian example install guide of a stand-alone node of Elasticsearch & Kibana.




LogTrail - Log Viewer plugin for Kibana


LogTrail is a plugin for Kibana to view, analyze, search and tail log events from multiple hosts in realtime with devops friendly interface inspired by Papertrail.

node-logstash - Simple logstash implmentation in nodejs : file log collection, sent with zeromq


It's a NodeJS implementation of Logstash. node-logstash is a tool to collect logs on servers. It allows sending its logs to a central server and to ElasticSearch for indexing.

kibi - Kibi is a friendly - kept in sync - Kibana fork which add support for joins across indexes and external sources, tabbed navigation interface and more


Kibi extends Kibana 5.5.2 with data intelligence features; the core feature of Kibi is the capability to join and filter data from multiple Elasticsearch indexes and from SQL/NOSQL data sources ("external queries").In addition, Kibi provides UI features and visualizations like dashboard groups, tabs, cross entity relational navigation buttons, an enhanced search results table, analytical aggregators, HTML templates on query results, and much more.

logstash-logger - Ruby logger that writes logstash events


LogStashLogger extends Ruby's Logger class to log directly to Logstash. It supports writing to various outputs in logstash JSON format. This is an improvement over writing to a file or syslog since Logstash can receive the structured data directly. You can use a URI to configure your logstash logger instead of a hash. This is useful in environments such as Heroku where you may want to read configuration values from the environment. The URI scheme is type://host:port/path?key=value. Some sample URI configurations are given below.


HELK - The Incredible HELK


A Hunting ELK (Elasticsearch, Logstash, Kibana) with advanced analytic capabilities.At the end of the HELK installation, you will have a similar output with the information you need to access the primary HELK components. Remember that the default username and password for the HELK are helk:hunting.

JustLog - JustLog brings logging on iOS to the next level


JustLog takes logging on iOS to the next level. It supports console, file and remote Logstash logging via TCP socket with no effort. Support for logz.io available. At Just Eat, logging and monitoring are fundamental parts of our job as engineers. Whether you are a back-end engineer or a front-end one, you'll often find yourself in the situation where understanding how your software behaves in production is important, if not critical. The ELK stack for real-time logging has gained great adoption over recent years, mainly in the back-end world where multiple microservices often interact with each other.

logstash-gelf - Graylog Extended Log Format (GELF) implementation in Java for all major logging frameworks: log4j, log4j2, java


See also http://logging.paluch.biz/ or http://www.graylog2.org/resources/gelf/specification for further documentation.You need to install the library with its dependencies (see download above) in Glassfish. Place it below the $GFHOME/glassfish/domains/$YOURDOMAIN/lib/ext/ path, then add the Java Util Logging to your logging.properties file.

silk - Silk is a port of Kibana 4 project.


Silk is an open source (Apache Licensed), browser based analytics and search dashboard for Solr. Silk is a snap to setup and start using. Silk strives to be easy to get started with, while also being flexible and powerful.

redis-healthy - It retrieves metrics, periodically, from Redis (or sentinel) and send them to Logstash


It retrieves metrics, periodically, from Redis (or sentinel) (such as latency, connected_clients, instantaneous_ops_per_sec and others) and then send them to Logstash.

nessusbeat - A Beat that monitors a local Nessus reports directory and outputs scan results to Elasticsearch or Logstash


Nessusbeat provides a Beat that monitors a local Nessus installation's reports directory and exports, parses, and outputs scan results to supported Beat outputs. To build the binary for Nessusbeat run the command below. This will generate a binary in the same directory with the name nessusbeat.

dockelk - ELK log transport and aggregation at scale


Clone the repo and run the full stack along with a NGINX container for testing.I've create a dedicated Docker network so each container can have a fix IP.

logstash-output-jdbc - JDBC output for Logstash


This plugin is provided as an external plugin and is not part of the Logstash project. This plugin allows you to output to SQL databases, using JDBC adapters. See below for tested adapters, and example configurations.

docker-gelf-logging-example - An example of an application that uses GELF Docker logging driver to communicate with an ELK instance


This is a really simple application that outputs random fake names using the console different methods: warn,error and log. The output is then redirected to an ELK instance, where we can filter it by log level.