Log Analyzer

  •        50

This project has the aim to help developers to see live log/trace from their application applying visual styles to the grabbed text.




Related Projects

LogAnalyzer - Log parser to extract information from Apache server logs.

Log parser to extract information from Apache server logs.

loganalyzer - Hobby project for analyzing textual log files for semantic correlations

Hobby project for analyzing textual log files for semantic correlations


Parses a linux log (for example /var/log/syslog) and puts it in a mongodb for further analysis.

loganalyzer - Unofficial git repository for LogAnalyzer (http://loganalyzer.adiscon.com)

Unofficial git repository for LogAnalyzer (http://loganalyzer.adiscon.com)

limp - Loganalyzer IMProved

Loganalyzer IMProved

osm-redaction-loganalyzer - Analyzer for logs of the openstreetmap redaction bot

Analyzer for logs of the openstreetmap redaction bot

couchdb-lager - Mirror of Apache CouchDB

for the backend:```erlang{lager, [ {handlers, [ {lager_console_backend, [info, {lager_default_formatter, [time," [",severity,"] ", message, "\n"]}]}, {lager_file_backend, [{file, "error.log"}, {level, error}, {formatter, lager_default_formatter}, {formatter_config, [date, " ", time," [",severity,"] ",pid, " ", message, "\n"]}]}, {lager_file_backend, [{file, "console.log"}, {level, info}]} ]}]}.```Included is lager_default_formatter. This provides a generic, default formatting fo

nxlog - Multi platform Log management

nxlog is a modular, multi-threaded, high-performance log management solution with multi-platform support. In concept it is similar to syslog-ng or rsyslog but is not limited to unix/syslog only. It can collect logs from files in various formats, receive logs from the network remotely over UDP, TCP or TLS/SSL . It supports platform specific sources such as the Windows Eventlog, Linux kernel logs, Android logs, local syslog etc.

fluent-plugin-detect-exceptions - A fluentd plugin that scans line-oriented log streams and combines exceptions stacks into a single log entry

fluent-plugin-detect-exceptions is an output plugin for fluentd which scans a log stream text messages or JSON records for multi-line exception stack traces: If a consecutive sequence of log messages forms an exception stack trace, the log messages are forwarded as a single, combined log message. Otherwise, the input log data is forwarded as is.Text log messages are assumed to contain single lines and are combined by concatenating them.


From the challenge>The objective of this challenge is to parse a log file and do some analysis on it. it is clear that, we would require a parser and a file reader. So I created two classes* ``KamranAhmed\Parser`` which I use to parse the logs and get the chunks of data using some regular expressions. I considered the following format for the log**Log format**```{timestamp} {source}[{process}]: at={log_level} method={http_method} path={http_path} host={http_host} fwd={client_ip} dyno={responding

Flux-Log - Flux-Log - Flux log storage and log readers

Flux-Log - Flux log storage and log readers

Graylog2 - Open Source Log Management

Graylog2 is an open source log management solution that stores your logs in ElasticSearch. It consists of a server written in Java that accepts your syslog messages via TCP, UDP or AMQP and stores it in the database. The second part is a web interface that allows you to manage the log messages from your web browser. Take a look at the screenshots or the latest release info page to get a feeling of what you can do with Graylog2.

Clarity - Web interface for the grep

Clarity is a Splunk like web interface for your server log files. It supports searching (using grep) as well as trailing log files in realtime. It has been written using the event based architecture based on EventMachine and so allows real-time search of very large log files.

funnel - A minimalistic 12 factor log router written in Go

The 12 factor rule for logging says that an app "should not attempt to write to or manage logfiles. Instead, each running process writes its event stream, unbuffered, to stdout." The execution environment should take care of capturing the logs and perform further processing with it. Funnel is this "execution environment".All you have to do from your app is to print your log line to stdout, and pipe it to funnel. You can still use any logging library inside your app to handle other stuff like log level, structured logging etc. But don't bother about the log destination. Let funnel take care whether you want to just write to files or stream your output to Kafka. Think of it as a fluentd/logstash replacement(with minimal features!) but having only stdin as an input.

node-wartremover - stream transform to turn bunyan json log entries into plaintext

Bunyan writes log records as json, and provides a command-line tool to parse the json and generate pretty, colored text for humans.Wartremover allows the human-readable text to be generated inside your server code. It's a stream transform that accepts bunyan json logging and writes human-formatted text.Unlike bunyan, it enforces a "one log entry per line" policy. Every line must be a complete entry, with timestamp and log-level prefix. (An exception is made for stack traces, which are written on

LogTracer - This is Log trace for iOS. it can not only catch general log but also system log.

This is Log trace for iOS. it can not only catch general log but also system log.