Epylog - a Syslog parser

  •        6669

Epylog is a syslog parser which runs periodically, looks at your logs, processes some of the entries in order to present them in a more comprehensible format, and then mails you the output. It is written specifically for large network clusters where a lot of machines (around 50 and upwards) log to the same loghost using syslog or syslog-ng.

It publishes reports to a file with optional notification via email.

https://fedorahosted.org/epylog/

Tags
Implementation
License
Platform

   




Related Projects

nxlog - Multi platform Log management


nxlog is a modular, multi-threaded, high-performance log management solution with multi-platform support. In concept it is similar to syslog-ng or rsyslog but is not limited to unix/syslog only. It can collect logs from files in various formats, receive logs from the network remotely over UDP, TCP or TLS/SSL . It supports platform specific sources such as the Windows Eventlog, Linux kernel logs, Android logs, local syslog etc.

Graylog2 - Open Source Log Management


Graylog2 is an open source log management solution that stores your logs in ElasticSearch. It consists of a server written in Java that accepts your syslog messages via TCP, UDP or AMQP and stores it in the database. The second part is a web interface that allows you to manage the log messages from your web browser. Take a look at the screenshots or the latest release info page to get a feeling of what you can do with Graylog2.

X-Itools: Enterprise Collaboration


Enterprise Collaboration modules and strong Log Analysis modules

Logsandra - log management using Cassandra


Logsandra is a log management application written in Python and using Cassandra as back-end. It is written as demo for cassandra but it is worth to take a look. It provides support to create your own parser.

Flume - Log management using HDFS


Flume is a distributed, reliable, and available service for efficiently collecting, aggregating, and moving large amounts of log data. It has a simple and flexible architecture based on streaming data flows. It is robust and fault tolerant with tunable reliability mechanisms and many failover and recovery mechanisms. It uses a simple extensible data model that allows for online analytic application.



Chainsaw - log viewer and analysis tool


Chainsaw is a companion application to Log4j written by members of the Log4j development community. Chainsaw can read log files formatted in Log4j's XMLLayout, receive events from remote locations, read events from a DB, it can even work with the JDK 1.4 logging events.

Clarity - Web interface for the grep


Clarity is a Splunk like web interface for your server log files. It supports searching (using grep) as well as trailing log files in realtime. It has been written using the event based architecture based on EventMachine and so allows real-time search of very large log files.

Fluentd - Data collector, Log Everything in JSON


Fluentd is an event collector system. It is a generalized version of syslogd, which handles JSON objects for its log messages. It collects logs from various data sources and writes them to files, database or other types of storages.

Octopussy - Perl/XML Logs Analyzer, Alerter & Reporter


Octopussy is a Log analyzer tool. It analyzes the log, generates reports and alerts the admin. It has LDAP support to maintain users list. It exports report by Email, FTP & SCP. Scheduled reports could be generated. RRD tool to generate graphs.

White-elephant - Hadoop log aggregator and dashboard


White Elephant is a Hadoop log aggregator and dashboard which enables visualization of Hadoop cluster utilization across users. The server is a JRuby web application. In a production environment it can be deployed to tomcat and reads aggregated usage data directly from Hadoop. This data is stored in an in-memory database provided by HyperSQL. Charting is provided by Rickshaw. This project is developed by LinkedIn.

pocket-playlab-challenge


From the challenge>The objective of this challenge is to parse a log file and do some analysis on it. it is clear that, we would require a parser and a file reader. So I created two classes* ``KamranAhmed\Parser`` which I use to parse the logs and get the chunks of data using some regular expressions. I considered the following format for the log**Log format**```{timestamp} {source}[{process}]: at={log_level} method={http_method} path={http_path} host={http_host} fwd={client_ip} dyno={responding

Kafka - A high-throughput distributed messaging system


Kafka provides a publish-subscribe solution that can handle all activity stream data and processing on a consumer-scale web site. This kind of activity (page views, searches, and other user actions) are a key ingredient in many of the social feature on the modern web. This data is typically handled by "logging" and ad hoc log aggregation solutions due to the throughput requirements. This kind of ad hoc solution is a viable solution to providing logging data to Hadoop.

Sentry - Realtime Platform-Agnostic Error Logging and Aggregation platform


Sentry is a realtime event logging and aggregation platform. It specializes in monitoring errors and extracting all the information needed to do a proper post-mortem without any of the hassle of the standard user feedback loop.

Webalizer - fast web server log file analysis


The Webalizer is a fast web server log file analysis program. It produces highly detailed, easily configurable usage reports in HTML format, for viewing with a standard web browser. It handles standard Common logfile format (CLF) server logs, several variations of the NCSA Combined logfile format, wu-ftpd/proftpd xferlog (FTP) format logs, Squid proxy server native format, and W3C Extended log formats.

Awstats - Advanced web, streaming, ftp and mail server statistics


AWStats is a powerful tool that generates advanced web, streaming, ftp or mail server statistics graphically. It can analyze log files from all major server tools like Apache log files, WebStar, IIS and a lot of other web, proxy, wap, streaming servers, mail servers and some ftp servers. This log analyzer works as a CGI or from command line and shows you all possible information your log contains, in few graphical web pages.

clash - Clojure Log Analysis Shell - interactive log analysis with clojure


Clojure Log Analysis Shell - interactive log analysis with clojure

perl-dpkg-log - Dpkg::Log - Perl library for dpkg log parsing and analysis


Dpkg::Log - Perl library for dpkg log parsing and analysis

Indihiang - IIS and Apache log analyzing tool


Indihiang Project is a web log analyzing tool. This tool analyzes IIS and Apache Web logs and generates real time reports. It has Web Log Viewer and analyzer. It is capable to analyze the trend from the logs. This tool also integrate with windows Explorer so you can attach a log file in to indihiang tool via context menu.

squidreporter - Squid reporter - squid log analysis using mysql as log backend


Squid reporter - squid log analysis using mysql as log backend

SCRUBS: SQL Reporting Services audit, log, management & optimization analysis


SCRUBS your SQL Reporting Services Logs to provide management, auditing & optimization reporting. SSRS provides robust logging in the Execution Log, but no management metrics in the box. You'd have to develop your own DW, SSIS procedures and metrics for reporting. Not anymore!