Crawl

  •        0

This is a rewrite of Linley Henzell's game Crawl in C++. Crawl is a rogue-like similar to games like Moria, Angband, and NetHack.

http://crawl5.sourceforge.net

Tags
Implementation
License
Platform

   

comments powered by Disqus


Related Projects

Crawler4j


Crawler4j is an open source Java Crawler which provides a simple interface for crawling the web. Using it, you can setup a multi-threaded web crawler in 5 minutes!

Kaskuslite - Kaskus wrapper PHP class


PHP wrapper untuk mengambil thread dan post di forum kaskus.us. Next, mungkin akan dikembangkan sebagai wrapper khusus vbulletin yg komplit dengan berbagai macam fitur, termasuk authentifikasi, link grabber, determine trending topic, junk post auto marker, dll. Cara Penggunaan : $kaskus = new Kaskus;// Get The Lounge Threads (subforum id = 21)// from page 1 to 5$lounge = $kaskus->crawl_threads(21,1,5);// Read a Thread (Ex: Thread ID = 4163603)$content = $kaskus->read_thread(4163603);// Read a Th

Flaxcrawler - Simple and flexible java web crawler.


Introductionflaxcrawler is an open source web crawler written in Java. It is very fast lightweight multi-threaded crawler, easy to setup and use. You can configure its behaviour with a plenty of settings. Or you can even use your own implementations of the flaxcrawler components. Examplepackage com.googlecode.flaxcrawler.examples;import com.googlecode.flaxcrawler.CrawlerConfiguration;import com.googlecode.flaxcrawler.CrawlerController;import com.googlecode.flaxcrawler.CrawlerException;import com

Lightcrawler - Open Source Crawler for Java


Light CrawlerAn Open Source Crawler for Java based on Crawler4j. Crawler4j is a good project which is smart and easy to use. We learn more from Crawler4j. It helps us a lot. We add some useful tools, made changes to rebuild it as LightCrawler for share and study. LightCrawler can control the depth of the crawler. Crawler will stop at the pointed depth. LightCrawler can choose which url should crawl and which should not crawl by config forbidden regex and allowed regex. LightCrawler is also Multi

Pupsniffer - Parallel URL Pattern (Pup) Sniffer


This is Pup(Parallel URL Pattern) Sniffer, An Efficient Multilingual Web Corpus Tool. What is itAn implementation and enhancement based on the following paper (Kit and Ng 2007): • Chunyu Kit and Jessica Y. H. Ng. 2007. An intelligent Web agent to mine bilingual parallel pages via automatic discovery of URL pairing patterns. In 2007 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology - Workshops: Workshop on Agents and Data Mining Interaction (ADMI-07), p

Webscraping - Python library for web scraping


OverviewThe webscraping library aims to make web scraping easier. All code is pure Python and has been run across multiple Linux servers, Windows machines, as well as Google App Engine. Examplescommon>>> from webscraping import common>>> common.remove_tags('hello <b>world</b>!')'hello world!'>>> common.extract_domain('http://www.google.com.au/tos.html')'google.com.au'>>> common.unescape('&lt;hello&nbsp;&amp;&nbsp;world&gt;')'<hello & world>'>>> common.extract_emails('hello richard AT sitescraper

Onestopshop - Aardwolf Helper Program


I started this to just crawl or scan the Aardwolf wiki to quick bring out map links and keep them available. Its features kept growing. Here is a small list. 1. Read Areas/Spell&Skills/Clans down from wiki each time you request them so it stays up to date with the games wiki. If a area is added to the wiki One Stop Shop will scan it in. 2. Links to all the various websites that can be opened in OSS's main browswer. 3. Section for people new to mudding or Aardwolf. 4. Program can stay on top of o

Py-picturegallery - Create picture galleries from your flickr photos


Create offline browseable Galleries from flickr photosThis package started out as a backup script I ran regularly to recover my photos stored on flickr. ./backup.py --cfg=/path/to/configwill crawl flickr for all sets and photos within sets in one account. As a first extension, I created html pages to browse the backup. Once the backup grew too big to fit on one cd, I had to come up with some scripts to create configurable portions of the backup. This is what this software does. ./gallery.py --cf

Aardwolfoss - One Stop Shop Aardwolf Helper Program


I started this to just crawl or scan the Aardwolf wiki to quick bring out map links and keep them available. Its features kept growing. Here is a small list. 1. Read Areas/Spell&Skills/Clans down from wiki each time you request them so it stays up to date with the games wiki. If a area is added to the wiki One Stop Shop will scan it in. 2. Links to all the various websites that can be opened in OSS's main browswer. 3. Section for people new to mudding or Aardwolf. 4. Program can stay on top of o