requests-threads - 🎭 Twisted Deferred Thread backend for Requests.

  •        25

This repo contains a Requests session that returns the amazing Twisted's awaitable Deferreds instead of Response objects.This example works on both Python 2 and Python 3.

https://github.com/requests/requests-threads

Tags
Implementation
License
Platform

   




Related Projects

grab - Web Scraping Framework

  •    Python

Project Grab is not abandoned but it is not being actively developed. At current time I am working on another crawling framework which I want to be simple, fast and does not leak memory. New project is located here: https://github.com/lorien/crawler First, I've tried to use mix of asyncio (network) and classic threads (parsing HTML with lxml on multiple CPU cores) but then I've decided to use classic threads for everything for the sake of simplicity. Network requests are processed with pycurl because it is fast, feature-rich and supports socks5 proxies. You can try new framework but be aware it does not have many features yet. In particular, its options to configure network requests are very pure. If you need some option, feel free to create new issue.

learnrxjava - A set of exercises to designed to teach developers Rx (Reactive Streams)

  •    Java

A sequential program runs on a flat timeline. Each task is only started after the previous one completes. In concurrent programs, multiple tasks may be running during the same time period and a new task may begin at any time. In threaded programs, introducing concurrency trades space for time. Allocating memory for more threads allows application servers to make network requests concurrently instead of sequentially. Threaded network requests can dramatically reduce server response times, but like all trade-offs this approach has its limits. Unchecked thread creation can cause a server to run out of memory or to spend too much time to context switching. Thread pools can help manage these problems, but under heavy load the number of threads in an application server’s pool will eventually be exhausted. When this happens network requests will be serialized, causing response times to rise. At this point the only way to bring down response times again is to scale up more servers, which increases costs.

droopescan - A plugin-based scanner that aids security researchers in identifying issues with several CMSs, mainly Drupal & Silverstripe

  •    Python

The master branch corresponds to the latest release (what is in pypi). Development branch is unstable and all pull requests must be made against it. More notes regarding installation can be found here. Droopescan aims to be the most accurate by default, while not overloading the target server due to excessive concurrent requests. Due to this, by default, a large number of requests will be made with four threads; change these settings by using the --number and --threads arguments respectively.

grequests - Requests + Gevent = <3

  •    Python

GRequests allows you to use Requests with Gevent to make asynchronous HTTP Requests easily. Note: You should probably use requests-threads or requests-futures instead.

einhorn - Einhorn: the language-independent shared socket manager

  •    Ruby

Let's say you have a server process which processes one request at a time. Your site is becoming increasingly popular, and this one process is no longer able to handle all of your inbound connections. However, you notice that your box's load number is low.So you start thinking about how to handle more requests. You could rewrite your server to use threads, but threads are a pain to program against (and maybe you're writing in Python or Ruby where you don't have true threads anyway). You could rewrite your server to be event-driven, but that'd require a ton of effort, and it wouldn't help you go beyond one core. So instead, you decide to just run multiple copies of your server process.


Volley-demo - An demonstration of Volley - HTTP library announced by google in I/O 2013

  •    Java

An demonstration of Volley - HTTP library announced by google in I/O 2013. Android has provided two HTTP Clients AndroidHttpClient (Extended from apache HTTPClient) and HttpUrlConnection to make a HTTP Request. Both has its own pros and cons. When an application is developed, we write HTTP connection classes which handles all the HTTP requests, creating THREADS to run in background, managing THREAD pool, response parsing, response caching, handling error codes, SSL connections, running requests in parallel and others stuffs around that. Every developer has his own way of implementing these functionalities.Some might use AsycnTask for running network operations in background, or some might use passing handlers created from UI thread to HTTP connection classes which then executes network operation in worker thread and uses the handler to pass back the parsed HTTP response back to the main thread.

malsub - A Python RESTful API framework for online malware analysis and threat intelligence services

  •    Python

malsub is a Python 3.6.x framework that wraps several web services of online malware and URL analysis sites through their RESTful Application Programming Interfaces (APIs). It supports submitting files or URLs for analysis, retrieving reports by hash values, domains, IPv4 addresses or URLs, downloading samples and other files, making generic searches and getting API quota values. The framework is designed in a modular way so that new services can be added with ease by following the provided template module and functions to make HTTP GET and POST requests and to pretty print results. This approach avoids having to write individual and specialized wrappers for each and every API by leveraging what they have in common in their calls and responses. The framework is also multi-threaded and dispatches service API functions across a thread pool for each input argument, meaning that it spawns a pool of threads per each file provided for submission or per each hash value provided for report retrieval, for example. Most of these services require API keys that are generated after registering an account in their respective websites, which need to be specified in the apikey.yaml file according to the given structure. Note that some of the already bundled services are limited in supported operations due to the fact that they were developed with free API keys. API keys associated with paid subscriptions are allowed to make additional calls not open to the public and may not be restricted by a given quota. Yet, malsub can process multiple input arguments and pause between requests as a workaround for cooldown periods.

Odyssey - Scalable PostgreSQL connection pooler

  •    C

Advanced multi-threaded PostgreSQL connection pooler and request router. Odyssey can significantly scale processing performance by specifying a number of additional worker threads. Each worker thread is responsible for authentication and proxying client-to-server and server-to-client requests. All worker threads are sharing global server connection pools. Multi-threaded design plays important role in SSL/TLS performance.

requests-async - async-await support for `requests`. ✨ 🍰 ✨

  •    Python

We now recommend using httpx.AsyncClient() for async/await support with a requests-compatible API. Note: Use ipython to try this from the console, since it supports await.

justice - Embeddable script for displaying web page performance metrics.

  •    Javascript

The FPS meter does not support budgets at this time but does color code the spline (or optional dot type) chart, assuming a goal of 60 FPS. The "requests" metric also supports budgets and is unique in that it polls for changes to that metric over time, since most webpages these days load a boat load of stuff async after page load. In the future it would be possible to seperate this metric into "Sync Requests" and "Async Requests". But not right now.

treq - Python requests like API built on top of Twisted's HTTP client.

  •    Python

treq is an HTTP library inspired by requests but written on top of Twisted's Agents. It provides a simple, higher level API for making HTTP requests when using Twisted.

restcommander - Fast Parallel Async HTTP client as a Service to monitor and manage 10,000 web servers

  •    Java

Formerly known as REST Superman. Fire thousands of HTTP requests and aggregate responses in a couple of clicks in seconds. Please check detail instructions, screenshots, documentations, sample code, REST APIs, and demos at www.restcommander.com and its demo video. What's new? Check related work review on efficient HTTP clients and concurrency and throttling model in Akka at eBay tech blog. REST Commander is a fast parallel async HTTP/REST/SOAP client as a service to monitor and manage 10,000s of web servers. Sends requests to 1000 servers with response aggregation in 10 seconds. or 10,000 servers in 50 seconds.

requests-respectful - Minimalist Requests wrapper to work within rate limits of any amount of services simultaneously

  •    Python

If you know Python, you know Requests. Requests is love. Requests is life. Depending on your use cases, you may come across scenarios where you need to use Requests a lot. Services you consume may have rate-limiting policies in place or you may just happen to be in a good mood and feel like being a good Netizen. This is where requests-respectful can come in handy. The library auto-detects the presence of a YAML file named requests-respectful.config.yml at the root of your project and will attempt to load configuration values from it.

cpulimit - CPU usage limiter for Linux

  •    C

Cpulimit is a tool which limits the CPU usage of a process (expressed in percentage, not in CPU time). It is useful to control batch jobs, when you don't want them to eat too many CPU cycles. The goal is prevent a process from running for more than a specified time ratio. It does not change the nice value or other scheduling priority settings, but the real CPU usage. Also, it is able to adapt itself to the overall system load, dynamically and quickly. The control of the used CPU amount is done sending SIGSTOP and SIGCONT POSIX signals to processes. All the children processes and threads of the specified process will share the same percentage of CPU. Developed by Angelo Marletta. Please send your feedback, bug reports, feature requests or just thanks.

gino - GINO Is Not ORM - a Python asyncio ORM on SQLAlchemy core.

  •    Python

GINO - GINO Is Not ORM - is a lightweight asynchronous ORM built on top of SQLAlchemy core for Python asyncio. Now (early 2018) GINO supports only one dialect asyncpg. There are a few tasks in GitHub issues marked as help wanted. Please feel free to take any of them and pull requests are greatly welcome.

curio - Curio - The Small Coroutine Library You Were Warned About

  •    Python

Curio is a library of building blocks for performing concurrent I/O and common system programming tasks such as launching subprocesses, working with files, and farming work out to thread and process pools. It uses Python coroutines and the explicit async/await syntax introduced in Python 3.5. Its programming model is based on cooperative multitasking and existing programming abstractions such as threads, sockets, files, subprocesses, locks, and queues. You'll find it to be small, fast, and fun. Curio has no third-party dependencies and does not use the standard asyncio module. Most users will probably find it to be a bit too-low level--it's probably best to think of it as a library for building libraries. Although you might not use it directly, many of its ideas have influenced other libraries with similar functionality.

Requests - Python HTTP Requests for Humans

  •    Python

Requests allows you to send organic, grass-fed HTTP/1.1 requests, without the need for manual labor. There's no need to manually add query strings to your URLs, or to form-encode your POST data. Keep-alive and HTTP connection pooling are 100% automatic, thanks to urllib3. Requests is one of the most downloaded Python packages of all time, pulling in over 11,000,000 downloads every month.

requests-scala - A Scala port of the popular Python Requests HTTP client: flexible, intuitive, and straightforward to use

  •    Scala

Requests-Scala is a Scala port of the popular Python Requests HTTP client. Requests-Scala aims to provide the same API and user-experience as the original Requests: flexible, intuitive, and straightforward to use. Making your first HTTP request is simple: simply call requests.get with the URL you want, and requests will fetch it for you.

Requests - Requests for PHP is a humble HTTP request library

  •    PHP

Requests is a HTTP library written in PHP, for human beings. It is roughly based on the API from the excellent Requests Python library. Requests is ISC Licensed (similar to the new BSD license) and has no dependencies, except for PHP 5.2+. Despite PHP's use as a language for the web, its tools for sending HTTP requests are severely lacking. cURL has an interesting API, to say the least, and you can't always rely on it being available. Sockets provide only low level access, and require you to build most of the HTTP response parsing yourself.