Displaying 1 to 5 from 5 results

smart_open - Utils for streaming large files (S3, HDFS, gzip, bz2...)

  •    Python

There are a few optional keyword arguments that are useful only for S3 access. These are both passed to boto.s3_connect() as keyword arguments. The S3 reader supports gzipped content, as long as the key is obviously a gzipped file (e.g. ends with ".gz").

node-webhdfs - A WebHDFS module for Node.js.

  •    Javascript

I am currently following and testing against the WebHDFS REST API documentation for the 1.2.1 release, by Apache. Make sure you enable WebHDFS in the hdfs site configuration file. I use Mocha and should.js for unit testing. They will be required if you want to run the unit tests. To execute the tests, simply npm test, but install the requirements first. You will also likely need to adjust the constants in the test file first (or have a username "ryan" setup for hosts "endpoint1" and "endpoint2").

webhdfs - Node.js WebHDFS REST API client

  •    Javascript

Hadoop WebHDFS REST API (2.2.0) client library for node.js with fs module like (asynchronous) interface.

pyhdfs - Python HDFS client

  •    Python

Because the world needs yet another way to talk to HDFS from Python. This library provides a Python client for WebHDFS. NameNode HA is supported by passing in both NameNodes. Responses are returned as nice Python classes, and any failed operation will raise some subclass of HdfsException matching the Java exception.




BigInsights-on-Apache-Hadoop - Example projects for 'BigInsights for Apache Hadoop' on IBM Bluemix

  •    

This repository contains example projects for BigInsights. Following the steps below on your client machine, it should take you less than 5 minutes to run any of the example projects against a BigInsights cluster. The projects are tested on BigInsights on IBM Bluemix but they should also work for BigInsights on-premise. Note that all of the examples are community supported.