Displaying 1 to 20 from 38 results

streamalert - StreamAlert is a serverless, realtime data analysis framework which empowers you to ingest, analyze, and alert on data from any environment, using datasources and alerting logic you define

  •    Python

StreamAlert is a serverless, realtime data analysis framework which empowers you to ingest, analyze, and alert on data from any environment, using datasources and alerting logic you define.

kinesalite - An implementation of Amazon's Kinesis built on LevelDB

  •    Javascript

An implementation of Amazon's Kinesis, focussed on correctness and performance, and built on LevelDB (well, @rvagg's awesome LevelUP to be precise).The Kinesis equivalent of dynalite.




serverless-analytics - Track website visitors with Serverless Analytics using Kinesis, Lambda, and TypeScript

  •    CSS

Example project for a personal serverless Google Analytics clone to track website visitors.After deploying the service you will have an HTTP endpoint using Amazon API Gateway that accepts requests and puts them into a Kinesis Stream. A Lambda function processes the stream and writes basic metrics about how many visitors you have per absolute URL to DynamoDB.

kinesis - A Node.js stream implementation of Amazon's Kinesis

  •    Javascript

A Node.js stream implementation of Amazon's Kinesis.Allows the consumer to pump data directly into (and out of) a Kinesis stream.

kine - Kine makes reading from an aws kinesis stream easy.

  •    Javascript

Kine makes reading from an aws kinesis stream easy.See API.md for complete reference.


kinesis-producer - An aggregated records producer for Amazon Kinesis

  •    Go

A KPL-like batch producer for Amazon Kinesis built on top of the official Go AWS SDK and using the same aggregation format that KPL use.

registry - Schema Registry

  •    Java

Registry is a versioned entity framework that allows to build various registry services such as Schema Registry, ML Model Registry etc..

lambda-streams-to-firehose - AWS Lambda function to forward Stream data to Kinesis Firehose

  •    Javascript

Amazon Kinesis Firehose simplifies delivery of streaming data to Amazon S3 and Amazon Redshift with a simple, automatically scaled, and zero operations requirement. Where customers have existing systems built on streaming interfaces, the addition of Firehose can enable simple archive, or be used to facilitate long term analysis of data from Amazon Redshift. Integration can be accomplished by using the Kinesis Agent to automatically publish file data to Amazon Kinesis Streams and/or Amazon Kinesis Firehose delivery streams. This project includes an AWS Lambda function that enables customers who are already using Amazon Kinesis Streams for real time processing to take advantage of Amazon Kinesis Firehose. Furthermore, if you are using Amazon DynamoDB and would like to store a history of changes made to the table, this function can push events to Amazon Kinesis Firehose.In order to effectively use this function, you should already have configured an Amazon Kinesis Stream or an Amazon DynamoDB Table with Update Streams, as well as an Amazon Kinesis Firehose Delivery Stream of the correct name. For Amazon Kinesis Streams, please ensure that producer applications can write to the Stream, and that the Amazon Kinesis Firehose Delivery Stream is able to deliver data to Amazon S3 or Amazon Redshift. This function makes no changes to Stream or Firehose configurations. You must also have the AWS Command Line Interface (https://aws.amazon.com/cli) installed to take advantage of the Stream Tagging utility supplied.

aws-lambda-fanout - A sample AWS Lambda function that accepts messages from an Amazon Kinesis Stream and transfers the messages to another data transport

  •    Javascript

This function answers a need I have had multiple times, where I want to replicate data from an Amazon Kinesis Stream to another account or another region for processing, or to another environment such as development.This AWS Lambda function can be used to propagate incoming messages from Amazon Kinesis Streams or Amazon DynamoDB Streams to other services (Amazon SNS, Amazon SQS, Amazon Elasticsearch Service, Amazon Kinesis Streams, Amazon Kinesis Firehose, AWS IoT, AWS Lambda, Amazon ElastiCache for Memcached and Redis), regions or accounts. This function generates metrics that will be published to Amazon CloudWatch Metrics, in the Custom/FanOut namespace.

amazon-kinesis-client-nodejs - Amazon Kinesis Client Library for Node.js

  •    Javascript

This package provides an interface to the Amazon Kinesis Client Library (KCL) MultiLangDaemon for the Node.js framework.Developers can use the KCL to build distributed applications that process streaming data reliably at scale. The KCL takes care of many of the complex tasks associated with distributed computing, such as load-balancing across multiple instances, responding to instance failures, checkpointing processed records, and reacting to changes in stream volume.

kinesis-readable - Simple readable stream client for AWS Kinesis

  •    Javascript

Node.js stream interface for reading records from AWS Kinesis.

nodejs-kinesis-client-library - Node.js implementation of Amazon's Kinesis Client Library.

  •    TypeScript

Based on the AWS Kinesis Client Library for Java, reimplemented in Node.js. Install with npm install kinesis-client-library --save.

lambda-kinesis-bigquery - AWS Lambda function for insert Kinesis event to Google BigQuery

  •    Javascript

AWS Lambda function for insert Kinesis event to Google BigQuery example. Wait role is created.

kinesis-consumer - Golang library for consuming Kinesis stream data

  •    Go

Kinesis consumer applications written in Go. This library is intended to be a lightweight wrapper around the Kinesis API to read records, save checkpoints (with swappable backends), and gracefully recover from service timeouts/errors. Kinesis to Firehose can be used to archive data directly to S3, Redshift, or Elasticsearch without running a consumer application.

data-migrator - A declarative data-migration package

  •    Python

Data-migrator (version 0.6.3.dev2) is a simple data-migration package for python lovers. Data-migrator is a declarative DSL for table driven data transformations, set up as an open and extensive system. Use this to create data transformations for changing databases as a result of changing code, initial loads to datalakes (it contains a Kinesis provider)and more.

kinesis-streams-fan-out-kinesis-analytics - Amazon Kinesis Streams fan-out via Kinesis Analytics (powered by the Serverless Framework)

  •    Javascript

Amazon Kinesis Analytics can fan-out your Kinesis Streams and avoid read throttling. Each Kinesis Streams shard can support a maximum total data read rate of 2 MBps (max 5 transactions), and a maximum total data write rate of 1 MBps (max 1,000 records). Even if you provision enough write capacity, you are not free to connect as many consumers as you'd like, especially with AWS Lambda, because you'll easily reach the read capacity.

aws-lambda-go-event - Type definitions for AWS Lambda event sources.

  •    Go

Type definitions and helpers for AWS Lambda event sources. AWS Lambda lets you run code without provisioning or managing servers. With eawsy/aws-lambda-go-shim, you can author your Lambda function code in Go. This project provides type definitions and helpers to deal with AWS Lambda event source mapping.