Displaying 1 to 20 from 22 results

amazon-sagemaker-examples - Example 📓 Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using 🧠 Amazon SageMaker

  •    Jupyter

Example Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using Amazon SageMaker. Amazon SageMaker is a fully managed service for data science and machine learning (ML) workflows. You can use Amazon SageMaker to simplify the process of building, training, and deploying ML models.

deep-learning-containers - AWS Deep Learning Containers (DLCs) are a set of Docker images for training and serving models in TensorFlow, TensorFlow 2, PyTorch, and MXNet

  •    Python

AWS Deep Learning Containers (DLCs) are a set of Docker images for training and serving models in TensorFlow, TensorFlow 2, PyTorch, and MXNet. Deep Learning Containers provide optimized environments with TensorFlow and MXNet, Nvidia CUDA (for GPU instances), and Intel MKL (for CPU instances) libraries and are available in the Amazon Elastic Container Registry (Amazon ECR). The AWS DLCs are used in Amazon SageMaker as the default vehicles for your SageMaker jobs such as training, inference, transforms etc. They've been tested for machine learning workloads on Amazon EC2, Amazon ECS and Amazon EKS services as well.

sagemaker-tensorflow-training-toolkit - Toolkit for running TensorFlow training scripts on SageMaker

  •    Python

SageMaker TensorFlow Training Toolkit is an open-source library for using TensorFlow to train models on Amazon SageMaker. For inference, see SageMaker TensorFlow Inference Toolkit.

amazon-sagemaker-examples - Example notebooks that show how to apply machine learning and deep learning in Amazon SageMaker

  •    Jupyter

These examples provide a gentle introduction to machine learning concepts as they are applied in practical use cases across a variety of sectors.These examples provide quick walkthroughs to get you up and running with Amazon SageMaker's custom developed algorithms. Most of these algorithms can train on distributed hardware, scale incredibly well, and are faster and cheaper than popular alternatives.




sagemaker-mxnet-containers - This support code is used for making the MXNet framework run on Amazon SageMaker

  •    Python

SageMaker MXNet Containers is an open source library for making the MXNet framework run on Amazon SageMaker. This repository also contains Dockerfiles which install this library, MXNet, and dependencies for building SageMaker MXNet images.

sagemaker-spark - A Spark library for Amazon SageMaker.

  •    Scala

SageMaker Spark is an open source Spark library for Amazon SageMaker. With SageMaker Spark you construct Spark ML Pipelines using Amazon SageMaker stages. These pipelines interleave native Spark ML stages and stages that interact with SageMaker training and model hosting. With SageMaker Spark, you can train on Amazon SageMaker from Spark DataFrames using Amazon-provided ML algorithms like K-Means clustering or XGBoost, and make predictions on DataFrames against SageMaker endpoints hosting your trained models, and, if you have your own ML algorithms built into SageMaker compatible Docker containers, you can use SageMaker Spark to train and infer on DataFrames with your own algorithms -- all at Spark scale.

sagemaker-tensorflow-containers - This support code is used for making the TensorFlow framework run on Amazon SageMaker

  •    Python

SageMaker TensorFlow Containers is an open source library for making the TensorFlow framework run on Amazon SageMaker. This repository also contains Dockerfiles which install this library, TensorFlow, and dependencies for building SageMaker TensorFlow images.


sagemaker-chainer-container - Docker container for running Chainer scripts to train and host Chainer models on SageMaker

  •    Python

SageMaker Chainer Containers is an open source library for making the Chainer framework run on Amazon SageMaker. This repository also contains Dockerfiles which install this library, Chainer, and dependencies for building SageMaker Chainer images.

sagemaker-mxnet-container - This support code is used for making the MXNet framework run on Amazon SageMaker

  •    Python

SageMaker MXNet Containers is an open source library for making the MXNet framework run on Amazon SageMaker. This repository also contains Dockerfiles which install this library, MXNet, and dependencies for building SageMaker MXNet images.

sagemaker-pytorch-container - Docker container for running PyTorch scripts to train and host PyTorch models on SageMaker

  •    Python

SageMaker PyTorch Container is an open source library for making the PyTorch framework run on Amazon SageMaker. This repository also contains Dockerfiles which install this library, PyTorch, and dependencies for building SageMaker PyTorch images.

sagemaker-tensorflow-container - This support code is used for making the TensorFlow framework run on Amazon SageMaker

  •    Python

SageMaker TensorFlow Containers is an open source library for making the TensorFlow framework run on Amazon SageMaker. This repository also contains Dockerfiles which install this library, TensorFlow, and dependencies for building SageMaker TensorFlow images.

sagemaker-sparkml-serving-container - This code is used to build & run a Docker container for performing predictions against a Spark ML Pipeline

  •    Java

SageMaker SparkML Serving Container lets you deploy an Apache Spark ML Pipeline in Amazon SageMaker for real-time, batch prediction and inference pipeline use-cases. The container can be used to deploy a Spark ML Pipeline outside of SageMaker as well. It is powered by open-source MLeap library. Apache Spark is a unified analytics engine for large scale data processing. Apache Spark comes with a Machine Learning library called MLlib which lets you build ML pipelines using most of the standard feature transformers & algorithms. Apache Spark is well suited for batch processing use-cases and is not the preferred solution for low latency online inference scenarios. In order to perform low latency online prediction, SageMaker SparkML Serving Container leverages an open source library called MLeap.

sagemaker-pipeline - Sagemaker pipeline for AWS Summit New York

  •    Python

This is a sample solution using a SageMaker pipeline. This implementation could be useful for any organization trying to automate their use of Machine Learning. With an implementation like this, any inference is easy, and can simply be queried through an endpoint to receive the output of the model’s inference, tests can be automatically performed for QA, and ML code can be quickly updated to match needs. AWS CloudFormation – AWS::CloudFormation::Interface sets parameter group metadata.

Deep-Learning-With-Deep-Lens

  •    Jupyter

AI, Machine Learning and IoT projects are becoming ever more important for enterprises and startups alike. These advanced technologies have been the key innovation engine for businesses such as Amazon Go, Alexa, Amazon Robotics. In this one-day workshop, we will cover the scenarios of AI and IoT working together. We will provide a hands-on learning experience by build an end-to-end systems for face detection, recognition and verification. The workshop is designed for developers that are curious about these new technologies with no ML background assumed.

aws-ai-ml-workshop-kr - A collection of localized (Korean) AWS AI/ML workshop materials for hands-on labs

  •    Jupyter

A collection of localized (Korean) AWS AI/ML workshop materials for hands-on labs. This repository assumes you have your own AWS account and wish to test SageMaker. If you don't have an AWS account, please follow the below instruction.

datajob - Build and deploy a serverless data pipeline on AWS with no effort.

  •    Python

You can find the full example in examples/data_pipeline_simple. We have a simple data pipeline composed of 2 glue jobs orchestrated sequentially using step functions.

donkeycar-sagemaker - Build an autonomous car using Amazon SageMaker

  •    Go

This is a self-paced workshop designed for anyone who is interested in building self-driving cars using Amazon SageMaker. AWS has published several blog posts that walkthrough the process of building one. In the first blog post of the autonomous vehicle series, you built your Donkey vehicle and deployed your pilot server onto an Amazon EC2 instance. In the second blog post, you learned to drive the Donkey car, and the Donkey car learned to self-drive. In the third blog post, you learned about the process of streaming telemetry from the Donkey vehicle into AWS using AWS IoT. In the forth blog post, you learned the concept of behavioral cloning with Convolutional Neural Networks (CNNs).

aws-customer-churn-pipeline - An End to End Customer Churn Prediction solution using AWS services.

  •    Jupyter

Update the .env file in the main directory. To run with Cox proportional hazard modeling instead of binary logloss set COXPH to "positive". To configure the Github connection manually in the CodeDeploy console, go to Developer Tools -> settings -> connections. This is a one time approval. Install as App or choose existing.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.