batch-shipyard - Execute batch and HPC Dockerized workloads on Azure Batch with shared file system provisioning and linking support

  •        18

Additionally, Batch Shipyard provides the ability to provision and manage entire standalone remote file systems (storage clusters) in Azure, independent of any integrated Azure Batch functionality.Batch Shipyard is now integrated directly into Azure Cloud Shell and you can execute any Batch Shipyard workload using your web browser or the Microsoft Azure Android and iOS app.

https://github.com/Azure/batch-shipyard

Tags
Implementation
License
Platform

   




Related Projects

aztk - On-demand, Dockerized, Spark Jobs on Azure (powered by Azure Batch)


Azure Distributed Data Engineering Toolkit is a python CLI application for provisioning on-demand Spark on Docker clusters in Azure. It's a cheap and easy way to get up and running with a Spark cluster, and a great tool for Spark users who want to experiment and start testing at scale.This toolkit is built on top of Azure Batch but does not require any Azure Batch knowledge to use.

azure-batch-samples - Azure Batch and HPC Code Samples


This GitHub repository contains a set of HPC and Batch related samples that demonstrate the usage of Microsoft Azure Batch services along with some general purpose utilities. See http://azure.microsoft.com/services/batch/ for more information on the Azure Batch service.Before you can interact with the Batch service, you will need a Batch service account. For detailed information on creating a Batch account, see Create and manage an Azure Batch account in the Azure portal.

azure-batch-maya - Cloud rendering from Maya using Azure Batch


This project demonstrates cloud rendering using the Azure Batch service with integrated licensing for Maya, VRay and Arnold.Please note that the Azure Batch licensing service for Maya is currently in preview. For more information and to register your interest, please see rendering.azure.com.

BatchLabs - A client tool to help create, debug and monitor Azure Batch Applications


Note: BatchLabs is in preview.Batch Labs is a tool to manage your Azure Batch accounts. The goal is to implement a great user experience that will help you debug, monitor and manage your pools, jobs and tasks. It will also include expermiental features such as Batch Templates in the aim to improve your Batch experience. We are open to any feedback, ideas and contributions you might have.



azure-batch-apps-blender


This sample is based on the now-deprecated Azure Batch Apps service. The Blender sample is currently being re-written to work directly against Azure Batch. This updated version of the code can be accessed in the following fork while it is under-going development: https://github.com/annatisch/azure-batch-apps-blender/tree/dev/.Please check the issues forum for guidance on using the in-development code and to report any bugs.

azurefile-dockervolumedriver - Docker Volume Driver for Azure File Service over SMB/CIFS :whale:


This is a Docker Volume Driver which uses Azure Storage File Storage to mount file shares on the cloud to Docker containers as volumes. It uses network file sharing (SMB/CIFS protocols) capabilities of Azure File Storage.and be aware of the limitations and what kind of applications are suitable for storing data on Azure File Service.

blobxfer - Azure Storage transfer tool and data movement library


blobxfer is an advanced data movement tool and library for Azure Storage Blob and Files. With blobxfer you can copy your files into or out of Azure Storage with the CLI or integrate the blobxfer data movement library into your own Python scripts.Please refer to the installation guide for more information on how to install blobxfer.

acs-engine - Azure Container Service Engine - a place for community to collaborate and build the best open Docker container infrastructure for Azure


The Azure Container Service Engine (acs-engine) generates ARM (Azure Resource Manager) templates for Docker enabled clusters on Microsoft Azure with your choice of DC/OS, Kubernetes, Swarm Mode, or Swarm orchestrators. The input to the tool is a cluster definition. The cluster definition is very similar to (in many cases the same as) the ARM template syntax used to deploy a Microsoft Azure Container Service cluster.Execute make ci to run the checkin validation tests.

azure-batch-apps-python


The package is to enable Azure Batch Apps customers to interact with the Management API using Python.This client module is designed to work with the applications set up within an existing Batch Apps service. You can upload your Application Image and Cloud Assembly via the Batch Apps Portal. For more information on setting this up, check out this article.

azure-sdk-for-python - Microsoft Azure SDK for Python


This project provides a set of Python packages that make it easy to access Management (Virtual Machines, ...) or Runtime (ServiceBus using HTTP, Batch, Monitor) components of Microsoft Azure Complete feature list of this repo and where to find Python packages not in this repo can be found on our Azure SDK for Python features chapter on ReadTheDocs.The SDK supports Python 2.7, 3.3, 3.4, 3.5 and 3.6.

ACS-Deployment-Tutorial - A tutorial on how to deploy a Dockerised deep learning application on Azure Container Services


Deploying machine learning models can often be tricky due to their numerous dependencies, deep learning models often even more so. One of the ways to overcome this is to use Docker containers. Unfortunately, it is rarely straight-forward. In this tutorial, we will demonstrate how to deploy a pre-trained deep learning model using Azure Container Services, which allows us to orchestrate a number of containers using DC/OS. By using Azure Container Services, we can ensure that it is performant, scalable and flexible enough to accommodate any deep learning framework. The Docker image we will be deploying can be found here. It contains a simple Flask web application with Nginx web server. The deep learning framework we will use is the Microsoft Cognitive Toolkit (CNTK) and we will be using a pre-trained model; specifically the ResNet 152 model.Azure Container Services enables you to configure, construct and manage a cluster of virtual machines pre-configured to run containerized applications. Once the cluster is set up you can use a number of open-source scheduling and orchestration tools, such as Kubernetes and DC/OS. This is ideal for machine learning application since we can use Docker containers which enable us to have ultimate flexibility in the libraries we use and allows us to easily scale up based on demand. While always ensuring that our application remains performant. You can create an ACS through the Azure portal but in this tutorial we will be constructing it using the Azure CLI.

azure-cosmos-db-emulator-docker - Contains Dockerfiles for the Azure Cosmos DB Emulator: https://docs


This repository contains the scripts required to install and run the Azure Cosmos DB Emulator as a Docker container.You can fetch the image from Docker Hub by running docker pull Microsoft/azure-documentdb-emulator.

azure-xplat-cli - Microsoft Azure Cross Platform Command Line


This project provides a cross-platform command line interface for developers and IT administrators to develop, deploy and manage Microsoft Azure applications.Note: The list of features may not be up-to-date. For accurate command details, type azure | azure -h | azure --help to navigate through the help system. Also, use azure config mode asm|arm to switch between service management (Version V1)and resource management (Version V2) of the Azure REST API.

azure-docker-extension - Docker VM Extension for Microsoft Azure :whale:


This repository contains source code for the Microsoft Azure Docker Virtual Machine Extension.The source code is meant to be used by Microsoft Azure employees publishing the extension and the source code is open sourced under Apache 2.0 License for reference. You can read the User Guide below.

doAzureParallel - A R package that allows users to submit parallel workloads in Azure


The doAzureParallel package is a parallel backend for the widely popular foreach package. With doAzureParallel, each iteration of the foreach loop runs in parallel on an Azure Virtual Machine (VM), allowing users to scale up their R jobs to tens or hundreds of machines.doAzureParallel is built to support the foreach parallel computing package. The foreach package supports parallel execution - it can execute multiple processes across some parallel backend. With just a few lines of code, the doAzureParallel package helps create a cluster in Azure, register it as a parallel backend, and seamlessly connects to the foreach package.

azure-acs-plugin - Azure Container Services plugin for Jenkins enables deploying containerized Java apps to Docker on Azure


Deploy Kubernetes, DC/OS, Docker Swarm application configurations to Azure Container Service cluster.

azure-webjobs-sdk - Azure WebJobs SDK


The Azure WebJobs SDK is a framework that simplifies the task of writing background processing code that runs in Azure. The Azure WebJobs SDK includes a declarative binding and trigger system that works with Azure Storage Blobs, Queues and Tables as well as Service Bus. The binding system makes it incredibly easy to write code that reads or writes Azure Storage objects. The trigger system automatically invokes a function in your code whenever any new data is received in a queue or blob.In addition to the built in triggers/bindings, the WebJobs SDK is fully extensible, allowing new types of triggers/bindings to be created and plugged into the framework in a first class way. See Azure WebJobs SDK Extensions for details. Many useful extensions have already been created and can be used in your applications today. Extensions include a File trigger/binder, a Timer/Cron trigger, a WebHook HTTP trigger, as well as a SendGrid email binding.

cortana-intelligence-personalized-offers - Generate real-time personalized offers on a retail website to engage more closely with customers


In today’s highly competitive and connected environment, modern businesses can no longer survive with generic, static online content. Furthermore, marketing strategies using traditional tools are often expensive, hard to implement, and do not produce the desired return on investment. These systems often fail to take full advantage of the data collected to create a more personalized experience for the user. Surfacing offers that are customized for the user has become essential to build customer loyalty and remain profitable. On a retail website, customers desire intelligent systems which provide offers and content based on their unique interests and preferences. Today’s digital marketing teams can build this intelligence using the data generated from all types of user interactions. By analyzing massive amounts of data, marketers have the unique opportunity to deliver highly relevant and personalized offers to each user. However, building a reliable and scalable big data infrastructure, and developing sophisticated machine learning models that personalize to each user is not trivial.Cortana Intelligence provides advanced analytics tools through Microsoft Azure — data ingestion, data storage, data processing and advanced analytics components — all of the essential elements for building an demand forecasting for energy solution. This solution combines several Azure services to provide powerful advantages. Event Hubs collects real-time consumption data. Stream Analytics aggregates the streaming data and updates the data used in making personalized offers to the customer. Azure DocumentDB stores the customer, product and offer information. Azure Storage is used to manage the queues that simulate user interaction. Azure Functions are used as a coordinator for the user simulation and as the central portion of the solution for generating personalized offers. Azure Machine Learning implements and executes the product recommendations and when no user history is available Azure Redis Cache is used to provide pre-computed product recommendations for the customer. PowerBI visualizes the activity of the system with the data from DocumentDB.