NVIDIA NGC

NVIDIA NGC
NVIDIA NGC 2019-05-10T15:11:47+00:00

NVIDIA NGC.

Visually design your deep learning applications with Deep Cognition’s Deep Learning Studio (DLS) Container by taking full advantage of high-performance NVIDIA® GPUs. You can access the Deep Learning Studio Container remotely using a web browser.

Deep Cognition’s DLS Container contains a range of drag-and-drop toolboxes that are ideal for visually designing deep learning models.

This guide helps you run the Deep Learning Studio (DLS) in the cloud on an Amazon EC2® P3 instance or your own computer. The Deep Learning Studio Container, a Docker container hosted on NGC, simplifies the process of designing and deploying AI. The container is available at the NGC Container Registry.

Requirements:

  • NGC account with valid API key
  • Deep Learning Studio Subscription (Available Free of Cost)
  • Amazon® Web Services account

Costs:

You are responsible for the cost of the Amazon Web Services used when you create a cluster using this guide. Resource settings, such as instance type, affect the cost of deployment. For cost estimates, see the pricing pages for each AWS service you are using. Prices are subject to change.

Prepare Your AWS Account:

If you do not have an Amazon Web Services account, create one at https://aws.amazon.com by following the on-screen instructions. Create a key pair using the Amazon EC2 Console.

Running Deep Learning Studio on NGC

Create persistent directories where your datasets and work will be stored

$ mkdir -p $HOME/dls/database $HOME/dls/keras $HOME/dls/data

Sample nvidia-docker run:

$ nvidia-docker run \ -d \ –rm \ –name deep_learning_studio \ -p 8880:80 -p 8881:80 -p 8888:8880 -p 8886:8888 -p 8889:3000 \ -v $HOME/dls/data:/data \ -v $HOME/dls/database:/home/app/database \ -v $HOME/dls/keras:/root/.keras \ -e DLS_EULA_AGREED=y \ nvcr.io/partners/deep-learning-studio:latest

Where:

-d: run container in detached mode
–name deep_learning_studio: assign name deep_learning_studio to the container
-p 8880:80 -p 8881:80 -p 8888:8880 -p 8886:8888 -p 8889:3000 : expose required ports for the application
-v $HOME/dls/data:/data: this will be the main directory where your uploaded datasets and project will be stored.
-v $HOME/dls/database:/home/app/database: directory where DLS will store settings and configuration.
-v $HOME/dls/keras:/root/.keras: settings for Keras will be stored here.
-e DLS_EULA_AGREED=y: by selecting this option you accept DLS EULA

Once container is launched you can open http://127.0.0.1:8880/ link from your browser to access DLS.

THE CLOUD
IS IN OUR DNA.

GET STARTED

Deep Cognition’s Announcements

Deep Cognition will be releasing Falcon TensorApp Server and TensorApp MarketPlace in Q4 2018.

RECENT TWEETS

CONTACT US

  • 8330, Sterling Street, Irving, TX, USA 75063
  • +1-214-441-3517
  • hello@deepcognition.ai