Estimated time: 15 minutes
In this part of our introduction you will learn how to:
- upload your data to the cloud storage;
- use image channels;
- use Jupyter notebooks running in cloud from within Neptune UI;
- compare your models within the Experiment List.
Let’s move on to a more realistic example - classification of letters from the notMNIST dataset using artificial neural networks. We will use Keras, a popular library for deep learning. Since we’ll work entirely in the cloud, there is no need to install anything.
Begin by entering the example’s directory:
We now need to deliver our dataset to Neptune.
Store your datasets outside of experiment’s directory.
Every time you run an experiment, Neptune uploads your code to the cloud and you usually don’t want the dataset to be a part of that upload.
The dataset is included in the
Upload it to your Neptune cloud storage using the following command
(it assumes you’re in the experiment’s directory):
neptune data upload ../data/notMNIST_small.mat
Training the Model¶
Our implementation in the
main.py file uses Keras library and requires a GPU to run.
With Neptune you can easily specify a type of machine to run the experiment on and the compute environment (Docker image) for your code:
1 2 3 4
neptune send \ --environment keras-2.2-gpu \ --worker s-k80 \ --input notMNIST_small.mat
Now let’s focus on the training process. You will see a typical learning curve: accuracy and log-loss, both for training and validation.
Sometimes a more fine-grained inspection is helpful to learn what is and what isn’t working. Neptune allows you to look at misclassified letters - just open the Channels tab in the left panel.
What you’re seeing is an image channel. In the previous examples, you sent numerical values to Neptune.
You can, just as easily, send images to Neptune UI from your python code.
Look how it is done in
1 2 3 4 5
ctx.channel_send('false_predictions', neptune.Image( name='my image', description='this image depicts a cat', data=pil_image # an image in PIL format ))
Running Experiments in a Notebook¶
In addition to experiments started from your console, you can experiment using interactive Jupyter notebooks hosted in the cloud.
Start Notebook in the upper left corner of Neptune UI.
You’ll be asked to select a worker type (use
s-k80) and environment (use
You also need to add
notMNIST_small.mat as an input - it will make it accessible from within the notebook.
Finally, you can upload a notebook file by clicking
Browse and selecting
Start and in a short while your notebook will be up and running.
Fine-Tuning the Algorithm¶
Deep learning models benefit a lot from architecture and parameter tweaking, so go ahead to the code and uncomment a few commented lines. This will add much-needed complexity to our convolutional network:
Now, let’s run the experiment pressing
CTRL+Enter or clicking
Cell -> Run Cells with focus on the cell.
Did you lower the
Log-loss validation? You can check it in the Channels tab.
And yes, channels do work with notebooks!
Compare Your Experiments¶
In order to compare multiple experiments Neptune comes with Experiment List view. You can go there by clicking on the up arrow in the upper left corner of Neptune UI.
It is another place where you can use channels. Just select
Log-loss validation channel from
Manage columns / Numeric Channels menu on the right.
Now you can compare your experiments based on the last value in the channel. Did you make progress?
If you’re done with improving your model, you can move on to the next example: Neural Style Transfer.