Deep Learning with SystemML
- Training Lenet on the MNIST dataset
- Prediction using a pretrained ResNet-50
There are three different ways to implement a Deep Learning model in SystemML:
- Using the DML-bodied NN library: This library allows the user to exploit full flexibility of DML language to implement your neural network.
- Using the experimental Caffe2DML API: This API allows a model expressed in Caffe’s proto format to be imported into SystemML. This API doesnot require Caffe to be installed on your SystemML.
- Using the experimental Keras2DML API: This API allows a model expressed in Keras’s API to be imported into SystemML. However, this API requires Keras to be installed on your driver.
|Ability to add custom layers||Yes||No||No|
|The user needs to know||DML||Caffe’s proto API||Keras’ API|
|Can be invoked using pyspark||Yes. Please see Python MLContext API||Yes.||Yes.|
|Can be invoked using spark-shell||Yes. Please see Scala MLContext API||Limited support||No|
|Can be invoked via command-line or JMLC API||Yes||No||No|
|GPU and native BLAS support||Yes||Yes||Yes|
|Part of SystemML’s mllearn API||No||Yes||Yes|
Before we go any further, let us briefly discuss the training and prediction functions in the mllearn API (i.e. Caffe2DML and Keras2DML).
Please note that when training using mllearn API (i.e.
expects that labels have been converted to 1-based value.
This avoids unnecessary decoding overhead for large dataset if the label columns has already been decoded.
For scikit-learn API, there is no such requirement.
Training Lenet on the MNIST dataset
Download the MNIST dataset using mlxtend package.
from mlxtend.data import mnist_data import numpy as np from sklearn.utils import shuffle # Download the MNIST dataset X, y = mnist_data() X, y = shuffle(X, y) # Split the data into training and test n_samples = len(X) X_train = X[:int(.9 * n_samples)] y_train = y[:int(.9 * n_samples)] X_test = X[int(.9 * n_samples):] y_test = y[int(.9 * n_samples):]