Core ML and machine learning in iOS

Core ML and Machine Learning in iOS

Core ML enables apps to use Machine Learning models with less power consumption, efficient processing speed and low memory usage. Core ML supports various models including neural networks, tree ensembles, support vector machines, generalized linear models, feature engineering and pipeline models. However, all of the models need to be converted to a .mlmodel file extension.

Conversion workflow

The core ML model is the conversion of a trained machine learning model to a .mlmodel file.

Convert to Core ML

Apple supports below mentioned formats for conversion as of now:

  • Caffe
  • Keras
  • XGBoost
  • Scikit-learn
  • libsvm

Model formats

To convert already trained models to .mlmodel format, Apple has introduced coremltools, which is an open-source tool.

Note: Try coremltools with Python 2.7 as other versions in macOS have some errors while configuring.

Steps Core ML and Machine Learning in iOS

Try these commands:

pip install virtualenv
virtualenv --python=/usr/bin/python2.7 python27
source python27/bin/activate

Check the current version of Python installed on your system:

python --version

Install coremltools in the system:

pip install -U coremltools

– U is for a temporary environment. Also, our installation part is over here.

As of now, we are ready to convert already trained models in a .mlmodel file.
Files required: bvlc_alexnet.caffemodel, deploy.prototxt, class_labels.txt

So choose the dictionary in a terminal,

cd
import coremltools
# Convert a caffe model to a classifier in Core ML
coreml_model = coremltools.converters.caffe.convert(('bvlc_alexnet.caffemodel', 'deploy.prototxt'), predicted_feature_name='class_labels.txt')

Here,

deploy.prototxt – which contains the structure of the neural network inside.
‘bvlc_alexnet.caffemodel – already trained Caffe model.
class_labels.txt – It contains a list of names of flowers.

To begin with conversion, you just need to press ENTER and the system will start its operation of converting the model.

It will be >>> sign when conversion gets completed. So save it by,

# Now save the model
coreml_model.save('BVLCObjectClassifier.mlmodel')

And you’ll see a ‘Flowers.mlmodel’ file in the same folder. That’s it. Integrate a recently created model in Xcode and start identifying objects.

Integrating Core ML Model into Xcode Project

Drag the Core ML model to the Xcode navigator. Xcode will detect it as a Core ML model and will display its description and input-output parameters for the model with other necessary information.

XCode integration

It’ll show a message like ‘Model is not part of any target…’. Just add the model to Target by selecting the target.

XCode integration

Now it’ll generate a new object with the name of your model and its properties will be automatically generated.

XCode integration

Now, in the ViewController, Add the code to pick an image and after that predict the result.

Code

 

Code

Code

Now, run the app and pick an image from Photos. The model will predict the age of the portrait picked and will display with probability of it.

Limitations of Core ML :

– At present apple only supports trained models.

– Not able to train models inside the app.

– Conversion of models is limited to a few formats only.

– Developers will not have the option to choose between CPU or GPU.

– It’ll increase the size of the app.

– It may result in different results on different platforms as .mlmodel is not supported on all platforms.