August 14, 2019 · 4 mins read

Lesson 14 - An end at the beginning

For the final lesson of I redid the first project of the course. In Swift. (Read the first post so you know what we’re building) After using Swift and TensorFlow for the first time yesterday (post), I want to use Swift for everything. So I figured it might be fun to do a project I am familiar with and see what the similarities and differences are.

This the last post about my journey. Read all the posts here.

If you have never used s4tf, check out yesterday’s post for a quick introduction.

import Foundation
import Python

let fastai = Python.import("fastai")
let vision = Python.import("")
let pathlib = Python.import("pathlib")
let np = Python.import("numpy")
let plt = Python.import("matplotlib.pylab")

Loading the data: SwiftKaggle

Because there is no Swifty API to interact with the Kaggle API in Swift yet, I am going to write my own. Check out the GitHub project.

Because the GitHub project is not yet a full Swift package that can be used in Jupyter Notebooks with a Swift kernel, copy the contents of Kaggle.swift for now.

Authenticating and downloading a dataset from Kaggle is easy using the SwiftKaggle API:

let kaggle = Kaggle(username: "kajs21rjk", key: "110c53139229dd1fc1394d5052c44757") "fruits", byUser: "moltean")

Because SwiftKaggle is far from finished, you need to unwrap the files manually for now. Note that the isDir value is passed as a pointer because FileManager is an Objective-C class.

var isDir: ObjCBool = true
if !(FileManager.default.fileExists(atPath: "fruits-360", isDirectory: &isDir)) {
  os.system("unzip > /dev/null")

Remember from the first post, this is what the data looks like:


Using FastAi in Swift

As you saw in the introduction, the module should be imported separately.

Everything else is quite similar to the Python version of this project I made in lesson 1.

Define the path of the data:

let path = pathlib.Path("fruits-360")

A better version of Python’s Pathlib library is available in swift (Path.swift). However, this API can’t be used with the Swift wrapper of fastai yet.

Creating a databunch is quite similar to Python as well:

let data = vision.ImageDataBunch.from_folder(path, train: ".", valid_pct: 0.2,
        ds_tfms: vision.get_transforms(), size: 224, num_workers: 4).normalize(vision.imagenet_stats)

Showing the data as I did in the first post is not working in Swift at this point. Swift does not seem to be able to output HTML code to Jupyter notebooks yet. I hope it will be in the future. Here’s the code for people from the future:

data.show_batch(rows: 3, figsize: PythonObject(tupleOf: 7, 8))

Training the model is, surprisingly, virtually identical to Python:

let learn = vision.cnn_learner(data, vision.models.resnet34, metrics: vision.error_rate)

We have to use vision. to use library methods. This is a neat way of keeping code clear, though it can be a little annoying to type.

Unfortunately, because Swift doesn’t support HTML objects in Jupyter Notebooks yet, the output of the training code is ambiguous.

long output


This project concludes my journey. It has been incredible and I am super excited to learn more. I will probably write a full review on, stay tuned for that. Follow me on Twitter to get notified when I release the post.

I can’t wait to start working on the full version of SwiftKaggle in two weeks. Make sure to follow the project on GitHub to stay up to date with the development.

A huge thanks to Sam Miserendino for proofreading this post!