DEV Community

Zac Siegel
Zac Siegel

Posted on

A Developers Journey into Machine Learning: Installing Python, Jupyter, TensorFlow, and Keras

This article originated on zsiegel.com

Machine Learning as a developer

Machine learning is without a doubt one of the most interesting topics in computer science today. The breakneck pace and WOW factor associated with many machine learning demos has given it some mystique that still has me scratching my head wondering how it is even possible.

When I finally decided to jump in to machine learning I started with a list of questions that will undoubtedly take a very long time to eventuall answer but it at least gives me an initial direction.

  • Is a strong math and statistics background necessary?
  • Is there a set of basic terminology and concepts that will serve as a good foundation?
  • Is a high end GPU really necessary to do machine learning?
  • Are their types of machine learning that do not require a GPU?
  • Given the incredible pace of change are any of the books, tutorials and examples online from 1-2 years ago still relevant?

In order to explore these questions I set out to get up and running quickly so I could spend my time learning and experimenting. The following is what helped me get going rather quickly with minimal fuss.

Linux, Python, CUDA, TensorFlow, and Keras

I mentioned before that the breakneck pace of change in Machine Learning is staggering. I spoke to a number of colleagues that had dipped their toes into the space and one of the core challenges they faced was around the tooling.

Many of them had challenges keeping their toolchains and dependencies in simple working order. They were often experiencing minor version updates of various frameworks causing compatibility issues.

During my initial setup I experienced similar troubles so I slowly and systematically inched back to what appears to be a stable setup and has been reproduced across multiple machines.

  • PopOS 19.04
  • Python 3.7
  • TensorFlow 1.13.1
  • Keras 2.2.4
  • CUDA 10.0

Being a developer familiar with Docker I had to resist the urge to add yet another dependency to the tool chain so I instead looked through various Dockerfiles and found a combination that has worked across a number of projects and been rock solid on PopOS which is a linux distribution based on Ubuntu.

It all starts with some basic tools that can be isntalled via apt.

sudo apt install python3 python3-venv system76-cuda-10.0 system76-cudnn-10.0

It is important to note I chose PopOS for its developer repositories that made installing CUDA very simple.

If you are running a Debian based operating system and have NVIDIA display drivers installed you can get the System76 developer packages by doing the following.

sudo echo "deb http://apt.pop-os.org/proprietary bionic main" | sudo tee -a /etc/apt/sources.list.d/pop-proprietary.list
sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-key 204DD8AEC33A7AFF
sudo apt update

Setting up Python and Jupyter Notebooks

With the basic frameworks installed it was time to setup my development environment. As someone who develops primarily in strongly typed languages with a compiler I was not sure how I would fare living primarily in a python repl but I have found Jupyter to be incredibly flexible and enjoyable.

Jupyter notebooks are a wonderful way to explore and run examples both locally on your machine but also in the cloud. During my initial experiments I spent a ton of time browsing Jupyter notebooks on Github and playing around in Google Colab.

You can think of Jupyter notebooks as a combination of prose, documentation, runnable code, and console output.

At the moment I have decided to store all of my work in a single directory on my computer but I could see making dedicated environments as needed in the future.

I chose a single directory for everything because I can leave my jupyter server running and remotely access my Jupyter server from my iPad Pro using an app called Juno. This is something I will write about later in the future!

Now on to setting up Python and Jupyter.

# Create the directory where we will be storing everything
mkdir ~/src/jupyter
cd ~/src/jupyter

# Create a fully isolated python environment - this is similar to gradlew or other wrapper scripts you may have encountered in other languages
python3 -m venv jupyter-env

# This activates the virtual environment - when activated you will see your prompt change to indicate you are in `jupyter-env`. Any commands run after this will be scoped strictly to this environment
source jupyter-env/bin/activate

# Install the dependencies. Again its important to understand these dependencies are not globally available on your system. They are only available to the virtual environment we activated just before this
pip install pandas numpy tensorflow-gpu keras jupyter matplotlib pillow scikit-learn

# Store all of the dependencies into a text file. Make sure you run this command if you add any new dependencies using pip. You can use this file to re-install the dependencies on a new machine using `pip install -r requirements.txt`
pip freeze > requirements.txt

# Launch the notebook - this will start a jupyter server and open your web browser
jupyter notebook

With this setup I have been able to work on a number of learning exercises and the toolchain has not gotten in my way... at least not yet :).

Top comments (0)