If you’ve ever dreamed of building your own low-power animated pixel pet—like a Tamagotchi (I named mine Catmagotchi) but modern, this post is for you. In this tutorial, I’ll show you how I created a looping animation of a cat on a Waveshare 2.13” Touch E-Ink display using a Raspberry Pi Zero 2 WH.
I recently started learning Python, and I have to say, it’s been a real pleasure to work with. After years of using other languages that felt overly verbose or heavy, Python’s simplicity has made creative coding feel natural and fun again.
As a learning project, I decided to build a minimalist virtual pet: a small animated cat that lives on an e-Paper display connected to a Raspberry Pi Zero 2 WH. The goal was to create something quiet, low-power, and always on. And thanks to Python’s ecosystem, it all came together much more smoothly than I expected.
This is just the beginning, I’m planning to add basic interaction next. The Waveshare e-Paper display includes a touch layer (documentation here), so I’ll soon be wiring up “buttons” that let you wake the cat, play with it, feed it, or just watch it nap. This post walks through the current implementation and how you can build your own animated pixel cat too.
After some modifications, the Python script automatically launches on the Raspberry Pi Zero at startup, and I changed the color of the cat to only take the outer lines of the PNG images (the cat becomes white, with details) and only use the white background on the display. #Catmagotchi 😻
— Maxence Rose 🦋 (@pirmax.fr) 2025-05-21T14:01:07.747Z
This cat isn’t just idle, it walks, sleeps, wakes up, and even reacts with random behaviors throughout the day. Let’s dive into the hardware and code behind it.
Why a Pi Zero and E-Ink?
The Raspberry Pi Zero 2 WH
is tiny, affordable, and comes with built-in Wi-Fi, perfect for always-on minimalist projects. The Waveshare 2.13” Touch e-Paper display adds a crisp black-and-white screen with almost no power draw when idle. Plus, it supports partial refresh and even basic touch input.
What You’ll Need
Before jumping into code, let’s make sure you’ve got the right hardware and assets lined up. This project is designed to be compact, quiet, and low-power, so no HDMI screens or USB mice required. All you need is a Raspberry Pi Zero, a tiny e-ink display, and a few animated frames of your pixel pet.
Below is the full list of hardware, software, and file structure required to bring your cat to life on the screen.
Material used:
- Raspberry Pi Zero 2 WH (with Raspbian installed)
- Waveshare 2.13” Touch E-Paper (250×122 pixels)
- MicroSD card
- USB power source
- Optional: case, GPIO header soldered
- A collection of frame-by-frame animations of a pixel cat (I used .png frames exported from GIFs)
Animations used:
- idle (5 frames)
- idle_to_sleep (8 frames)
- sleep (3 frames)
- sleep_to_idle (8 frames)
- walking_positive (8 frames)
- walking_negative (8 frames)
Each animation frame is named like: frame_0.png, frame_1.png, etc.
Directory structure:
animations/
idle/
walking_positive/
sleep/
...
Step 1: Set Up Your Pi and Display
First, clone the Waveshare driver repo:
git clone https://github.com/waveshare/e-Paper
cd e-Paper/RaspberryPi_JetsonNano/python
sudo python3 setup.py install
Then, enable SPI and I2C:
sudo raspi-config
# Interface Options → Enable SPI + I2C
Step 2: Install Dependencies
pip3 install pillow
You’ll also need evdev later if you want to use touch support:
pip3 install evdev
Step 3: Displaying an Animated Cat
Keep in mind that e-Paper displays aren’t like traditional LCDs. They have a relatively slow refresh rate, typically taking ~500ms to 1s for a full refresh, and slightly faster for partial refreshes. This makes them ideal for static or slow-changing content, but unsuitable for fast-paced animations.
In this project, we use a FRAME_DELAY
of 0.5s
, which aligns well with the display’s capabilities. Faster intervals would not be noticeable and could even cause screen artifacts or ghosting, depending on the model. The Waveshare 2.13” Touch e-Paper used here supports partial refresh, which helps reduce flicker and power consumption when only small areas of the screen change.
In short: your cat won't purr at 60 frames per second, but it will sound perfectly natural for a minimalist virtual pet.
Here’s a simplified version of the script that handles:
- Drawing centered images on the screen
- Looping animations
- Random transitions
import sys
import time
import argparse
import random
from PIL import Image, ImageTk
import tkinter as tk
Let's move on to configuring our app with variables:
# Screen resolution for the 2.13" Waveshare e-paper display
SCREEN_WIDTH = 250
SCREEN_HEIGHT = 122
FRAME_DELAY = 0.5 # Seconds between frames
LINE_HEIGHT = 1
Animation configuration: number of frames and total duration (in seconds) :
ANIMATIONS = {
"idle": {"frames": 5, "duration": 10},
"idle_to_sleep": {"frames": 8, "duration": None},
"sleep": {"frames": 3, "duration": 15},
"sleep_to_idle": {"frames": 8, "duration": None},
"walking_positive": {"frames": 8, "duration": 5},
"walking_negative": {"frames": 8, "duration": 5},
}
Function needed to load frames (from image/PNG files):
# --- LOAD AND PREPARE FRAME ---
def load_frame(animation_name, frame_index):
path = f"animations/{animation_name}/frame_{frame_index}.png"
# Load image as RGB, ignore transparency, convert to grayscale
img = Image.open(path).convert('RGB')
grayscale = img.convert('L')
# Custom thresholding to extract black & white version of the cat
def threshold(x):
# Background (almost black) becomes white (invisible)
# Bright details like eyes stay white
# Midtones (cat body) become black
if x < 20:
return 255 # Treat as white (background)
elif x > 190:
return 255 # Eyes and light highlights → white
else:
return 0 # Cat body and darker parts → black
bw = grayscale.point(threshold, '1') # Convert to 1-bit black & white image
# Center the image on a blank white canvas matching the screen size
centered = Image.new('1', (SCREEN_WIDTH, SCREEN_HEIGHT), 255)
pos_x = (SCREEN_WIDTH - bw.width) // 2
pos_y = (SCREEN_HEIGHT - bw.height) // 2
centered.paste(bw, (pos_x, pos_y))
return centered
Displaying the frame on the e-paper screen:
# --- DISPLAY FRAME ON EPAPER ---
def display_frame_epaper(epd, animation_name, frame_index):
frame = load_frame(animation_name, frame_index)
epd.displayPartial(epd.getbuffer(frame)) # Partial refresh to avoid flicker
Displaying the frame in the program window (desktop version):
# --- DISPLAY FRAME ON DESKTOP PREVIEW ---
def display_frame_desktop(canvas, photo_img, animation_name, frame_index, root):
frame = load_frame(animation_name, frame_index)
display = frame.convert('L') # Convert back to grayscale for display
tk_img = ImageTk.PhotoImage(display.resize((SCREEN_WIDTH * 2, SCREEN_HEIGHT * 2)))
photo_img[0] = tk_img
canvas.create_image(0, 0, anchor=tk.NW, image=tk_img)
root.update()
Function needed to play each animation (according to the number of frames per animation) of the cat on the screen or on the program:
# --- PLAY ONE ANIMATION CYCLE ---
def play_animation(animation_name, display_fn):
config = ANIMATIONS[animation_name]
frame_count = config["frames"]
total_duration = config["duration"]
if total_duration is None:
# Play once, frame by frame
for i in range(frame_count):
display_fn(animation_name, i)
time.sleep(FRAME_DELAY)
else:
# Loop the animation enough to fill total_duration
loops = int(total_duration / (FRAME_DELAY * frame_count))
for _ in range(loops):
for i in range(frame_count):
display_fn(animation_name, i)
time.sleep(FRAME_DELAY)
Function necessary for choosing animations, the defined order allows not to make the cat walk while it wakes up, and to have a cat logic:
# --- CHOOSE RANDOMLY WHAT THE CAT DOES NEXT ---
def animation_sequence(display_fn):
last_state = "idle"
while True:
if last_state == "sleep":
play_animation("sleep_to_idle", display_fn)
last_state = "idle"
continue
if last_state in ["walking_positive", "walking_negative"]:
play_animation("idle", display_fn)
last_state = "idle"
continue
# Randomly pick the next action
choice = random.choice(["walk_pos", "walk_neg", "sleep", "idle"])
if choice == "walk_pos":
play_animation("walking_positive", display_fn)
last_state = "walking_positive"
elif choice == "walk_neg":
play_animation("walking_negative", display_fn)
last_state = "walking_negative"
elif choice == "sleep":
play_animation("idle_to_sleep", display_fn)
play_animation("sleep", display_fn)
last_state = "sleep"
else:
play_animation("idle", display_fn)
last_state = "idle"
Function required for booting on e-paper screen:
# --- START EPAPER MODE ---
def run_epaper():
sys.path.append("lib") # Add driver path
from waveshare_epd import epd2in13_V3
epd = epd2in13_V3.EPD()
epd.init()
epd.Clear(0xFF) # Full white screen clear
# Set a blank white base image to prepare for partial updates
epd.displayPartBaseImage(epd.getbuffer(Image.new('1', (SCREEN_WIDTH, SCREEN_HEIGHT), 255)))
try:
animation_sequence(lambda a, i: display_frame_epaper(epd, a, i))
except KeyboardInterrupt:
epd.init()
epd.Clear(0xFF)
epd.sleep()
Function required for booting on desktop preview mode:
# --- START DESKTOP PREVIEW MODE (FOR DEV ON MAC/LINUX) ---
def run_desktop():
root = tk.Tk()
root.title("Catmagotchi Desktop Preview")
canvas = tk.Canvas(root, width=SCREEN_WIDTH*2, height=SCREEN_HEIGHT*2, bg='white')
canvas.pack()
photo_img = [None] # Used to keep reference to the image
try:
animation_sequence(lambda a, i: display_frame_desktop(canvas, photo_img, a, i, root))
except KeyboardInterrupt:
root.destroy()
And finally, the main function that launches everything we saw previously:
# --- ENTRY POINT ---
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("--preview", action="store_true", help="Run desktop preview mode instead of e-paper mode")
args = parser.parse_args()
if args.preview:
run_desktop()
else:
run_epaper()
Step 4: Add Desktop Preview (Optional)
You can also develop on your Mac or Linux system without uploading everything to the Pi by previewing in a Tkinter window. Add a --preview
flag and duplicate the load_frame()
logic with ImageTk.
python main.py --preview
Where to Go From Here
Here are a few ideas to extend your virtual cat:
- Add touch interaction (wake up the cat by touching the screen)
- Add a weather-based mood (via OpenWeatherMap API)
- Integrate time-based cycles (sleep more at night)
- Add Wi-Fi connectivity to fetch messages or alerts
- Draw speech bubbles or blinking eyes for more life
Final Thoughts
This project is a great way to give new life to a low-power device like the Pi Zero. You get to practice image rendering, animation logic, and hardware interfacing, all in a minimalist and relaxing format. Plus, your cat never runs out of batteries.
Let me know if you build your own version, or fork and extend it. I’d love to see how far this little cat can go.
Top comments (0)