DEV Community

張旭豐
張旭豐

Posted on

Choose Arduino Modules by Scene, Not by Specification

Choose Arduino Modules by Scene, Not by Specification

A module list tells you what to buy.

A scene tells you why the module belongs there.

That difference matters.

Most Arduino beginners ask the wrong shopping question:

Which sensor should I buy?

A better design question is:

What moment do I want the object to understand?

If the moment is a visitor entering a gallery, the device needs presence sensing. If the moment is a hand moving closer, it needs distance. If the moment is a deliberate touch, it needs capacitive sensing. If the moment is environmental change, it needs a temperature sensor.

This guide recommends common interactive-device modules by the situation they serve, not by catalog category.


The core rule: start from the human moment

Do not start with HC-SR04, PIR, ESP32, WS2812B, or servo.

Start with a human sentence.

Good interaction sentences look like this:

  • When someone enters the room, the object wakes up.
  • When a hand approaches, the light becomes warmer.
  • When someone steps on the floor, a lamp turns on.
  • When the temperature rises, the sculpture spins faster.
  • When someone touches the surface, the object changes mode.

Those sentences already imply the modules.

  • Presence suggests PIR.
  • Distance suggests ultrasonic or time-of-flight.
  • Pressure suggests FSR.
  • Environmental change suggests DHT22.
  • Intentional choice suggests capacitive touch.

The module is not the starting point. The module is the answer to a scene.


Scenario 1: the object sleeps until someone enters

Educational poster: PIR LED corridor interaction showing INPUT → PROCESSING → OUTPUT flow. Museum corridor at night, visitor silhouette approaching doorway with HC-SR501 PIR sensor on wall and WS2812B LED strip glowing amber. Annotation boxes show: INPUT: HC-SR501 PIR detects warm body movement → CONTROLLER: Arduino/ESP32 processes presence signal → OUTPUT: WS2812B LED light turns on. Right side shows state diagram: IDLE (LED off) → MOTION DETECTED (LED fades in) → HOLD 8s (LED stays on) → NO MOTION (LED fades out)

Imagine a small installation near a doorway.

Nobody is nearby. The object is dark. It should not flash randomly. It should not keep moving. It should feel like it is waiting.

A visitor enters the space.

The object wakes up.

That scene needs presence sensing, not precise distance.

The interaction flow:

  • INPUT: HC-SR501 PIR sensor — detects changes in infrared patterns from warm bodies moving in the area
  • PROCESSING: Arduino/ESP32 reads digital PIR signal, applies hold state logic to prevent flickering
  • OUTPUT: WS2812B LED strip — fades in warm light when presence is detected, fades out when visitor leaves

Use:

  • HC-SR501 PIR motion sensor
  • Arduino Uno or ESP32
  • WS2812B LED strip or a soft LED output
  • Optional OLED for debugging during setup

Why PIR fits:

A PIR sensor detects changes in infrared patterns. It is good for noticing that a warm body moved in the area. It is not good for measuring how close the visitor is.

The hold state is everything:

If you connect PIR directly to LED on/off, the object feels nervous. It blinks whenever the sensor changes. If you add a hold state, the object feels aware. It noticed someone and stayed attentive.

Suggested parts:

  • HC-SR501 PIR Motion Sensor: presence detection for room-scale wake behavior. Amazon search
  • ESP32 Development Board: useful when the installation later needs wireless settings or logging. Amazon search
  • WS2812B LED Strip: visible wake and rest feedback. Amazon search

Scenario 2: the light reacts to how close the hand is

Educational poster: ultrasonic proximity LED wall interaction showing INPUT → PROCESSING → OUTPUT flow. White exhibition panel with HC-SR04 ultrasonic sensor. Visitor's hand shown at three positions: far (100cm), medium (50cm), close (20cm), with corresponding LED brightness states. Annotation boxes: INPUT: HC-SR04 measures time-of-flight of 40kHz sound pulse → PROCESSING: Distance Zones map far/medium/close to brightness → OUTPUT: WS2812B LED responds with corresponding brightness and color temperature. Bottom right shows zone map table

Now imagine the visitor has already walked closer.

The object is awake. The visitor reaches toward it.

The light should not simply turn on. It should change as the hand approaches.

That scene needs distance sensing.

The interaction flow:

  • INPUT: HC-SR04 ultrasonic sensor — measures time-of-flight of 40kHz sound pulse to calculate distance
  • PROCESSING: Arduino maps distance reading to brightness zones (far = dim, medium = warm, close = bright)
  • OUTPUT: WS2812B LED strip — responds with corresponding brightness and color temperature

Use:

  • HC-SR04 ultrasonic sensor for visible, low-cost distance interaction
  • VL53L0X or VL53L1X time-of-flight sensor for compact hidden sensing
  • WS2812B LED strip for output
  • ESP32 or Arduino Uno as controller

Why distance fits:

Distance lets you design zones. This is more expressive than a simple on/off threshold.

A threshold says: if distance is less than 30 cm, turn on.

A zone says: as you approach, the object changes its emotional temperature.

That is interaction design.

Zone map:

Zone Distance LED Behavior
Far > 75 cm Dim warm light
Medium 25–75 cm Medium brightness
Close 5–25 cm Bright warm glow

Suggested parts:

  • HC-SR04 Ultrasonic Sensor Module: cheap and easy distance sensing for prototypes. Amazon search
  • VL53L0X Time-of-Flight Sensor: smaller and cleaner for hidden proximity sensing. Amazon search
  • 5V LED Power Supply: needed when LED strips become more than a tiny test. Amazon search

Scenario 3: the floor knows someone stepped on it

Educational poster: FSR floor pressure lamp interaction showing INPUT → PROCESSING → OUTPUT flow. Living room floor with person stepping on rug. Cutaway shows FSR 400 force sensor embedded under rug. Circuit diagram shows FSR + 10k pulldown resistor + Arduino analog input. Annotation boxes: INPUT: FSR 400 Force Sensor changes resistance when pressure applied, read via analog input → PROCESSING: Arduino threshold check, analogRead value > threshold triggers event → OUTPUT: Floor lamp LED turns on with soft warm glow

Some interactions are not about approaching. They are about physical presence on a surface.

Imagine a carpet that knows when someone is standing on it. Or a floor tile that triggers a sound when stepped on. Or a cushion that wakes up when someone sits down.

That scene needs pressure sensing.

The interaction flow:

  • INPUT: FSR 400 force sensor — resistance changes when pressure is applied, read via Arduino analog input using a 10k pulldown resistor (voltage divider circuit)
  • PROCESSING: Arduino checks if analogRead value exceeds threshold, triggers event on crossing
  • OUTPUT: Floor lamp LED strip — turns on with soft warm glow when pressure is detected

Use:

  • FSR 400 (Force Sensing Resistor) for thin, flexible pressure detection
  • FSR 402 for slightly more rigid mounting
  • 10k Ohm pulldown resistor — critical, without it the analog reading floats

The threshold pattern:

Use FSR as a threshold trigger, not continuous analog input. FSRs are noisy in continuous use. When pressure exceeds the threshold, trigger an event. This is more reliable than trying to read gradual pressure changes.

Critical circuit note:

The analog signal will float without a 10k resistor between the signal pin and ground. This is the most common beginner mistake with FSRs.

Suggested parts:

  • FSR 400 Pressure Sensor: thin and flexible for carpet or floor mounting. Amazon search
  • 10k Ohm Resistor Pack: needed for pulldown circuit. Amazon search

Scenario 4: the object responds to environmental change

Educational poster: DHT22 temperature-driven paper windmill showing INPUT → PROCESSING → OUTPUT flow. Paper windmill sculpture next to gallery window, spinning slowly. DHT22 sensor on breadboard connected to Arduino. Dashboard shows real-time temperature and servo speed graphs over time. Annotation boxes: INPUT: DHT22 Sensor reads ambient temperature and humidity continuously → PROCESSING: Temperature-to-Speed Mapping, warmer temperature = faster servo rotation → OUTPUT: Paper Windmill (Servo) spins at speed proportional to temperature

Some interactions are not about a visitor at all.

They are about the environment around the object changing.

Imagine a paper sculpture that spins faster when the room warms up. An LED strip that shifts from cool blue to warm orange as temperature changes through the day. A fan that turns on when humidity rises above a threshold.

That scene needs ambient environmental sensing.

The interaction flow:

  • INPUT: DHT22 temperature and humidity sensor — reads ambient temperature and relative humidity continuously
  • PROCESSING: Arduino maps temperature reading to servo speed, warmer temperature drives faster rotation
  • OUTPUT: Paper windmill (driven by SG90 servo) — spins at speed proportional to current temperature

Use:

  • DHT22 for temperature and humidity together
  • DS18B20 for more precise temperature-only reading
  • Light-dependent resistor (LDR) for ambient light level

Why DHT22 fits:

The DHT22 reads both temperature and relative humidity with enough accuracy for artistic purposes. The library support means you can get stable readings within an hour of opening the package.

The ambient awareness principle:

The DHT22 does not respond to a human action at a specific moment. It makes the installation continuously sensitive to its environment. An environmental sensor makes an installation feel alive in a different sense — not reactive to visitors, but continuously responding to the conditions around it.

Suggested parts:

  • DHT22 Temperature and Humidity Sensor: reads both ambient temperature and relative humidity. Amazon search
  • DS18B20 Waterproof Temperature Sensor: more precise temperature reading for outdoor or liquid use. Amazon search
  • SG90 Micro Servo: drives the paper windmill. Amazon search

Scenario 5: touch means permission

Presence is passive.

Distance is exploratory.

Pressure is physical.

Touch is intentional.

That is why capacitive touch modules are powerful in interactive devices. They do not only detect contact. They change the social meaning of the object.

Imagine a lamp that wakes when you enter, glows warmer as you approach, and changes mode only when you touch the side.

That touch says: I choose to interact.

The interaction flow:

  • INPUT: TTP223 capacitive touch module — detects change in capacitance when a finger approaches or touches the surface, works through wood, acrylic, or paper
  • PROCESSING: Mode toggle logic — each tap switches between two modes
  • OUTPUT: LED color changes + OLED display updates to show current mode

Educational poster: capacitive touch mode change interaction showing INPUT → PROCESSING → OUTPUT flow. Wooden desk surface with TTP223 module hidden underneath (cutaway view). Person tapping desk. LED strip shows two states: Mode A (blue) and Mode B (amber). OLED display shows current mode. Bottom shows tap sequence: TAP → mode toggles → LED changes color → OLED updates

Use:

  • TTP223 capacitive touch module
  • Copper tape or conductive pad behind a surface for larger touch areas
  • ESP32 or Arduino Uno
  • LED strip, OLED, or servo as feedback

Good touch interactions:

  • Tap to change mode
  • Long press to save a setting
  • Touch and hold to make the object breathe
  • Double tap to reset
  • Hidden touch point behind wood, acrylic, or paper

After a touch, the object should acknowledge immediately:

A short light ripple, a tiny servo nod, or a display change tells the user: your intention was received. Without acknowledgment, the touch feels like it disappeared into nothing.

Suggested parts:

  • TTP223 Capacitive Touch Sensor Module: simple intentional input for surfaces. Amazon search
  • Copper Foil Tape: useful for making larger custom touch areas. Amazon search
  • 0.96 inch I2C OLED Display: shows current mode during interaction. Amazon search

A complete scene-based build

Here is a complete interaction scene that combines multiple modules.

Scene:

A small tabletop object waits in a gallery. When a visitor enters the area, it wakes. When the visitor approaches, the light becomes warmer. When the visitor touches the surface, the object changes mode and a small servo moves slowly.

Module stack:

Stage Module Role
Entry HC-SR501 PIR Wakes object when visitor enters
Approach HC-SR04 Ultrasonic Measures how close visitor is
Intent TTP223 Touch Confirms deliberate interaction
Output WS2812B LED Shows emotional state through light
Motion SG90 Servo Creates physical movement
Debug OLED Display Shows state during tuning

The state machine:

  1. Sleeping — LEDs off, PIR monitoring
  2. Visitor detected — LED fades in softly
  3. Approach detected — LED gets warmer as distance decreases
  4. Touch confirmed — Mode toggles, servo moves slowly
  5. Active mode — Object responds with new behavior
  6. Hold state — Object stays awake after visitor steps back
  7. Return to rest — LED fades back to sleeping state

That state list is more important than the shopping list.

The shopping list gives you parts.

The state list gives you behavior.


Minimal state machine code

The code structure should follow the scene.

This sketch is a state-machine skeleton. The actual output functions (showSleepingLight, showAwakeGlow, showActiveMode, moveServoSlowly) would call FastLED, Adafruit NeoPixel, Servo, or your chosen output library.

const int pirPin = 2;
const int touchPin = 4;

unsigned long lastPresenceTime = 0;
const unsigned long holdTime = 8000;
bool activeMode = false;
bool lastTouch = false;

void setup() {
  pinMode(pirPin, INPUT);
  pinMode(touchPin, INPUT);
}

void loop() {
  // INPUT: Detect if someone entered the space
  bool presence = digitalRead(pirPin) == HIGH;
  if (presence) {
    lastPresenceTime = millis();
  }

  // PROCESSING: Hold state keeps object awake after motion stops
  bool awake = millis() - lastPresenceTime < holdTime;

  // INPUT: Touch means the user made an intentional choice
  bool touchNow = digitalRead(touchPin) == HIGH;
  if (awake && touchNow && !lastTouch) {
    activeMode = !activeMode;
  }
  lastTouch = touchNow;

  // OUTPUT: LED/servo respond to current state
  if (!awake) {
    showSleepingLight();       // OUTPUT: dim or off
  } else if (activeMode) {
    showActiveMode();          // OUTPUT: bright, warm
    moveServoSlowly();         // OUTPUT: gentle motion
  } else {
    showAwakeGlow();          // OUTPUT: soft ambient
  }
}
Enter fullscreen mode Exit fullscreen mode

This is not a complete installation code. It is a pattern.

The pattern is:

  • INPUT sensors detect human actions (presence, touch, proximity)
  • Processing logic decides what the signals mean (hold states, mode toggles)
  • OUTPUT functions show the object's response (light, motion, sound)

That is the structure most beginner projects miss.


What to buy for this build

If I were building this exact interaction, I would buy a balanced kit instead of a random sensor pack.

Input modules:

  • HC-SR501 PIR Motion Sensor: wake the object when someone enters the area. Amazon search
  • HC-SR04 Ultrasonic Sensor: measure approach distance. Amazon search
  • TTP223 Capacitive Touch Sensor: confirm intentional interaction. Amazon search
  • FSR 400 Pressure Sensor: detect standing or sitting on a surface. Amazon search

Controller:

  • ESP32 Development Board: good for connected interactive objects. Amazon search
  • Arduino Uno Compatible Board: good for first tests and workshops. Amazon search

Output modules:

  • WS2812B LED Strip: the most expressive output for interactive objects. Amazon search
  • SG90 Micro Servo: simple mechanical movement for small mechanisms. Amazon search

Power and infrastructure:

  • 5V 10A LED Power Supply: reliable power for LEDs and servo. Amazon search
  • Logic Level Shifter Module: ESP32 to 5V LED reliability. Amazon search
  • 0.96 inch I2C OLED Display: state readout during development. Amazon search

The Module Selection Framework

When you are planning your next interactive project, work through this sequence:

  1. What is the primary interaction? Presence, proximity, contact, environmental change, or motion?
  2. What is the output? LED effect, motor movement, sound playback, projection mapping?
  3. What are the physical constraints? Range needed, mounting position, weather exposure?
  4. What is the single module that best serves the primary interaction?

Most beginners buy a sensor kit and then try to find an application for each module. More effective: define the interaction you want, then find the single module that makes that interaction possible. Start there. Add complexity only when you have a reason.


Start With One Module, One Interaction

If you have been stuck in tutorial paralysis — reading about modules without building anything — pick one scenario from this article and build it this weekend. Not a complex system. One module. One output. One clear interaction.

A PIR sensor triggering an LED when someone walks by. An ultrasonic sensor making a servo rotate as someone approaches. A pressure mat turning on a light when someone steps on it.

Once you have that single interaction working reliably, you will understand the module well enough to know where it fits in something more ambitious.

The gap between "I have a list of sensors" and "I know how to use these to create experiences" is not knowledge. It is practice.

If you want a blueprint for building a complete multi-sensor interactive system, I put together a detailed Interactive Project Blueprint on Fiverr that covers the module selection logic, wiring approach, and code structure for a proximity-reactive installation.

Get the Interactive Project Blueprint on Fiverr

Affiliate disclosure: As an Amazon Associate, I earn from qualifying purchases.

Top comments (0)