My current challenge is to find ways to connect Grotto to a gallery space for my final show. I’d like to avoid falling into the same pattern of “Thing projected on a wall” that you see at a lot of these shows. So far I have been concentrating on esp32 microcontrollers or headless raspberry pi’s that can get information from Grotto’s API. I’d like to use lights and sounds and weird tactile interfaces as much as possible.

I have a few ideas I’ve been researching-

  1. I’ve imagined a doorknob mounted on a rotary encoder as a novel way to move a character around in the game. an esp32 gets player information from Grotto with the security token. It pulls the connected exits from the current room. rotating the encoder cycles through the doors, visualized by an rgb led inside a keyhole (exits have rgb color values in Grotto). The doorknob is connected to a cog/ratchet/solenoid lock assembly- if the “door” is locked, the solenoid locks the cog, otherwise the knob can be turned, flipping a limit switch that sends a command to Grotto to enter the connecting room. A bright RGB light fills the gallery space with the color of the current room.

where this is at:
surprisingly I was able to get json from the grotto api on the esp32 pretty quickly (sometimes different platforms act sticky about the auth token). I now can cycle through exits with the rotary encoder (and change a neopixel led color to match each door), and button presses with the limit switch working. I don’t have the solenoid lock working yet (I’ve tested the part though) and I don’t have sending the use exit command done yet. I also have multiple functions firing off at different times in the circuitpython sketch for the esp32 (I switched from arduino to circuitpython because https never worked on the arduino), and I don’t know how to get them to work properly (they currently both fire off when the longest timer runs out) ask Chandler about this

Here’s the circuitpython code-

# SPDX-FileCopyrightText: 2019 ladyada for Adafruit Industries

# SPDX-License-Identifier: MIT

import time
import board
import busio
import neopixel
import usb_hid
import rotaryio
import digitalio

from digitalio import DigitalInOut
import adafruit_requests as requests
import adafruit_esp32spi.adafruit_esp32spi_socket as socket
from adafruit_esp32spi import adafruit_esp32spi
import json

# Get wifi details and more from a file

    from secrets import secrets
except ImportError:
    print("WiFi secrets are kept in, please add them there!")

print("ESP32 SPI webclient test")

headers = {"Authorization": secrets["token"]}

pixel = neopixel.NeoPixel(board.NEOPIXEL, 1)

# If you are using a board with pre-defined ESP32 Pins

esp32_cs = DigitalInOut(board.ESP_CS)
esp32_ready = DigitalInOut(board.ESP_BUSY)
esp32_reset = DigitalInOut(board.ESP_RESET)
spi = busio.SPI(board.SCK, board.MOSI, board.MISO)
esp = adafruit_esp32spi.ESP_SPIcontrol(spi, esp32_cs, esp32_ready, esp32_reset)

requests.set_socket(socket, esp)

button1 = digitalio.DigitalInOut(board.D12)
button1.direction = digitalio.Direction.INPUT
button1.pull = digitalio.Pull.UP

button2 = digitalio.DigitalInOut(board.D0)
button2.direction = digitalio.Direction.INPUT
button2.pull = digitalio.Pull.UP

encoder = rotaryio.IncrementalEncoder(board.D10, board.D9)
button1_state = None
button2_state = None

def hex2rgb(color):
    hex = color.lstrip("#")
    rgb = tuple(int(hex[i : i + 2], 16) for i in (0, 2, 4))

    return rgb

if esp.status == adafruit_esp32spi.WL_IDLE_STATUS:
    print("ESP32 found and in idle mode")
print("Firmware vers.", esp.firmware_version)
print("MAC addr:", [hex(i) for i in esp.MAC_address])

for ap in esp.scan_networks():
    print("\t%s\t\tRSSI: %d" % (str(ap["ssid"], "utf-8"), ap["rssi"]))

print("Connecting to AP...")
while not esp.is_connected:
        esp.connect_AP(secrets["ssid"], secrets["password"])
    except OSError as e:
        print("could not connect to AP, retrying: ", e)
print("Connected to", str(esp.ssid, "utf-8"), "\tRSSI:", esp.rssi)
print("My IP address is", esp.pretty_ip(esp.ip_address))

print("Fetching json from", JSON_URL)
r = requests.get(JSON_URL, headers=headers)

jdata = r.json()


roomColor = hex2rgb(jdata["room"]["color_hex"])

# print(roomColor)

exits = []
currentExit = {}

def updateRotary():
    global button1_state
    # rotary encoder
    last_position = None
    position = encoder.position
    if last_position is None or position != last_position:
    last_position = position
        currentExit = exits[position]
        print("exits error")
    if not button1.value and button1_state is None:
        button1_state = "pressed"
    if button1.value and button1_state == "pressed":
        print("Button1 pressed.")
        button1_state = None

def updateKnobSwitch():
    global button2_state
    if not button2.value and button2_state is None:
        button2_state = "turned"
    if button2.value and button2_state == "turned":
        print("knob turned.")
        button2_state = None

def updateapi():
    global exits
    r = requests.get(JSON_URL, headers=headers)
    jdata = r.json()
    # if (jdata['room'] ['color_hex'] is not None):
    #    roomColor = hex2rgb(jdata['room'] ['color_hex'])
    #    print(roomColor)
    roomColor = hex2rgb(jdata["room"]["color_hex"])
    exits += jdata["room"]["exits"]
    time.sleep(2)  # eventually things will recheck the api after actions, so we'll move this sleep out then

# spi check loop

while True:
    # timed recheck of the api

cogs hardware wise- I have laser cut some mdf cogs and a ratchet, I’m currently gluing a little spring piece to keep the lightweight mdf ratchet in place. I need to secure the knob axel(?) on the side opposite of the knob, but there’s no nut metric or customary that seems to fit it. Once I have this prototype working well, I’ll 3d model the assembly and cut aluminum parts, then a wooden housing. I imagined the housing to be shaped like a prism stood on one end, and the wood of the base to be riddled with termite damage-like patterns.

Why any of this??

I like the idea of a doorknob as a game controller for a game that gives feedback without a screen (or maybe a minimal screen like e-ink or a led matrix). There’s the touch sensation of the knob (I have antique knobs given to me by my mother with family history and interesting weight and touch). There’s the familiar action of turning it. but the ‘door’ must be imagined. crockery knob heavy crockery doorknob my uncle gave my mother at my grandmother’s funeral

  1. Related to the doorknob (“the phantom doorknob”), It would be great to have detailed audio feedback as well as rgb lighting to give an impression of the room in the game. I’ve been experimenting with sonic pi, which presumably should be able to connect to Grotto’s api, get information about the current room, and then play loops at different volume and with varying reverb as a result.

where this is at: This is new to me and it’s been a while since I tried writing any ruby. The sonic pi sketch isn’t connecting to the API and I can’t tell if it’s because of the authentication token or not. Here’s what my program looks like now, (with a redacted auth token)

# Load the library for making HTTP requests + json

require 'open-uri'
require 'json'

# Define a function to fetch the JSON data from the URL

# Fetch the JSON data

def fetch_json
  json_data = open("",
                   "Authorization" => "Token <auth token here>").read
  data = JSON.parse(json_data)

# Fetch the JSON data

data = fetch_json

# Extract the attributes from the "room" object

attributes = data["room"]["attributes"]

# Extract the number of exits from the "room" object

num_exits = data["room"]["exits"].length

# Define the reverb amount based on the number of exits

reverb = num_exits / 25.0

# Define the audio loops

loop1 = :loop_amen
loop2 = :loop_garzul
loop3 = :loop_industry

# Set the volume levels for each loop based on the brightness, cleanliness, and sanctity attributes

with_fx :reverb, mix: reverb do
  live_loop :loop1 do
    sample loop1, rate: 1, amp: attributes["brightness"] / 10.0
    sleep sample_duration(loop1) / 1
  live_loop :loop2 do
    sample loop2, rate: 1, amp: attributes["cleanliness"] / 10.0
    sleep sample_duration(loop2) / 1
  live_loop :loop3 do
    sample loop3, rate: 1, amp: attributes["sanctity"] / 10.0
    sleep sample_duration(loop3) / 1

When I run this I get: *Runtime Error: [buffer 0, line 17] - Errno::ENOENT Thread death! No such file or directory @ rb_sysopen -* (I tried the url with and without ?format=json at the end)

Which I guess could be the auth token bouncing, but it’s formatted just like it is in my old unity project Mud Room, which works.

  1. Working off the idea that I avoid using overkill computers with displays, I had started using Sonic Pi because it could be run on my Rpi 4. I liked the idea of having multi-channel audio in the space, so I did a little investigating and found that a cheap usb audio interface could be used for surround sound. Unfortunately the surround sound speakers DMA has to test with have digital in only, and after more time experimenting than I would have liked, I found out that my usb audio device only does surround through phone jacks and its optical output just does stereo. I could keep pursuing this with some different speakers, but my concern is that when I run speaker-test a couple of times in a row on the rpi, it sometimes gives me a playback open error: -16,Device or resource busy which doesn’t inspire confidence. raspberry pi and cheapo usb audio

So- the path of least resistance here is to submit to having a ‘real’ computer as part of the piece. Since api calls work fine in unity and presumably unity knows what to do with surround sound speakers, I could just build the audio part in unity, and if I end up needing some projection visuals too I can add those to the new unity project, which would already have a lot of work already done since I could reuse code from Mud Room. I know it’s the best way forward but it’s a little disappointing.

  1. I’d like to have elements of the past two shows in the gallery, and I was thinking of replacing the overhead projector from Archon with this thing- opaque projector It’s an antique opaque projector, meaning I don’t need to make transparency slides for it, you could just stick a book in it (I’d like to make a book for this show like I did for Archon) The bulb that is in it is faint though. I did a brief test with Ariel and stuck very powerful LED’s in it, which worked well, and were cooler than the original incandescent bulb.

where I’m at with this- I have to wire up the led’s and figure out how to mount them inside, then I’d like to cut another wood pedestal with a built-in bookshelf in it, like in Archon

  1. Print book- This seems not too hard. New output from my tilepaintings/wave function collapse + my thesis writing in a book, like a big version of the Archon book. It’d be nice to have some fold-out pages and other design features.


  • set up a new unity project based on Mud Room with 5.1 audio output
  • get the solenoid lock working on the esp32
  • get the use exit api post working in the circuitpython sketch for the esp32
  • figure out how to have coroutines working on different timers in circuitpython ask Chandler
  • find a way to secure the doorknob
  • make a pedestal for the opaque projector
  • supercharge the opaque projector
  • start laying out the new book