Date: Sat, 3 Jan 2015 20:54:45 -0800
From: Bret Victor
Subject: laser architecture sketch
Having fun thinking about architecture.  Thinking about this as basically a room-sized operating system.

The speech server is because I want to be able to be in the research gallery and say "from Toby Schachman, about two months ago" (perhaps into a microphone or iPhone, or perhaps just out loud while holding down a button), and labels matching that query are highlighted.


------------------------------------------------------

All servers and apps are in javascript (when possible).
Communication is via seatbelt (when appropriate).


in the wild

mail server
database of email archive
currently, a gmail account and inboxapp

label server
database of labels
a label is a physical or virtual image that can be physically placed, and may have links or media attached
the label server stores the appearance and metadata of a label, not its physical location
currently a cgi script on worrydream.com

media server
database of images and other media files referenced by emails and labels
currently a cgi script on worrydream.com

label web app
tool for generating a label


in the lab

app server
knows which apps are on which surfaces
manages launching and switching apps
manages routing events to apps (such as laser events, kinect events)
implements some global UI behavior (calibration, clicking on simple labels)

physicality server
knows the 3D location of objects in the room
objects can be surfaces, labels, projectors, cameras, kinects, walls, furniture...
locations can be relative (eg, a label's location might be relative to a surface or a wall)
can serve transform matrices for converting from one viewpoint to another
can do hit testing / ray casting
can find surfaces near a location, etc
initially, it won't actually know absolute locations, but just enough to answer the questions we ask of it
eventually, it might track absolute locations of all objects
eventually, it might auto-calibrate to figure out locations it doesn't know, and auto-update as objects are perturbed

laser server
does laser fixing and tracking
receives candidate laser positions from nodes
determines the laser's surface and location
sends events to app server

speech server
listens for speech commands
transcribes audio into text
sends speech events to app server

print server
prints out labels
eventually, might print out 3D objects

kiosk web app
runs in any web browser
turns the device (computer display, iPad, iPhone) into a surface which can run apps
the device's location is registered with the physicality server
for mobile devices, the location is updated as the device is moved


node

A node is a generic hardware/software bundle that can be "dropped in" to the room and just work (after calibration).

hardware
Mac Mini, plus any number of:
projectors
cameras
kinects
microphones

node manager
does some camera and kinect processing and reports back to the appropriate servers

chrome
runs with --remote-debugging-port, to enable developer tools on laptop
runs host in presentation mode on each display
host runs apps in iframe
host accepts events from app server
apps might communicate only with host, or might communicate directly with servers via some javascript library
("host" and "kiosk web app" might be the same thing)