Date: Mon, 6 Mar 2023 13:43:49 -0500
From: Luke Iannini
Subject: Re: knobs and sliders
(Apologies for short gap in replies, Alex just took her 8-hour USMLE Step 1 exam on Friday so it was all hands on deck here! She's through the gauntlet but won't know results for two nail-biting weeks...)

My favorite line from the video was his phrasing of 
[you want to tighten the feedback loop] "...so that the accidents can happen faster"

A deep connection with physical controls is in our genes — friend of the lab David Kanaga (of Music Cards fame) made an unbelievable game-like-thing back in 2015 that was entirely driven by a custom-built bank of MIDI knobs and sliders controlling an 18-dimensional possibility space (9 knobs, 9 faders) of sound and vision. With fingertips on the faders you could move 9 of them simultaneously, and in the hyper-immersive installations they did of it around SF (I think we were in neighboring rooms at an event back when we were showing off La Tabla!) with large projections and speakers the experience was like nothing else. 

(Hard to find media, all links are broken, but here are some decent glimpses and I think it can still be found on Steam)
https://www.youtube.com/watch?v=gBTTMNFXHTk
https://www.gamedeveloper.com/audio/road-to-the-igf-ramallo-and-kanaga-s-i-panoramical-i-

Definitely worth taking inspiration from for our own immersive explorations of parameter spaces!

Taken to the extreme, a quick tally of my studio reveals somewhere well over 1000 individual knobs (276 on the mixer alone!) — when asked why I still use physical synthesizers, when I could simply spawn 512 virtual Minimoogs running in Ableton Live, I explain that when I'm fully immersed in constructing a song, I know exactly where in my studio each sound is coming from — I can close my eyes, and know that not only can I change that melody by reaching for that synthesizer, but within that synthesizer's panel I can change that aspect of that sound by reaching "without thinking" for the stable-in-space, single-purpose knob I need.

Someday soon we'll do this with DNA.

A Standard-Issue Dynamicland Parameter Space Explorer sounds like a wonderful idea, and I've definitely kept a background mental thread running on integrating MIDI-controller-type devices into Realtalk! Knowing there's some shared excitement for that, I'd be thrilled to get us all set up in the near-term with some off-the-shelf wireless-ified control banks! It would be fun to have a tracked vertical bank of 8 knobs that automatically connect up to the closest 8 parameters when placed to the right of e.g. a vertically arranged Recognition Kit pipeline...

In the medium term I think they'd be a fabulous target for some simple custom hardware — my personal basic wishlist would include higher resolution (most MIDI controllers only have 128 steps, though a few higher-end ones support NRPNs to give 16384 values, but we can just use some 16-bit ADCs since MIDI-conformance isn't necessary), sane wireless as Shawn mentions, and something like the Qi inductive charging so you can just place the knobs casually on the table to charge them...

(On the topic of custom hardware, I mentioned to Shawn the mysterious magic of JLCPCB which our other-friend-of-the-lab Alex Evans (co-creator of Dreams) used to manufacture his own from-scratch touch-controlled synthesizer from his apartment without ever picking up a soldering iron... they have a fixed catalog of 350K components, and if you restrict your design to those, you can just upload your Eagle/Gerber file and get a fully assembled batch of PCBs a week later with optional 3D-printed enclosures. Something to experiment with until we've got the fully-integrated Apparatus Lab up and running!)

On Mar 1, 2023, at 5:30 PM, Shawn Douglas <****************> wrote:

This is a great find! Also worth checking out what he says later at 28:00:

This relates in a bit to what I was describing with artists' turnaround time. If I didn't have the sliders and all I could do was change shader logic in order to see changes made to the shader then it takes about a minute to turn that around. If I have a very short shader it might compile in a span of a few seconds but even if the shader compiles in a few seconds the number of varieties of that shader that I can see is constrained by that shader compiler time. With the slider, I can see hundreds of variations of the shader. Particularly if I change the sampling patterns, I don't have to guess what happens if I drop the number of samples to zero, I see what happens here. And I can see maybe how many samples it is before the shadow is satisfactory, and how many samples I need when the shadow is much more accurate. And so I can also extract those values and maybe use them as different graphical presets. All those things become possible now and I can explore them very quickly by having those variables that are predefined and mapping them into something that I can change in real time.

He continues with a point that could serve as a metaphor for the value of improving the scientific research process:

We have so much computational power now that we could put all of that just into making the game prettier. But you get diminishing returns on that, and if you think about spending 20% of the performance horsepower of the machine you're running the game on on making it more real time and having more hooks in with the tools that might be difficult to optimize, but if they're in the game during its development and it increases an artist workflow from several minutes into several seconds. That's a couple orders of magnitude of faster workflow to an artist. So it's well worth spending the 20% performance hit that you might get from that. You might sacrifice your polygon count or some of the extra complexity of the shaders. If you put that into the tools instead, you're gonna get a drastically better game.

I've also been thinking about diversifying our input controls. A few months ago, I picked up some arduino-ready slide potentiometers (~$2.50 each) to get a feel for them. 

<slidepot.jpg>

It might be a fun exercise to build a recognizer to estimate the slider position via the camera, and compare that to the electronic output.

I love the idea of a standard-issue midi controller to accompany the keyboard and knob-dials.
There's a whole universe of commercial controllers available, many of them are surprisingly expensive. 
For example, the Midimaker lineup has a simple, clean design but you're looking at $100–$175 each.

Several crowdfunded controller systems have nifty submodules that snap together magnetically:

To get wireless connectivity for devices with actual midi connectors, we might use midi-to-bluetooth adapters. For controllers with usb output, we might find or build little battery-powered wireless usb hubs. I haven't found any ideal product, but maybe something like this. We may want these anyway for other input devices, such as joysticks or giant knobs. (If we're building, I vote to use 2.4ghz rf receivers like our keyboards; pairing bluetooth devices often makes me question my sanity). Finally, I recall Omar experimented with RaspberryPis as dedicated USB-to-wifi adapters — maybe pi zero gum pack-sized hubs could work.