I seem to be working backward through my time at RC, chronologically. This is a piece I built for the last week of my batch proper -

What is it? Again, I’m working through my time at RC backward, but the most relevant part of this one is - its mostly shaders, creating roughly 4 different flavors/frequencies of noise, mostly simplex. The background is a displaced grid of rhombuses, and I was surprised to find out that simplex noise has weird discontinuities if you quantize it in the x and y dimension, so I just used a sum of sines instead, per Bruce Hill’s hill-noise.

The code is really messy but it’s here. The border is composed of a high-frequency noise that’s added in a lower amplitude to the whole image, I find that it adds a “metallic” physicality to the visuals that I really like? Without it they look “plasticy” - someone told me it reminded them of Austin Powers.

The foreground hexagon is composed of canvas line drawing. I’m iterating over the x axis of the screen one pixel at a time, deciding whether to draw a vertical line of some length, and then drawing that to a canvas - and then I take that canvas, display it with more noise, and then use that texture to look up which of two gradients to use for the final display step.

I’ve been reading Ramsey Nasser’s post on his video synth, DiMattina, and the Rutt-Etra video synth that inspired it, and realizing that this technique of displacing and coloring line drawings with WebGL has a really vast amount of depth in it.

There is a (probably very buggy/unpolished) version of this online if someone wants to play with it or try a few different color palettes: https://whaling.dev/blue_noises/

The IG version of this has music from an ORCA patch similar to lissajous_two. I’ll probably make some posts about my ORCA technique on its own soon, I think that is outside the scope of this post.

I’ve “performed” this piece twice live, once at RC’s Demo-palooza, and once at a livecode.nyc meetup. A few notes:

  1. This one looks really good on a big screen
  2. When I perform this, I find myself very slowly going through a pretty large hyperparameter space and changes colors periodically as I go.
  3. People suggested a live twitch-stream or other way to deliver a longer video form, which would be a new thing for me, and an interesting problem to solve.
  4. Someone at the livecode meetup even suggested a “twitch plays pokemon” style of interactivity, which is an even more fascinating engineering problem haha