Beats and Patterns

WEFT should think about beats as a unique coordinate space. Time is already a coordinate (@t), so beat-time is a remapping of @t through the bpm of the composition. A pattern backend could expose P.step and P.phase as signals derived from @t:

#backend pattern { step, phase } as P

Then we could write a drum pattern is a tuple indexed by step:

kick  = (1,0,0,0,1,0,0,0).(P.step % 8)
snare = (0,0,1,0,0,0,1,0).(P.step % 8)

This gets a little gross for anything more than eights. But we could use sub-patterns, signals used as tuple elements. The outer tuple selects which sub-pattern is active, the inner tuples define them:

kick = (kickA, kickB, kickA, kickB).(P.step / 4 % 4)
  where {
    kickA = (1,0,0,0).(P.step % 4);
    kickB = (1,0,1,0).(P.step % 4);
  }

This also makes it pretty nice to see the ABAB pattern. I don’t love having to put .(P.ste / 4 % 4) at the top level, but maybe the pattern backend could expose native P.fourths, P.eights, P.sixteenths, etc.

Because where is substitution, pattern transforms are just coordinate transforms on P.step:

kickRetro   = kick where { P.step = 15 - P.step; }      -- retrograde
kickShifted = kick where { P.step = P.step + 2; }        -- rotation
kickFast    = kick where { P.step = P.step * 2; }        -- double time

Same mechanism as spatial transforms on @x, which I think is neat.

Sample playback

This is initial brainstorming on how sample playback could work, specifically keeping in mind the drum/beats use case.

Sample playback could be represented as coordinate remapping into the file’s time axis. P.phase sweeps [0,1) across the current step:

play = load("kick.wav", P.phase * stepDuration)
play = load("song.wav", @t)                       -- free running
play = load("song.wav", P.step * stepDuration)    -- beat synced

Open questions