View Issue Details
|ID||Category||Date Submitted||Last Update|
|0007768||features||2019-06-22 02:10||2019-11-04 07:59|
|Platform||Some Other Linux||OS||Some Other Linux||OS Version||unknown|
|Fixed in Version|
|Summary||0007768: MIDI warp: linearly stretch the timing of MIDI events in a region|
|Description||# Linear MIDI warping|
In Ableton Live you can "warp" audio -- you might take a 2 second clip of sound, and stretch it so that what used to be the first half of the clip occupies the first 90%, and what used to be the second half occupies the remaining 10%.
Doing that with audio seems hard, and I'm not asking for it. But similarly warping MIDI is dead simple, at least mathematically (see below). And surprisingly, Ableton do not offer it.
This feature would let the user add "warp marks" to a point in the MIDI timeline, and then drag them around. Dragging a warp mark causes a linear stretching of the events around it, while keeping the other warp marks fixed.
## One reason you might want to do this
Suppose you record without a metronome, and your left hand always slams down a particular chord X at the start of the beat. If you stuck a warp mark at the start of each X, you could then drag those marks around to change the tempo of what you played locally, while preserving the rhythm.
Among other possibilities, this lets you add a click track after you've recorded without one. That's valuable because many players find it easier to be more creative without a click track.
## The math
Suppose the user has drawn three warp marks, at times `a`, `b0` and `c`, where `a < b0 < c`. Suppose the user then drags the middle marker from the old time `b0` to the new time `b1`.
No MIDI events before `a` or after `c` will be affected by the move. (For warp purposes, the start and end of a note are treated as two separate MIDI events.)
Consider a MIDI event which, before `b0` is warped to `b1`, occurs at time `t0`, where `a < t0 < b0`. After warping, this event will occur at time `t1`, where
t1 = (t0 - a) * (b1 - a) / (b0 - a) + a
Similarly, if `b0 < t0 < c`, then after `b0` is warped to `b1`, an event which used to occur at time `t0` will now occur at time `t1`, where
t1 = c - (c - t0) * (c - b1) / (c - b0)
Those formulas work regardless of whether the middle marker is dragged forward or backward.
|Additional Information||If my description is not clear, I could try making some animations and upload them to Youtube.|
Thanks for the elaborate suggestion. The description is clear, but it's not something that can be implemented in the near future.
The problem here is at a different level: "t" in your case is just some abstract continuous time unit.
Ardour's engine handles currently uses a ratio for audio time (when events actually are played to the soundcard): N / sample-rate.
While music-time (bar/beat/ticks) and a tempo-map is currently use double-precision floating point value (mainly for the benefit of exponential tempo-ramps).
GUI Edits can happen at independent music-time units (e.g. move one bar forward, and backwards) and should be lossless.
The problem is that there is no bijective map between the different time-domains, mainly due to rounding issues.
There are some plans to move to a common high-precision time-base ratio ("superclock") to return to some sane state of linear algebra. There are also some thoughts how to represent music-time: https://ardour.org/timing.html but this is a deep change and will likely not happen before Ardour7.