The nyquist sampling theorem is a cornerstone of analog to digital conversion. It posits that to adequately preserve an analog signal when converting to digital, you have to use a sampling frequency twice as fast as what a human can sense. This is part of why 44.1 khz is considered high quality audio, even though the mic capturing the audio vibrates faster, sampling it at about 40k times a second produces a signal that to us is indistinguishable from one with an infinite resolution. As the bandwidth our hearing, at best peaks at about 20khz.
I’m no engineer, just a partially informed enthusiast. However, this picture of the water moving, somehow illustrates the nyquist theorem to me. How perception of speed varies with distance, and how distance somehow make things look clear. The scanner blade samples at about 30hz across the horizon.
Scanned left to righ, in about 20 seconds. The view from a floating pier across an undramatic patch of the Oslo fjord.
*edit: I swapped the direction of the scan in OP
In addition to scanning left to right, your scanmra must also have a decent readout delay for a given horizontal location top to bottom (or bottom to top?).
Excellent photo, as always. Water is a excellent subject for this type of camera. I wonder what a busy road at night would look like.
As for busy road at night, I’m afraid that with this way of doing scanning photography it would be quite uninteresting. The motion is way too quick, and cars would render as with either weird vertical lines or very small diagonal squiggles, depending on which direction you scan.
I suggest looking up some talks by the Italian photographer Adam Magyar, he’s done some great talks on transchronologal (?) photography, and is a great artist himself.
You might be surprised, especially if you find a busy multi-lane road. LED light on cars is generally PWM, so your camera will pick the strobing up. Add in strobes from multiple vehicles and it might even get interesting.
Thanks for giving me a name to go down a rabbit hole with!
I haven’t really considered that, I’m assuming the (in this case) vertical sampling is ‘global’, as in the values at each sensor site is locked at the same time and then read out from the serial bus.
If there was a delay, stuff like fluorescent lighting would read as a moire pattern, but I’ve only ever encountered streaking/linear distortion in those circumstances.
I think the ‘griddyness’ or general sense of direction in the water is purely a function of how water moves and not a result of readout delay.
I’d love to be proven wrong, though, so if I can do some experiment to determine either way, I’m all ears.
I’m only guessing on sequential sensor readout. It seems like cost would incentivize sequential readout, but then again that would make it hard for the scanner to move horizontally in a continuous sweep. You could try photographing something provided a strobe with crisp edges (eg not an incandescent light source that’s blinking).
And you’re totally right, the effect on the water could just come from sampling horizontal slices at a fairly fixed time interval. It just seems a bit too… “nice”? It is a very cool effect though.