Mike Gunter said:
The defective pixels lie in the video processing seen in the LCD.
Remove the word "processing" and we're all on the same page.
Mike Gunter said:
I don't buy the hot pixel in 'High Definition' sensor argument at all - with the exception that overexposure will show noise; the output in video isn't close to the number of pixels of the sensor. The content is processed and resampled downward.
As others have mentioned, there is a very very low chance Nikon uses every line of the sensor in video mode. The Jello Effect typical of CMOS sensors is caused by slow line read speeds. Reading every line when only 1080 lines are needed is very unlikely. Forget for a moment that downsampling is computationally expensive, reading every line is almost impossible on current hardware. The buffer just isn't large enough. This is evidenced by the fact there is a slowdown at all when shooting high-speed continuous. Obviously the mechanical limitations of the shutter/mirror determine the maximum frame rate, but the fact one can fill the buffer and experience a framerate slowdown is proof, not evidence, that the RAM buffer is full. If Nikon was reading the entire sensor in video mode @ 23.976 FPS the RAM buffer would need to be orders of magnitude larger than it is. One could shoot 6 fps forever.
The other argument is that in video mode the sensor's duty-cycle is significantly higher than in any normal still photo mode. As has been said these are "hot" pixels and not stuck pixels. Video equals at least a 50% sensor duty-cycle (24fps @ 1/48th of a second) if Nikon is able to fully power-off the sensor and amplifiers between frames. There is likely a power-up and power-down lag of up to 100ms for such a large die, so we'll call that a 55% duty cycle (.02083 seconds vs .02283 seconds of sensor power per frame).
In high-speed still shooting, let's assume a worst case. You're shooting in the dark, trying to sneak in one non-blurred shot. 1/20th of a second. That's 0.050 seconds of sensor activity per shot, .002 seconds added for power up and power down delays, 0.052 seconds of powered sensor * 6 = 0.312. A 31% sensor duty cycle. So to get the sensor as hot as recording video one would need to shoot high-speed continuous @ 6fps @ 1/10th of a seconds, forever and ever. But wait - you can't: The buffer fills. This is why hot pixels are exacerbated in video when compared to stills.
Mike Gunter said:
I had wondered if the content was 'passed through' the LCD. Can one, using a switch, record the display data?
The LCD in most consumer electronics is fed by its own framebuffer. No comprehensive tear-down of Nikon cameras exist, so I can't even comment on what chip they are using as the LCD controller, but there is no reason to expect them to have reinvented the wheel.
While one could conceivably fork the data path post-framebuffer, it would take a custom piece of hardware, gaining nothing while costing a pretty penny. The way dual-outputs are normally done (iPods, portable DVD players, etc) is to fork the video data before the LCD controller (which is where the framebuffer I mentioned is physically located.)
It gains nothing because the data in the LCD controller's framebuffer is 1 - uncompressed, 2 - aligned for the data bus to the LCD, and 3 - in the colorspace required of the display itself. Since the LCDs on consumer electronics are not "high gamut" displays this means the colorspace has been decimated, and there is not enough memory in said framebuffer to hold full colorspace. Said buffer expects aligned inputs and outputs, because latency is key, else one would get tearing and all sorts of other nasty visible artifacts on the display itself.
PS - I'll think some more about this and try to ponder a reason Nikon would have take an unconventional approach. It clearly IS possible, but for the reasons I mentioned above I see no reason to believe they did. Everything we see is explainable pretty robustly w/o exotic action on Nikon's part.