[LEAPSECS] Celebrating the new year a few seconds late

Jonathan E. Hardis jhardis at tcs.wap.org
Fri Jan 4 09:14:51 EST 2019



> On Jan 1, 2019, at 1:03 PM, Brooks Harris <brooks at edlmax.com> wrote:
> 
> Back in the days of analog TV (which is still used in some parts of the world) the broadcast TV signal was one of the most stable time sources around. This was necessary because the display of the signal on a CRT TV set depended critically on the timing of the components of the signal, the horizontal and vertical scan lines of each frame (actually two interlaced 'fields').
> 
> There were experiments at NIST in the early days of TV to use the TV signal as a time dissemination source. It worked well, as coordinated with the NIST radio time signals. But it didn't turn out to be a practical solution.

More specifically, the idea was to put a character code (like ASCII) in the VIR (vertical interval reference) portion of the signal that would be the correct time.  There turned out to be little interest in the technology for this purpose, but an alternate application made it big—closed captioning.   For this, NIST won an Emmy Award.  https://www.nist.gov/node/774286 <https://www.nist.gov/node/774286>  (Link inactive during lapse in appropriations.)

> All that real-time behavior went away with the advent of digital video and digital TV broadcast. Everything is buffered in some manner, sometimes very short, like parts of the picture scan lines, to entire frames, and, often many frames. There are many processes signals now propogate through, including compression and decompression of various formats, like Mpeg2, which is what we've mostly been watching for many years. New formats, including hi-def etc are constantly being adopted. With each of these stages there is some buffering and delay. And each facility and broadcaster has different equipment and procedures, so its unlikely any two TV signals are synchronous by the time they get to the audience's screens. And each TV set has its own internal buffering which adds more delay, and few of these products match each other in that respect. And, as noted, live broadcasts are often intentionally delayed. "Live" is not really live.

Digital broadcasts, both radio and TV, use forward error correction.  That is, the data is encoded in such a way that it’s transmitted at least twice, and spread out, over a time span of several seconds.  This prevents a momentary signal interruption (a bird, a highway bridge) from killing the data stream.  Of course, the user isn’t presented with the data until after all occurrences of it in the signal have passed.  Digital broadcast radio has a feature called “ballgame mode” where the data window is minimized in order to reduce the latency for those listening to play-by-play while watching a live event.

     - Jonathan

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://pairlist6.pair.net/pipermail/leapsecs/attachments/20190104/ef260798/attachment.html>


More information about the LEAPSECS mailing list