Originally Posted By: kc2ixe
In digital, you drop ONE bit, you have no reception. It's wonderful when it works, and you can get every bit in each digital packet - but with analog, if you lose the same amount of data, you don't lose signal, you just get a noisy signal


And this is the main flaw in the whole DTV scheme - it's not fault-tolerant like TCP/IP.

This is the main issue with DTV - there's no good way to deal with packet loss in the same way you do with TCP/IP, which can not only re-send lost packets, it can send packets via multiple paths and even network connections and you can put them all in order and store up enough for a decent stream as they pass through the local buffer. I've read up on the ATSC standard (and QAM and all the rest of the jargon associated with DTV) and I am astonished that a specification as poorly engineered as this ever was adopted as a standard. From a data-delivery standpoint, and in looking at real-world situations like multi-path reflections and intermittent signal reduction due to leaf coverage, weather patterns and so forth, it's no wonder I'm at - and off - the edge of the "digital cliff".
This is now the second time I've seen how a digital "upgrade" from analog is in fact a downgrade for end users. Many of the same issues I face as a firefighter with our APCO-25 Digital Radios are similar to the ATSC issues (random, unexplained outages, does not work where the old analog system did). At least in the fire service, we've been given the ability (in fact a MANDATE) to NOT use the digital system for Fireground Communications (it's deemed to unreliable for on-scene life safety). With television broadcasts, I won't have that option.

I wonder what hurricane season 2009 will be like when the cable TV wires go down and people try to get local information on TV and they find that the old TV in the garage that worked last year does not work anymore.

My bet: You'll see a hasty "temporary reactivation" of the NTSC Analog facilities before 2009 is out.