We go through at least one router a year in my house. The wireless connection works reasonably well for a while, but eventually it gets finicky and we shell out some cash at the local Circuit City rather than face rebooting the router every five minutes. That’s for in-home data networking.
Now we have to solve the problem of in-home multimedia networking. I don’t mean “we” as in my household, but “we” as in the technology industry. When it comes to TV, consumers just expect it to work. Kind of like the dial tone.
Wired networking, versus wireless, is clearly the better option for TV networking today. But even with that choice made, there are still different networking approaches to consider. For example, does it make more sense to center on one media hub in the home or to use a distributed architecture where individual devices (in this case TVs or set-tops) each carry some of the weight and functionality of the network? If you think the answer is to have one media hub, keep in mind that we’re essentially talk about computers here, and even the best computers go on the fritz now and then. As we start hanging more and more things off the network (home networking products are a huge growth area), it only makes sense that added devices should handle some of the load and not just act as dummy units.
Someone recently made a comment on a related topic (CableCARD PCs) that I think pertains quite nicely to this discussion. “If your hard drive goes down, you can’t watch TV. That blows.”