As CBS demonstrated with the Grammys: The Internet is not ready to compete with television for live events.

There was a significant number of complaints over the reliability and availability of the Grammys telecast last week. I dug a little and found the following section of a book that I wrote back in 2013 that explain why it is the case. The full book is available for free on this blog following this link 2020: The End of Television

The Internet, a general purpose network
The Internet is in fact a series of interconnected networks with various characteristic. It has not been build to transport video signals but for general data. In fact, although it was theoretically possible to transport video on the Internet when it began to be open to the public in the early 90s, the available connection speed made it unpractical.

Transporting data on the Internet is fairly unpredictable. Spikes in usage in any points between the server and the user may introduce unforeseen delays in the delivery of a data stream. This make the transport of live video a very tricky operation.

The Internet is also capable of bidirectional communication. This makes it possible for the end users to request content on demand and to be served without delays.

For the most part, video transport is done, even today, with large buffers of many seconds if not minutes. When you begin to play a video at home, the client player on your computer will try to download data faster than the playback rate and create a buffer that will let you experience the video smoothly even if there are times when the delivery is slowing down. In some cases, if the network is too busy of the resolution too high for your connection, the buffer will empty and you will have to wait to resume video watching. While this is not a big problem when watching library content on a service like Netflix, it is more of a concern if you are watching a live event.

For example, Apple do present some of their big announcement event live on the Internet. Those event draw a large audience, estimated to be over a million, that want to watch it live, as the announcements are made. But, as impressive as a million viewer might be, it’s still far from the TV audience of an event like the Superbowl which was estimated to 108 million in 2013. Even so, the quality of the video streaming at these event is not comparable to HD broadcast television, it is far inferior and still, the experience is not a seamless one. I was watching such an event this week and not only was there a few seconds buffer but the steam did stopped several times.

Apart from the fact that Internet was not build for video streaming, the reason that it is still hard today to do massive streaming event is the fact that every user needs his own stream.

As opposed to dedicated video delivery infrastructure, where a signal is sent over the whole chain regardless if someone is watching it, on the Internet it is sent only is someone request it by initiating a connection. Each of those connections require the some resources from the source. In a specialized delivery system, the source server has a constant, relatively light load regardless of if it watched by 1 person or 100 million. But on the Internet, each viewer require server resources. Today, it would be unthinkable for a single server to server a million stream. Not only would that use a lot of computing power, but it would also need the connection speed. Let’s imagine that 1 million connection are done simultaneously at 1 megabit per second (relatively low quality), it would mean that the server need to be able to handle 1 million Megabit or 1000 Gigabits/second (1 Terabit). Given the fact that the most common Internet backbone connection today is what is called an OC-192 that operate at nearly 10 Gigabits/second, it would take 100 of those connection to achieve that level of services.

It is possible, but unpractical and very expensive. For that reason, organizers of those live events uses Content Delivery Networks of CDNs like Akamai. Those service typically will receive the signal of the live event and distribute it to various data centers spread over the world and do the streaming to the audience members from those sites. This has the advantage of cutting the long range traffic over the Internet and spreading the load to various part of the world. However, using a CDN for a live event is very expensive and it would be prohibitive to use that today for a large event like the Superbowl.

There are solutions to that problem that would bring the Internet to a level where it could compete with dedicated delivery systems. The technology is called “multicast” and it is a technique that is available in the various pieces of equipment that transport the Internet traffic like the routers and switches. The reason that multicast is not enable on the Internet is that, if left open, it would flood the Internet with all sort of traffic and would make it less usable and prone to more cyber-attacks.

Multicast is a technique were a source will send a single signal to a specific “multicast address” and all the clients that have registered to that address would receive the signal.

Multicast along with network broadcast (a multicast that is send on all network ports) is used widely on local networks for application such as automatic network configuration and discovery (such as DHCP and Apple’s “Bonjour”) as well as media distribution and application like multi-point videoconference.

However, there is significant challenges to expand multicast to the Internet especially on the configuration side in order to allow efficient video delivery while preventing unwanted usage. A fairly new technique called “Software defined networking” or SDN may bring the foundation of the solution to the that problem. In fact, Cisco did announce the availability and deployment of product with that feature in September of 2013. With those type of technologies deployed through the Internet, the foundation would be there to support efficient, scalable live video.

It would also require for the ISPs to collaborate with content owners for that to happen but we can well envision that content providers and ISP can draw deals that can be mutually beneficials. This would prevent ISP’s network to be saturated and would lower the cost of content delivery for the content provider.

While the technology do exist today to enable the Internet to be more efficient for real-time video delivery, it will still take quite a few years before the various ISP have acquired and deployed the technology. But with the increasing demand, it is more than likely that those conditions will be place by 2020.

Leave a Reply

Your email address will not be published.