Dance Card is Full

It takes two to tango, and lots of people for a line dance.

Yes, as the rest of the Web has noticed, Apple has just proudly touted the fact that it’s streaming its own press event in a format only people with the latest Apple devices can actually watch. Even Mac site TUAW, gearing up for today’s press event, thinks it’s pretty odd. But let’s skip straight to the good stuff: what’s this HTTP live streaming, anyway? The short answer is, it’s something cool – but it’ll be far cooler if Apple can acquire some friends doing the same thing.

Apple PR has this to say about their stream:

AppleĀ® will broadcast its September 1 event online using Apple’s industry-leading HTTP Live Streaming, which is based on open standards.

(Update – it may also help if you have a $1 billion server farm, as that could be the reason Apple is doing this at all. I’m, uh, still holding out for some magical nginx module, myself, but okay. How many billions would Apple have needed to reach more than Mac and iOS devices?)

Note that they never actually claim HTTP Live Streaming is a standard, because it isn’t. Apple has proposed it to the Internet Engineering Task Force, but it hasn’t been accepted yet. Meanwhile, as we’ve learned painfully in the case of ISO-certified AVC and H.264, just having a standard accepted is far from the end of the story – standards on paper aren’t the same as standards in use. Ironically, presumably all Apple means by saying HTTP Live Streaming is “industry-leading” is that they’ve done it first, and no one else has.

Apple can claim, correctly, that HTTP Live Streaming is “based on Internet standards.” In lay terms, you take a video, chop it up into bits, and re-assemble it at the other end. While common in proprietary streaming server software (think Flash), that hasn’t been something you can do simply with an encoder, a server, and a standard client. As Ars Technica explains, one key advantage of Apple’s approach is that by using larger slices or buffers – at the expense of low latency – you can count on higher reliability than real-time streams. And unlike previous approaches, the use of HTTP means you don’t have to worry about which ports are open. So you get something that’s reliable, easy to implement, and doesn’t require pricey additional software.

Other than that, it’s all basic stuff, meaning implementations should be easy to accomplish, software stays lightweight, and lots of clients could easily add support on a broad variety of desktop and mobile platforms. Here are the basic ingredients:

  • MPEG-2 Transport stream, set by the encoder.
  • Video encoding – Apple’s proposal suggests only that you use something the client can support, so while they require H.264 video and HE-AAC audio for their implementation, you could also use VP8 video and OGG Vorbis audio; you just have to hope the client has the same support.
  • Stream segmenter – this is the part that actually chops up the video.
  • Media segment files – the output of the streamer, this could be a video .ts file (the MPEG-2 format), or even, as Apple observes in their developer documentation, a standard M3U (M3U8 unicode) file, just as you may be accustomed to using with Internet radio stations and the like.
  • The client reads the result, by reading the standard playlist file. That’s the reason multi-platform, open source player VLC can read Apple’s stream.

It all makes perfect sense, and it’s actually a bit odd that it hasn’t been done sooner in this way. For the record, just streaming video over HTTP doesn’t cut it; you need exactly the kind of implementation Apple is proposing. The proposal is so simple, I’d be surprised if someone hadn’t implemented something similar under a different name, but then, I can’t personally find a case of that. Sometimes, technologists overlook just these kinds of simple, elegant solutions.

All of this raises an obvious question: why is Apple crowing about how cool it is that only they are using it? (“Look at me! I’m the only one on the dance floor!”) I suppose the message is supposed to be that other people should join, but that leads to the second question: where are the implementations?

There’s no reason HTTP Live Streaming couldn’t see free encoding tools on every platform, and still more-ubiquitous client tools. John Nack of Adobe muses that it’d be nice to see it in Flash. Browsers could work as clients via the video tag, as Safari does now. VLC appears to work as a client already.

One likely missing piece there is the encoder. In their FAQ from the developer documentation, Apple lists two encoders they’ve tested:
Inlet Technologies Spinnaker 7000
Envivio 4Caster C4

This tech is currently used as a way of streaming to iPhones specifically, but it’s not exactly household stuff.

Client implementations shouldn’t be that hard. But that brings us to a climate in the tech world that, for all the progress on open standards, could still use some improvement.

Making interoperable technologies work requires building partnerships. Apple hasn’t exactly been focused on building bridges lately, it seems. Nor are they alone; today’s lawsuit-heavy, savagely competitive, politically-charged tech environment seems to have everyone at each other’s throats. I’m all for competition. Friendly competition can even help standards implementation: witness the near-daily improvements to rival browsers Safari, Firefox, Chrome, and others, all of which are made by a group of engineers who share a common interest in getting compatibility for these innovations in the near term, and within a standard framework. A little one-upsmanship on getting those things done first or better is absolutely healthy.

But even as the draft HTML5 spec continues to evolve and open Web standards improve, badly-needed, genuine working partnerships seem to be fewer and further between. Posturing between competitors isn’t helping.

And nor can I find evidence that, while this is in draft, it’s set up for people to implement. Even the draft document begins by telling you you’re not allowed to use it:

Pursuant to Section 8.f. of the Legal Provisions (see the IETF Trust’s Legal Provisions Relating to IETF Documents effective December 28, 2009), Section 4 of the Legal Provisions does not apply to this document. Thus, to the extent that this Informational Internet Draft contains one or more Code Components, no license to such Code Components is granted. Furthermore, this document may not be modified, and derivative works of it may not be created, and it may not be published except as an Internet-Draft.

So, in other words, you can read the draft, but you can’t use the code in it, and you can’t make derivative works of the draft. (As far as I know, this is standard boilerplate for IETF drafts. But then, much legal writing in general can be summed up in one word: just, “No.”)

The bottom line:

1. HTTP Live Streaming is super cool.

2. It’s based on open standards and should be easy to implement.

3. Let’s hope we get implementations.

4. This PR stunt aside, it’s unclear what efforts Apple has made to reach out to anyone else doing an implementation, though information is sketchy.

Regardless, this somewhat odd move will certainly raise visibility of the tech. Whether that lasts beyond today’s media event remains to be seen.

Here’s where to go for more information.

HTTP Streaming Architecture [iOS Developer Library]

Apple proposes HTTP streaming feature as IETF standard [Ars Technica]

Image (CC-BY-SA) Ryan Harvey.

Updated: I’m indeed remiss in not talking about the excellent open-source, Java-based Red5 media server:
http://red5.org/

And to Adobe’s credit, open standards support in Flash – along with tolerance in place of litigation – is part of why such projects can exist.

In fact, I don’t see any reason Red5 couldn’t be the basis of a solution that streams to browsers using the video tag. I’ll try to follow up on that very topic, because I’m ignorant of the details.