Miracast is essentially a beefed-up version of WiFi Direct, an earlier streaming technology developed by Intel that failed to gain traction – possibly because it was tough to use.
What makes Miracast interesting is not so much what it does as who is involved. The list of device manufacturers behind it includes LG, Samsung and Sony, while Intel, NVIDIA, Broadcom Marvell, MediaTek, Ralink and Realtek are working on compatible chipsets, cards and adapters. If you thought that list included pretty much every manufacturer of note except Apple (and possibly Motorola), you’d be right.
The technology is already starting to be deployed. Samsung’s Galaxy S III is Miracast-certified, as is its new Echo-P series TV. The Optimus G is the first LG device to be certified. It’s worth noting that Samsung had its own Miracast-based technology built into a range of devices including the Galaxy SIII, Galaxy Note 10.1 and Galaxy Note II. The big difference between Samsung’s AllShare Cast and Miracast is that the WiFi Alliance specification allows the mobile device and monitor to be from different manufacturers.
WiFi Alliance Senior Marketing Manager Kevin Robinson told Ars Technica:
The primary use cases will be enabled at launch. This isn’t going to take years for the devices to proliferate. You can expect to see tablets, phones, laptops, televisions, and set-top boxes with Miracast.
Robinson added that Miracast supports copyright protection systems including HDCP, something that they say is essential to attract premium content providers.
This all puts Apple in a rather difficult position. AirPlay only works between Apple products. In order to stream to a television via AirPlay, you have to have an Apple TV. With a much wider range of manufacturers participating, you’d have a broader selection of set-top boxes to choose from. But if Samsung’s initial efforts at wireless streaming are anything to go by, set-top boxes may soon prove unnecessary, as we’ll start seeing Miracast built into the next generation of televisions.