In June 2011, Apple announced AirPlay mirroring. At first glance, this is wireless display technology done right. For most people, it just plain works. Plug in an Apple TV, get it on your wireless network, and then just search for it on an Apple device capable of mirroring. It really is that simple. Well, except for when it's not.
The first problem with AirPlay is that it is exclusively for Apple devices. For many educators, that's not really a big deal. I'm sure there are countless teachers who are completely within Apple's walled garden. The second problem (and yes, regardless of how Apple wants to spin it, it is a problem) is that the whole device discovery falls apart when used on a network that is designed to support more than just a couple hundred devices (like the networks typically used in schools). You can have your iOS device and Apple TV on the same network, but no matter how hard you try, sometimes they just can't see each other. There are workarounds out there, but they depend on supported IT infrastructure and configuration by IT staff. I'm sure many educators have already blamed their IT staff for not "working with Apple" to get AirPlay working. Let me be clear here. Apple is to blame. They wanted to beat the WiFi Alliance to the punch and rushed out a protocol that was not well thought out.
In January 2011, the WiFi Alliance announced WiFi Direct Display. WiFi Direct Display would show up in various news articles over the course of nearly two years (including being rebranded as Miracast) before any devices based on the standard would be available.
Miracast does not depend on your existing wireless connection. A Miracast transmitter and receiver setup a completely separate WiFi connection using an existing method known as WiFi Direct (thus the original name of WiFi Direct Display). This is truly where Apple went wrong with AirPlay.
So, Miracast is wireless display done right, right? Well, maybe.
Google announced Miracast as a feature of Android 4.2. This was wonderful news, although it was quickly discovered that even one of Google's newest tablets, the Nexus 7, doesn't support Miracast despite running Android 4.2. Even now, many months since the release of Android 4.2, only a handful of devices are actually available that have Miracast capability. None of the devices that I have access to fall into that category.
Intel obviously thinks Miracast is the way to go. With version 3.5 of WiDi, Intel has added Miracast compatibility. I do have a laptop that supports WiDi 3.5, so I picked up the Netgear P2TV-3000, a Miracast receiver. After installing all the right drivers in the prescribed order, I had absolutely no luck getting it to work. After several hours of trying various installation and connection options and methods, I stumbled on a page that had the answer. I had to uninstall the Windows 8 Intel WiFi drivers (from my Windows 8 laptop), and install the Windows 7 drivers. Sure enough, this worked (even though it shows an error every time it connects). While I was happy to have it working, I doubt that most people that walk into a Best Buy to purchase the P2TV-3000 would actually go through all of this hassle.
I suppose what I find most remarkable is that there really shouldn't be anything particularly challenging with implementing an open wireless display standard that works. Device discovery and pairing protocols have been used for decades. Encryption protocols have also been around for a very long time. Audio and video encoding and decoding have been hardware accelerated for a few years now, even on handheld devices. These are the pieces, and yet everything still seems to be a struggle, and it feels like it's just one software update away from getting broken.
How strange is it that the wireless transmitter and receiver from 15 years ago is by far easier to setup, and more reliable than anything we have now?
The first problem with AirPlay is that it is exclusively for Apple devices. For many educators, that's not really a big deal. I'm sure there are countless teachers who are completely within Apple's walled garden. The second problem (and yes, regardless of how Apple wants to spin it, it is a problem) is that the whole device discovery falls apart when used on a network that is designed to support more than just a couple hundred devices (like the networks typically used in schools). You can have your iOS device and Apple TV on the same network, but no matter how hard you try, sometimes they just can't see each other. There are workarounds out there, but they depend on supported IT infrastructure and configuration by IT staff. I'm sure many educators have already blamed their IT staff for not "working with Apple" to get AirPlay working. Let me be clear here. Apple is to blame. They wanted to beat the WiFi Alliance to the punch and rushed out a protocol that was not well thought out.
In January 2011, the WiFi Alliance announced WiFi Direct Display. WiFi Direct Display would show up in various news articles over the course of nearly two years (including being rebranded as Miracast) before any devices based on the standard would be available.
Miracast does not depend on your existing wireless connection. A Miracast transmitter and receiver setup a completely separate WiFi connection using an existing method known as WiFi Direct (thus the original name of WiFi Direct Display). This is truly where Apple went wrong with AirPlay.
So, Miracast is wireless display done right, right? Well, maybe.
Google announced Miracast as a feature of Android 4.2. This was wonderful news, although it was quickly discovered that even one of Google's newest tablets, the Nexus 7, doesn't support Miracast despite running Android 4.2. Even now, many months since the release of Android 4.2, only a handful of devices are actually available that have Miracast capability. None of the devices that I have access to fall into that category.
Intel obviously thinks Miracast is the way to go. With version 3.5 of WiDi, Intel has added Miracast compatibility. I do have a laptop that supports WiDi 3.5, so I picked up the Netgear P2TV-3000, a Miracast receiver. After installing all the right drivers in the prescribed order, I had absolutely no luck getting it to work. After several hours of trying various installation and connection options and methods, I stumbled on a page that had the answer. I had to uninstall the Windows 8 Intel WiFi drivers (from my Windows 8 laptop), and install the Windows 7 drivers. Sure enough, this worked (even though it shows an error every time it connects). While I was happy to have it working, I doubt that most people that walk into a Best Buy to purchase the P2TV-3000 would actually go through all of this hassle.
I suppose what I find most remarkable is that there really shouldn't be anything particularly challenging with implementing an open wireless display standard that works. Device discovery and pairing protocols have been used for decades. Encryption protocols have also been around for a very long time. Audio and video encoding and decoding have been hardware accelerated for a few years now, even on handheld devices. These are the pieces, and yet everything still seems to be a struggle, and it feels like it's just one software update away from getting broken.
How strange is it that the wireless transmitter and receiver from 15 years ago is by far easier to setup, and more reliable than anything we have now?