Making the Switch: The move from Android to iOS

This is a follow up post to the previous post about switching from Android to iOS.

It has been about a month since I made the switch, and so far there haven't been any big surprises. I have argued for years that when you consider what you use a smartphone for these days, the day-to-day use of an iPhone versus Android are quite minimal. Most of what I (and pretty much everyone I know) do include email, web browsing, and some social media (Facebook, Twitter, etc). The experience of these activities really doesn't change much across devices.

There are things that I knew would be slightly different (and annoying). For example, I use Chrome as my browser across all devices. I do not want to use Safari, but Apple does not let you change the default browser in iOS. So, I sometimes have to click on a link to open a page in Safari, copy the link, close the tab in Safari (so I don't end up with 500 zombie tabs), then open Chrome and paste in the link.

Using iMessage and FaceTime with family is nice. Again, I wish iPhone users were more willing to use a messaging platform that was inclusive, but generally speaking they just aren't.

The resale value of Android phones that I mentioned in the previous post really hit home. I managed to sell my Samsung A5 2017 for roughly 25% of its original value. That hurt.

Overall, the transition has been relatively painless (apart from losing so much value in the resale of my Android device).

Making the switch: Moving from Android to iOS

I have used Android phones for nearly 10 years now. My first Android phone was the first Android phone; the T-Mobile G1. I transitioned through a few different phones, and was appreciative of the developer community that kept these phones up to date long after the manufacturers had abandoned them.

I did get tired of replacing my phone so often, but the combination of rapidly improving hardware and unstable unofficial Android ROMs made it hard to stick with a phone for more than a year or so.

Then I bought the Nexus 5. I bought it at launch. It was a phone with high-end specs and a $400 price. Google provided three years of updates for their Nexus line, and I kept my Nexus 5 for just over three years. It had a mediocre (at best) camera, and poor battery life, but I loved that phone.

I tried switching to the Axon Pro, another Android with high-end specs and reasonable price. Sadly, it only received one update despite being ZTE's flagship phone, and it was much larger than I wanted my phone to be. I switch again in less than a year to the mid-range Samsung Galaxy A5. It's not a bad phone, but it's not great and I honestly don't know how long Samsung plans to keep the A5 updated.

So, what kept me from Apple all those years, and what has changed?

Price was a big factor. The iPhone is a very expensive phone. Unfortunately, if I want to repeat my "Nexus 5" experience, I would need to look at getting one of Google's Pixel phones and long gone are the days of Google's flagship phone costing $400.

From a personal perspective, most of my family use iPhones. While I wished people would transition to a cross-platform messaging system (like WhatsApp), Apple has successfully sucked people into the iMessage/FaceTime vortex. I don't like it, but not liking something doesn't keep it from being a reality.

Phone hardware is also not changing as rapidly as it once did. Keeping a phone for over three years isn't as crazy sounding as it once was, as long as the manufacturer is still supporting it. This fall, Apple released iOS 12 for the iPhone 5S, a 5-year old device! I have a family member with that phone. It still works great, and it has current software! I am not aware of a single Android device that has official support after 3 years, let alone 5.

One final note is that every time I have changed my Android device, the depreciation was significant. I am blown away by the resale value of used iPhones. Maybe in the long run the iPhone won't be as expensive as it first seems.

So, I'm giving the iPhone thing a whirl. I know there will be things that drive me crazy, and only time will tell whether this experiment will be a positive experience.

Misadventures with Thunderbolt

The Promising Technology

Thunderbolt is a port technology by Intel, and first appeared on Macs in 2011. Originally, the primary use for Thunderbolt was as a video port (mini DisplayPort), though there were some other peripherals, such as external storage. With Thunderbolt 3, things got very interesting. The connector switched to USB Type-C, and the port included USB compatibility. The peak bandwidth was also increased to 40Gbps, opening up many possibilities for extremely high bandwidth devices such as external graphics. Even better, Thunderbolt 3 could deliver up to 100W of power, either to devices attached to the computer, or to the computer from the attached device.

On paper, things sounded amazing. Reality, as is often the case, was quite a bit different.

In 2016 I purchased an Asus GL702VM gaming laptop. Asus proudly advertised "Onboard Intel® Thunderbolt™ technology" that "gives you single-cable data and signal transmission rates of up to 40Gbits/s". I had kept my previous gaming laptop for 4 years, and the only reason I upgraded from it was the aging graphics chip it used. I figured a Thunderbolt 3 port would allow me to upgrade the graphics chips down the road, extending the life of the laptop.

At work, to support our Mac users, I use a 2016 MacBook Pro with 4 Thunderbolt 3 ports.

This year, we decided that new laptops acquired for staff should also include Thunderbolt 3, and that we could start looking for a universal docking station for use with any laptop going forward. We ordered a Dell Latitude 5480, which reviews showed as having a Thunderbolt 3 port.

Things were looking promising for Thunderbolt 3; the one port to rule them all.

And then...

I recently acquired a Gigabyte Aorus GTX 1080 eGPU (external graphics) box. Graphics cards are extremely expensive due to the cryptocurrency mining craze, but somehow the Gigabyte eGPU managed to be the cheapest GTX 1080 available. While researching eGPU configurations for my Asus laptop, I discovered that the Thunderbolt 3 chip Asus used was an "LP" version that only worked at 20Gbps (half of Thunderbolt 3's advertised peak speed). Asus does not list this anywhere on the product page, and this chip is essentially Intel's dirty little Thunderbolt 3 secret.

Next, our order for the Dell Latitude 5480 came in, along with the Thunderbolt 3 docks. I connected the dock to the Dell and discovered that only a specific configuration of the 5480 (with a completely unrelated GeForce 930MX graphics chip) includes Thunderbolt 3. The model we received has a regular USB Type-C port. Fortunately we were able to return the laptops and order replacements with the Thunderbolt 3 port, but Dell is needlessly creating confusion on this laptop. If you've been considering it, be careful ordering.

Finally, I connected the Dell Thunderbolt 3 dock to the 2016 MacBook Pro. Nothing. Apple has a "whitelist" of supported Thunderbolt 3 devices, and unsupported devices simply won't work. There is a hack that removes Apple's arbitrary and artificial device check. Once I went through those steps, the dock functioned mostly OK; everything worked but only one external display can be used.

Someday?

Perhaps there is some hope for Thunderbolt 3. Intel is making Thunderbolt 3 royalty-free, so it may start showing up in more devices. I just hope that laptop manufacturers stop using the slower version of Intel's Thunderbolt 3 chip, or at least are more clear which chip they are using. Other Thunderbolt 3 docks are supported by Apple (though they are significantly more expensive of course), and offer Windows compatibility as well. Although I could say I hope that Apple removes their ridiculous "supported" check, Apple's history makes that scenario unlikely.

Putting ICE on IT

I have worked in what is traditionally viewed as "IT", or Information Technology for a very long time now. However, since I began working in the Faculty of Education at Brock University, my initial IT position has evolved in wonderful and important ways. To support the Teacher Candidates and our faculty, I became increasingly involved with what has been traditionally viewed (apologies for the redundancy) as Educational Technology.

Over the last few years, I have realized that neither IT nor ET can adequately capture what is truly happening in education, from K-12 through higher education.

There are many technologies that enable teaching, learning, and research. Some technologies are commonly used in education, but can hardly be described as educational technology. Examples include presentation tools and learning management systems. I am more inclined to describe these as Instructional Technologies (though IT already exists as a separate entity). Similarly, technologies such as video conferencing and shared document editing are commonly used in education, but are better described as Collaboration Technologies. There are indeed Educational Technologies, but which category they fall under depends on their specific use. Tablets are a good example.

Almost hand-in-hand with these technologies there tend to be associated staff members, focused on specific areas.

For several months now I have been considering a more holistic approach; a combination of Instructional, Collaboration, Educational, and Information Technology. Although I am not a huge fan of acronyms, I feel that describing all of the relevant technology pieces would be a little too cumbersome.

Welcome to ICEIT.

This is more than just a name. It reflects that these individual pieces are stronger together; that there needs to be a collaborative approach to technology. Each letter does not represent an individual territory to be claimed by individual staff or units. It is a whole, and all of the members need to work together for it to be effective.

I look forward to the coming months as we start to look at this approach in the Faculty of Education.