Training Challenges in the North

Iqaluit, Nunavut.

In February.

When I was first asked if I would be available to provide SMART Notebook training to teachers in Iqaluit, my main concern was that I did not have the gear to handle Canada's far north in the middle of winter. Sure, I had a parka, some gloves, and boots. That isn't uncommon for Canadians.

But there's a pretty big difference between winter in southern Ontario and northern Canada.

As it turns out, the weather wasn't nearly the challenge I thought it would be (even though my flight out did get cancelled due to a blizzard). I picked up some better boots, better mittens, a balaclava, and some snow pants, and ended up walking around quite a bit while in Iqaluit. It was a great experience, and I only fell through the snow once!

The real challenge of Iqaluit, from an educational technology training perspective, is the state of the Internet.

The Internet speed at the hotel would lead me to click on a web link, walk away to do something else, and come back to the computer a couple of minutes later. The speed at the school wasn't any better. In fact, the school Internet was further impacted by the government filters. I have to wonder how long it will take for officials to realize that the filters are increasingly ineffective, especially as students begin to bring their own data-enabled devices into the classroom. The filters also end up blocking useful teaching tools and valuable information (some of the SMART-related resources appeared to be blocked). SMART Response worked, but not particularly well and would not be usable for more than a handful of questions. To SMART's credit, the question web pages are actually quite small. Unfortunately, the school's Internet connection is so slow, the question pages would still take up to a minute to load on student devices. There is another delay between the student clicking to submit a response, and the response being "received" by the teacher.

Surprisingly, SMART Maestro, the iPad-enabled feature of Notebook, ran smoothly. This must mean that most or all of the network traffic required to mirror the SMART Board to the iPad must stay on the local network.



On my third day of training, I asked the teachers what their strategies were for integrating Internet-based materials into the classroom. In unison, several teachers replied, "We don't". This may seem like a shocking response in the 21st century, but it isn't a surprise once you've tried using the Internet in the school for a few days.

So, the solution could be to pre-download resources from home. The teachers did comment that their Internet speed at home was quite a bit better than the Internet at the school. This was a solution used to a limited degree by some teachers, but there was another problem. It seems that the best deal for Internet in Iqaluit only includes roughly 40GB of monthly data, and each additional GB is $15! Ouch! I can barely stay below my 275GB monthly allotment and have considered paying the additional $10/month to get unlimited bandwidth. That's great for me, but there is clearly a problem with "Internet equity" in Canada.

The CRTC is currently soliciting input on broadband connectivity in Canada. The completed questionnaires must be submitted by February 29, 2016, so go participate as soon as possible (but please just read a little further first).

Before you respond to that poll, just take a few moments. Forget about Netflix. Forget about iTunes. Think about your own child not having access to the Internet to research a school subject. Consider that other students across the country have relatively easy access to resources like Homework Help, Khan Academy, and a variety of other online learning resources. Many school districts are moving to Google Apps or Office 365, tools that help enable collaboration and 21st century skills. From what I experienced in Iqaluit, these tools would be virtually unusable.

Apple, FBI, ISIS, and Secrets

This goes significantly off-topic from what I normally talk about, but still revolves around technology (and even touches on the potential impact on education). The news about the FBI demanding that Apple unlock the phone of one of the San Bernardino gunmen is everywhere, and the FBI using the suffering of the victims' families to get what they want is not only immoral, it is irresponsible.

One of the most common arguments that comes up regarding encryption and secrets is that if you aren't doing anything wrong, you have nothing to hide. This could not be further from the truth. Many businesses around the world depend on trade secrets, or keeping secret the development and progress of new products and technology. Law enforcement agencies may be protecting the identities of undercover agents, witnesses, or victims. You know, agencies like, say, the FBI. Can you say you have nothing to hide while still demanding answers about the breaches in security at Target, Neiman Marcus, and Michaels? More in line with education, schools and districts must also be sure they are keeping student information secure and private. This is not just something that should be done, but something that must be done. We all have "something to hide", even if we're not doing anything wrong.

The FBI is claiming they hope to discover information on the phone; information that will help prevent other terror attacks. This is highly unlikely, and the FBI knows it. The San Bernardino gunmen were a man and a woman. Islamic extremists (ISIS, Taliban) do not use women as "soldiers". This act of terror appears to have been "ISIS-inpired", but that is very different from "ISIS-plotted". The FBI can get access to phone records, even without access to the phone. They likely already have a good idea who the gunmen were in contact with, and there is little else they could discover from the phone itself.

Asking Apple to try to create a method to circumvent security measures puts far more people at risk than any possible gain from unlocking this one phone.

There is a belief that the burden on, or cost to, Apple to circumvent the security of the phone is relatively small because they are such a large and wealthy company. Again, this could not be further from the truth. If Apple is successful in gaining access to the phone, it calls into question, at least from the perspective of the public, the actual security of Apple's products. Apple could potentially lose contracts for large-scale deployments to government agencies, businesses, and yes, even school districts. The public perception of ineffective security could also cost Apple consumer sales. There are so many costs that go beyond the simple costs related to the hours required for Apple's developers to gain access to the phone's contents.

There isn't anything the FBI can do to bring back the victims of the attack, and it is disturbing that they are using the grief of the victims' families to advance some hidden and unrelated agenda.

Waiting on the next big thing

After recording the podcast following FETC this year, our group pondered why we didn't really see any major new technology.

I suggested that it might be related to the difficulties the major processor fabrication companies are having shrinking the chips used in our electronics. I quickly realized that this was a topic that my colleagues really had little knowledge of, and that most users of technology probably don't know much about the chips inside the gadgets we use every day.

This post is not intended to be an in-depth technical discussion. Hopefully I can provide a simple explanation of how our electronics have managed to get faster and do more things over the years, and give a quick overview of what is causing a slowdown in some areas of technology.

In 2006 Intel introduced the Core architecture of processors. These processors were manufactured on what Intel referred to as a 65nm (a nanometer is one-billionth of a meter) process. The 65nm process had also been used in the later Pentium 4 processors. 65nm represents a measure of the process, but some "features" in the process are larger than 65nm while others can be smaller.

Late in 2007, Intel began producing processors on a 45nm process. While some might interpret this as being roughly 70% of 65nm, processors are generally rectangular and have area. This means that the 45nm process can create an identical chip in roughly 48% of the space used by the 65nm process (45^2 / 65^2 = 47.92...). The scaling isn't quite perfect, so the chips don't shrink by the same amount as the process naming implies. Still, you can see that chip manufacturers can pack a whole lot more transistors into the same amount of space used by the older process. Reduced size is not the only advantage to new, smaller processes; smaller processes use less power and generate less heat. The reduced size also normally means that a chip as complex as "last year's" high-end chip can be produced at a lower cost.

In early 2010, just over two years after introducing the 45nm process, Intel released chips produced on the 32nm process (roughly 50% in size compared to 45nm). In mid-2012, Intel had started using a 22nm process (roughly 47% in size compared to 32nm). The first sign of trouble was with chips from Intel being produced at 14nm (40% of 22nm). Intel released a very limited number of 14nm chips, targeted mainly at low power laptops. Higher powered 14nm desktop and laptop chips did not show up until 2015. Intel's roadmap also now shows that products based on their next process (10nm) is not due until late 2017.

Intel is not the only chip-making company around. Other big players include TSMC and Samsung. Despite the public disputes between Apple and Samsung, the processors in most iPhones have actually been manufactured by Samsung. The latest iPhones have started using chips manufactured by TSMC. Samsung and TSMC have also started to struggle to make chips smaller. Some rumours suggested that with the iPhone 6 (and 6 Plus), Apple was taking so much (of a limited) capacity from TSMC that other tech companies could not get access to the latest process. AMD and Nvidia are the two major graphics chip designers, and have their graphics chips manufactured primarily by TSMC. Neither company released graphics chips using TSMC's 20nm process.

Limiting the latest and greatest manufacturing technologies to a handful of companies means that only those companies have the potential to make noticeable improvements, but they may not be under pressure to do so. Apple seems to have capitalized on their nearly exclusive access to TSMC's advanced process. Benchmarks for the iPhone 6, and again with the 6S models, showed significant improvements in performance. Note that the iPhone is under competitive pressure from Android smartphones. Intel on the other hand faces little competition in their primary market of computer processors. Intel not only designs the processors, but also owns the manufacturing facilities for their processors. The performance improvements in processors from Intel have been relatively small (5-10% from generation to generation).

What about technology other than smartphones and computer processors?

We are starting to hear more about VR (virtual reality) and AR (augmented reality). Oculus, probably the most recognizable name in VR, announced the system requirements for the Rift VR headset. The cost of building a system to meet those requirements is quite high. Here is a quote from that page, highlighting the importance of the GPU (Graphics Processing Unit).
Today, that system’s specification is largely driven by the requirements of VR graphics. To start with, VR lets you see graphics like never before. Good stereo VR with positional tracking directly drives your perceptual system in a way that a flat monitor can’t. As a consequence, rendering techniques and quality matter more than ever before, as things that are imperceivable on a traditional monitor suddenly make all the difference when experienced in VR. Therefore, VR increases the value of GPU performance.
Remember that AMD and Nvidia are the major source of graphics chips, and that they likely did not get access to 20nm? Relatively few computers meet the graphics requirements of the Rift.

Other areas of technology may also have been stalled by limited access to the newest chip manufacturing processes. Nvidia makes the chips in the tablet for Google's Project Tango, a computer-vision platform for detecting objects (think self-driving cars). This technology is relevant for robotics, a topic I discussed in the podcast.

While the trend toward a slowing in technological advancement continues, more companies are finally getting access to the latest manufacturing processes. AMD and Nvidia are planning products based upon 14nm and 16nm for release in 2016. AMD has stated that their upcoming graphics chips will make the largest leap in performance per watt in the history of the Radeon brand (AMD's primary graphics brand, introduced in 2000).

Hopefully this means we will see some new and really interesting tech at conferences next year.