During the ASHA conference, I attended a really good presentation by a couple of excellent folks in the AAC field; Libby Rush and Celeste Helling. The topic was Making Evidence-based Decisions about Apps & SGD’s, which although a little weak in relation to the topic of actual metrics, did a great job of summarizing where the field is in relation to technology options and professional challenges.
One statement that Libby made is worth commenting on, particularly because it’s not uncommon. She said that one of her younger family members was able to use technology much more “intuitively” then she could and was more of a digital native. 
Maybe it’s just me, but I find this notion that there is some special difference between digital natives, those who have grown up with computers and the Internet, and digital immigrants, those of us who have taken up using such technology as it was invented, just a little too simplistic and odd. The assumption is that any one-year-old can pick up an iPad and be surfing the net, sending e-mails, and twittering using the hash tag #imsosmartitscaresmyparents quicker than mom or dad can say “where’s the ON button?”
But a recent piece of research by the UK’s Open University found that the distinction between a native and an immigrant is slight to nothing. “We found no evidence for any discontinuity in technology use around the age of 30 as would be predicted by the … Digital Natives hypothesis,” said the report. Rather than finding a correlation between age and effective use of technology, with “younger” being “more capable,” they found a stronger correlation between attitude to technology and techno-ability i.e. it doesn’t matter how old you are but how interested you are.
Given that the technology we are born into is always “normal,” the different between a native and an immigrant is a temporal thing, not an intellectual one. A laptop was not an item in my life until the age of 25 so it was, for me, “new technology.” For my daughters, they have always had a laptop in the house, and learned how to use them earlier than I. In that sense, they are digital natives. But do they “know” more than I do about technology? No. Can they write better than I can? No. Do they understand the implications of posting personal information to Facebook imagining it to be private? No.  And when the computer fails to boot (or gets infected by a virus), do they fix it themselves? Ha, that’s what Dad’s are for! 
Of course, you’re likely to argue that this is atypical, and that being an SLP working in Assistive Technology, I am a “special case” and that most people are not that tech savvy. But all this reflects is the difference between my attitude to technology (I am interested) versus my kids’ attitudes (they are ambivalent). Adults with little interest in technology may well believe that because someone younger can (a) turn on a computer, (b) download a song from iTunes, and (c) spend hours on Facebook, they must be computer smart and way ahead of the curve. The truth is more prosaic – anyone who can do something you can’t do has the appearance of being smart. But if you were to spend a little time doing the same things, it wouldn’t then seem so magical. 
My wife, who always claims to be a computer idiot, actual runs our entire finances via the computer. She can track payments, shift balances, transfer money, pay bills, and all from a desk in our small home office. She can, in fact, do many things I cannot. Is this because she is a digital native? No, it’s because her attitude toward the tasks needing to be done is greater than mine; I can’t handle the accounts because I don’t need to! Now, if we lose the connection to the Internet, she does not troubleshoot the network connection, nor will she reconfigure the software if there is a major problem. Why? Because I can do that and she is happy to let me. Does that mean she is “impaired” in her computer use? No, once again, it’s a matter of attitude and choices. Hey, my older daughter, who is a digital native, would similarly have no clue how to reconnect to the Internet – but she can spend hours on Facebook doing whatever it is she does. 
This helps to explain how we have a generation of technology users who can use Facebook, Twitter, smartphones, iPads, and the like in a seemingly effortless way,  but who have really little clue about how they work, what they do, what are the limitations, and what might be the dangers. Does anyone even worry just a tad when they post a message saying “I am at 555 West Avenue, Podunk, picking up my pay check?”
None of this is intended to suggest that in order to use a piece of technology we have to know how it works. If someone backs a truck over my Droid and offers to hand me a free soldering iron and some duct tape to make amends, I’m afraid my response would simply be to make use of my monthly $6.99 insurance payment and get a new one. But there are two things that I think are critical to remember:
(a) Being able to use something does not make someone an expert, a whiz, a genius, a savant, or something particularly special. That perception is nothing more than a reworking of the familiar lecturing trick “as long as I know 5% more than the folks I’m teaching, I’ll seem intelligent.”
(b) Your interest and attitude to technology is what makes you skilled at use, not whether you are a native or an immigrant – which is typically assumed to mean young or old.
So when I hear seasoned clinicians and educators suggest that they are somehow less savvy than “younger people,” perhaps you can understand why I want to give them a good shake and a group hug: If you’re presenting on AT to large groups of people and getting folks turning up regularly, believe me, you are a digital native!
 The distinction between digital natives and digital immigrants is purported to have been first used by the writer and educator Mark Prensky in an article entitled On The Horizon. The suggestion was that the Technological Era has changed how people think, and that “today’s students think and process information fundamentally differently from their predecessors.” At the time, this sounded radical and exciting but since then, critics have argued that it is simply not true, and that we all continue to process information in much the same way as we have done since losing our tails and cutting back on bananas. More strident critics have suggested that the digital natives are actually worse at processing information c.f. Mark Bauerlein (2008) The Dumbest Generation: How the Digital Age Stupifies Young Americans and Jeopardizes Our Future, and Nicholas Carr (2011) The Shallows.
 A startling number of people who use Facebook have a hard time understanding that the phrase “social network” means that information they post will end up on a “network” and that it is shared “socially.” I visited the Speech and Language Department of a nearby university where I found posters for sessions teaching students the dangers of such things as posting on Facebook because people such as future employers might read them. Might? You better believe that if you apply for a job for which I am on the interview panel I’ll be typing your name into Google “search” quicker than you can say “invasion of privacy!” The point is, if college students need to be taught the dangers of loose postings, then “digital natives” are not as smart as we think they are – or that they pretend to be.
 And after the first virus hit the home computer due to some dubious downloading practices by my dear daughters, it was also Dad’s job to remove the infection, beef up the firewall, install some new anti-virus stuff, and offer education on the dangers of file sharing – a practice digital natives are imbued with being able to do with ease but which turns out to be a dangerous affair because they don’t know the potential hazards!
 A word I like to use with regard to the affect (and effect) technology can have on people is enchantment. The near-magical way in which cool devices operate can suspend our disbelief and make us imagine that a piece of machinery can do wonderful, limitless things. Psychologically, enchantment can be dangerous as it discourages us from looking beyond the mask and subdues critical thinking. When someone believes that the answer to all the world’s problems can come from a piece of technology, then you are wise to be suspicious. The more sweeping the claims, the more skeptical you should become. Apparently, during the Industrial Revolution, machines were set to remove all need for hard labor and usher in lives of leisure. So how ‘s that working out for yah?
At some point, I’d like to do an entire post/article on Digital Enchantment: Breaking the Spell, but for now, I just have the title 😉
 OK, I’ll fess up that I know exactly what she does because I am on her “Friend” list and she forgets that whatever she posts is open to me and not just the person to whom she is posting. If need be, I can also log on as her because I know her password. So much for the super-smart digital natives! And before anyone tries to argue that I am a wicked father invading my daughter’s privacy, I have two comments: first, if it’s on Facebook it’s fair game; and second, as far as my family goes, I’ve always said that I don’t run a Democracy but a Benign Dictatorship 😉
 And there we go again with a nod toward another myth – that of the “intuitive device,” which leads to the dangerous notion that things should always be designed so that people don’t actually have to learn anything. Making something “easier” to use is a far cry from making it “intuitive.” In fact, there is no “intuitive” anything – it’s all learned. Check out Everything is Obvious: Once You Know The Answer (2011) by Duncan Watts for a sobering look at how we make assumptions about “ease of use” and “simplicity” yet forget what we’ve learned in the past. In a similar vein, Living With Complexity (2011) by Donald Norman talks about why things can’t always be simple, and how too much “simplicity” can be more complicated!