Larry MacPhee: e-Learning


Click for Flagstaff, Arizona Forecast

AZ Time:

11.08.2015: The Coming Disruption of Academia

Academia, with its medieval era academic robes and feudal system power structures, is going to be disrupted by technology in ways that will make what has happened up to this point seem inconsequential. While academia has held out longer than some other powerful institutions, it is vulnerable to disruption for the same reasons, and recent trends in higher education have only exacerbated the situation.

Why does disruption occur? In every case, the product or service offered is similar, but the digital alternative to the traditional one has been more convenient and/or less expensive to the customer. When that happens, the transition occurs rapidly. Let's look at some examples. Newspapers used to be big businesses, and they carried great influence and power, but the thing that drove newspapers was advertizing. When Craigslist provided a convenient online alternative to searching the classifieds, newspapers began to lose readership, which started a rapid downward spiral. Apple has been the cause of several industry disruptions. With iTunes, Apple gave music lovers an easy, convenient way to buy music online, and the physical sales of CDs rapidly dwindled. Apple did it again with the iPhone, resulting in the collapse of the previous phone market leaders, Nokia and Blackberry. Amazon disrupted book sales, and later the entire catalog sales industry. Netflix did it for video, and the once ubiquitous Blockbuster video chain ceased to exist within a few short years. Uber seems to be doing it for taxis, and AirBnB for accomodations. Apple has, itself, been disrupted in music by streaming services like Spotify and Pandora. There is no traditional market that is unaffected by digital transformation.

In academia, the first challenges by the forces of disruption, the online and for-profit universities, failed for several reasons. The quality of the education was not very good, and the delivery system was also pretty poor, but the cost was the same, so they produced a bad product that nobody wanted. Defaults on student loans were higher at for-profits, where the degree was of low value, and so funding organizations became more discriminating. Accreditation has also created an OPEC-like cartel but, as with OPEC, all it takes is for a major player to go its own way and the whole thing collapes. Traditional academia has some serious problems too. Problem 1: For decades, a college degree was what the high school diploma used to be; a ticket to a good job. However, that drove everyone, even the grossly unqualified, to go and get a college degree. Problem 2: Colleges got greedy. Willing to accept anyone who could pay, and even those who took on enormous student loans for questionable degrees, colleges saturated the market with graduates and the bachelor's degree became devalued. For a while, universities solved that problem by offering Master's degrees, but now that market is flooded too. Problem 3: Tuition costs keep going up, because of state cuts to higher education, and because highly paid administrators have gone on building sprees to try to make their universities more appealing than those of the competition. Problem 4: To stem the rising costs, administrators have been gradually phasing out the well paid, highly qualified, tenured professors as they retire, and replacing them with low paid, less qualified instructors on annual contracts. This has had the added effect of solidifying the administrative power base, since tenured faculty were often the most resistant to the demands of administrators to lower standards and keep paying students in the pipeline regardless of their potential.

So, in summary, the modern baccalaureate degree is devalued. It is no longer a ticket to a good job, the quality of the education itself has declined, and the cost remains exorbitantly high. These problems create a situation ripe for disruption.

All that remains is for employers to realize that they cannot effectively distinguish between job applicants based on whether, or from where, they have a bachelor's degree, and to begin prioritizing other selection criteria. When demand for the bachelor's degree dries up, there will be a lot of academic real estate coming onto the market as universities collapse. Why won't universities just tighten their standards and produce higher quality graduates? Those with strong reputations will be able to take that approach, because it's not just a degree, but a Harvard or Yale or Stanford degree. But graduates with degrees from "generic university" will find that their diploma is worthless, and that they are saddled with a huge pile of student loan debt that bought them nothing but a 4-year party. The administrators running Generic U will need to rapidly change their institution's offerings, or see enrollment plummet. Unfortunately, universities are led by cautious, change averse people, and so most of those institutions will keep on doing what they've always done and they will fold before leadership has any idea what happened.

The successful disruptors will be the organizations that figure out how to do two things: 1) Rapidly assess the abilities of students (grading students in the way that eggs, or olive oil, or maple syrup is graded, by quality) and sell those ratings directly to employers. Curiously, the lists employers provide do not generally involve any knowledge of the job itself, but rather are heavy on personality traits and attitude indicators. 2) Figure out how to imbue the students with the abilities (something that takes longer and costs more) that employers are looking for and that they don't already possess. This would lead to a program of study based on the missing pieces, much as the "personalized learning" programs are attempting to do. What are those qualities, and how do we teach them? Can they be taught, or can they only be nurtured in those with intrinsic ability? Can we teach enthusiasm, perseverance, integrity, dedication, communication skills, ability to get along with others on a team, some of whom may be annoying? Or should we stick to teaching the kind of material found in textbooks? Does your institution have the answers?

09.08.2015: 21st Century Learning

What is the future of learning? And how did learning in its present form take shape? Sugata Mitra says that the skills we teach our children are based on the Victorian era need for interchangeable human calculators. In an age before modern computing and telecommunications technology, the British Empire ran efficiently on columns of numbers transcribed into ledgers and transported around the globe by ships. It was necessary to have human calculators who had the same abilities scattered all over the world in order to transmit and receive the vital information of commerce, so spelling, writing, and mathematics were standardized. Students needed to read, write, and spell accurately, perform calculations correctly, and not demonstrate too much creativity. Schools of the Victorian era fulfilled their mission, but most schools are still doing it today, over 100 years later, when, perhaps, it no longer serves us so well.

It became possible to eliminate some of these drudgeries back in the 1970s when early technologies started invading the classroom. These technologies made it less necessary to memorize mundane things, but there was an inevitable backlash. Students were forbidden to use pocket calculators because they became less proficient at memorizing their times tables, or because they couldn't do a long division problem by hand, or when they trusted the output of the calculator even when it made no sense. Students were forbidden to use word processors and spell checkers because the quality of their cursive handwriting and their ability to spell was suffering, or because cut and paste was making term papers too easy to plagiarize. But, rather than ban the tools, perhaps we should change what we teach? I'll expand on this idea below.

Should we continue to teach those basic skills, even if only to give students an appreciation for our humble origins, much as we might derive an equation from first principles in a graduate seminar class? Are skills like cursive writing and knowing one's times tables important to the shaping of neural pathways, influencing our intuitive language and computation abilities as some studies suggest, or are they relics of a bygone age? I would argue that some basic skills are important and should still be taught because I can do basic math in my head faster than my children can reach for their iPhones. However, in a knowledge economy, our ability to synthesize and evaluate information is much more important than it used to be, back when we could trust that most of the knowledge found in library books had been vetted by experts during the arduous publication process. Although we like to think that we're in the Information Age, there's an awful lot more easily accessible misinformation online too. Retrieving information is easy. Evaluating and synthsizing it is the bigger challenge in today's world.

categories in the cognitive domain

Categories in the cognitive domain of Bloom's Taxonomy.

Does it still make sense to give students multiple choice tests with closed notes, closed books, and no electronic aids? Is asking students to memorize facts that can be easily retrieved of much value? Sure, students must understand the foundational materials. Even the die-hard constructivists admit that you can't construct your own learning without the basic building blocks. Otherwise it's just time wasted reinventing the wheel, and a pretty primitive wheel it will be. But shouldn't the emphasis be on the higher order thinking skills? Does it matter if a history student knows the date of an event, or is it more important that he/she understands the causes of the event? Does a math student need to know how to tediously calculate a square root by hand, when the calculator is so readily accessible? We need to redesign our curriculum to emphasize the kinds of assignments that require thinking rather than memorizing, and synthesis rather than fact gathering. Why assign students to do a biography of a lesser known president when that information is so easily looked up? Instead, why not assign the students to write about whether he was a good president and whether, based on the information he had at his disposal, or in retrospect, he made the best decisions? Students still need to learn what the guy did, but the tougher question, the one that makes you search your own values and understanding, is, "Was it the right course of action, and why?" You can watch the incurious students squirm when you ask questions like that!

Rather than ban a technology from the classroom, why not acknowledge that, in the modern workplace, it's not so important to know how to run a linear regression with only pencil and paper (perhaps useful if stranded on a desert island?), or to look up some information online on what constitutes a healthy diet. What's really important to know is whether a linear regression is the right tool for the job, or to assess the validity of a website that claims to have all the answers about healthy eating. Creating a generation of critical thinkers who know how to use technology tools is what we're really after, even if it means allowing open book, open note, open Internet on tests. The mathematically curious ones will still want to know how a linear regression works, but the rest can skip ahead to the more stimulating problems. Except for a handful grand masters, most of us can't beat the computer at chess, and yet we don't lose sleep over it. Let's let computers do what they're good at: brute force searches, speedy, error-free calculations, and rapid information retrieval, and let's get humans doing more of what we're best at (when properly trained): critical thinking, synthesis, pattern searching, and creativity!

It's ironic, then, that many instructors want to use technology tools to block student access to technology so that they can continue to deliver tests designed for a pre-technology era. It's the pedagogy and the assessments that need rethinking. Yes, it's often more work to determine what students think rather than test what they have memorized. Computers can grade multiple choice tests, but not opinion papers! We still need humans for that!

02.28.2015: Why College is Like a Gym Membership

working out

College has a somewhat unusual business model. Because it is unusual, many students are confused by it. I have frequently heard the refrain among dissatisfied students that "The only reason I'm here is that I need this degree to get a job." Another common complaint is that "I paid a lot of money, and I am not satisfied with the grade I received in this class." As consumers, we are used to paying for things with the expectation of a full refund if they do not satisfy. This is where the analogy with a gym membership helps to make the point. Paying for an education doesn't guarantee you an education. It only provides you access to the opportunities that will help to develop you into an educated person. But, after you enter college, it's ultimately up to you whether you choose to explore those opportunities. After all, nobody can make you expand your mind. That's something you have to choose to do, and it's something you have to work at. That's why enrolling all incoming freshmen in liberal studies classes doesn't create a cohort of poets and philosophers. With the exception of the gullible people who pay for the exercise machines they see on TV with the expectation that, in one short month, they will look like the smiling, well-oiled and well-muscled supermodels who are shown using the product, most people understand that paying for a gym membership doesn't, all by itself, make you fit. They understand that some comittment is required. If you asked someone, when they were signing up for this membership, to sign a waiver saying that they understand that just paying for a membership does not guarantee that they will become fit, most people would laugh and say, "Of course. That's obvious." Why then, is it not the same for our students who pay for a college education and then feel crestfallen when, without putting much effort into it, they find that it does not meet their expectations? Everyone should have the opportunity to go to college, but not everybody makes the most of it. College is a place ripe with opportunities, but some of them must be sought out. Some of them require effort. Those who seek the opportunities, and work for them, are generally rewarded. But, for those who do neither, sorry folks, no refund. Enjoy those expensive textbooks and your bowflex machine. If you don't make an effort, that's all you've got.

11.27.2014: Pick Any Two
Pick any two
In the world of technology, and the world in general, for that matter, there are situations where you will be asked to do a job fast, cheaply, and well. It turns out that, most of the time, you can't have it all. I'm not sure exactly why this is true but, trust me, it's true. You cannot change the laws of physics! I think the graphic says it all, and very succinctly. There are many people in the world, some of whom will be your bosses or your clients, who don't seem to understand this simple fact. My advice to you is to show them the graphic before you start any project and, without another word of explanation, tell them to pick two. If they can't live with that, point to the center and walk away.


06.24.2013: The Technology Adoption Curve

The concept of a technology adoption curve was first described in a study of the willingness of farmers to try new agricultural methods, but it applies quite well to technology in general. I like to try new things, but won't promote any new technology just because it's cool. It has to fill an unmet need and be easy enough to use that most people can manage it. Therefore I generally try to stay just on the right hand side of the "chasm." However, that's not always the case. Although I have followed their development with great interest, I only recently got a smartphone. With 56% of Americans now owning smartphones (I'm sure that number skews young), that plants me squarely in the majority, and illustrates that while one might be an innovator with one technology, that doesn't mean one isn't more cautious in some other technical regard. That's ok. Until recently, I had no need to own a smartphone and, now that I own one, I'd still characterise "need" as a stretch ;-)

Where are you?

Are you a technology innovator, do you "go with the flow", or do you take a "wait and see" approach?

Back in March, I wrote about the hype cycle. Although I didn't think about it at the time, can you see how the chasm in the upper graph is connected to the trough in the hype cycle? Overlay these two graphs and the connection is clear. While a technology can show great promise and generate excitement among early enthusiasts, it may never catch on with the general public. Often this is because of some limitation in what it can do or in how easy it is to use that the enthusiasts don't mind but which the general public would not tolerate. That's the chasm. If the chasm isn't crossed, then the technology never reaches that eventual productivity plateau but instead just dies out or remains a hobby for a small group of technophiles. Linux is a great example of a promising technology that hasn't crossed the chasm. Enthusiasts are the key to the spread of new technologies however. Seth Godin makes this point well when he talks about marketing to the people who care. So let's think about where we are, because it's useful to know ourselves. If I say the word "Blackboard" or "iPad" or "clicker" or "3-D printer" or "Arduino" or "smartphone" or "Twitter" or "Facebook," where do you fall on the curve? Now think about your colleagues and where they are relative to you. Are you always in the same part of the curve or do you jump around? Have you learned something about yourself through this little exercise? If you decide not to adopt some new technology that everyone is talking about, does that make you a laggard? Not necessarily. It could be that the technology in question is heavily hyped right now but will not last. There is an implied slur in calling people "laggard" that I don't like. Not every new technology is a good thing, nor will it last. If it's not, or if it doesn't--something we'll only know in retrospect--then you were right not to jump on the bandwagon. Nobody talks much about Second Life anymore and, if you missed it, you didn't miss much. If you never bought a Palm or Windows Mobile PDA, good for you! You saved a bunch of money on a near worthless device. And what if you're an innovator? Do you stick with a technology once everyone is using it? Or does that take all the fun out of it? If you were on Facebook when nobody had heard of it, are you still there today? There's a saying that "good pioneers make bad settlers." Pioneers don't like crowds, and they are always moving to the new frontier. I'm not one, but I appreciate them. They work hard and explore a lot of places that don't lead anywhere useful. But when they make a real discovery, the rest of us get to enjoy it without all the effort ;-)

Where are you?

The "hype cycle." Overlay this graph on the tech adoption curve above.

06.17.2013: Time to change my password? Oh, just shoot me now!
The dreaded password change notice

The dreaded password change notice.

Ah, the lowly password. A simple tool from a bygone computer era. But if you've gotten a message like this one lately, I think you will agree with me that passwords are no longer either simple to manage or effective at keeping us secure. Passwords are much like our congested freeways; they don't work very well anymore but the entire infrastructure is built around them so, while they drive us crazy, we have no alternative. The first problem is multiple services. With a steadily growing number of web-based services, each with its own password expiration cycle and username and password creation rules, it has become increasingly difficult to remember all of your passwords and which username and password go with which service. Thankfully, many services let you use your e-mail address as your username, and most offer a "Forgot your password?" link. It's also helpful that my workplace, at least, has a "single sign-on" for all services. The downside of that, though, is that if my work password is compromised, the hacker can change everything from the grades in the class I'm teaching to the beneficiaries of my life insurance plan and the bank routing number of my paycheck's direct deposit! The second problem is multiple devices. The password wallet or virtual keychain was a good solution for managing the multiple usernames and passwords saved on your computer. Just remember one username and password and the tool does the rest. But once you have multiple devices, you're out of luck. So if I have 10 services that require passwords (not an outrageous number when you consider multiple e-mail accounts, IM and video conferencing services, a photo sharing service, e-Bay, PayPal, social services, online banking, web hosting service, cloud storage service, etc.) and 5 devices (again, not all that unreasonable when you consider personal computer, work computer, tablet, laptop, smartphone, etc.), then that's 50 passwords to change on a regular basis. If you have multiple OSes (MacOS and Windows, for example) installed on your device, then that device counts as two. But we're still not done. The third problem is multiple applications on each device. Even if we just consider my single work password, there are many applications on each device that need to use it, and need to be updated when it changes. For example, there's my e-mail program, my VPN connection, my IM program, my web browser, my web page editor, my FTP application, and the fact that I often run more than one of these programs. For example, I use three web browsers commonly, and two e-mail applications, three IM programs, etc. So, to sum up, if I use 10 services on 5 devices, each which connects to these services from 10 applications, that's, very conservatively speaking, 500 places where I need to change passwords on a regular basis! Not every app stores a password for every service, but you get the idea. Here's an oversimplified map of my online world. I bet yours looks similar. Start drawing lines from service through device to application, and you'll see how messy password management can get!

Password management is no longer simple, nor does it keep us very secure.

Password management is no longer simple, nor does it keep us very secure.

True, I may have it a bit worse than most, but I can tell you that this is out of control and that I'm better at managing this chaos than most people. It is no wonder, then, that many of us use the same not-very-strong-password, or a minor variant of it, on multiple services, and that we engage in other unsafe behaviors such as writing passwords down on a sticky note attached to our monitor, or incrementing our old password with the next number in line when it's time to change it. With the proliferation of cloud-based accounts and services, it's only a matter of time before one of them is breached. It seems that not a month goes by without some service provider announcing that its user base has been compromised. It is no wonder, then, that when one of our accounts gets hacked, sometimes through no fault of our own, it doesn't take long for a hacker to gain access to our other accounts, often by using our email system to reset our passwords in other systems. The fourth problem is that all of the flaws above lead services to use extreme countermeasures such as forcing you to create a password so strong that you have no hope of remembering it, or locking your account after too many failed access attempts, or making you prove that you're human with one of those "captcha" tools. Often it's not a hacker or a bot but just me, trying to remember which username and which password go with this account, hoping I guess right before I get locked out. Sometimes it's a device with an old password trying to update itself automatically that locks me out. Sometimes I fail to correctly answer my own challenge questions because the system is too picky about the answer. For the "street I lived on when I was in second grade," did I spell out "Street" or did I abbreviate with "St." (with or without the ".") or did I leave out "Street" altogether? I don't know what the solution is. Maybe a cloud based keychain that generates ridiculously strong passwords you never have to remember, or that uses some difficult to impersonate biometric like fingerprint or retina scan? All I know is that I'm in desperate need of something to fix this mess, and that a lot of other people are stuck in the same sinking boat. I will lose hours of productivity dealing with this upcoming password change, and it will be days or weeks before most of the apps on most of the devices I use regularly are updated. There has got to be a better way!

Update: I have heard very good things about 1Password, a web-based password keychain, and since I have started using Apple's iCloud Keychain, which syncs across all my devices, a strong password is supplied for each web account, and yet I don't have to remember it. So far, 6 months in, it's working great!

03.27.2013: Technology is the answer. What was the question?
The Hype Cycle

The Hype Cycle: Map your favorite educational technology.

How many times have you heard that some emerging technology is going to solve all of education's woes? In my experience, a technical innovation may allow the job to be done faster, cheaper, or better than before, but rarely, if ever, all three. If you're lucky, you get to pick two! If you're thinking about implementing some new technology that everyone is talking about, It's important to step back and consider its position on the "hype cycle" graph. Google Glass, for example, is just past the trigger point, and visibility is still increasing. MOOCs are at the peak of inflated expectations right now. But does anyone remember Second Life? Once heralded as "the next big thing," it has slid into the trough of disillusionment. Take Second Life out of your resumé, people. It's not doing you any favors. Speech recognition, long ridiculed, is finally climbing out of the trough and up the slope towards a more realistic "plateau of productivity." While still not practical for most uses, it fills a niche for users with repetitive stress injuries that make using the mouse and keyboard painful. Used as intended, with a realistic appreciation for what it can and can't do, technology can be highly effective. But misapplied, technology can make a real mess of things. As the old saying goes, "To err is human. To really screw up, you need a computer." One of the debates that rages in my office relates to what to teach people about a new technology. We want them to get excited about new technologies, as we are, and to be adventurous in their teaching. Often however, people with inflated expectations come to us only wanting to know how some new technology will make their job easier, and they get frustrated when we ask them why they want to use it (what problem are they trying to solve?) or try to explain that there are limitations. They don't want to hear that it won't re-energize their lectures or that it might require just as much effort as what they are doing now. Let's look at a few examples of useful technologies misapplied, and you'll see what I mean.

Technology Misuse Proper Use
Instructor shows a full-length movie to class in order to take a day off from lecture, catch up on grading, etc. Instructor shows a series of relevant video clips, each followed up with insightful questions and guided discussion to engage the class in critical thinking.


(two ways to wreck a presentation)

1. PowerPoint presentation is viewed in absence of the presenter, but the bullet points are vague or meaningless without the emphasis and interpretation of the speaker. (Did they think the presenter had nothing of value to add?)

2. Speaker, facing away from the audience, reads paragraphs of text from each projected PowerPoint slide, adding nothing of relevance. (Did they think the audience can't read?)

Presenter uses prompts on the slides to make key points to the audience, to jog the memory, and to engage the audience in a lively and only loosely scripted discussion.
SafeAssign or TurnItIn
Instructor uses tool to fail students for unintentional plagiarism. Instructor uses tool to show students how to properly reference the source materials they cite.
Rather than make the teaching more engaging, instructor uses clickers to enforce mandatory attendance policy. Instructor uses tool to assess comprehension, engage students, and deepen their understanding with challenging questions and analysis of why they think what they do.

Your assignment: Expand my table with more examples. Begin with the LMS, Facebook, eBooks, MOOCs, and iPads. All great tools. But are they being used as they should?

03.06.2013: After bad stuff happens :-(
02.28.2013: Latest Reports from the LMS Battleground
10.10.2012 The wave of change that's about to hit higher education
mortarboard with @ symbol

Higher Ed Goes Digital

Big changes are coming to the hallowed halls of higher education. As the cost of a four year degree continues to rise because of, well, you might be surprised. And because state funding for education continues to decline, the consumer is left paying an increasing share of the bill. Administrators, who feel pinched to keep doing more with less and to keep a lid on costs, are pushing for increased class sizes, for more classes taught by part-time instructors, for more online classes, and for the adoption of technologies that automate instruction or reduce the teaching effort per instructor, allowing each one to do more. If we step back and look at the big picture, where is all this headed? As a result of these coming changes, the tenure track faculty member who teaches for a living is, by my reading of the situation, an endangered species, and the state funded primarily undergraduate university isn't much better off. Don't believe me? Ask any department chair at any public undergraduate institution what's happening when a tenured professor (one who's primary responsibility is teaching, not research) retires. While enrollment is growing like crazy (because "college is for everyone"), experienced full-time faculty are being replaced, if at all, by much cheaper and often less qualified part-time instructors. It's happening because technology has been identified as a method for regularizing and further automating undergraduate instruction. Undergraduate university teaching is the delivery of specialized, but fairly standard, information to a large market of adults, for a high price. (K-12 is safe for the moment because teachers not only impart knowledge but also serve as workday babysitters for their young charges.) Sure, experts are still necessary to develop the standardized lessons and content for higher education but, once that's done, it can all be deployed on a massive scale and managed by less qualified people. (Well, that's the argument I hear from upper administration anyway. Whether a less qualified instructor can as effectively grasp and deliver that content is another question, but it's a tradeoff administrators seems able to live with.)

Since the market is large and the price is high, there will be lots of competition for students. With instruction going online, students will no longer be placebound, and course capacities will no longer be dictated by the size of the classroom. In the very near future, students will be able to get an online degree in most subjects from anywhere they choose. Some universities are even racing to grant degrees in personalized learning programs where students can shorten their course of study by "testing out" of classes in which they have "life experience!" (I hope the testing is rigorous and occurs in a proctored environment with ID checks!) When future students are choosing where to go for their online degree, why would they choose your institution? If you don't have a good answer, you'll be in trouble. This change will be highly disruptive. Ask yourself this. What happened to the local video rental stores like Blockbuster when Netflix came along? What happened to the local music shops after iTunes? What happened to the local newspapers after Craigslist became the place for classified ads? What happened to all the independent used bookstores and even the big chain bookstores like Barnes and Noble now that Amazon sells more digital books than paper ones? All of these digital information delivery services replaced their analog counterparts in a very short period of time. With high quality content and lessons coming from the big publishers, written by pedagogical and subject area experts and tailored for the web by skilled graphic designers, the courses developed independently by most professors don't compare favorably. Brick and mortar universities teaching traditionally will be like the small quirky independent bookshops competing against Amazon's vastly greater selection of cheaper content. Most of them will fold. What will happen to all those beautiful campuses and the college towns that depended on them? When the undergrad degree goes digital, there will be only a few winners and they will win big. There will also be many losers, as venerable local institutions see in-person enrollment decline and poorly implemented online programs fail to attract and/or retain students. Universities that conduct research and have graduate programs will be less affected, and the private ivy league institutions will continue to do fine by offering an expensive top-notch traditional education to a niche market, but the community colleges and primarily undergraduate institutions competing on price and who can't differentiate themselves will mostly go the way of the Blockbuster Video stores.

Which organization that you haven't heard of yet will be the Amazon or the iTunes of higher education? Will it be a big publisher like Pearson, or a for-profit online institution like University of Phoenix or Capella? Will it be a currently free option like Coursera, EdX, or Udacity or the Khan Academy? Will it be a highly regarded traditional institution like Stanford or MIT? Or will it be a small regional university like NAU, already accredited and experienced in online delivery to its rural population, that gets it right? It's too early to tell. But there are ways to prosper in this new era. Courses from the for-profits are still generally pretty bad, and the selection from the free services is limited, so there's a window of opportunity for some new leaders to emerge. And while Massively Open Online Courses (MOOCs) are currently getting a lot of attention, they require a level of self-motivation and organization rarely found in our undergraduates. Build better service, with better instructors, more courses of study, better than standard "canned" content, and more personal touch into our online programs and we can beat the competition, create more value for the dollar, grow enrollment, and enhance our reputation as a quality online degree granting institution. That will take time and hard work, and it will take a new kind of instructor who knows technology and pedagogy as well as the subject area. And it won't be any cheaper, to the chagrin of those who think that waving some technology pixie dust over the problem will make it all better. But change is coming and academia, steeped in tradition and rife with bureaucracy, is not very good at change, so it's going to be a shock. Are you preparing for the giant wave of change that's about to crash on traditional higher education? Because you can just sit there and get crushed by it, or you can start paddling for your life and ride it into the future!

04.03.2012 Blackboard Embraces Open a Boa constrictor
04.01.2012 What Google and Facebook have in common
03.25.2012 Message to the eContent providers
03.20.2012 Textbooks of the Near Future.
01.06.2012 Are we putting the technology cart before the instructional horse?
01.03.2012 Unintended Consequences.
10.17.2011 Quality Matters?

magnifying glassRecently NAU was approached by an organization called "Quality Matters" and invited to become a member. While they are a non-profit, that does not mean they are free. Annual membership dues are required, and the implication is pretty clear. If you say you're not interested, you must not care about quality, right? People pay to be trained as reviewers. People also pay to have their courses reviewed, and they pay to receive the QM seal of approval. Based on the success of this operation, QM could easily spin off some other ventures such as, "Motherhood and Apple Pie Matter," or "Patriotism Matters." Their heart is, to be fair, in the right place. The purpose of this organization is to identify things that make for a quality online course, and use a faculty peer review process to evaluate and certify these courses. This movement wouldn't even exist if there weren't some valid questions about the quality of online courses nationally, and if schools weren't feeling a little defensive about their online programs. I do, however, have some issues with their approach. My first issue is that the focus is on courses delivered online. Their scope does not include courses taught in a traditional manner, and I think we can all agree that some of those must be equally bad or worse! While I'd like to level the playing field and look at all courses, it's maybe a bit unfair to criticize QM for what they don't review. So let's look at what they do review. We will leave aside for now whether NAU should cede its authority over the evaluation of course quality to a body outside the university, and over which we have no control, because the question of who's watching the watchers could be the subject of an entirely different discussion. My biggest remaining issue with the "QM Program" is that online courses can be, arguably, broken down into three major components, and QM deals with only one. A better name for Quality Matters might be "Let's Focus on One of Three Things that Matters!" In case you're inclined to disagree with me, here are my three components of quality in an online course: 1) Course Design: this is the way the course is structured, how it displays to the user in the online environment, and the instructional methods used, including the identification and measurement of learning outcomes. 2) Course Content: this includes the selection of appropriate materials and the accuracy and depth of those materials, 3) Course Delivery: this includes all of the interactions between instructor and student, and among students. The QM program deals only with Course Design. I'm not saying that design doesn't matter. I'm pretty convinced that it does. Without good design, it's going to be difficult to get out of the starting blocks. But I think I'd like more than one of the three reviewers of my online course to be a "subject matter expert" and I don't think it makes much sense to slap a seal of approval on a course unless the content and delivery have also been reviewed thoroughly. I have seen the disastrous results that occur when you give great materials to a poor instructor. I have also seen the tragic consequences when you combine a dynamic and motivating instructor with materials that are inappropriate for the students, either because the materials are not challenging enough, are out of date or otherwise inaccurate, or are too challenging because the students do not have the necessary background preparation. What I'd really like to see is a peer review program that looks at all of the aspects of course quality described above, and owned by our own faculty rather than an outside organization. But I think I see the writing on the wall. If we don't start policing ourselves, it may not be too long before someone else is doing it for us.

09.29.2011 The "do-over" mentality in undergraduate education

mortarboard and degreeTrue story. I have a faculty colleage who had a formal complaint filed against him by one of his students for "discriminating against me on the basis of my intelligence." (The "discrimination" was giving the student a lower grade than some of his classmates, based on the student's relatively poor performance on various assessments.) When the professor agreed that this was true, the student became even more convinced that he had a case! I think this raises an interesting point because the professor in question was using an "old" way of thinking, while the student was using a more modern construction.

When I was in college back in the '80s, I'm not sure there was such a thing as dropping a class. At least, if there was, I never did, and I never knew anyone who did, so it was neither common practice nor a well advertised option. It just never occured to me that one could do that. The concept of re-taking a class a second or third time to replace the original bad grade was also completely foreign. When I got the occasional grade that I was unhappy with, I owned it, and there was nothing I could do about it. It was there on my transcript for all to see, like a tenacious piece of gum on the bottom of my shoe. Today, most students would just throw away the shoes and buy a new pair. In my job at the university, we care about student success and we want everyone to get a good grade. We go to greater lengths every year to accomplish this goal, giving students more choice, more flexibility, and we intervene more than ever before to work with students who are struggling. All of this is good, I think. But we rarely think about why this is the goal. Not trying to be cynical here, but let's just step back for a minute and ask ourselves: "Isn't the point of grading students, in large part, to identify (optimistically) which ones have learned or, (pragmatically) which ones have successfully completed the assignments, or (cynically) which ones have successfully jumped through the hoops?"

Question: Is our goal to get everyone over the bar, no matter what it takes, or just to provide everyone an equal opportunity to get over the bar and then report the results? The bar I refer to here, of course, is "learning" even if measuring that intangible substance requires cruder instruments like tests and other assessments. If everyone gets unlimited chances to get an A (assuming here that letter grade correlates with learning achieved, so you can substitute A with "learned a lot" and F with "didn't learn a thing") by the process of do-overs, remedial work, tutoring sessions, interventions, etc, then aren't we artificially levelling the playing field? Aren't we de-valuing the A earned with hard work and without extra credit? Would you rather be seen by the doctor who got an A in Biology the first time through without any outside help, or the one who was failing the course and dropped it, took it again and got a D, found an easier instructor and took the course a third time, got a B- and, with a bunch of intervention, tutoring, and extra credit, got the B- rounded up to an A, which replaced the D on the transcript? I suppose that student has perseverance at least! Of course, there's an old joke: What do you call the medical student who graduated at the absolute bottom of his class? Doctor! Hah :-)

Why has it come to this, and how has it come to this, and is this where we want to be, and, if not, how do we get someplace else? I think part of the reason we have arrived at this point is that so many more kids are going to college. College really is the new high school. Michael Wesch, who I admire and mostly agree with, says "College is for everyone." True. Certainly part of the problem, though, is that if everyone is being admitted, more students are arriving unprepared. More students are here not because they want to be, but because they feel compelled to be so that they will be competitive for a job at the other end. This also explains the impatience of many of our students, who don't really love to learn or want to broaden their minds. They "just want a job, ok, and could you please show me the fastest way out of here?" I'm sympathetic. Who wants to spend $40,000 (minimum) for a bachelor's degree that still can't guarantee them a job? And certainly part of the problem is that universities love all the extra money that's coming in, but feel a twinge of guilt when those students who aren't prepared don't succeed. Legislators and administrators, who hear from the howling parents who pay the bills of these mediocre students, put pressure on faculty to do better. By "better," they mean graduate more students faster with better grades and with less funding. If we rule out the easy way (just lowering standards), and take the challenge to "do better" seriously, what's left?

Solutions: 1) Placement. Students should not be admitted to the university if they are not capable of succeeding and students should not be allowed into courses for which they have a high probability of failure. We can pretty accurately predict success with placement tests and we need to do this more. 2) Remediation. If students arrive without the skills but it is possible to teach them those skills, they need bridging courses to get them there. 3) Academic probation and dismissal. Students who are not succeeding, and who are not likely to turn it around, should not be strung along. 4) Monitoring. Technology can be used to monitor student progress so that intervention occurs quickly before students spiral downward. We do all of these things now. We just need to do them more, and better. But the following are not generally addressed at all. 5) Instruction. Most faculty arrive with good content area knowledge but limited teaching experience or knowhow. This can be addressed, but it would take a mind shift for the university to accept that this is a problem. 6) Compensation. Little attention is paid to the quality of instruction. Typically, only instructors with high D/F/W (drop, fail, withdraw) rates get any attention from administration, and this negative attention can easily be avoided by lowering standards and giving lots of As. But standardized tests, with all their flaws, can measure incoming and outgoing students and be used to reward instructors who show the gains. Will this lead to "teaching to the test?" Possibly. But if the test is good, that's not the worst problem to have. 7) Peer review. Research faculty know all about peer review. It's how they get articles published in good journals. But in the classroom, instruction is siloed. Nobody watches anyone else teach or gives them any tips on how to do it better. Sure, there's muttering in the hallways about which instructors are too easy, or just plain bad, but nothing gets done about it. This could be fixed if there was the will to do it, but again, it would require a major shift in faculty culture. 8) Reporting. Something I've never heard mentioned anywhere is that universities really ought to report not just on the grade a student receives, but how long it took the student to get there, and by what path. We have this data. We could put some of the rigor back into transcripts that are packed with As by reporting the information employers want to know: How much external time and effort was expended to get this student over the bar? 9) Tracks. I know it's sacrilege but, while college is for everyone, the liberal studies degree is not. Universities need to rethink degree granting with an eye towards certificates and diplomas that lead directly to a career path. Want to be a salesman, a dental hygeinist or an x-ray technician, or a database programmer, a forest ranger or a cop? Sure, a bachelors would be helpful, but it's probably not something you "need." Want to be an astrophysicist, a historian, or a philosopher? Ok, get the bachelors. But here's something else we should tell incoming freshmen and rarely do. If you get the bachelors, you probably don't need to come back to school when you change careers, as most of us do these days. With the certificates, you probably do.

12.12.2010 Why going "TSA" on web classes just won't work
06.14.2010 What Google should do next
06.11.2010 Why NAU's Mobile Computing Policy needs rethinking
Wireless icon

NAU has upgraded its wifi system. The new one works very much like the ones you've encountered in airports and hotels. That's the first problem. It's a university, not a hotel. Regular users of the wireless have to agree to terms EVERY TIME they connect. If your smartphone goes to sleep to conserve power or you close your laptop to move from one location to another, when you wake the device up you need to reconnect and agree all over again. That's just silly for a system designed primarily for regular users (not one time guests). While this is annoying for laptop users, it's a downright nuisance for people with wi-fi capable smartphones and tablets. But there is a BETTER WAY. The MAC (media access control) address of every wired computer on campus is registered. If regular users of the wireless could register their devices too, then the agreements could be logged once and filed away. Sure, the agree screen should pop up for unregistered guests (parents, vendors, etc.) or when the policy changes. But for the regular users, most of the time, this shouldn't be necessary.

The second problem is security. If you try to access a web service other than a browser, you never see the agree screen so you can't connect. Even in a browser, the agreement screen doesn't always appear, and that has negative consequences downstream. If your home page is set to an NAU website, you won't be prompted to agree because the NAU domain is a "trusted site." But unless you agree, you can't join the VPN (and you're not told why; it just fails to connect) so your session is insecure and you are transmitting passwords and credit card numbers unencrypted. And even if you do agree, many people don't take that final step and join the VPN because they don't have to; the wireless works even if you don't join the VPN! Sure, there's a warning on the agreement screen, but it's buried in a page of legalese and who reads that stuff anyway? So, aside from a handful of tech people who know better, most of our wireless clients are surfing the web without encryption. Don't believe me? Ask your colleagues if they connect to the VPN while using the wireless, or if they even know what the VPN is! This makes the majority of our clients easy pickin's for any geek with a packet sniffing program and a few idle minutes in a public space! Is this bad? Think of it this way. It's the digital equivalent of walking down the street naked in the middle of winter. Normally the security-centric IT folks would be all over an issue like this, but they're not. They know about this problem, but they don't choose to fix it. Is it because there is a way to be secure and it's buyer beware? Or because they have a reactive (see problem 3 below) strategy? I don't know. But there are two proactive ways to fix this problem: technical (don't allow insecure connections) or educational (teach people about how and why to encrypt their wireless sessions).

On to the third problem. People are increasingly showing up on campus with tablets and smartphones. These devices are almost always the property of the user, not the university. But the university insists that, for the privilege of checking my work email on my personal device, a password lock with a 15 minute timeout must be installed, and that the password must be strong (hard to remember) and non-repeating. Worst of all, the university wants to be held harmless for remotely wiping my entire device without my express permission if my password is entered incorrectly too many times. This is security overreach at its worst. Restrictive policies such as these are typically written by big corporations with trade secrets to protect and who provide company-owned mobile devices to their employees for work purposes. That situation doesn't apply here. The university ought to be thrilled that faculty and staff would want to check their work e-mail on a device that cost the university nothing, and which is carried around with them during every waking moment. There is a BETTER WAY. Our policy should not discourage the use of personal mobile devices by faculty, staff and students. We need a much more flexible and less restrictive policy for personally owned devices which contain mostly non-NAU data, or users will not connect them to the services we want them to use.

I agree

Workarounds: Rather than configure IRIS in my e-mail client, I use Outlook Web Access through my mobile device's browser which, on the downside, requires a login every time but at least doesn't require me to agree to a remote wipe of my personal device. Students use Google's GMail system, which doesn't require the remote wipe. And everyone should remember to use the VPN. As for the frequent Agree prompts...don't we all agree this is just a silly waste of time?

Update: 07/11/2013 The system has changed again. The newest wireless has a more secure "NAU" network which requires a one time login with your NAU username and password, and a less secure "Public" one for guests which works much as described above. This is a big improvement. The NAU network still doesn't require you to use a VPN for greater security, but at least it only bugs you once for a login. And the public network works as it should, prompting guests to agree to terms each time they connect. We're slowly getting there!

05.05.2010 College is for Everyone, so Attendance is Mandatory!
04.20.2010 LMS Decisions
04.12.2010 The Hacker-Hipster Manifesto
02.19.2010 What is up with Google lately?
01.22.2010 Working and Learning through Snow Days, Swine Flu and Other Disasters
01.04.2010 Clickers: Treating the symptoms or the disease?
12.20.2009 Spreading the FUD
10.14.2009 NAU adopts MS Exchange; increase in productivity negligible
10.02.2009 How to get attention in Academia
10.01.2009 Universal Design
09.30.2009 Should NAU site license the MacOS as well as Windows?
09.01.2009 Marketshare change among LMSes over time
05.26.2009 Mac Growth in Higher Ed
05.21.2009 Microsoft on the move?
04.15.2009 Free and Open Source Software in the Enterprise
Why Computing Monopolies are Bad
How fast is your network connection?
Data Visualization
Mossberg puts his finger on it, and his foot in it.
Why can't Microsoft get it right?
The truth about telecommuting

Recently our university has been encouraging employees to turn off lights, turn down thermostats, and to attend conferences virtually instead of in person, ostensibly to reduce our global carbon footprint and be more green. It's hard to tell whether that appeal is of pure intent though, because it both helps the environment and saves the university money, while placing most of the hardships on the employees. It reminds me of the hotels that plead with you to save our planet by not changing your sheets and washing your towels each day, but then feeding you breakfast with disposable plastic plates, cups and utensils. Clearly, the green they are most interested in saving is their money. If the appeal was that they could lower your bill if you were less wasteful, that would be a far more sincere and convincing argument.

Save the green!

What kind of green are we being asked to save?

Are we as quick to do things that are good for the planet if they don't save us money? We have an award winning LEED certified (platinum level) green building on campus, which is great, but why aren't all our new buildings green? At least part of the reason is that they are more expensive to build. Having one makes a statement. Having many is a bigger comittment. What about things that could help the environment, but would force us to change our time honored management practices, like telecommuting? Telecommuting takes cars off the road, reduces parking demand on campus, saves the workplace electricity, and potentially lengthens the workday by eliminating commute time. It also increases efficiency by eliminating many of the distractions of the office. When I telecommute, I can be incredibly productive in my quiet home office. Unfortunately, management can be suspicious of telecommuting because of the assumption that an employee in his/her seat at work can't goof off, and because an employee working from home can't be as closely monitored. Here's the truth about telecommuting. If managers set clear deliverables and check progress regularly, then it's the quality and quantity of work and not the number of hours spent warming a seat that gets measured. Bottom line? If you have responsible employees and effective managers, telecommuting works great. But it's kind of lonely. And if you have flaky employees, there's no guarantee they're being productive even if they warm their seats at work for eight hours a day.

Blackboard's Scholar
Learning Spaces
Podcasting with iTunesU
Gaming on Campus
©2007-2016 Larry MacPhee | IM: | Skype: larryrmacphee | google: larry.macphee | 928-523-9406