Larry MacPhee: e-Learning


Click for Flagstaff, Arizona Forecast

AZ Time:

11.02.2016 Why the way we assess students makes no sense.

Traditional testing forces students to cram, regurgitate, and forget.

Have you ever thought about why we test students the way we do? What do I mean? Well, we generally test students in isolation from each other. We generally disallow aids like notes, calculators, textbooks, cellphones. We ban the use of Google and Wikipedia. We set strict time limits and restrict you to your seat. We use a lot of multiple choice and fill-in-the-blank, with perhaps a smattering of short essay. Someone is watching you constantly. Now, you may be thinking, "Of course. How else can we keep them from cheating? How else can we find out what they know? How else can we keep them from helping each other?" I would argue that those are the wrong questions. Sugata Mitra has an interesting TED talk, where he develops the idea that the present day education system remains much as it was designed by the British empire in the 18th century. At that time, what was needed were clerks and bookkeepers who could do math in their heads, and read and write without need of help, primarily to keep track of goods moved around the world in sailing ships. He argues convincingly that the education system isn't broken. It works remarkably well. It's just that it trains students to do things that there is little need for in the information age. Rather than testing for the ability to memorize and regurgitate without understanding, we need to redesign assessment around collaboration, persistence, synthesis, and creativity.

When we attempt to solve a problem at home or at work, what are the first things we do? Gather some background information. Consult an expert. Get some help. Brainstorm. Try more than one approach. Keep at it. None of these methods are allowed during a test, but this is the way we solve problems in the real world. Sure, we need to have the vocabulary. When I go to the hardware store, I need to be able to explain the problem so they can recommend the right tool. Yes, I need some basic understanding as a foundation. But why, in the 21st century, when all of the knowledge of humanity is a few clicks away, must I regurgitate memorized facts on an exam without any help? How often would I not have access to these resources in the real, everyday world? Perhaps if I'm lost in the woods, and my cell phone is out of juice, then I would need to solve a problem in isolation and without assistance. But that seems more like the exception than the rule. Cramming for a test, regurgitating a collection of memorized facts, and forgetting it all the next day is like being a bulimic. There is little educational value in consuming information that you can't retain, just as there is little nutritional value in eating food you don't keep down.

Most problems we face in the real world don't occur in an isolation chamber. They don't have someone hovering over you with a stopwatch. They don't require that all of the knowledge required to solve the problem is already in your head. They don't require you to stay seated, or to work alone. They don't present you with five distinct choices, only one of which is correct. They don't allow you only one attempt. That would be crazy. And yet, that's exactly how we test students, from elementary school all the way through college. Think about these questions for a bit. What kinds of students are successful at that kind of testing? How well does that reflect their future performance on the job? What skills do employers regularly ask for? When hiring someone, is it more important that they already know how to do the job, or that they are creative, persistent, able to learn, and able to work well with others? How well do we prepare students for the challeges they will face?

What are the skills we need to employ in modern day problem solving? Usually, they involve gaining an understanding of the problem, either by doing research or getting help from someone who knows more about the topic. Once we understand the problem, we develop one or more strategies to solve it, based on cost, time, effort, available resources. Often, the first solution is inelegant, but it might be good enough. "Fail small, fail often." is advice I've heard from many successful problem solvers. Don't be afraid to try things. Break the problem into pieces and solve each part separately. Creative solutions rarely come from aiming directly at the problem and going full speed ahead. But the key point here is that we learn to be creative by attacking problems not with a head full of facts, but a kit full of tools that can be used again and again. You may be thinking that I've got a point, but it's easier to grade answers right or wrong when we test facts, not opinions. However, it's actually not so hard to grade students a better way. You look at how they tackled the problem. It's the difference between awarding points only for the answer versus asking students to show their work and evaluating both the quality of the end product and the sophistication of their methods. Let them work in teams. Let them use any resources they can get their hands on. This is an approach to teaching and learning that actually prepares students for a job in the real world.

"But wait," you're saying. "If I assign group work, how can I tell who did what?" Yes, that can be tricky. We've all been assigned to a team where one person does almost nothing, and gets the same amount of credit as those who pulled most of the weight. That's a problem with the way the group members were evaluated. But guess who knows who did which parts of the job, and how well they did them? The members of the group. A very clever way to grade students is to have them evaluate their own performance and that of their fellow group members by secret ballot. Average out the peer grades and compare it to the grade they gave themselves. You'd be surprised how accurately this will match your own observations, and how well it reveals who did the work. Of course, you also assign the work an overall grade, so that if everyone agrees to give themselves higher grades than they deserve, there is a correction factor. This method may need to be employed more than once before students realize that their actions are accountable, so don't give up after just one try. You will find that it becomes even more effective as time goes on.

There is another thing you can try when assigning group work, if you're still having challenges. Identify the different kinds of work necessary to put together the final project. For example, in a lab experiment, one person is the group manager, whose job is to lead, organize, plan, make decisions and settle disputes. Another is the experimenter, the hands-on person, who must be good at understanding and following instructions. A third is the data collector, who might also be in charge of creating graphs and charts. A fourth is the analyst and writer of the report. A fifth is the presenter. These are somewhat arbitrary divisions of responsibility, but you get the idea. When you assign duties within the group, people sort themselves into the kind of work they like to do. Students who hate to get up in front of others and talk might be excellent writers. Students who like to present might not want to get their hands dirty, or be good at following detailed instructions. That's ok. Everybody can make a contribution. And, if someone really wants to work alone, let them. As long as they understand they have to do the same amount of work as a whole group would, that's fine. That's how the world works.

9.21.2016 The Classroom of the 21st Century

Panel discussion on 21st Century Classrooms

Recently, I was asked to speak about what I envisioned as the "classroom of the 21st century." Given that we're already 15 years into the new century, that might seem like an easy task. However, when asked to predict the future, I always think of the scene in the movie "Metropolis" where the bi-planes are flying between the skyscrapers. Or in 2001: A Space Odyssey, where the spaceplane is owned by the now defunct Pan Am Airlines. The future tends to unfold in ways we can't imagine. Some of the (at the time) amazing technologies of Star Trek already look primitive compared to today's devices. Others remain as far away as another star system.

This venue was at a technology conference sponsored by technology vendors, so the expectation was that we would talk about technologies that would transform education as we know it. There were impressive demonstrations of virtual and augmented reality, and telepresence; things that require lots of bandwidth, robust connections, serious computing power and, most of all, a lot of back-end technical support. One of the presenters showcased an elementary school in Ireland that was doing cutting edge stuff with VR, but noted with incredulity that, in one of the slides, the child was sitting in front of a CRT monitor rather than a flatscreen. Another presenter lamented that one of the challenges in K-12 is that the computers found in public schools don't often have video cards capable of keeping up with his 4K video. Unwittingly, these presenters got to the heart of the problem. Public schools, even in the 21st century, don't throw away old equipment that still works. A cathode ray tube monitor may be a throwback to the 1990s but, if it's working, it will continue to be used and limited funds will be diverted to higher priorities, like things that don't work at all. In my experience, schools don't even throw away broken stuff because they may need to cannibalize it for parts. Teachers are resourceful and frugal.

Now if you've read any of my previous blogs, you already know that while I like technology, I am often skeptical about expensive technical solutions to pedagogical problems. I recall when I taught high school in southern California, one of the teachers, a friend of mine, worked in a south facing classroom with a big bank of windows, and actually passed out in front of her students one sunny spring day while teaching in the 90+ degree heat. The administration had a priority system, however, for installing air conditioners. If the room had a computer, it could have an air conditioner because the administrators didn't want the computers getting damaged by operating in excessively warm conditions. At my urging, she requested a computer, for which there were ample technology funds, and she got an air conditioner as part of the bargain. I don't think she used the computer at all, but she let the kids who finished the lesson early play solitaire as a reward. And she effectively delivered her lessons in a classroom that was at a pleasant 70-something degrees. That's the kind of creative thinking it takes to operate in the 21st century classroom.

iPads are a recent example of a technology that was supposed to transform education. Large school districts made multi-million dollar deals with Apple to put an iPad in the hands of every student. They were supposed to replace textbooks. They were supposed to fill students with wondor and the passion to learn. That didn't work. A few years down the road, many of these districts are finding that iPad management tools are lousy, the devices are fragile, and they go obsolete far too fast for the money they cost. But the biggest problem of all might be that many students already have one at home and use it primarily for playing games, so that's what they want to do with it at school. Many districts are now dropping iPads and looking at cheaper, more rugged Chromebooks. While these may prove to be a better investment, the top-down approach to deploying technology remains a problem.

Top-down technology investments in education assume that if you make a technology available, the teachers, who are dedicated professionals, will figure out some appropriate instructional uses. This is the wrong approach. We need to design curriculum with the learning objectives in mind at the beginning, and then provide the necessary instructional support resources, including but not limited to technology, to help make the learning happen. That means we need to ask teachers what they need and, where reasonable, do what we can to meet those needs. Some teachers might make great use of iPads, while others would prefer video cameras, or art supplies, or new textbooks. When the teachers at my school were asked to help design a new science building, they asked for lab benches at the back of the classrooms, including big flat work surfaces, with clean sight lines, and lots of storage cabinets. The need was easily met, and the rooms were both popular and effective. However, the architect decided to ignore their request for windows, so the biology classrooms can't have any live plants. A classroom designed around the needs of the individual teachers and the learners is going to be far more effective than a top down directive to use this or that gadget to transform the learning process.

Sometimes a technology is offered as a solution to a problem because of cost or practicality. One example I heard recently was that field trips and labs are too expensive, so let's "virtualize" those experiences and then the students can learn about the Grand Canyon, or a marine ecosystem, or the anatomy of the frog on the computer. The result is never as good. Intuitively we all know this, and yet we are lured by the promise that it will be just as (almost as?) good, and will cost less, or be safer, or will not require tedious permission slips, etc. But the experience isn't the same. The canned tour of the canyon doesn't include a tough hike. It doesn't smell like hot dry air and desert flowers. It isn't nearly as fun. I can't flip over a rock in the simulation, because the designer didn't program that option. The slippery boulders, the icy cold water, and the experience of catching a fish out of a tide pool with a dip net are so much more vivid than the best sim. And it turns out that the infrastructure and technology required to design, deliver, and support a really good virtual reality experience is incredibly expensive. Perhaps more expensive per student than a dozen field trips. A cadaver lab is very expensive, but I want my medical students looking at real specimens rather than photos and canned simulations. How else can they discover individual variability? Or the effects of aging or disease? Or differences between male and female. When a doctor goes into surgery, you don't want to hear the words, "It didn't look anything like this in the simulation." Even if it costs more, the experience is worth the price if the students remember it years later, and that's much more likely with the real thing than with a simulation.

So what does a classroom of the 21st century look like? I have two kids, ages 10 and 13, in the school system right now, so I can tell you. Generally, it's a poorly designed, overcrowded room in a decades old building in need of major maintenance, with peeling paint, lousy acoustics, a heater that clanks all day long under flickering fluorescent lights, a mix of brand new and ancient, working and broken equipment, a lot of duct tape and plastic buckets, and a whole lot of heart. The passion and drive of dedicated teachers is what keeps it all going. Ok, so that's what we've got. In my dreams, what could we have?

The top priority is support. Give teachers the support they ask for. Did you know that teachers supply most of the classroom materials they need out of pocket, and rarely get reimbursed? Putting an expensive technology that nobody asked for into a classroom is not going to be effective. Raising salaries, reducing class sizes, or providing funds for a teacher's aide, or fixing the broken desks and chairs is generally what teachers are looking for. If they want to use a technology, by all means, see if it can be provided. But ask them what they need rather than tell them what they should be doing. That alone would revolutionize teaching and learning in any century.

Even if the teachers wanted a specific technology, and got it, the support for it tends to end when the technology has been set up by a technician. How many bad PowerPoint presentations have you seen? Was PowerPoint broken? Nope. It was misused. Without the kind of tech support and professional development needed to ensure that the technology is used properly, these high-tech initiatives always fail. There are some great and effective uses for technology that can facilitate teaching and learning, but the initial investment must be followed up with funds for ongoing maintenance, tech support, training, and planning.

Some people imagine a future where students are taught entirely by computers with clever algorithms that adjust the content to the pace of the learner. In my mind, this is like getting your nutrition from a pill. In the near future, I don't see a computer, no matter how cleverly programmed, inspiring students the way a good teacher can. If we invest in teachers, and let them pick the technologies they want to use, the classroom of the 21st century could really knock our socks off! It might not be cheap but, as the joke goes, "Education might seem expensive, until you consider the alternative."

9.06.2016 College is for Everyone?!

photo credit:

Michael Wesch is one of education's big thinkers. One of his notable sayings is that "College is for learning, and everyone can learn, so college is for everyone." That's a lovely sentiment. On the surface, it seems so obvious that it's like a human right. It should be in the constitution. Life, liberty, the pursuit of happiness, and a college education. Some have even gone so far as to say that a college education is such a fundamental right that it should be free for everyone, paid for with tax dollars! But as great it would be for everyone to be smarter, and as nice sounding as Wesch's syllogism may be, he couldn't be more wrong. The transitive property works in math class, but his logic is deeply flawed. College is not for everyone.

It's September, and I work on a college campus, and I see so many smiling, happy students walking around, riding their skateboards, playing with their smartphones, rushing for fraternities and sororities, drinking their $5 coffees in their exercise wear. And I think to myself, "Why are you here?" I have heard all of the answers many times. Senior faculty say, "you're here to learn how to think." Junior faculty say, "You're here to learn about this subject." Serious students say, "I'm here because I want this or that career and, to get there, I need this or that degree." Less serious students say, "Well, I graduated from high school, and college sounds like a good party." Parents say, "A college degree helps you get ahead." The Administration says, "We offer a number of excellent programs, and if you are admitted, (assuming you pay, and you work hard) you can get a degree in this or that." All of these statements are mostly true. And yet, college is not for everyone.

A person can learn a lot without spending any money on a college degree. Wikipedia contains all of the information I studied in college. Before the Internet, all of that information was also in the public library. What colleges provide, for a price, is a certificate that says you have progressed successfully through a course of study. The degree is to knowledge as money is to gold. The degree and the dollar are only pieces of paper that represent something of intrinsic value. The employer accepts the value of the diploma. But what happens if a nation just starts printing its currency without the resources to back it? What happens if colleges start cranking out diplomas for everyone who shows up with a student loan? And even if we don't allow that standards have slipped (though it seems likely they have, since there is not an unlimited supply of deep thinking, hard working students, and the number of qualified, full-time faculty is actually in decline at most institutions) if there are more people with diplomas and the number of jobs remains constant, the problem of supply and demand takes effect. The supply of diplomas outstrips the demand of employers. It becomes a buyers' market, and the value of your degree slides.

I would argue that a college education can be a great and wonderful thing if one takes it seriously. But the growth in enrollment, and the growth in cost, to me, looks an awful lot like the recent housing bubble. I just watched an excellent movie called "The Big Short" and one of the points it made is that housing prices kept rising through the nineteen-eighties and nineties, and houses kept selling, even while incomes remained flat. How was that possible? It was possible only because people were getting loans for houses they couldn't afford. Today, there are an awful lot of people taking out big loans for college degrees and those degrees, in many cases, are not leading to good paying jobs. Doesn't that sound like a bubble?

So why isn't college for everyone? Because as more and more people go to college, the value of the degree is diminished. Most employers look for a degree but don't pay much attention to the school you got it from, or the grades you got. Therefore, with the exception of a few elite schools, all degrees are equivalent. Most people today don't end up working in a field related to what they studied in college. Most employers are looking for things you didn't learn in college. As the diploma is devalued, employers will need to use other criteria to decide who to hire. That's already happening. Everyone who applies for a job has a degree. It remains a prerequisite. But for how long? How long before employers start saying, "anyone can get a degree, so we don't care whether or not you have one." And if the degree stops giving people an advantage, it won't be long before people stop paying for it. At some point, doesn't it seem likely that this bubble will pop?

If you're not serious about your studies (and, honestly, how many kids fresh out of high school are?) chances are pretty good that your grades are going to suffer. After all, you have shelter, money, food, you're surrounded by people your own age, and you're out from under mom and dad's rules. Party on, right? That's not to say that college is only about academics, but it should be at least one of the reasons you're there. A gap year or two, working a job, might be a very valuable experience, even if it's not, and maybe especially if it's not, the greatest job. There are lots of things to learn outside class. Learning how to live on your own, cook and clean for yourself, and figure out who you want to be are all important. But you don't necessarily need to be in college, at least not right after high school, in order to do those things. Learning how to show up on time, get things done, deal with bad customers, bad bosses, bad colleagues, bad roommates, live within your means, and pay your bills are all very valuable life experiences we don't learn in high school or while living at home. So why not save some money, think about what you want to do with your life, grow up a little, and bide your time? And if you decide college is not for you, there are lots of careers where you can do just fine without a college degree. In Germany, for example, only about the top 30% of high school students go to college. Many more go to two year trade schools or apprenticeships and get jobs in areas where a four-year degree is not necessary, such as factory worker, physician's assistant, plumber, technician, carpenter, bank clerk, etc. When they change jobs, they may require more training, but the 4-year general degree isn't useful.

College isn't for everyone. Yes, everyone should have the opportunity to go, if and when they are ready. But you need to know what you want to do with your life, be open to new ideas that will make you rethink what you thought you knew, be passionate about learning for its own sake, and be ready to work harder, study harder, write better, and think deeper than you've ever thought before. College recruitment ads should be like those drug commercials:

"College is not for everyone. Known side effects include crippling debt, confusion, exhaustion, disillusionment, joblessness, and depression. Consult your professor to see if college is right for you."

By the way, just in case you don't want to take my word for it, the guy who made billions shorting mortgages when nobody else thought there was a housing bubble is now shorting the for-profit colleges.

8.15.2016 Adaptive Courseware: The Good, the Bad, and the Ugly

The Good:

NAU has received a grant from APLU (the Association of Public Landgrant Universities) to explore Adaptive Courseware with our faculty and students. Before we get into the details, it's a good idea to unpack those words. Courseware is content in software form, rather than a physical, bound, paper textbook. It's also a collection of assessments (tests) of the learner in various machine gradeable (multiple choice and similar) forms. But e-books from the big publishers like Pearson, Wiley, McGraw-Hill, and Cengage have been around for a while and are in fairly general use. It's the adaptive part that's new. Both the course content and the assessments can be adaptive.

So what do we mean by adaptive? The basic idea makes great sense. The idea is that the software monitors the way the students progress through the course material, and how they do on the assessments, records their responses and their path for analytic purposes (noting parts of the course that are causing problems for students, for whatever reason) and presents different content and test questions based on their responses. This makes perfect sense. If a student isn't getting it, or if they are bored silly, it makes no sense to keep throwing more of the same stuff at them. It makes much more sense to divert the struggling student into remedial or preparatory content, and to present deeper, more sophisticated content to the student who isn't being challenged.

What are the reasons students have problems with the content? The three most obvious ones are that 1) the content is unclear, 2) the content is not engaging, or 3) the content is not appropriate to the student's level of prior knowledge. With data, we can fix these things and make the course better, with the ultimate goal of aiding the learner.

Why is this, potentially, a great idea? Retention of students is better than losing them and recruiting new ones. Making sure that students are ready for the courses they are enrolled in increases student success in a meaningful way, makes professors happier, and ensures that the reputation of the institution improves.

How does this transform education? As content is personalized, it breaks the traditional lock-step approach, where (if we're lucky) the middle 50% are getting it, the bottom 25% are failing, and the top 25% are bored. It allows struggling students to get the background information they need to succeed, and it allows the advanced student to finish early or go farther, and to have that deeper level of mastery get recognized. It allows everyone to proceed at their own pace, and to spend as much time as they need on the parts that are challenging, and to go deeper into material that interests them.

Well, that all sounds great. What's the catch? That's next...

The Bad:

For our first round of product reviews, we're looking at four vendors: Acrobatiq, Knewton, Cogbooks, and Learning Objects Difference Engine (a division of Cengage). What we have quickly realized is that each of these tools has a very different user interface, which goes against the efforts we've been making to create a more consistent look and feel across all NAU courses. This makes it more difficult for students to navigate because every course might be very different in its layout and design. It also relegates our learning management system, Blackboard, to the role of a portal. Students log into Blackboard, find the course, click on a link, and then leave the LMS and land on one of many possible courseware sites. With lots of integration work, we can get grades back into the LMS, but that's about it. Nothing lives in the LMS.

We have also realized that adopting these tools results in a considerable loss of intellectual freedom over content and delivery of a course, because the content is deeply intertwined with the adaptive engine. In order to get good analytic data on learner behavior, content needs to be thoughtfully tagged, and progress needs to be carefully tracked, and this makes authoring content more challenging than just knowing the subject matter. In most cases, authoring is a job for professionals, not content experts. From the vendor presentations, it's not clear what the role of the instructor is with these tools, and therefore it's hard to see where our faculty experts can add value or personalize the content.

Difficulty customizing content raises another question. How does one school differentiate itself from another if they are both using the same product, and the product is not very customizable? How does NAU compete against a school with lower tuition costs, for example? How does an NAU instructor integrate content on life at elevation to a Biology course; something that is both relevant and interesting when you live at 7000 ft? When students can shop around for courses, why would they pick our version?

Another common objection is that students who don't get it have more work to do, or that students who are doing well get more work piled on. This may seem reasonable, but it is a very foreign concept for most students, and the students will resist it unless the rewards are tangible. We haven't transformed the rest of the educational process, so those rewards are not clear at present.

The final issue is that, beyond the introductory level, there isn't (and may never be) a lot of adaptive courseware content, because it's a niche market. There is also not, at present, much content that isn't machine gradeable. That means that liberal studies and the humanities are not going to be well-served in the near term. Therefore, this is not a complete solution.

The Ugly:

The APLU grant is only seed money. Purchasing adaptive courseware is a new financial burden that will fall either on students or universities. It's pretty clear that this is the future and, if they get it right, the potential is great. But it's also clear that we're moving towards a world where people are taught by machines, and the human factor is being pushed to the side. Are we doing this because it's a better way, or because it reduces costs? I don't know about you, but a computer telling me, "Great Job, Larry" isn't very motivating.

11.08.2015: The Coming Disruption of Academia

Academia, with its medieval era academic robes and feudal system power structures, is going to be disrupted by technology in ways that will make what has happened up to this point seem inconsequential. While academia has held out longer than some other powerful institutions, it is vulnerable to disruption for the same reasons, and recent trends in higher education have only exacerbated the situation.

Why does disruption occur? In every case, the product or service offered is similar, but the digital alternative to the traditional one has been more convenient and/or less expensive to the customer. When that happens, the transition occurs rapidly. Let's look at some examples. Newspapers used to be big businesses, and they carried great influence and power, but the thing that drove newspapers was advertizing. When Craigslist provided a convenient online alternative to searching the classifieds, newspapers began to lose readership, which started a rapid downward spiral. Apple has been the cause of several industry disruptions. With iTunes, Apple gave music lovers an easy, convenient way to buy music online, and the physical sales of CDs rapidly dwindled. Apple did it again with the iPhone, resulting in the collapse of the previous phone market leaders, Nokia and Blackberry. Amazon disrupted book sales, and later the entire catalog sales industry. Netflix did it for video, and the once ubiquitous Blockbuster video chain ceased to exist within a few short years. Uber seems to be doing it for taxis, and AirBnB for accomodations. Apple has, itself, been disrupted in music by streaming services like Spotify and Pandora. There is no traditional market that is unaffected by digital transformation.

In academia, the first challenges by the forces of disruption, the online and for-profit universities, failed for several reasons. The quality of the education was not very good, and the delivery system was also pretty poor, but the cost was the same, so they produced a bad product that nobody wanted. Defaults on student loans were higher at for-profits, where the degree was of low value, and so funding organizations became more discriminating. Accreditation has also created an OPEC-like cartel but, as with OPEC, all it takes is for a major player to go its own way and the whole thing collapes. Traditional academia has some serious problems too. Problem 1: For decades, a college degree was what the high school diploma used to be; a ticket to a good job. However, that drove everyone, even the grossly unqualified, to go and get a college degree. Problem 2: Colleges got greedy. Willing to accept anyone who could pay, and even those who took on enormous student loans for questionable degrees, colleges saturated the market with graduates and the bachelor's degree became devalued. For a while, universities solved that problem by offering Master's degrees, but now that market is flooded too. Problem 3: Tuition costs keep going up, because of state cuts to higher education, and because highly paid administrators have gone on building sprees to try to make their universities more appealing than those of the competition. Problem 4: To stem the rising costs, administrators have been gradually phasing out the well paid, highly qualified, tenured professors as they retire, and replacing them with low paid, less qualified instructors on annual contracts. This has had the added effect of solidifying the administrative power base, since tenured faculty were often the most resistant to the demands of administrators to lower standards and keep paying students in the pipeline regardless of their potential.

So, in summary, the modern baccalaureate degree is devalued. It is no longer a ticket to a good job, the quality of the education itself has declined, and the cost remains exorbitantly high. These problems create a situation ripe for disruption.

All that remains is for employers to realize that they cannot effectively distinguish between job applicants based on whether, or from where, they have a bachelor's degree, and to begin prioritizing other selection criteria. When demand for the bachelor's degree dries up, there will be a lot of academic real estate coming onto the market as universities collapse. Why won't universities just tighten their standards and produce higher quality graduates? Those with strong reputations will be able to take that approach, because it's not just a degree, but a Harvard or Yale or Stanford degree. But graduates with degrees from "generic university" will find that their diploma is worthless, and that they are saddled with a huge pile of student loan debt that bought them nothing but a 4-year party. The administrators running Generic U will need to rapidly change their institution's offerings, or see enrollment plummet. Unfortunately, universities are led by cautious, change averse people, and so most of those institutions will keep on doing what they've always done and they will fold before leadership has any idea what happened.

The successful disruptors will be the organizations that figure out how to do two things: 1) Rapidly assess the abilities of students (grading students in the way that eggs, or olive oil, or maple syrup is graded, by quality) and sell those ratings directly to employers. Curiously, the lists employers provide do not generally involve any knowledge of the job itself, but rather are heavy on personality traits and attitude indicators. 2) Figure out how to imbue the students with the abilities (something that takes longer and costs more) that employers are looking for and that they don't already possess. This would lead to a program of study based on the missing pieces, much as the "personalized learning" programs are attempting to do. What are those qualities, and how do we teach them? Can they be taught, or can they only be nurtured in those with intrinsic ability? Can we teach enthusiasm, perseverance, integrity, dedication, communication skills, ability to get along with others on a team, some of whom may be annoying? Or should we stick to teaching the kind of material found in textbooks? Does your institution have the answers?

09.08.2015: 21st Century Learning

What is the future of learning? And how did learning in its present form take shape? Sugata Mitra says that the skills we teach our children are based on the Victorian era need for interchangeable human calculators. In an age before modern computing and telecommunications technology, the British Empire ran efficiently on columns of numbers transcribed into ledgers and transported around the globe by ships. It was necessary to have human calculators who had the same abilities scattered all over the world in order to transmit and receive the vital information of commerce, so spelling, writing, and mathematics were standardized. Students needed to read, write, and spell accurately, perform calculations correctly, and not demonstrate too much creativity. Schools of the Victorian era fulfilled their mission, but most schools are still doing it today, over 100 years later, when, perhaps, it no longer serves us so well.

It became possible to eliminate some of these drudgeries back in the 1970s when early technologies started invading the classroom. These technologies made it less necessary to memorize mundane things, but there was an inevitable backlash. Students were forbidden to use pocket calculators because they became less proficient at memorizing their times tables, or because they couldn't do a long division problem by hand, or when they trusted the output of the calculator even when it made no sense. Students were forbidden to use word processors and spell checkers because the quality of their cursive handwriting and their ability to spell was suffering, or because cut and paste was making term papers too easy to plagiarize. But, rather than ban the tools, perhaps we should change what we teach? I'll expand on this idea below.

Should we continue to teach those basic skills, even if only to give students an appreciation for our humble origins, much as we might derive an equation from first principles in a graduate seminar class? Are skills like cursive writing and knowing one's times tables important to the shaping of neural pathways, influencing our intuitive language and computation abilities as some studies suggest, or are they relics of a bygone age? I would argue that some basic skills are important and should still be taught because I can do basic math in my head faster than my children can reach for their iPhones. However, in a knowledge economy, our ability to synthesize and evaluate information is much more important than it used to be, back when we could trust that most of the knowledge found in library books had been vetted by experts during the arduous publication process. Although we like to think that we're in the Information Age, there's an awful lot more easily accessible misinformation online too. Retrieving information is easy. Evaluating and synthsizing it is the bigger challenge in today's world.

categories in the cognitive domain

Categories in the cognitive domain of Bloom's Taxonomy.

Does it still make sense to give students multiple choice tests with closed notes, closed books, and no electronic aids? Is asking students to memorize facts that can be easily retrieved of much value? Sure, students must understand the foundational materials. Even the die-hard constructivists admit that you can't construct your own learning without the basic building blocks. Otherwise it's just time wasted reinventing the wheel, and a pretty primitive wheel it will be. But shouldn't the emphasis be on the higher order thinking skills? Does it matter if a history student knows the date of an event, or is it more important that he/she understands the causes of the event? Does a math student need to know how to tediously calculate a square root by hand, when the calculator is so readily accessible? We need to redesign our curriculum to emphasize the kinds of assignments that require thinking rather than memorizing, and synthesis rather than fact gathering. Why assign students to do a biography of a lesser known president when that information is so easily looked up? Instead, why not assign the students to write about whether he was a good president and whether, based on the information he had at his disposal, or in retrospect, he made the best decisions? Students still need to learn what the guy did, but the tougher question, the one that makes you search your own values and understanding, is, "Was it the right course of action, and why?" You can watch the incurious students squirm when you ask questions like that!

Rather than ban a technology from the classroom, why not acknowledge that, in the modern workplace, it's not so important to know how to run a linear regression with only pencil and paper (perhaps useful if stranded on a desert island?), or to look up some information online on what constitutes a healthy diet. What's really important to know is whether a linear regression is the right tool for the job, or to assess the validity of a website that claims to have all the answers about healthy eating. Creating a generation of critical thinkers who know how to use technology tools is what we're really after, even if it means allowing open book, open note, open Internet on tests. The mathematically curious ones will still want to know how a linear regression works, but the rest can skip ahead to the more stimulating problems. Except for a handful grand masters, most of us can't beat the computer at chess, and yet we don't lose sleep over it. Let's let computers do what they're good at: brute force searches, speedy, error-free calculations, and rapid information retrieval, and let's get humans doing more of what we're best at (when properly trained): critical thinking, synthesis, pattern searching, and creativity!

It's ironic, then, that many instructors want to use technology tools to block student access to technology so that they can continue to deliver tests designed for a pre-technology era. It's the pedagogy and the assessments that need rethinking. Yes, it's often more work to determine what students think rather than test what they have memorized. Computers can grade multiple choice tests, but not opinion papers! We still need humans for that!

02.28.2015: Why College is Like a Gym Membership

working out

College has a somewhat unusual business model. Because it is unusual, many students are confused by it. I have frequently heard the refrain among dissatisfied students that "The only reason I'm here is that I need this degree to get a job." Another common complaint is that "I paid a lot of money, and I am not satisfied with the grade I received in this class." As consumers, we are used to paying for things with the expectation of a full refund if they do not satisfy. This is where the analogy with a gym membership helps to make the point. Paying for an education doesn't guarantee you an education. It only provides you access to the opportunities that will help to develop you into an educated person. But, after you enter college, it's ultimately up to you whether you choose to explore those opportunities. After all, nobody can make you expand your mind. That's something you have to choose to do, and it's something you have to work at. That's why enrolling all incoming freshmen in liberal studies classes doesn't create a cohort of poets and philosophers. With the exception of the gullible people who pay for the exercise machines they see on TV with the expectation that, in one short month, they will look like the smiling, well-oiled and well-muscled supermodels who are shown using the product, most people understand that paying for a gym membership doesn't, all by itself, make you fit. They understand that some comittment is required. If you asked someone, when they were signing up for this membership, to sign a waiver saying that they understand that just paying for a membership does not guarantee that they will become fit, most people would laugh and say, "Of course. That's obvious." Why then, is it not the same for our students who pay for a college education and then feel crestfallen when, without putting much effort into it, they find that it does not meet their expectations? Everyone should have the opportunity to go to college, but not everybody makes the most of it. College is a place ripe with opportunities, but some of them must be sought out. Some of them require effort. Those who seek the opportunities, and work for them, are generally rewarded. But, for those who do neither, sorry folks, no refund. Enjoy those expensive textbooks and your bowflex machine. If you don't make an effort, that's all you've got.

11.27.2014: Pick Any Two
Pick any two
In the world of technology, and the world in general, for that matter, there are situations where you will be asked to do a job fast, cheaply, and well. It turns out that, most of the time, you can't have it all. I'm not sure exactly why this is true but, trust me, it's true. You cannot change the laws of physics! I think the graphic says it all, and very succinctly. There are many people in the world, some of whom will be your bosses or your clients, who don't seem to understand this simple fact. My advice to you is to show them the graphic before you start any project and, without another word of explanation, tell them to pick two. If they can't live with that, point to the center and walk away.


06.24.2013: The Technology Adoption Curve

The concept of a technology adoption curve was first described in a study of the willingness of farmers to try new agricultural methods, but it applies quite well to technology in general. I like to try new things, but won't promote any new technology just because it's cool. It has to fill an unmet need and be easy enough to use that most people can manage it. Therefore I generally try to stay just on the right hand side of the "chasm." However, that's not always the case. Although I have followed their development with great interest, I only recently got a smartphone. With 56% of Americans now owning smartphones (I'm sure that number skews young), that plants me squarely in the majority, and illustrates that while one might be an innovator with one technology, that doesn't mean one isn't more cautious in some other technical regard. That's ok. Until recently, I had no need to own a smartphone and, now that I own one, I'd still characterise "need" as a stretch ;-)

Where are you?

Are you a technology innovator, do you "go with the flow", or do you take a "wait and see" approach?

Back in March, I wrote about the hype cycle. Although I didn't think about it at the time, can you see how the chasm in the upper graph is connected to the trough in the hype cycle? Overlay these two graphs and the connection is clear. While a technology can show great promise and generate excitement among early enthusiasts, it may never catch on with the general public. Often this is because of some limitation in what it can do or in how easy it is to use that the enthusiasts don't mind but which the general public would not tolerate. That's the chasm. If the chasm isn't crossed, then the technology never reaches that eventual productivity plateau but instead just dies out or remains a hobby for a small group of technophiles. Linux is a great example of a promising technology that hasn't crossed the chasm. Enthusiasts are the key to the spread of new technologies however. Seth Godin makes this point well when he talks about marketing to the people who care. So let's think about where we are, because it's useful to know ourselves. If I say the word "Blackboard" or "iPad" or "clicker" or "3-D printer" or "Arduino" or "smartphone" or "Twitter" or "Facebook," where do you fall on the curve? Now think about your colleagues and where they are relative to you. Are you always in the same part of the curve or do you jump around? Have you learned something about yourself through this little exercise? If you decide not to adopt some new technology that everyone is talking about, does that make you a laggard? Not necessarily. It could be that the technology in question is heavily hyped right now but will not last. There is an implied slur in calling people "laggard" that I don't like. Not every new technology is a good thing, nor will it last. If it's not, or if it doesn't--something we'll only know in retrospect--then you were right not to jump on the bandwagon. Nobody talks much about Second Life anymore and, if you missed it, you didn't miss much. If you never bought a Palm or Windows Mobile PDA, good for you! You saved a bunch of money on a near worthless device. And what if you're an innovator? Do you stick with a technology once everyone is using it? Or does that take all the fun out of it? If you were on Facebook when nobody had heard of it, are you still there today? There's a saying that "good pioneers make bad settlers." Pioneers don't like crowds, and they are always moving to the new frontier. I'm not one, but I appreciate them. They work hard and explore a lot of places that don't lead anywhere useful. But when they make a real discovery, the rest of us get to enjoy it without all the effort ;-)

Where are you?

The "hype cycle." Overlay this graph on the tech adoption curve above.

06.17.2013: Time to change my password? Oh, just shoot me now!
The dreaded password change notice

The dreaded password change notice.

Ah, the lowly password. A simple tool from a bygone computer era. But if you've gotten a message like this one lately, I think you will agree with me that passwords are no longer either simple to manage or effective at keeping us secure. Passwords are much like our congested freeways; they don't work very well anymore but the entire infrastructure is built around them so, while they drive us crazy, we have no alternative. The first problem is multiple services. With a steadily growing number of web-based services, each with its own password expiration cycle and username and password creation rules, it has become increasingly difficult to remember all of your passwords and which username and password go with which service. Thankfully, many services let you use your e-mail address as your username, and most offer a "Forgot your password?" link. It's also helpful that my workplace, at least, has a "single sign-on" for all services. The downside of that, though, is that if my work password is compromised, the hacker can change everything from the grades in the class I'm teaching to the beneficiaries of my life insurance plan and the bank routing number of my paycheck's direct deposit! The second problem is multiple devices. The password wallet or virtual keychain was a good solution for managing the multiple usernames and passwords saved on your computer. Just remember one username and password and the tool does the rest. But once you have multiple devices, you're out of luck. So if I have 10 services that require passwords (not an outrageous number when you consider multiple e-mail accounts, IM and video conferencing services, a photo sharing service, e-Bay, PayPal, social services, online banking, web hosting service, cloud storage service, etc.) and 5 devices (again, not all that unreasonable when you consider personal computer, work computer, tablet, laptop, smartphone, etc.), then that's 50 passwords to change on a regular basis. If you have multiple OSes (MacOS and Windows, for example) installed on your device, then that device counts as two. But we're still not done. The third problem is multiple applications on each device. Even if we just consider my single work password, there are many applications on each device that need to use it, and need to be updated when it changes. For example, there's my e-mail program, my VPN connection, my IM program, my web browser, my web page editor, my FTP application, and the fact that I often run more than one of these programs. For example, I use three web browsers commonly, and two e-mail applications, three IM programs, etc. So, to sum up, if I use 10 services on 5 devices, each which connects to these services from 10 applications, that's, very conservatively speaking, 500 places where I need to change passwords on a regular basis! Not every app stores a password for every service, but you get the idea. Here's an oversimplified map of my online world. I bet yours looks similar. Start drawing lines from service through device to application, and you'll see how messy password management can get!

Password management is no longer simple, nor does it keep us very secure.

Password management is no longer simple, nor does it keep us very secure.

True, I may have it a bit worse than most, but I can tell you that this is out of control and that I'm better at managing this chaos than most people. It is no wonder, then, that many of us use the same not-very-strong-password, or a minor variant of it, on multiple services, and that we engage in other unsafe behaviors such as writing passwords down on a sticky note attached to our monitor, or incrementing our old password with the next number in line when it's time to change it. With the proliferation of cloud-based accounts and services, it's only a matter of time before one of them is breached. It seems that not a month goes by without some service provider announcing that its user base has been compromised. It is no wonder, then, that when one of our accounts gets hacked, sometimes through no fault of our own, it doesn't take long for a hacker to gain access to our other accounts, often by using our email system to reset our passwords in other systems. The fourth problem is that all of the flaws above lead services to use extreme countermeasures such as forcing you to create a password so strong that you have no hope of remembering it, or locking your account after too many failed access attempts, or making you prove that you're human with one of those "captcha" tools. Often it's not a hacker or a bot but just me, trying to remember which username and which password go with this account, hoping I guess right before I get locked out. Sometimes it's a device with an old password trying to update itself automatically that locks me out. Sometimes I fail to correctly answer my own challenge questions because the system is too picky about the answer. For the "street I lived on when I was in second grade," did I spell out "Street" or did I abbreviate with "St." (with or without the ".") or did I leave out "Street" altogether? I don't know what the solution is. Maybe a cloud based keychain that generates ridiculously strong passwords you never have to remember, or that uses some difficult to impersonate biometric like fingerprint or retina scan? All I know is that I'm in desperate need of something to fix this mess, and that a lot of other people are stuck in the same sinking boat. I will lose hours of productivity dealing with this upcoming password change, and it will be days or weeks before most of the apps on most of the devices I use regularly are updated. There has got to be a better way!

Update: I have heard very good things about 1Password, a web-based password keychain, and since I have started using Apple's iCloud Keychain, which syncs across all my devices, a strong password is supplied for each web account, and yet I don't have to remember it. So far, 6 months in, it's working great!

03.27.2013: Technology is the answer. What was the question?
The Hype Cycle

The Hype Cycle: Map your favorite educational technology.

How many times have you heard that some emerging technology is going to solve all of education's woes? In my experience, a technical innovation may allow the job to be done faster, cheaper, or better than before, but rarely, if ever, all three. If you're lucky, you get to pick two! If you're thinking about implementing some new technology that everyone is talking about, It's important to step back and consider its position on the "hype cycle" graph. Google Glass, for example, is just past the trigger point, and visibility is still increasing. MOOCs are at the peak of inflated expectations right now. But does anyone remember Second Life? Once heralded as "the next big thing," it has slid into the trough of disillusionment. Take Second Life out of your resumé, people. It's not doing you any favors. Speech recognition, long ridiculed, is finally climbing out of the trough and up the slope towards a more realistic "plateau of productivity." While still not practical for most uses, it fills a niche for users with repetitive stress injuries that make using the mouse and keyboard painful. Used as intended, with a realistic appreciation for what it can and can't do, technology can be highly effective. But misapplied, technology can make a real mess of things. As the old saying goes, "To err is human. To really screw up, you need a computer." One of the debates that rages in my office relates to what to teach people about a new technology. We want them to get excited about new technologies, as we are, and to be adventurous in their teaching. Often however, people with inflated expectations come to us only wanting to know how some new technology will make their job easier, and they get frustrated when we ask them why they want to use it (what problem are they trying to solve?) or try to explain that there are limitations. They don't want to hear that it won't re-energize their lectures or that it might require just as much effort as what they are doing now. Let's look at a few examples of useful technologies misapplied, and you'll see what I mean.

Technology Misuse Proper Use
Instructor shows a full-length movie to class in order to take a day off from lecture, catch up on grading, etc. Instructor shows a series of relevant video clips, each followed up with insightful questions and guided discussion to engage the class in critical thinking.


(two ways to wreck a presentation)

1. PowerPoint presentation is viewed in absence of the presenter, but the bullet points are vague or meaningless without the emphasis and interpretation of the speaker. (Did they think the presenter had nothing of value to add?)

2. Speaker, facing away from the audience, reads paragraphs of text from each projected PowerPoint slide, adding nothing of relevance. (Did they think the audience can't read?)

Presenter uses prompts on the slides to make key points to the audience, to jog the memory, and to engage the audience in a lively and only loosely scripted discussion.
SafeAssign or TurnItIn
Instructor uses tool to fail students for unintentional plagiarism. Instructor uses tool to show students how to properly reference the source materials they cite.
Rather than make the teaching more engaging, instructor uses clickers to enforce mandatory attendance policy. Instructor uses tool to assess comprehension, engage students, and deepen their understanding with challenging questions and analysis of why they think what they do.

Your assignment: Expand my table with more examples. Begin with the LMS, Facebook, eBooks, MOOCs, and iPads. All great tools. But are they being used as they should?

03.06.2013: After bad stuff happens :-(
02.28.2013: Latest Reports from the LMS Battleground
10.10.2012 The wave of change that's about to hit higher education
mortarboard with @ symbol

Higher Ed Goes Digital

Big changes are coming to the hallowed halls of higher education. As the cost of a four year degree continues to rise because of, well, you might be surprised. And because state funding for education continues to decline, the consumer is left paying an increasing share of the bill. Administrators, who feel pinched to keep doing more with less and to keep a lid on costs, are pushing for increased class sizes, for more classes taught by part-time instructors, for more online classes, and for the adoption of technologies that automate instruction or reduce the teaching effort per instructor, allowing each one to do more. If we step back and look at the big picture, where is all this headed? As a result of these coming changes, the tenure track faculty member who teaches for a living is, by my reading of the situation, an endangered species, and the state funded primarily undergraduate university isn't much better off. Don't believe me? Ask any department chair at any public undergraduate institution what's happening when a tenured professor (one who's primary responsibility is teaching, not research) retires. While enrollment is growing like crazy (because "college is for everyone"), experienced full-time faculty are being replaced, if at all, by much cheaper and often less qualified part-time instructors. It's happening because technology has been identified as a method for regularizing and further automating undergraduate instruction. Undergraduate university teaching is the delivery of specialized, but fairly standard, information to a large market of adults, for a high price. (K-12 is safe for the moment because teachers not only impart knowledge but also serve as workday babysitters for their young charges.) Sure, experts are still necessary to develop the standardized lessons and content for higher education but, once that's done, it can all be deployed on a massive scale and managed by less qualified people. (Well, that's the argument I hear from upper administration anyway. Whether a less qualified instructor can as effectively grasp and deliver that content is another question, but it's a tradeoff administrators seems able to live with.)

Since the market is large and the price is high, there will be lots of competition for students. With instruction going online, students will no longer be placebound, and course capacities will no longer be dictated by the size of the classroom. In the very near future, students will be able to get an online degree in most subjects from anywhere they choose. Some universities are even racing to grant degrees in personalized learning programs where students can shorten their course of study by "testing out" of classes in which they have "life experience!" (I hope the testing is rigorous and occurs in a proctored environment with ID checks!) When future students are choosing where to go for their online degree, why would they choose your institution? If you don't have a good answer, you'll be in trouble. This change will be highly disruptive. Ask yourself this. What happened to the local video rental stores like Blockbuster when Netflix came along? What happened to the local music shops after iTunes? What happened to the local newspapers after Craigslist became the place for classified ads? What happened to all the independent used bookstores and even the big chain bookstores like Barnes and Noble now that Amazon sells more digital books than paper ones? All of these digital information delivery services replaced their analog counterparts in a very short period of time. With high quality content and lessons coming from the big publishers, written by pedagogical and subject area experts and tailored for the web by skilled graphic designers, the courses developed independently by most professors don't compare favorably. Brick and mortar universities teaching traditionally will be like the small quirky independent bookshops competing against Amazon's vastly greater selection of cheaper content. Most of them will fold. What will happen to all those beautiful campuses and the college towns that depended on them? When the undergrad degree goes digital, there will be only a few winners and they will win big. There will also be many losers, as venerable local institutions see in-person enrollment decline and poorly implemented online programs fail to attract and/or retain students. Universities that conduct research and have graduate programs will be less affected, and the private ivy league institutions will continue to do fine by offering an expensive top-notch traditional education to a niche market, but the community colleges and primarily undergraduate institutions competing on price and who can't differentiate themselves will mostly go the way of the Blockbuster Video stores.

Which organization that you haven't heard of yet will be the Amazon or the iTunes of higher education? Will it be a big publisher like Pearson, or a for-profit online institution like University of Phoenix or Capella? Will it be a currently free option like Coursera, EdX, or Udacity or the Khan Academy? Will it be a highly regarded traditional institution like Stanford or MIT? Or will it be a small regional university like NAU, already accredited and experienced in online delivery to its rural population, that gets it right? It's too early to tell. But there are ways to prosper in this new era. Courses from the for-profits are still generally pretty bad, and the selection from the free services is limited, so there's a window of opportunity for some new leaders to emerge. And while Massively Open Online Courses (MOOCs) are currently getting a lot of attention, they require a level of self-motivation and organization rarely found in our undergraduates. Build better service, with better instructors, more courses of study, better than standard "canned" content, and more personal touch into our online programs and we can beat the competition, create more value for the dollar, grow enrollment, and enhance our reputation as a quality online degree granting institution. That will take time and hard work, and it will take a new kind of instructor who knows technology and pedagogy as well as the subject area. And it won't be any cheaper, to the chagrin of those who think that waving some technology pixie dust over the problem will make it all better. But change is coming and academia, steeped in tradition and rife with bureaucracy, is not very good at change, so it's going to be a shock. Are you preparing for the giant wave of change that's about to crash on traditional higher education? Because you can just sit there and get crushed by it, or you can start paddling for your life and ride it into the future!

04.03.2012 Blackboard Embraces Open a Boa constrictor
04.01.2012 What Google and Facebook have in common
03.25.2012 Message to the eContent providers
03.20.2012 Textbooks of the Near Future.
01.06.2012 Are we putting the technology cart before the instructional horse?
01.03.2012 Unintended Consequences.
10.17.2011 Quality Matters?

magnifying glassRecently NAU was approached by an organization called "Quality Matters" and invited to become a member. While they are a non-profit, that does not mean they are free. Annual membership dues are required, and the implication is pretty clear. If you say you're not interested, you must not care about quality, right? People pay to be trained as reviewers. People also pay to have their courses reviewed, and they pay to receive the QM seal of approval. Based on the success of this operation, QM could easily spin off some other ventures such as, "Motherhood and Apple Pie Matter," or "Patriotism Matters." Their heart is, to be fair, in the right place. The purpose of this organization is to identify things that make for a quality online course, and use a faculty peer review process to evaluate and certify these courses. This movement wouldn't even exist if there weren't some valid questions about the quality of online courses nationally, and if schools weren't feeling a little defensive about their online programs. I do, however, have some issues with their approach. My first issue is that the focus is on courses delivered online. Their scope does not include courses taught in a traditional manner, and I think we can all agree that some of those must be equally bad or worse! While I'd like to level the playing field and look at all courses, it's maybe a bit unfair to criticize QM for what they don't review. So let's look at what they do review. We will leave aside for now whether NAU should cede its authority over the evaluation of course quality to a body outside the university, and over which we have no control, because the question of who's watching the watchers could be the subject of an entirely different discussion. My biggest remaining issue with the "QM Program" is that online courses can be, arguably, broken down into three major components, and QM deals with only one. A better name for Quality Matters might be "Let's Focus on One of Three Things that Matters!" In case you're inclined to disagree with me, here are my three components of quality in an online course: 1) Course Design: this is the way the course is structured, how it displays to the user in the online environment, and the instructional methods used, including the identification and measurement of learning outcomes. 2) Course Content: this includes the selection of appropriate materials and the accuracy and depth of those materials, 3) Course Delivery: this includes all of the interactions between instructor and student, and among students. The QM program deals only with Course Design. I'm not saying that design doesn't matter. I'm pretty convinced that it does. Without good design, it's going to be difficult to get out of the starting blocks. But I think I'd like more than one of the three reviewers of my online course to be a "subject matter expert" and I don't think it makes much sense to slap a seal of approval on a course unless the content and delivery have also been reviewed thoroughly. I have seen the disastrous results that occur when you give great materials to a poor instructor. I have also seen the tragic consequences when you combine a dynamic and motivating instructor with materials that are inappropriate for the students, either because the materials are not challenging enough, are out of date or otherwise inaccurate, or are too challenging because the students do not have the necessary background preparation. What I'd really like to see is a peer review program that looks at all of the aspects of course quality described above, and owned by our own faculty rather than an outside organization. But I think I see the writing on the wall. If we don't start policing ourselves, it may not be too long before someone else is doing it for us.

09.29.2011 The "do-over" mentality in undergraduate education

mortarboard and degreeTrue story. I have a faculty colleage who had a formal complaint filed against him by one of his students for "discriminating against me on the basis of my intelligence." (The "discrimination" was giving the student a lower grade than some of his classmates, based on the student's relatively poor performance on various assessments.) When the professor agreed that this was true, the student became even more convinced that he had a case! I think this raises an interesting point because the professor in question was using an "old" way of thinking, while the student was using a more modern construction.

When I was in college back in the '80s, I'm not sure there was such a thing as dropping a class. At least, if there was, I never did, and I never knew anyone who did, so it was neither common practice nor a well advertised option. It just never occured to me that one could do that. The concept of re-taking a class a second or third time to replace the original bad grade was also completely foreign. When I got the occasional grade that I was unhappy with, I owned it, and there was nothing I could do about it. It was there on my transcript for all to see, like a tenacious piece of gum on the bottom of my shoe. Today, most students would just throw away the shoes and buy a new pair. In my job at the university, we care about student success and we want everyone to get a good grade. We go to greater lengths every year to accomplish this goal, giving students more choice, more flexibility, and we intervene more than ever before to work with students who are struggling. All of this is good, I think. But we rarely think about why this is the goal. Not trying to be cynical here, but let's just step back for a minute and ask ourselves: "Isn't the point of grading students, in large part, to identify (optimistically) which ones have learned or, (pragmatically) which ones have successfully completed the assignments, or (cynically) which ones have successfully jumped through the hoops?"

Question: Is our goal to get everyone over the bar, no matter what it takes, or just to provide everyone an equal opportunity to get over the bar and then report the results? The bar I refer to here, of course, is "learning" even if measuring that intangible substance requires cruder instruments like tests and other assessments. If everyone gets unlimited chances to get an A (assuming here that letter grade correlates with learning achieved, so you can substitute A with "learned a lot" and F with "didn't learn a thing") by the process of do-overs, remedial work, tutoring sessions, interventions, etc, then aren't we artificially levelling the playing field? Aren't we de-valuing the A earned with hard work and without extra credit? Would you rather be seen by the doctor who got an A in Biology the first time through without any outside help, or the one who was failing the course and dropped it, took it again and got a D, found an easier instructor and took the course a third time, got a B- and, with a bunch of intervention, tutoring, and extra credit, got the B- rounded up to an A, which replaced the D on the transcript? I suppose that student has perseverance at least! Of course, there's an old joke: What do you call the medical student who graduated at the absolute bottom of his class? Doctor! Hah :-)

Why has it come to this, and how has it come to this, and is this where we want to be, and, if not, how do we get someplace else? I think part of the reason we have arrived at this point is that so many more kids are going to college. College really is the new high school. Michael Wesch, who I admire and mostly agree with, says "College is for everyone." True. Certainly part of the problem, though, is that if everyone is being admitted, more students are arriving unprepared. More students are here not because they want to be, but because they feel compelled to be so that they will be competitive for a job at the other end. This also explains the impatience of many of our students, who don't really love to learn or want to broaden their minds. They "just want a job, ok, and could you please show me the fastest way out of here?" I'm sympathetic. Who wants to spend $40,000 (minimum) for a bachelor's degree that still can't guarantee them a job? And certainly part of the problem is that universities love all the extra money that's coming in, but feel a twinge of guilt when those students who aren't prepared don't succeed. Legislators and administrators, who hear from the howling parents who pay the bills of these mediocre students, put pressure on faculty to do better. By "better," they mean graduate more students faster with better grades and with less funding. If we rule out the easy way (just lowering standards), and take the challenge to "do better" seriously, what's left?

Solutions: 1) Placement. Students should not be admitted to the university if they are not capable of succeeding and students should not be allowed into courses for which they have a high probability of failure. We can pretty accurately predict success with placement tests and we need to do this more. 2) Remediation. If students arrive without the skills but it is possible to teach them those skills, they need bridging courses to get them there. 3) Academic probation and dismissal. Students who are not succeeding, and who are not likely to turn it around, should not be strung along. 4) Monitoring. Technology can be used to monitor student progress so that intervention occurs quickly before students spiral downward. We do all of these things now. We just need to do them more, and better. But the following are not generally addressed at all. 5) Instruction. Most faculty arrive with good content area knowledge but limited teaching experience or knowhow. This can be addressed, but it would take a mind shift for the university to accept that this is a problem. 6) Compensation. Little attention is paid to the quality of instruction. Typically, only instructors with high D/F/W (drop, fail, withdraw) rates get any attention from administration, and this negative attention can easily be avoided by lowering standards and giving lots of As. But standardized tests, with all their flaws, can measure incoming and outgoing students and be used to reward instructors who show the gains. Will this lead to "teaching to the test?" Possibly. But if the test is good, that's not the worst problem to have. 7) Peer review. Research faculty know all about peer review. It's how they get articles published in good journals. But in the classroom, instruction is siloed. Nobody watches anyone else teach or gives them any tips on how to do it better. Sure, there's muttering in the hallways about which instructors are too easy, or just plain bad, but nothing gets done about it. This could be fixed if there was the will to do it, but again, it would require a major shift in faculty culture. 8) Reporting. Something I've never heard mentioned anywhere is that universities really ought to report not just on the grade a student receives, but how long it took the student to get there, and by what path. We have this data. We could put some of the rigor back into transcripts that are packed with As by reporting the information employers want to know: How much external time and effort was expended to get this student over the bar? 9) Tracks. I know it's sacrilege but, while college is for everyone, the liberal studies degree is not. Universities need to rethink degree granting with an eye towards certificates and diplomas that lead directly to a career path. Want to be a salesman, a dental hygeinist or an x-ray technician, or a database programmer, a forest ranger or a cop? Sure, a bachelors would be helpful, but it's probably not something you "need." Want to be an astrophysicist, a historian, or a philosopher? Ok, get the bachelors. But here's something else we should tell incoming freshmen and rarely do. If you get the bachelors, you probably don't need to come back to school when you change careers, as most of us do these days. With the certificates, you probably do.

12.12.2010 Why going "TSA" on web classes just won't work
06.14.2010 What Google should do next
06.11.2010 Why NAU's Mobile Computing Policy needs rethinking
Wireless icon

NAU has upgraded its wifi system. The new one works very much like the ones you've encountered in airports and hotels. That's the first problem. It's a university, not a hotel. Regular users of the wireless have to agree to terms EVERY TIME they connect. If your smartphone goes to sleep to conserve power or you close your laptop to move from one location to another, when you wake the device up you need to reconnect and agree all over again. That's just silly for a system designed primarily for regular users (not one time guests). While this is annoying for laptop users, it's a downright nuisance for people with wi-fi capable smartphones and tablets. But there is a BETTER WAY. The MAC (media access control) address of every wired computer on campus is registered. If regular users of the wireless could register their devices too, then the agreements could be logged once and filed away. Sure, the agree screen should pop up for unregistered guests (parents, vendors, etc.) or when the policy changes. But for the regular users, most of the time, this shouldn't be necessary.

The second problem is security. If you try to access a web service other than a browser, you never see the agree screen so you can't connect. Even in a browser, the agreement screen doesn't always appear, and that has negative consequences downstream. If your home page is set to an NAU website, you won't be prompted to agree because the NAU domain is a "trusted site." But unless you agree, you can't join the VPN (and you're not told why; it just fails to connect) so your session is insecure and you are transmitting passwords and credit card numbers unencrypted. And even if you do agree, many people don't take that final step and join the VPN because they don't have to; the wireless works even if you don't join the VPN! Sure, there's a warning on the agreement screen, but it's buried in a page of legalese and who reads that stuff anyway? So, aside from a handful of tech people who know better, most of our wireless clients are surfing the web without encryption. Don't believe me? Ask your colleagues if they connect to the VPN while using the wireless, or if they even know what the VPN is! This makes the majority of our clients easy pickin's for any geek with a packet sniffing program and a few idle minutes in a public space! Is this bad? Think of it this way. It's the digital equivalent of walking down the street naked in the middle of winter. Normally the security-centric IT folks would be all over an issue like this, but they're not. They know about this problem, but they don't choose to fix it. Is it because there is a way to be secure and it's buyer beware? Or because they have a reactive (see problem 3 below) strategy? I don't know. But there are two proactive ways to fix this problem: technical (don't allow insecure connections) or educational (teach people about how and why to encrypt their wireless sessions).

On to the third problem. People are increasingly showing up on campus with tablets and smartphones. These devices are almost always the property of the user, not the university. But the university insists that, for the privilege of checking my work email on my personal device, a password lock with a 15 minute timeout must be installed, and that the password must be strong (hard to remember) and non-repeating. Worst of all, the university wants to be held harmless for remotely wiping my entire device without my express permission if my password is entered incorrectly too many times. This is security overreach at its worst. Restrictive policies such as these are typically written by big corporations with trade secrets to protect and who provide company-owned mobile devices to their employees for work purposes. That situation doesn't apply here. The university ought to be thrilled that faculty and staff would want to check their work e-mail on a device that cost the university nothing, and which is carried around with them during every waking moment. There is a BETTER WAY. Our policy should not discourage the use of personal mobile devices by faculty, staff and students. We need a much more flexible and less restrictive policy for personally owned devices which contain mostly non-NAU data, or users will not connect them to the services we want them to use.

I agree

Workarounds: Rather than configure IRIS in my e-mail client, I use Outlook Web Access through my mobile device's browser which, on the downside, requires a login every time but at least doesn't require me to agree to a remote wipe of my personal device. Students use Google's GMail system, which doesn't require the remote wipe. And everyone should remember to use the VPN. As for the frequent Agree prompts...don't we all agree this is just a silly waste of time?

Update: 07/11/2013 The system has changed again. The newest wireless has a more secure "NAU" network which requires a one time login with your NAU username and password, and a less secure "Public" one for guests which works much as described above. This is a big improvement. The NAU network still doesn't require you to use a VPN for greater security, but at least it only bugs you once for a login. And the public network works as it should, prompting guests to agree to terms each time they connect. We're slowly getting there!

05.05.2010 College is for Everyone, so Attendance is Mandatory!
04.20.2010 LMS Decisions
04.12.2010 The Hacker-Hipster Manifesto
02.19.2010 What is up with Google lately?
01.22.2010 Working and Learning through Snow Days, Swine Flu and Other Disasters
01.04.2010 Clickers: Treating the symptoms or the disease?
12.20.2009 Spreading the FUD
10.14.2009 NAU adopts MS Exchange; increase in productivity negligible
10.02.2009 How to get attention in Academia
10.01.2009 Universal Design
09.30.2009 Should NAU site license the MacOS as well as Windows?
09.01.2009 Marketshare change among LMSes over time
05.26.2009 Mac Growth in Higher Ed
05.21.2009 Microsoft on the move?
04.15.2009 Free and Open Source Software in the Enterprise
Why Computing Monopolies are Bad
How fast is your network connection?
Data Visualization
Mossberg puts his finger on it, and his foot in it.
Why can't Microsoft get it right?
The truth about telecommuting

Recently our university has been encouraging employees to turn off lights, turn down thermostats, and to attend conferences virtually instead of in person, ostensibly to reduce our global carbon footprint and be more green. It's hard to tell whether that appeal is of pure intent though, because it both helps the environment and saves the university money, while placing most of the hardships on the employees. It reminds me of the hotels that plead with you to save our planet by not changing your sheets and washing your towels each day, but then feeding you breakfast with disposable plastic plates, cups and utensils. Clearly, the green they are most interested in saving is their money. If the appeal was that they could lower your bill if you were less wasteful, that would be a far more sincere and convincing argument.

Save the green!

What kind of green are we being asked to save?

Are we as quick to do things that are good for the planet if they don't save us money? We have an award winning LEED certified (platinum level) green building on campus, which is great, but why aren't all our new buildings green? At least part of the reason is that they are more expensive to build. Having one makes a statement. Having many is a bigger comittment. What about things that could help the environment, but would force us to change our time honored management practices, like telecommuting? Telecommuting takes cars off the road, reduces parking demand on campus, saves the workplace electricity, and potentially lengthens the workday by eliminating commute time. It also increases efficiency by eliminating many of the distractions of the office. When I telecommute, I can be incredibly productive in my quiet home office. Unfortunately, management can be suspicious of telecommuting because of the assumption that an employee in his/her seat at work can't goof off, and because an employee working from home can't be as closely monitored. Here's the truth about telecommuting. If managers set clear deliverables and check progress regularly, then it's the quality and quantity of work and not the number of hours spent warming a seat that gets measured. Bottom line? If you have responsible employees and effective managers, telecommuting works great. But it's kind of lonely. And if you have flaky employees, there's no guarantee they're being productive even if they warm their seats at work for eight hours a day.

Blackboard's Scholar
Learning Spaces
Podcasting with iTunesU
Gaming on Campus
©2007-2016 Larry MacPhee | IM: | Skype: larryrmacphee | google: larry.macphee | 928-523-9406