8.23.2024 AI for dummies
I just heard an ad for C3.ai, one of the many generative AI companies that are springing up in recent years. They toss out a bunch of terminology that may sound impressive but most people have no idea what they are talking about. So here's a very basic primer on AI jargon and what it means. LLM Agnostic: LLM stands for Large Language Model. For AI to sound human, it has to use language the way people do. By dumping lots and lots of content samples into a system that the software can peruse, they are simulating the way people talk. The agnostic part means that the AI in question can use more than one LLM, so that, for example, it can generate convincing jargon associated with biomedical, or legal, or whatever specialized field is needed. IP Free: IP refers to Intellectual Property; specifically the copyrighted original work belonging to other people. What the AI people are saying is that the content that the AI generates is so churned up from various sources that it doesn't appear to violate any one individual's work. I say "appear" because these systems don't actually think or generate novel content. They just rehash from the available source material. But to catch them at it is nearly impossible. Hallucination Free: this is kind of a funny one. You may have heard that while the AI can in many cases do a pretty good job, it occasionally gets things very wrong. A great example is the fingers problem. When the AI is asked to draw people, it often gets the number of fingers wrong. Sometimes background faces in a group photo look like something from Edvard Munch's "The Scream". That's the hallucination. In other cases, faces may look too perfect, with no blemishes, freckles, wrinkles, or features that are out of perfect proportion. These are amalgams of many faces, averaging out the minor individual deviations from the norm. In any of these cases, there's a good chance the image is AI generated. That said, it is getting harder to tell AI from real. Take this quiz to see for yourself.
Here are a few great examples where the AI got it wrong:
This group photo of female F-35 pilots looks good at first, but zoom in on the ladies in the back row on the right. It gets worse...
If you're familiar with the bible story of Jesus flipping over the sellers' tables in the temple, I don't think this is what was being implied. One more, just for fun...
Thus the majestic salmon complete their long journey, swimming back to their natal stream to spawn.
10.24.2023 The COVID backlash - out with the baby and the bathwater!
When COVID hit higher education, back in early 2020, a lot of higher-ed upper-level administrators, many of whom didn't have much experience with fully-online education, were asked to quickly come to a decision about how to handle this challenge. Unfortunately, the decision makers didn't take the advice of the people who were experienced in onine education. This is not a criticism of just my own institution, but rather a broad overview of academia's response. I think it's worth exploring why they didn't take the advice of the experts, and looking at some of the major downstream consequences that resulted.
Our advice was to move most courses to from in-person to online asynchronous, relying primarily on the Learning Management System (LMS). One reason was the concern about inadequate bandwidth for some of our students, many of whom live in rural Arizona and on the Navajo and Hopi reservations where Internet connectivity is always slow and often spotty. Moving to asynchronous assignments would have reduced the time stress on students who may have been dealing with non-academic challenges such as caring for sick relatives or recovering from illness themselves. The LMS functions well at lower bandwidth, and when students are not all hitting the system at the same moment in time. We recommended the use of tools like Zoom and Collaborate only for lower-stakes activities such as office hours, and promoted our streaming media system, Kaltura, for recording short video lectures that could be buffered and viewed at lower bandwidth.
The second reason is that remote synchronous instruction (mediated by tools like Zoom and Collaborate) is a high-risk strategy because, if the technology fails, there's no safety net. So called "Zoom school" is what many institutions settled on. In the early days of COVID, those failures happened a lot. Zoom bombers disrupted classes, students dropped the call frequently, mics were often muted while people were trying to talk, or unmuted when they were supposed to be quiet, resulting in a lot of awkward and embarrassing background noise. For those who didn't use a false background, the goings-on behind the camera were also sometimes very distracting. Some of these problems can be chalked up to the inexperience of the instructors and support staff with regard to this new mode of instruction. Eventually better security measures prevented uninvited guests from disrupting classes, instructors got better at muting and un-muting participants at the right moment, and at simultaneously monitoring the chat while lecturing. We also started recording lectures for those who dropped out, but they were seldom viewed, and we have the analytics to prove it. Without a doubt, engagement suffered. Many students turned off their cameras and more just stopped attending. Why? Again, the lack of preparation for this style of instruction was a factor. In a lecture hall, the captive audience has little choice but to sit there and listen. Although breaking up the content into small chunks, and embedding regular engagement activities is a good practice for both in-person lecture and Zoom instruction, it's less conspicuous to tune out with Zoom, and not to get called on it.
Some institutions, ours included, went even further, and adopted a variant of HyFlex instruction. Without the staff and technical resources to support it properly, which few schools invested, this meant simultaneously teaching to an in-person and online audience without a lot of help. For most of our instructors, who had no experience teaching remotely, this was the most difficult thing that could have been assigned to them, and the quality of teaching slipped dramatically. It was especially painful for the remote students, who often got forgotten. It's easy to accuse the students and say they should have stuck with it and been more participatory, but the reality is that they were also unprepared for this rapid adjustment.
So why didn't administrators at more schools choose to shift from live, in-person to online asynchronous, as the experts in online teaching and learning recommended? I think one answer is that these administrators falsely believed that asynchronous instruction was "not as good" as synchronous, and feared student attrition. Or, for those who did have confidence in asynchronous, they still feared that the parents and students felt that way, and wouldn't be willing to pay for it. They wanted to change things as little as possible, and made the false assumption that remote synchronous lecture via Zoom would be the smallest change they could make.
Unfortunately, almost everyone looks back on Zoom school and Hyflex as disasters and, in many ways, they were. But one has to wonder about the path not taken. Yes, more change up front would have been necessary, but I truly believe that had we chosen the bolder path at the beginning, we might not now be throwing out all of online education, baby and bathwater. The big winners from COVID seem to have been the for-profit online schools like GCU and SNHU, who stuck to what they were already doing well.
If we don't learn from our mistakes, we are bound to repeat them. One of the other take-aways from allowing employees to work remotely when it's possible, and monitoring them based on performance metrics rather than time in seat, is that employee satisfaction was higher and productivity didn't suffer appreciably. So why are we forcing employees to come back to the office if they can do the job from anywhere? While I think a lot of people want things to "return to normal" and, by normal they mean pre-COVID, the world has changed, and I think we need to embrace that.
10.22.2023 ChatGPT and the AI Panic
In academia and beyond, ChapGPT has been creating quite a firestorm. Although contract cheating (where you pay someone else to do the work for you) has been around awhile, getting the computer to write a passable essay is new. What has got academia in a twist is that writing essays used to be something that made cheating more difficult, and it was always considered the gold standard on ensuring that students "know how to think." Faculty who were disparaging of multiple choice tests have, until recently, felt secure in the belief that requiring students to write essays allowed them to uphold academic integrity. Although this may have always been a questionable assumption, it is now a more dubious claim then ever.
I wanted to see for myself just how good the software is, so I picked a topic I know a fair bit about, set up an account in just a few seconds, and asked ChatGPT to write me an essay on the subject. I asked ChatGPT to write an essay on the topic of Natural Selection (Biology). The essay was generated on the fly within about 10 seconds and, had a student submitted this work to me, with proper references, I would have shed tears of joy and assigned an A+. Even more amazing is that if I click the Regenerate button, it writes a new essay on the same topic that is just as good!
One of the criticisms I've heard about this tool includes the claim that while the AI makes a convincing argument, it does not do a good job of sticking to the facts. In my test, that was not the case. The information was accurate and written in the everyday language that I would expect a good college student to be able to produce. So, in other words, I would have absolutely no clue that this essay was machine generated.
Good news/Bad News. The good news is that an entire cottage industry of AI detectors has sprung up to respond to this new challenge. The bad news, however, is that the tools for detecting AI generated content are really in the same boat as we are. Between false positives (Looks like the student used AI but he didn't.) and false negatives (Looks like the student didn't use AI but he did.), the tools are essentially worthless. OpenAI's own tool for detecting AI generated text gets it right 26% of the time, but also incorrectly identifies student generated text as AI-written 9% of the time. It would seem that Frankenstein has lost control of his monster.
What to do? I expect that while the detection tools may get better, so will the AI's ability to evade detection. There's probably no winning this arms race. Some will pretend that nothing has changed and that everything's fine. Others will hit the panic button and come up with extreme solutions. One option would be to ban all technology and have students write their essays in controlled labs or in proctored online settings, where access to these tools is restricted. Another approach would be to lean into this new technology and have students fact-check and provide references for the AI generated content. But there's a middle of the road alternative that may be more reasonable, although it's going to require some effort. An instructor can do one-on-one interviews with students after they submit their essays, live or via Zoom, and ask them questions about it. It won't take long to learn whether they know their stuff or not.
09.28.2022 What do floppy disks and women's colleges have in common?
What do floppy disks and women's colleges have in common? Your first response might be that both are obsolete, but no, I don't think that's right. There was an article on NPR the other day about the "Floppy Disk King." Tom Persky describes himself as the "last man standing" in the floppy disk marketplace. While sales of floppy disks have been declining for years, there is a niche market for older equipment that still uses this reliable but obsolete technology. Think about voting machines or scientific/medical instruments for example, that serve a very specific role, are expensive to replace, and work just fine except for the fact that the world has moved on. As Tom's competitors exited the marketplace, he found himself in a niche market that, for him, was actually growing. Demand may be low, but it's solid and, without competitors, he's got a monopoly. I heard another article about a womans' college that was yielding to economic pressures and opening up enrollment to male students when it occured to me that it's a similar problem. Many colleges, not just womens' colleges, are seeing declining enrollment, and the solution for administrators is to bring in more students by whatever means necessary. For administrators of womens' colleges, the low hanging fruit may appear to be admitting men. However, the unique value proposition of a womens' college is ruined by that change. Many of the faculty, prospective and current students, and the alumni of that institution are not happy about the change, and this may result in unanticipated attrition of female students, loss of good faculty, and loss of alumni dollars, which may erase any gains made by admitting men. If a university president of a womens' college was to take the opposite approach, to hang on, advertize that uniqueness, appeal for support from alumni to preserve the tradition of single sex education for those who want it, and to scoop up the students from other womens' colleges that are folding or going co-ed, this may be a better long term strategy for success than becoming just another generic university in an already crowded, competitive market.
08.28.2022 Picking NAU's next LMS
Recently, NAU decided to begin the search for our next Learning Management System. We are currently using the venerable Blackboard Learn for most of our online courses, Moodle for our Personalized Learning programs, and LifterLMS for enrollment in our Continuing Education programs. Getting to a single, modern LMS is something I've been advocating for several years, and it's finally underway. This is a large and ongoing effort involving a lot of people, both technical and academic. We first had to identify some potential candidates, and pretty quickly settled on Blackboard Learn's successor, Ultra, D2L BrightSpace, and Instructure's Canvas as the three alternatives. Our sister institutions, Arizona State University in Phoenix, and the University of Arizona in Tucson each went their own way, with ASU on Canvas and UA on Brightspace. Our local Coconino Community College, CCC, and the Maricopa community college system are also on Canvas. The for-profit, fully online University of Phoenix was an early adopter of Blackboard Ultra. We met with representatives from each company, viewed their demos, built a criteria list, offered faculty sandboxes to explore, and then put it to a campus-wide vote. In my personal opinion, though it is one that all of the Instructional Designers concurred with, there were two excellent options; Brightspace and Canvas, and one less good one, Blackboard Ultra. I studied all three systems in depth, and developed a page that contrasted the most important features of all three. Of the three, Brightspace is my personal favorite because it has lots of customization options, it is highly intuitive, it is visually the most attractive, and the support/sales people were really great. Interestingly, the vote came back very heavily in favor of Canvas, with Blackboard Ultra a distant second and Brightspace in third place. You can view the results here. Why? Faculty and students outnumbered other voting groups and both of these groups have heard, by word of mouth from peers at other institutions, that everyone else is using Canvas. There is some truth to that, and I think that was enough to convince most people. Another faction, while unhappy with Blackboard Learn, believed that moving to Blackboard Ultra would be less work than going to a different product, though from our research that opinion appears to be incorrect. While I'm surprised that Brightspace didn't win more people over, most really didn't look at the choices in detail and it was the least well known. Don't get me wrong. Canvas will be a fine LMS for NAU, and will be much better than what we have now. But it is far from perfect, and D2L is, in my opinion, putting the most energy into building the best LMS money can buy. Blackboard Ultra is headed in the right direction, but it still feels buggy and unfinished after years of effort and, at the rate that Blackboard is losing market share, I worry about the financial stability of the company. So we will be moving to Canvas next Summer, with the help of K-16 Solutions as a migration consultant. After over a decade on Blackboard, it will be interesting times ahead.
06.22.2021 Life in the time of COVID
Oh, hi, you're still here! Well this has been quite a strange interlude, hasn't it?
When COVID-19 hit the U.S. in March of 2020, my office got busy helping to develop what became known as the NAUFlex (a variation on Brian Beatty's HyFlex) model of instruction. We have, for many years, supported the use of online tools for face-to-face classes, a model we call "web-enhanced." We've done the same for fully online asynchronous instruction using Blackboard and its predecessors (and Moodle for our Personalized Learning program) dating back to the early 1990s. However, live synchronous online instruction was something we had mostly avoided for reasons I'll get into below, but demand grew rapidly, and we had no choice but to dive in. We immediately expanded support for teaching with conferencing tools like Zoom and Collaborate, with Kaltura for streaming recorded media, and with collaborative productivity tools like the Google Apps for Education and (ugh) Microsoft Teams. We were able to convince a few people that uploading PowerPoints and 90-minute lectures was not the best pedagogy. One of us even dropped a best selling book at an opportune time!
Attendance at our webinars increased by an order of magnitude (be careful what you wish for!), and we received lots of positive feedback about our support. This has been an incredibly difficult year, where our personal and professional lives have been compressed into one, with kids and pets and spouses all locked in with us, but a year we’ve come through about as well as we could by being patient, accommodating, and by working together to address the many unexpected challenges our faculty and students have been facing.
Teaching live online is a hard thing to do well; it's trapeze work without a net. If anything goes wrong in the technology layer, everything comes crashing down. And pedagogical things that are simple in the classroom (everyone break up into groups of five) are clunky at best with our conferencing tools. We endured bad lighting, questionable attire, mouth breathers, curious views into people's homes, silly animated backgrounds when people barely had enough bandwidth for audio, we've suffered Zoom bombing by racist idiots, people who thought they were on mute but weren't, and people talking while muted, which is the new ALL-CAPS! We've had difficulty engaging students, who we have to coax to turn on their cameras and turn in their work. We have radically changed the way we taught this year, and all without much preparation. At times, it seemed a lot like this:
As vaccines are now available, allowing people to gradually return to pre-COVID ways of doing things, it will be interesting to see which aspects of this tech-infused approach to teaching we each decide to keep, and which will go back on the shelf. I’m sure that for many, who were thrust online reluctantly and without adequate time to prepare, Zoom fatigue and the lack of social interaction with colleagues will result in an eager return to old practices. Others may have found that the necessity created by the pandemic inspired a few new and innovative practices worth holding on to.
We proved that working online from home is effective, and that the flexibility it afforded had great value.
We reduced the use of gasoline so much that the price went negative and the skies over polluted cities cleared, revealing views of distant mountains many people had not seen in their lifetimes.
We made the realization that connection with our students and colleagues was still possible in the online environment, and we have stretched ourselves to learn new tools and teaching methodologies. We have made sacrifices to stay safe and well, but I feel grateful to all who have made difficult adjustments to keep the wheels turning. I hope we will continue to use some of the new tricks we’ve learned, and I also hope to see you all in person again soon.
04.15.2019: The early history of Adaptive Courseware
Back in the beginning of what would become my career in educational technology, when I was a student teacher at Serrano Middle School in 1992, I had the rare opportunity to observe a teaching technology that would still be, or is again, considered cutting edge in 2019. The PLATO learning system, developed at the University of Illinois (the same computing powerhouse that brought us Mosaic, the first web browser) was being piloted in California and New York at a handful of public schools. It was an early version of what, today, we call adaptive courseware. PLATO was running on a genuine IBM server; a PC tower with a big toggle power switch near the top, that was networked to a group of thin client terminals set up in pods of four on round tables at the back of my mentor teacher's classroom. When I arrived, it was in use as kind of a drill and practice tool that occupied about a quarter of the students in the classroom at any given time. It was one of my duties to power up the system each day before the students arrived.
Over the course of my time there, I got a chance to observe the system and the way the students used it. Each student logged into the system individually, and was led through a series of multiple choice problems, often with a relevant graphic like the one above as a prompt. The questions were generally appropriate, relevant, and well worded. They were not the same for each student, and what I eventually gleaned was that they were presented dynamically, based on the students' answers to previous questions. If a student got a question right, the system would move on to the next topic or offer up a more challenging question in the same area, according to some algorithm. If the student got a question wrong, it would offer up an easier question, or more questions of a similar type. When the students completed a set number of questions, which was about 20 if I recall correctly, they were done for the day and the machine dismissed them back to the classroom and waited patiently for the next student. The classroom teacher had no part in this process, and was neither there to explain, nor encourage, nor guide the students. One thing my mentor teacher commented on with puzzlement was that some of his students, who were doing quite well in class overall, were doing so poorly on the computer that they were in danger of failing. I was asked to observe these students and figure out what was going on. What I found was that they were not reading the questions at all, nor attempting to answer them correctly. Often they would not even be looking at the screen. They would just hit the return key repeatedly to jump to the next question until they reached the required question count, and then rotated back into the regular classroom activities. Clearly, they were so unmotivated by this system that they were just trying to get through it as quickly as possible, and even the fear of a failing grade was not enough to make them try harder. Middle school students might be a particularly tough crowd.
What I learned that day, and what continues to be true today, despite higher resolution graphics and better animations, is that most students need an inspiring instructor to guide them, to praise them, and to motivate them to care about, and challenge them to understand, the material they are learning. This might seem astonishing to a computer programmer, who logically thought that students were there to learn and, if presented with clearly worded questions at the appropriate comprehension level, would rapidly progress to new levels of understanding. Evidently not. Most students just aren't motivated enough to learn entirely on their own. If they were, we wouldn't need teachers and schools, but only textbooks and libraries. When the computer praises them with a "good job" it feels empty. When it pushes them harder, they just lose interest. This might help to explain why MOOCs, a more recent attempt at competency based, self-paced learning, have also failed. The completion rate of MOOCs, according to a recent and ongoing study, is somewhere between 5 and 15 percent.
Can this type of learning work? I think we can say one thing for sure. It's not only the quality of the content, but the motivation and self-discipline of the students that determines whether independent learning can work. Maybe graduate students would make a better audience than middle schoolers. But even there, I think people respond better to an inspiring instructor than to a rich piece of content. A short video might pique your interest but to stick with it requires some recognition that you're working hard to achieve something, and some regular injections of real enthusiasm help! I'm just not sure that's something the computer can provide. Good news for teachers and professors everywhere. You've got some job security for now.
04.04.2019: AI or IA?
Artificial Intelligence is much in the news these days. On campus, we have six wheeled autonomous robots delivering food to the dorms. At an educational technology conference I recently attended, the inflated expectations around the potential of AI and "machine learning" were alarming; particularly so because I found myself in the small minority of skeptics. As I see it, there are several problems with AI. The first one is the biggest: it doesn't exist. What people are calling AI is just good programming. If a programmer can anticipate most of the common scenarios, it can appear as though the robot or program is intelligent. But it's not. My Nest thermostat has some clever programming. It lowers the temperature of the house when, using motion sensors, it determines that there's nobody home. Some people call this machine learning, but that's a stretch. This behavior was programmed by a person. The only thing that the thermostat "learns," if you can call it that, is our general pattern of occupancy. While that is cool and useful, true AI, as I see it, must be able to adapt to novel situations, to anticipate based on prior experience, and to avoid past mistakes. We are so far away from true AI that it's a disservice to use the term. Autonomous vehicles are a great example of programming based on anticpated problems. Equipped with GPS, an accurate digital map, and a collection of good sensors, a computer program can drive a car better than a distracted or drunk driver under most conditions, and maybe even better than an attentive novice driver when certain predictable conditions, like a slippery roadway, arise. But where it becomes clear that an AI is not as adaptable as a human being is underscored in the two recent tragic incidents with the Boeing 737 Max 8. If accounts are true, the pilots understood the problem perfectly, but were unable to wrestle control back from a misguided computer that, due to a faulty sensor, kept nosing the plane down to avoid what it "thought" was a stall, resulting in the deaths of almost 350 people. The best evidence that there was no intelligence in this AI is that both planes hit the ground at upwards of 600 mph, against the express wishes of the pilots. Some might argue that machine error is less common than pilot error, but that's a tough sell, and the machines are going to make different, dumber mistakes than the pilots would, at least in some situations. I hope we have learned from this example that an AI can't be trusted entirely, needs an easy to reach off switch, and that its judgment, especially when human lives are at stake, should be based on the input from more than one sensor. In fact, decisions should be based on the inputs from a minimum of three sensors, so that the majority rules when there's a disagreement. When someone steps in front of a food delivery robot, it freezes like a deer in the headlights. We can't afford to fly planes that way.
Rather than AI, I think what we need is IA. We need tools that allow us to visualize things we couldn't see in the raw data, to steady our hand, and to do things with more finesse than we could do unaided. What we need is Intelligence Augmentation, or IA, where there is a synergy between the real intelligence and the machine helper. The software that keeps a drone stable in a fixed position is far faster and more deft than a human pilot could be, but the AI will keep the drone in that hold position until it runs out of battery and crashes. In fact, a modern airplane relies on a variety of computers to assist in the flying of the plane. In the incident where Sully safely landed a passenger jet on the Hudson river after multiple bird strikes took out both engines, he deserves a ton of credit, but he brought the plane down on a glide path guided by the computer. I would rather live in a future where machines enhance or augment what we can use our intelligence and creativity and good judgment to do, rather than one where the machines lock us out and make the decisions on our behalf. When I first set up our Nest thermostat, I had miswired it so that the heat, once called for, never shut off. I woke up in the middle of the night to a house that had been heated to over 90 degrees and rising, and joked that the AI was trying to cook me. I can think of nothing more disturbing than being in the cockpit of an airplane, losing the battle with a machine intent on crashing the plane out of a misguided sense that it is being helpful. I have no interest in machines taking away our good jobs or our freedom of choice. I have never found Microsoft's Clippy to be particularly helpful at assisting me with making a numbered list. I believe in human potential, and technology that helps us to be more than we are; not less than we are.
02.12.2019: Disruptive Technology
The pace of technological change is accelerating, and many of these changes are disruptive to the kinds of jobs we do, and the kinds of services we enjoy. Consider how each of the following professions or industries has been disrupted:
Old Industry |
Disruptor |
Typesetting |
Desktop Publishing |
Secretaries |
Word Processors |
Bank Tellers |
ATM Machines |
Music Industry |
Napster-->iTunes-->Spotify |
Hotels |
AirBnB/VRBO |
Travel Agents |
Expedia, etc. |
Encyclopedias |
Wikipedia |
News Media |
Social Media |
Classified Ads |
Craigslist |
Big Box Stores (Sears) |
Amazon |
Yellow Pages |
Google |
Taxis |
Uber/Lyft |
Movie Rentals (Blockbuster) |
Netflix |
Land Line Phones |
Cell phones-->Smartphones |
Gas Powered Cars |
Electrics/Self Driving Cars |
Maps |
GPS/Voice Navigation |
Fossil Fuels |
Wind/Solar |
Assembly Lines |
Robots |
Medicine |
Genomics (PCR, CRISPR) |
Academia |
Online Courses |
Did I miss any big ones? e-mail me your additions and I'll update my list. Where I work, in higher education, change has been slower, and less transformative so far, but I think we're just at the beginning, and I think there are a number of critical factors that will drive this change and disrupt academia too. The first is cost. An undergraduate degree has never cost more, but at least in the past you could be assured of a long and prosperous career if you had one. But in today's economy, employers tell us that our graduates arrive unprepared for the kind of work they'll be doing. Many of them will never earn the kind of incomes necessary to pay off those giant student loan debts, and most cannot even count on a job that provides medical benefits when they get sick. Under this kind of pressure, something has got to give. Maybe it's already started? Enrollments are down 9% since 2011 and a number of elite schools have shut their doors or engaged in mergers. These trends are even more worrisome in parts of the country that are losing population, such as New England and the midwest. It could be that the value proposition of an undergrad degree is fading, and that students are choosing to forego the debt load that goes along with them. We'll be watching this trend closely in the coming years, and trying to find ways to stay relevant and affordable. Personalized Learning, Competency Based Education programs, and online courses in general are promising possibilities, but the quality and relevency of these programs will have to increase, and there will be more schools on the losing end rather than the winning end. The keys will be ease of use (convenience), quality and breadth of offerings, and price, and it will be difficult to win without strong differentiation from the rest of the market.
LMS Marketshare Latest Numbers
Blackboard has a big problem. Neither the absorption of major players like WebCT and Angel has helped them grow, nor has the dubious patenting of the LMS and the threat of lawsuits against competitors scared people away from the alternatives. Nobody is moving to Blackboard. They are several years into a complete LMS overhaul, and the migration path from Blackboard Learn to Ultra is anything but clear. It would be foolish for anyone to migrate to Learn at this point, because they will need to migrate again shortly, and it would be risky to commit to Ultra while it remains so unfinished, when there are solid alternatives like D2L Brightspace and Canvas to choose from. Blackboard desperately needs a win, and I suspect they made a screaming deal with the University of Phoenix, which has also struggled in recent years. The best Blackboard can hope for is that there won't be new defections, that current customers will move to Ultra when the time comes, and that Ultra will be good enough to attract new customers. That's a tall order.
- Start small and build strategically (fast/cheap/good? pick 2)
- Identify some early wins!
- Evaluate and improve existing content with potential
- Create More Differentiation
- Who’s our competition?
- Community Colleges (cheaper)
- Other State Schools
- ASU (more students, more money)
- U of A (more prestigious, research oriented)
- For Profits
- U of Phoenix (in re-organization)
- Capella
- Strayer
- Southern New Hampshire University
- GCU (gained as U of Phoenix suffered losses)
- Western Governors University (accreditation issues)
- Arizona Regents University (now defunct?)
- This could be revisited so that students can more easily build a degree with a combination ASU, U of A and NAU course credits.
- What are their strengths?
- The For-Profits
- Great marketing
- A clear choice of offerings
- Organized, efficient, adaptable
- Well-funded
- The Public Institutions
- Cheaper
- In-person, blended, and fully online offerings
- Full-time faculty with deep expertise
- Reputation (trust of traditional institutions)
- What are their weaknesses?
- The For-Profits
- Credibility (skepticism about online degrees)
- Lower student success rates
- Higher costs; larger student loan burden
- The Public Institutions
- Disorganized, un-strategic online programs
- Faculty who aren’t experienced online instructors
- Reliance on “canned” content available elsewhere
- Resistance to, and slow pace of change
- How we can do better:
- Better Courses
- Existing Courses (but many need improvement!)
- We already have hundreds of courses
- We already have dozens of online programs
- Standards for Design, Content, Delivery
- Go Beyond QM: Cover All Three Elements of Quality
- Play on Regional Strengths
- Partnerships with Native American Colleges
- Partnerships with Community Colleges
- Tool alignment, resource sharing
- Take advantage of our geographical location
- Rural Arizona
- Native American Reservations
- Colorado Plateau
- Grand Canyon, Sedona
- Southwest
- Deserts
- Build Specialty Programs
- Be the Best at a Few Things, not bad at many
- Build gradually, strategically, intentionally
- Avoid expensive contracts with big publishers where possible
- Canned content creates a race to the bottom
- Use open courseware where possible
- Support strategic development of original custom content
- Develop select “flagship” courses available nowhere else
- Think Netflix and Amazon original content for example
- We already have a handful of unique world class courses
- No developing a course while teaching it
- Better Instructors
- Select for Online Teaching Experience
- Provide Subject Area Mentor Instructors
- Provide Training and Support in Online Pedagogy
- Waive in-state hiring requirement
- Required New Hire Orientation Training
- Recruit Faculty with TPaCK (Technology, Pedagogy, and Content Knowledge)
- Annual Peer Review of the course and the teaching
- Excellence Awards
- Incentives for larger (more profitable) classes
- Course development stipends (we used to pay $5000)
- TAs, Reader-Graders
- Adequate Compensation (stop the revolving door)
- Better Faculty Support (e-Learning has these skills)
- Experienced Pedagogy People
- Experienced Media Designers
- Established relationships with faculty
- Better Administration
- 3-6 month new course development process
- Course refresh and continuous improvement process
- Master course archive
- Better LMS and other tools
- Short term: Get PL onto Blackboard (now)
- Medium term: Integrate PL with adaptive courseware tools (1 year)
- Long term: Consider an LMS transition (2+ years)
- Faculty are ready to move, Canvas is the most popular option
- Better Credibility
- Keep ProctorU
- Keep Kaltura
- Adopt TurnItIn
- Build and Staff Testing Centers (this is controversial)
- Promote Higher Order Thinking Skills
- Allow collaboration
- Allow the open web
- Shift away from memorization
- Build the reputation (this will take time)
- We need to stick with it for at least 5 years!
- We need to market ourselves like the non-profits
- We need to improve the clarity of our offerings
- Lower Operating Costs
- Open Source Materials
- Retention of Talent
- Efficiency
- Clear Focus
- Better Student Support
- Assistance with Financial Aid
- Assistance with Course and Program Selection
- Monitoring of Student Progress
- Better use of Analytics
- Timely Intervention
- Don’t worry about local students taking hybrid and online courses
- Students will do what works for them
- Flexibility
- Personalized Learning
- Variable Course length
- Variable Start times, End Times
- Mastery is what matters most
- Great Self-Service Options
- Personal Attention
In a meta-analysis of a dozen recent studies, a non-profit education research organization has concluded that there is
no significant difference in student learning outcomes when comparing performance in face-to-face courses with hybrid and fully-online courses. Mode of delivery matters less than the quality of the individual course and its instructor. No surprise to me! This finding backs earlier work.
http://nosignificantdifference.org
If you're changing LMS this year, you're likely going to Canvas
Although the title of the e-Literate article says that Blackboard may be turning things around, the graph doesn't show it. If you're a school in the U.S. or Canada and you're changing LMS, odds are high that you're going to Canvas (61%) or D2L Brightspace (38%). Note that this graph does not represent current market share, but where the switchers are going. Still, it's a strong predictor of future market share, and it shows that Blackboard's in trouble. Blackboard's current goal seems to be to hold onto existing customers and stop the bleeding until their next-gen LMS, Ultra, is ready for prime time. That date has been slipping for some time however, and I'm pretty certain that if I had to switch LMSes today, I'd go to Canvas. I've been a fan since 2009. But the decision to switch is not an easy one. Changing LMSes is like moving to a new home in a new city in a different country. It's not to be undertaken lightly, because the move is difficult, time consuming, costly, and painful, even when the end result is a better system. So why isn't anyone moving to Blackboard? The answer is simple. You'd have to migrate twice. Once to the current system, and then to the new one. That would be nuts.
In case you missed it, Western Governors University has been audited by the Federal Department of Education and found not to be distinctly different from a correspondence school. If the finding is upheld, WGU could be forced to return $713 million in federal financial aid. This has serious implications for all schools involved in distance learning and personalized learning! But the Feds are right about one thing. An online course can and should be much more than a correspondence class.
10.13.2017 Update: "WGU is not off the hook."
Traditional testing forces students to cram, regurgitate, and forget.
Have you ever thought about why we test students the way we do? What do I mean? Well, we generally test students in isolation from each other. We generally disallow aids like notes, calculators, textbooks, cellphones. We ban the use of Google and Wikipedia. We set strict time limits and restrict you to your seat. We use a lot of multiple choice and fill-in-the-blank, with perhaps a smattering of short essay. Someone is watching you constantly. Now, you may be thinking, "Of course. How else can we keep them from cheating? How else can we find out what they know? How else can we keep them from helping each other?" I would argue that those are the wrong questions. Sugata Mitra has an interesting TED talk, where he develops the idea that the present day education system remains much as it was designed by the British empire in the 18th century. At that time, what was needed were clerks and bookkeepers who could do math in their heads, and read and write without need of help, primarily to keep track of goods moved around the world in sailing ships. He argues convincingly that the education system isn't broken. It works remarkably well. It's just that it trains students to do things that there is little need for in the information age. Rather than testing for the ability to memorize and regurgitate without understanding, we need to redesign assessment around collaboration, persistence, synthesis, and creativity.
When we attempt to solve a problem at home or at work, what are the first things we do? Gather some background information. Consult an expert. Get some help. Brainstorm. Try more than one approach. Keep at it. None of these methods are allowed during a test, but this is the way we solve problems in the real world. Sure, we need to have the vocabulary. When I go to the hardware store, I need to be able to explain the problem so they can recommend the right tool. Yes, I need some basic understanding as a foundation. But why, in the 21st century, when all of the knowledge of humanity is a few clicks away, must I regurgitate memorized facts on an exam without any help? How often would I not have access to these resources in the real, everyday world? Perhaps if I'm lost in the woods, and my cell phone is out of juice, then I would need to solve a problem in isolation and without assistance. But that seems more like the exception than the rule. Cramming for a test, regurgitating a collection of memorized facts, and forgetting it all the next day is like being a bulimic. There is little educational value in consuming information that you can't retain, just as there is little nutritional value in eating food you don't keep down.
Most problems we face in the real world don't occur in an isolation chamber. They don't have someone hovering over you with a stopwatch. They don't require that all of the knowledge required to solve the problem is already in your head. They don't require you to stay seated, or to work alone. They don't present you with five distinct choices, only one of which is correct. They don't allow you only one attempt. That would be crazy. And yet, that's exactly how we test students, from elementary school all the way through college. Think about these questions for a bit. What kinds of students are successful at that kind of testing? How well does that reflect their future performance on the job? What skills do employers regularly ask for? When hiring someone, is it more important that they already know how to do the job, or that they are creative, persistent, able to learn, and able to work well with others? How well do we prepare students for the challeges they will face?
What are the skills we need to employ in modern day problem solving? Usually, they involve gaining an understanding of the problem, either by doing research or getting help from someone who knows more about the topic. Once we understand the problem, we develop one or more strategies to solve it, based on cost, time, effort, available resources. Often, the first solution is inelegant, but it might be good enough. "Fail small, fail often." is advice I've heard from many successful problem solvers. Don't be afraid to try things. Break the problem into pieces and solve each part separately. Creative solutions rarely come from aiming directly at the problem and going full speed ahead. But the key point here is that we learn to be creative by attacking problems not with a head full of facts, but a kit full of tools that can be used again and again. You may be thinking that I've got a point, but it's easier to grade answers right or wrong when we test facts, not opinions. However, it's actually not so hard to grade students a better way. You look at how they tackled the problem. It's the difference between awarding points only for the answer versus asking students to show their work and evaluating both the quality of the end product and the sophistication of their methods. Let them work in teams. Let them use any resources they can get their hands on. This is an approach to teaching and learning that actually prepares students for a job in the real world.
"But wait," you're saying. "If I assign group work, how can I tell who did what?" Yes, that can be tricky. We've all been assigned to a team where one person does almost nothing, and gets the same amount of credit as those who pulled most of the weight. That's a problem with the way the group members were evaluated. But guess who knows who did which parts of the job, and how well they did them? The members of the group. A very clever way to grade students is to have them evaluate their own performance and that of their fellow group members by secret ballot. Average out the peer grades and compare it to the grade they gave themselves. You'd be surprised how accurately this will match your own observations, and how well it reveals who did the work. Of course, you also assign the work an overall grade, so that if everyone agrees to give themselves higher grades than they deserve, there is a correction factor. This method may need to be employed more than once before students realize that their actions are accountable, so don't give up after just one try. You will find that it becomes even more effective as time goes on.
There is another thing you can try when assigning group work, if you're still having challenges. Identify the different kinds of work necessary to put together the final project. For example, in a lab experiment, one person is the group manager, whose job is to lead, organize, plan, make decisions and settle disputes. Another is the experimenter, the hands-on person, who must be good at understanding and following instructions. A third is the data collector, who might also be in charge of creating graphs and charts. A fourth is the analyst and writer of the report. A fifth is the presenter. These are somewhat arbitrary divisions of responsibility, but you get the idea. When you assign duties within the group, people sort themselves into the kind of work they like to do. Students who hate to get up in front of others and talk might be excellent writers. Students who like to present might not want to get their hands dirty, or be good at following detailed instructions. That's ok. Everybody can make a contribution. And, if someone really wants to work alone, let them. As long as they understand they have to do the same amount of work as a whole group would, that's fine. That's how the world works.
Panel discussion on 21st Century Classrooms
Recently, I was asked to speak about what I envisioned as the "classroom of the 21st century." Given that we're already 15 years into the new century, that might seem like an easy task. However, when asked to predict the future, I always think of the scene in the movie "Metropolis" where the bi-planes are flying between the skyscrapers. Or in 2001: A Space Odyssey, where the spaceplane is owned by the now defunct Pan Am Airlines. The future tends to unfold in ways we can't imagine. Some of the (at the time) amazing technologies of Star Trek, like the communicator, already look primitive compared to today's devices. Others remain as far away as another star system.
This venue was at a technology conference sponsored by technology vendors, so the expectation was that we would talk about technologies that would transform education as we know it. There were impressive demonstrations of virtual and augmented reality, and telepresence; things that require lots of bandwidth, robust connections, serious computing power and, most of all, a lot of back-end technical support. One of the presenters showcased an elementary school in Ireland that was doing cutting edge stuff with VR, but noted with incredulity that, in one of the slides, the child was sitting in front of a CRT monitor rather than a flatscreen. Another presenter lamented that one of the challenges in K-12 is that the computers found in public schools don't often have video cards capable of keeping up with his 4K video. Unwittingly, these presenters got to the heart of the problem. Public schools, even in the 21st century, don't throw away old equipment that still works. A cathode ray tube monitor may be a throwback to the 1990s but, if it's working, it will continue to be used and limited funds will be diverted to higher priorities, like things that don't work at all. In my experience, schools don't even throw away broken stuff because they may need to cannibalize it for parts. Teachers are resourceful and frugal.
Now if you've read any of my previous blogs, you already know that while I like technology, I am often skeptical about expensive technical solutions to pedagogical problems. I recall when I taught high school in southern California, one of the teachers, a friend of mine, worked in a south facing classroom with a big bank of windows, and actually passed out in front of her students one sunny spring day while teaching in the 90+ degree heat. The administration had a priority system, however, for installing air conditioners. If the room had a computer, it could have an air conditioner because the administrators didn't want the computers getting damaged by operating in excessively warm conditions. At my urging, she requested a computer, for which there were ample technology funds, and she got an air conditioner as part of the bargain. I don't think she used the computer at all, but she let the kids who finished the lesson early play solitaire as a reward. And she effectively delivered her lessons in a classroom that was at a pleasant 70-something degrees. That's the kind of creative thinking it takes to operate in the 21st century classroom.
iPads are a recent example of a technology that was supposed to transform education. Large school districts made multi-million dollar deals with Apple to put an iPad in the hands of every student. They were supposed to replace textbooks. They were supposed to fill students with wondor and the passion to learn. That didn't work. A few years down the road, many of these districts are finding that iPad management tools are lousy, the devices are fragile, and they go obsolete far too fast for the money they cost. But the biggest problem of all might be that many students already have one at home and use it primarily for playing games, so that's what they want to do with it at school. Many districts are now dropping iPads and looking at cheaper, more rugged Chromebooks. While these may prove to be a better investment, the top-down approach to deploying technology remains a problem.
Top-down technology investments in education assume that if you make a technology available, the teachers, who are dedicated professionals, will figure out some appropriate instructional uses. This is the wrong approach. We need to design curriculum with the learning objectives in mind at the beginning, and then provide the necessary instructional support resources, including but not limited to technology, to help make the learning happen. That means we need to ask teachers what they need and, where reasonable, do what we can to meet those needs. Some teachers might make great use of iPads, while others would prefer video cameras, or art supplies, or new textbooks. When the teachers at my school were asked to help design a new science building, they asked for lab benches at the back of the classrooms, including big flat work surfaces, with clean sight lines, and lots of storage cabinets. The need was easily met, and the rooms were both popular and effective. However, the architect decided to ignore their request for windows, so the biology classrooms can't have any live plants. A classroom designed around the needs of the individual teachers and the learners is going to be far more effective than a top down directive to use this or that gadget to transform the learning process.
Sometimes a technology is offered as a solution to a problem because of cost or practicality. One example I heard recently was that field trips and labs are too expensive, so let's "virtualize" those experiences and then the students can learn about the Grand Canyon, or a marine ecosystem, or the anatomy of the frog on the computer. The result is never as good. Intuitively we all know this, and yet we are lured by the promise that it will be just as (almost as?) good, and will cost less, or be safer, or will not require tedious permission slips, etc. But the experience isn't the same. The canned tour of the canyon doesn't include a tough hike. It doesn't smell like hot dry air and desert flowers. It isn't nearly as fun. I can't flip over a rock in the simulation, because the designer didn't program that option. The slippery boulders, the icy cold water, and the experience of catching a fish out of a tide pool with a dip net are so much more vivid than the best sim. And it turns out that the infrastructure and technology required to design, deliver, and support a really good virtual reality experience is incredibly expensive. Perhaps more expensive per student than a dozen field trips. A cadaver lab is very expensive, but I want my medical students looking at real specimens rather than photos and canned simulations. How else can they discover individual variability? Or the effects of aging or disease? Or differences between male and female. When a doctor goes into surgery, you don't want to hear the words, "It didn't look anything like this in the simulation." Even if it costs more, the experience is worth the price if the students remember it years later, and that's much more likely with the real thing than with a simulation.
So what does a classroom of the 21st century look like? I have two kids, ages 10 and 13, in the school system right now, so I can tell you. Generally, it's a poorly designed, overcrowded room in a decades old building in need of major maintenance, with peeling paint, lousy acoustics, a heater that clanks all day long under flickering fluorescent lights, a mix of brand new and ancient, working and broken equipment, a lot of duct tape and plastic buckets, and a whole lot of heart. The passion and drive of dedicated teachers is what keeps it all going. Ok, so that's what we've got. In my dreams, what could we have?
The top priority is support. Give teachers the support they ask for. Did you know that teachers supply most of the classroom materials they need out of pocket, and rarely get reimbursed? Putting an expensive technology that nobody asked for into a classroom is not going to be effective. Raising salaries, reducing class sizes, or providing funds for a teacher's aide, or fixing the broken desks and chairs is generally what teachers are looking for. If they want to use a technology, by all means, see if it can be provided. But ask them what they need rather than tell them what they should be doing. That alone would revolutionize teaching and learning in any century.
Even if the teachers wanted a specific technology, and got it, the support for it tends to end when the technology has been set up by a technician. How many bad PowerPoint presentations have you seen? Was PowerPoint broken? Nope. It was misused. Without the kind of tech support and professional development needed to ensure that the technology is used properly, these high-tech initiatives always fail. There are some great and effective uses for technology that can facilitate teaching and learning, but the initial investment must be followed up with funds for ongoing maintenance, tech support, training, and planning.
Some people imagine a future where students are taught entirely by computers with clever algorithms that adjust the content to the pace of the learner. In my mind, this is like getting your nutrition from a pill. In the near future, I don't see a computer, no matter how cleverly programmed, inspiring students the way a good teacher can. If we invest in teachers, and let them pick the technologies they want to use, the classroom of the 21st century could really knock our socks off! It might not be cheap but, as the joke goes, "Education might seem expensive, until you consider the alternative."
photo credit: collegecandy.com
Michael Wesch is one of education's big thinkers. One of his notable sayings is that "College is for learning, and everyone can learn, so college is for everyone." That's a lovely sentiment. On the surface, it seems so obvious that it's like a human right. It should be in the constitution. Life, liberty, the pursuit of happiness, and a college education. Some have even gone so far as to say that a college education is such a fundamental right that it should be free for everyone, paid for with tax dollars! But as great it would be for everyone to be smarter, and as nice sounding as Wesch's syllogism may be, he couldn't be more wrong. The transitive property works in math class, but his logic is deeply flawed. College is not for everyone.
It's September, and I work on a college campus, and I see so many smiling, happy students walking around, riding their skateboards, playing with their smartphones, rushing for fraternities and sororities, drinking their $5 coffees in their exercise wear. And I think to myself, "Why are you here?" I have heard all of the answers many times. Senior faculty say, "you're here to learn how to think." Junior faculty say, "You're here to learn about this subject." Serious students say, "I'm here because I want this or that career and, to get there, I need this or that degree." Less serious students say, "Well, I graduated from high school, and college sounds like a good party." Parents say, "A college degree helps you get ahead." The Administration says, "We offer a number of excellent programs, and if you are admitted, (assuming you pay, and you work hard) you can get a degree in this or that." All of these statements are mostly true. And yet, college is not for everyone.
A person can learn a lot without spending any money on a college degree. Wikipedia contains all of the information I studied in college. Before the Internet, all of that information was also in the public library. What colleges provide, for a price, is a certificate that says you have progressed successfully through a course of study. The degree is to knowledge as money is to gold. The degree and the dollar are only pieces of paper that represent something of intrinsic value. The employer accepts the value of the diploma. But what happens if a nation just starts printing its currency without the resources to back it? What happens if colleges start cranking out diplomas for everyone who shows up with a student loan? And even if we don't allow that standards have slipped (though it seems likely they have, since there is not an unlimited supply of deep thinking, hard working students, and the number of qualified, full-time faculty is actually in decline at most institutions) if there are more people with diplomas and the number of jobs remains constant, the problem of supply and demand takes effect. The supply of diplomas outstrips the demand of employers. It becomes a buyers' market, and the value of your degree slides.
I would argue that a college education can be a great and wonderful thing if one takes it seriously. But the growth in enrollment, and the growth in cost, to me, looks an awful lot like the recent housing bubble. I just watched an excellent movie called "The Big Short" and one of the points it made is that housing prices kept rising through the nineteen-eighties and nineties, and houses kept selling, even while incomes remained flat. How was that possible? It was possible only because people were getting loans for houses they couldn't afford. Today, there are an awful lot of people taking out big loans for college degrees and those degrees, in many cases, are not leading to good paying jobs. Doesn't that sound like a bubble?
So why isn't college for everyone? Because as more and more people go to college, the value of the degree is diminished. Most employers look for a degree but don't pay much attention to the school you got it from, or the grades you got. Therefore, with the exception of a few elite schools, all degrees are equivalent. Most people today don't end up working in a field related to what they studied in college. Most employers are looking for things you didn't learn in college. As the diploma is devalued, employers will need to use other criteria to decide who to hire. That's already happening. Everyone who applies for a job has a degree. It remains a prerequisite. But for how long? How long before employers start saying, "anyone can get a degree, so we don't care whether or not you have one." And if the degree stops giving people an advantage, it won't be long before people stop paying for it. At some point, doesn't it seem likely that this bubble will pop?
If you're not serious about your studies (and, honestly, how many kids fresh out of high school are?) chances are pretty good that your grades are going to suffer. After all, you have shelter, money, food, you're surrounded by people your own age, and you're out from under mom and dad's rules. Party on, right? That's not to say that college is only about academics, but it should be at least one of the reasons you're there. A gap year or two, working a job, might be a very valuable experience, even if it's not, and maybe especially if it's not, the greatest job. There are lots of things to learn outside class. Learning how to live on your own, cook and clean for yourself, and figure out who you want to be are all important. But you don't necessarily need to be in college, at least not right after high school, in order to do those things. Learning how to show up on time, get things done, deal with bad customers, bad bosses, bad colleagues, bad roommates, live within your means, and pay your bills are all very valuable life experiences we don't learn in high school or while living at home. So why not save some money, think about what you want to do with your life, grow up a little, and bide your time? And if you decide college is not for you, there are lots of careers where you can do just fine without a college degree. In Germany, for example, only about the top 30% of high school students go to college. Many more go to two year trade schools or apprenticeships and get jobs in areas where a four-year degree is not necessary, such as factory worker, physician's assistant, plumber, technician, carpenter, bank clerk, etc. When they change jobs, they may require more training, but the 4-year general degree isn't useful.
College isn't for everyone. Yes, everyone should have the opportunity to go, if and when they are ready. But you need to know what you want to do with your life, be open to new ideas that will make you rethink what you thought you knew, be passionate about learning for its own sake, and be ready to work harder, study harder, write better, and think deeper than you've ever thought before. College recruitment ads should be like those drug commercials:
"College is not for everyone. Known side effects include crippling debt, confusion, exhaustion, disillusionment, joblessness, and depression. Consult your professor to see if college is right for you."
By the way, just in case you don't want to take my word for it, the guy who made billions shorting mortgages when nobody else thought there was a housing bubble is now shorting the for-profit colleges.
The Good:
NAU has received a grant from APLU (the Association of Public Land grant Universities) to explore Adaptive Courseware with our faculty and students. Before we get into the details, it's a good idea to unpack those words. Courseware is content in software form, rather than a physical, bound, paper textbook. It's also a collection of assessments (tests) of the learner in various machine gradeable (multiple choice and similar) forms. But e-books from the big publishers like Pearson, Wiley, McGraw-Hill, and Cengage have been around for a while and are in fairly general use. It's the adaptive part that's new. Both the course content and the assessments can be adaptive.
So what do we mean by adaptive? The basic idea makes great sense. The idea is that the software monitors the way the students progress through the course material, and how they do on the assessments, records their responses and their path for analytic purposes (noting parts of the course that are causing problems for students, for whatever reason) and presents different content and test questions based on their responses. This makes perfect sense. If a student isn't getting it, or if they are bored silly, it makes no sense to keep throwing more of the same stuff at them. It makes much more sense to divert the struggling student into remedial or preparatory content, and to present deeper, more sophisticated content to the student who isn't being challenged.
What are the reasons students have problems with the content? The three most obvious ones are that 1) the content is unclear, 2) the content is not engaging, or 3) the content is not appropriate to the student's level of prior knowledge. With data, we can fix these things and make the course better, with the ultimate goal of aiding the learner.
Why is this, potentially, a great idea? Retention of students is better than losing them and recruiting new ones. Making sure that students are ready for the courses they are enrolled in increases student success in a meaningful way, makes professors happier, and ensures that the reputation of the institution improves.
How does this transform education? As content is personalized, it breaks the traditional lock-step approach, where (if we're lucky) the middle 50% are getting it, the bottom 25% are failing, and the top 25% are bored. It allows struggling students to get the background information they need to succeed, and it allows the advanced student to finish early or go farther, and to have that deeper level of mastery get recognized. It allows everyone to proceed at their own pace, and to spend as much time as they need on the parts that are challenging, and to go deeper into material that interests them.
Well, that all sounds great. What's the catch? That's next...
The Bad:
For our first round of product reviews, we're looking at four vendors: Acrobatiq, Knewton, Cogbooks, and Learning Objects Difference Engine (a division of Cengage). What we have quickly realized is that each of these tools has a very different user interface, which goes against the efforts we've been making to create a more consistent look and feel across all NAU courses. This makes it more difficult for students to navigate because every course might be very different in its layout and design. It also relegates our learning management system, Blackboard, to the role of a portal. Students log into Blackboard, find the course, click on a link, and then leave the LMS and land on one of many possible courseware sites. With lots of integration work, we can get grades back into the LMS, but that's about it. Nothing lives in the LMS.
We have also realized that adopting these tools results in a considerable loss of intellectual freedom over content and delivery of a course, because the content is deeply intertwined with the adaptive engine. In order to get good analytic data on learner behavior, content needs to be thoughtfully tagged, and progress needs to be carefully tracked, and this makes authoring content more challenging than just knowing the subject matter. In most cases, authoring is a job for professionals, not content experts. From the vendor presentations, it's not clear what the role of the instructor is with these tools, and therefore it's hard to see where our faculty experts can add value or personalize the content.
Difficulty customizing content raises another question. How does one school differentiate itself from another if they are both using the same product, and the product is not very customizable? How does NAU compete against a school with lower tuition costs, for example? How does an NAU instructor integrate content on life at elevation to a Biology course; something that is both relevant and interesting when you live at 7000 ft? When students can shop around for courses, why would they pick our version?
Another common objection is that students who don't get it have more work to do, or that students who are doing well get more work piled on. This may seem reasonable, but it is a very foreign concept for most students, and the students will resist it unless the rewards are tangible. We haven't transformed the rest of the educational process, so those rewards are not clear at present.
The final issue is that, beyond the introductory level, there isn't (and may never be) a lot of adaptive courseware content, because it's a niche market. There is also not, at present, much content that isn't machine gradeable. That means that liberal studies and the humanities are not going to be well-served in the near term. Therefore, this is not a complete solution.
The Ugly:
The APLU grant is only seed money. Purchasing adaptive courseware is a new financial burden that will fall either on students or universities. It's pretty clear that this is the future and, if they get it right, the potential is great. But it's also clear that we're moving towards a world where people are taught by machines, and the human factor is being pushed to the side. Are we doing this because it's a better way, or because it reduces costs? I don't know about you, but a computer telling me, "Great Job, Larry" isn't very motivating.
Academia, with its medieval era academic robes and feudal system power structures, is going to be disrupted by technology in ways that will make what has happened up to this point seem inconsequential. While academia has held out longer than some other powerful institutions, it is vulnerable to disruption for the same reasons, and recent trends in higher education have only exacerbated the situation.
Why does disruption occur? In every case, the product or service offered is similar, but the digital alternative to the traditional one has been more convenient and/or less expensive to the customer. When that happens, the transition occurs rapidly. Let's look at some examples. Newspapers used to be big businesses, and they carried great influence and power, but the thing that drove newspapers was advertizing. When Craigslist provided a convenient online alternative to searching the classifieds, newspapers began to lose readership, which started a rapid downward spiral. Apple has been the cause of several industry disruptions. With iTunes, Apple gave music lovers an easy, convenient way to buy music online, and the physical sales of CDs rapidly dwindled. Apple did it again with the iPhone, resulting in the collapse of the previous phone market leaders, Nokia and Blackberry. Amazon disrupted book sales, and later the entire catalog sales industry. Netflix did it for video, and the once ubiquitous Blockbuster video chain ceased to exist within a few short years. Uber seems to be doing it for taxis, and AirBnB for accomodations. Apple has, itself, been disrupted in music by streaming services like Spotify and Pandora. There is no traditional market that is unaffected by digital transformation.
In academia, the first challenges by the forces of disruption, the online and for-profit universities, failed for several reasons. The quality of the education was not very good, and the delivery system was also pretty poor, but the cost was the same, so they produced a bad product that nobody wanted. Defaults on student loans were higher at for-profits, where the degree was of low value, and so funding organizations became more discriminating. Accreditation has also created an OPEC-like cartel but, as with OPEC, all it takes is for a major player to go its own way and the whole thing collapes. Traditional academia has some serious problems too.
Problem 1: For decades, a college degree was what the high school diploma used to be; a ticket to a good job. However, that drove everyone, even the grossly unqualified, to go and get a college degree.
Problem 2: Colleges got greedy. Willing to accept anyone who could pay, and even those who took on enormous student loans for questionable degrees, colleges saturated the market with graduates and the bachelor's degree became devalued. For a while, universities solved that problem by offering Master's degrees, but now that market is flooded too.
Problem 3: Tuition costs keep going up, because of state cuts to higher education, and because highly paid administrators have gone on building sprees to try to make their universities more appealing than those of the competition.
Problem 4: To stem the rising costs, administrators have been gradually phasing out the well paid, highly qualified, tenured professors as they retire, and replacing them with low paid, less qualified instructors on annual contracts. This has had the added effect of solidifying the administrative power base, since tenured faculty were often the most resistant to the demands of administrators to lower standards and keep paying students in the pipeline regardless of their potential.
The modern baccalaureate degree is devalued. It is no longer a ticket to a good job, the quality of the education itself has declined, and the cost remains exorbitantly high. These problems create a situation ripe for disruption.
All that remains is for employers to realize that they cannot effectively distinguish between job applicants based on whether, or from where, they have a bachelor's degree, and to begin prioritizing other selection criteria. When demand for the bachelor's degree dries up, there will be a lot of academic real estate coming onto the market as universities collapse. Why won't universities just tighten their standards and produce higher quality graduates? Those with strong reputations will be able to take that approach, because it's not just a degree, but a Harvard or Yale or Stanford degree. But graduates with degrees from "generic university" will find that their diploma is worthless, and that they are saddled with a huge pile of student loan debt that bought them nothing but a 4-year party. The administrators running Generic U will need to rapidly change their institution's offerings, or see enrollment plummet. Unfortunately, universities are led by cautious, change averse people, and so most of those institutions will keep on doing what they've always done and they will fold before leadership has any idea what happened.
The successful disruptors will be the organizations that figure out how to do two things:
1) Rapidly assess the abilities of students (grading students in the way that eggs, or olive oil, or maple syrup is graded, by quality) and sell those ratings directly to employers. Curiously, the lists employers provide do not generally involve any knowledge of the job itself, but rather are heavy on personality traits and attitude indicators.
2) Figure out how to imbue the students with the abilities (something that takes longer and costs more) that employers are looking for and that they don't already possess. This would lead to a program of study based on the missing pieces, much as the "personalized learning" programs are attempting to do. What are those qualities, and how do we teach them? Can they be taught, or can they only be nurtured in those with intrinsic ability? Can we teach enthusiasm, perseverance, integrity, dedication, communication skills, ability to get along with others on a team, some of whom may be annoying? Or should we stick to teaching the kind of material found in textbooks? Does your institution have the answers?
What is the future of learning? And how did learning in its present form take shape? Sugata Mitra says that the skills we teach our children are based on the Victorian era need for interchangeable human calculators. In an age before modern computing and telecommunications technology, the British Empire ran efficiently on columns of numbers transcribed into ledgers and transported around the globe by ships. It was necessary to have human calculators who had the same abilities scattered all over the world in order to transmit and receive the vital information of commerce, so spelling, writing, and mathematics were standardized. Students needed to read, write, and spell accurately, perform calculations correctly, and not demonstrate too much creativity. Schools of the Victorian era fulfilled their mission, but most schools are still doing it today, over 100 years later, when, perhaps, it no longer serves us so well.
It became possible to eliminate some of these drudgeries back in the 1970s when early technologies started invading the classroom. These technologies made it less necessary to memorize mundane things, but there was an inevitable backlash. Students were forbidden to use pocket calculators because they became less proficient at memorizing their times tables, or because they couldn't do a long division problem by hand, or when they trusted the output of the calculator even when it made no sense. Students were forbidden to use word processors and spell checkers because the quality of their cursive handwriting and their ability to spell was suffering, or because cut and paste was making term papers too easy to plagiarize. But, rather than ban the tools, perhaps we should change what we teach? I'll expand on this idea below.
Should we continue to teach those basic skills, even if only to give students an appreciation for our humble origins, much as we might derive an equation from first principles in a graduate seminar class? Are skills like cursive writing and knowing one's times tables important to the shaping of neural pathways, influencing our intuitive language and computation abilities as some studies suggest, or are they relics of a bygone age? I would argue that some basic skills are important and should still be taught because I can do basic math in my head faster than my children can reach for their iPhones. However, in a knowledge economy, our ability to synthesize and evaluate information is much more important than it used to be, back when we could trust that most of the knowledge found in library books had been vetted by experts during the arduous publication process. Although we like to think that we're in the Information Age, there's an awful lot more easily accessible misinformation online too. Retrieving information is easy. Evaluating and synthsizing it is the bigger challenge in today's world.
Categories in the cognitive domain of Bloom's Taxonomy.
Does it still make sense to give students multiple choice tests with closed notes, closed books, and no electronic aids? Is asking students to memorize facts that can be easily retrieved of much value? Sure, students must understand the foundational materials. Even the die-hard constructivists admit that you can't construct your own learning without the basic building blocks. Otherwise it's just time wasted reinventing the wheel, and a pretty primitive wheel it will be. But shouldn't the emphasis be on the higher order thinking skills? Does it matter if a history student knows the date of an event, or is it more important that he/she understands the causes of the event? Does a math student need to know how to tediously calculate a square root by hand, when the calculator is so readily accessible? We need to redesign our curriculum to emphasize the kinds of assignments that require thinking rather than memorizing, and synthesis rather than fact gathering. Why assign students to do a biography of a lesser known president when that information is so easily looked up? Instead, why not assign the students to write about whether he was a good president and whether, based on the information he had at his disposal, or in retrospect, he made the best decisions? Students still need to learn what the guy did, but the tougher question, the one that makes you search your own values and understanding, is, "Was it the right course of action, and why?" You can watch the incurious students squirm when you ask questions like that!
Rather than ban a technology from the classroom, why not acknowledge that, in the modern workplace, it's not so important to know how to run a linear regression with only pencil and paper (perhaps useful if stranded on a desert island?), or to look up some information online on what constitutes a healthy diet. What's really important to know is whether a linear regression is the right tool for the job, or to assess the validity of a website that claims to have all the answers about healthy eating. Creating a generation of critical thinkers who know how to use technology tools is what we're really after, even if it means allowing open book, open note, open Internet on tests. The mathematically curious ones will still want to know how a linear regression works, but the rest can skip ahead to the more stimulating problems. Except for a handful grand masters, most of us can't beat the computer at chess, and yet we don't lose sleep over it. Let's let computers do what they're good at: brute force searches, speedy, error-free calculations, and rapid information retrieval, and let's get humans doing more of what we're best at (when properly trained): critical thinking, synthesis, pattern searching, and creativity!
It's ironic, then, that many instructors want to use technology tools to block student access to technology so that they can continue to deliver tests designed for a pre-technology era. It's the pedagogy and the assessments that need rethinking. Yes, it's often more work to determine what students think rather than test what they have memorized. Computers can grade multiple choice tests, but not opinion papers! We still need humans for that!
College has a somewhat unusual business model. Because it is unusual, many students are confused by it. I have frequently heard the refrain among dissatisfied students that "The only reason I'm here is that I need this degree to get a job." Another common complaint is that "I paid a lot of money, and I am not satisfied with the grade I received in this class." As consumers, we are used to paying for things with the expectation of a full refund if they do not satisfy. This is where the analogy with a gym membership helps to make the point. Paying for an education doesn't guarantee you an education. It only provides you access to the opportunities that will help to develop you into an educated person. But, after you enter college, it's ultimately up to you whether you choose to explore those opportunities. After all, nobody can make you expand your mind. That's something you have to choose to do, and it's something you have to work at. That's why enrolling all incoming freshmen in liberal studies classes doesn't create a cohort of poets and philosophers. With the exception of the gullible people who pay for the exercise machines they see on TV with the expectation that, in one short month, they will look like the smiling, well-oiled and well-muscled supermodels who are shown using the product, most people understand that paying for a gym membership doesn't, all by itself, make you fit. They understand that some comittment is required. If you asked someone, when they were signing up for this membership, to sign a waiver saying that they understand that just paying for a membership does not guarantee that they will become fit, most people would laugh and say, "Of course. That's obvious." Why then, is it not the same for our students who pay for a college education and then feel crestfallen when, without putting much effort into it, they find that it does not meet their expectations? Everyone should have the opportunity to go to college, but not everybody makes the most of it. College is a place ripe with opportunities, but some of them must be sought out. Some of them require effort. Those who seek the opportunities, and work for them, are generally rewarded. But, for those who do neither, sorry folks, no refund. Enjoy those expensive textbooks and your bowflex machine. If you don't make an effort, that's all you've got.
In the world of technology, and the world in general, for that matter, there are situations where you will be asked to do a job fast, cheaply, and well. It turns out that, most of the time, you can't have it all. I'm not sure exactly why this is true but, trust me, it's true. You cannot change the laws of physics! I think the graphic says it all, and very succinctly. There are many people in the world, some of whom will be your bosses or your clients, who don't seem to understand this simple fact. My advice to you is to show them the graphic before you start any project and, without another word of explanation, tell them to pick two. If they can't live with that, point to the center and walk away.
The concept of a technology adoption curve was first described in a study of the willingness of farmers to try new agricultural methods, but it applies quite well to technology in general. I like to try new things, but won't promote any new technology just because it's cool. It has to fill an unmet need and be easy enough to use that most people can manage it. Therefore I generally try to stay just on the right hand side of the "chasm." However, that's not always the case. Although I have followed their development with great interest, I only recently got a smartphone. With 56% of Americans now owning smartphones (I'm sure that number skews young), that plants me squarely in the majority, and illustrates that while one might be an innovator with one technology, that doesn't mean one isn't more cautious in some other technical regard. That's ok. Until recently, I had no need to own a smartphone and, now that I own one, I'd still characterise "need" as a stretch ;-)
Are you a technology innovator, do you "go with the flow", or do you take a "wait and see" approach?
Back in March, I wrote about the hype cycle. Although I didn't think about it at the time, can you see how the chasm in the upper graph is connected to the trough in the hype cycle? Overlay these two graphs and the connection is clear. While a technology can show great promise and generate excitement among early enthusiasts, it may never catch on with the general public. Often this is because of some limitation in what it can do or in how easy it is to use that the enthusiasts don't mind but which the general public would not tolerate. That's the chasm. If the chasm isn't crossed, then the technology never reaches that eventual productivity plateau but instead just dies out or remains a hobby for a small group of technophiles. Linux is a great example of a promising technology that hasn't crossed the chasm. Enthusiasts are the key to the spread of new technologies however. Seth Godin makes this point well when he talks about marketing to the people who care. So let's think about where we are, because it's useful to know ourselves. If I say the word "Blackboard" or "iPad" or "clicker" or "3-D printer" or "Arduino" or "smartphone" or "Twitter" or "Facebook," where do you fall on the curve? Now think about your colleagues and where they are relative to you. Are you always in the same part of the curve or do you jump around? Have you learned something about yourself through this little exercise? If you decide not to adopt some new technology that everyone is talking about, does that make you a laggard? Not necessarily. It could be that the technology in question is heavily hyped right now but will not last. There is an implied slur in calling people "laggard" that I don't like. Not every new technology is a good thing, nor will it last. If it's not, or if it doesn't--something we'll only know in retrospect--then you were right not to jump on the bandwagon. Nobody talks much about Second Life anymore and, if you missed it, you didn't miss much. If you never bought a Palm or Windows Mobile PDA, good for you! You saved a bunch of money on a near worthless device. And what if you're an innovator? Do you stick with a technology once everyone is using it? Or does that take all the fun out of it? If you were on Facebook when nobody had heard of it, are you still there today? There's a saying that "good pioneers make bad settlers." Pioneers don't like crowds, and they are always moving to the new frontier. I'm not one, but I appreciate them. They work hard and explore a lot of places that don't lead anywhere useful. But when they make a real discovery, the rest of us get to enjoy it without all the effort ;-)
The "hype cycle." Overlay this graph on the tech adoption curve above.
The dreaded password change notice.
Ah, the lowly password. A simple tool from a bygone computer era. But if you've gotten a message like this one lately, I think you will agree with me that passwords are no longer either simple to manage or effective at keeping us secure. Passwords are much like our congested freeways; they don't work very well anymore but the entire infrastructure is built around them so, while they drive us crazy, we have no alternative. The first problem is multiple services. With a steadily growing number of web-based services, each with its own password expiration cycle and username and password creation rules, it has become increasingly difficult to remember all of your passwords and which username and password go with which service. Thankfully, many services let you use your e-mail address as your username, and most offer a "Forgot your password?" link. It's also helpful that my workplace, at least, has a "single sign-on" for all services. The downside of that, though, is that if my work password is compromised, the hacker can change everything from the grades in the class I'm teaching to the beneficiaries of my life insurance plan and the bank routing number of my paycheck's direct deposit! The second problem is multiple devices. The password wallet or virtual keychain was a good solution for managing the multiple usernames and passwords saved on your computer. Just remember one username and password and the tool does the rest. But once you have multiple devices, you're out of luck. So if I have 10 services that require passwords (not an outrageous number when you consider multiple e-mail accounts, IM and video conferencing services, a photo sharing service, e-Bay, PayPal, social services, online banking, web hosting service, cloud storage service, etc.) and 5 devices (again, not all that unreasonable when you consider personal computer, work computer, tablet, laptop, smartphone, etc.), then that's 50 passwords to change on a regular basis. If you have multiple OSes (MacOS and Windows, for example) installed on your device, then that device counts as two. But we're still not done. The third problem is multiple applications on each device. Even if we just consider my single work password, there are many applications on each device that need to use it, and need to be updated when it changes. For example, there's my e-mail program, my VPN connection, my IM program, my web browser, my web page editor, my FTP application, and the fact that I often run more than one of these programs. For example, I use three web browsers commonly, and two e-mail applications, three IM programs, etc. So, to sum up, if I use 10 services on 5 devices, each which connects to these services from 10 applications, that's, very conservatively speaking, 500 places where I need to change passwords on a regular basis! Not every app stores a password for every service, but you get the idea. Here's an oversimplified map of my online world. I bet yours looks similar. Start drawing lines from service through device to application, and you'll see how messy password management can get!
Password management is no longer simple, nor does it keep us very secure.
True, I may have it a bit worse than most, but I can tell you that this is out of control and that I'm better at managing this chaos than most people. It is no wonder, then, that many of us use the same not-very-strong-password, or a minor variant of it, on multiple services, and that we engage in other unsafe behaviors such as writing passwords down on a sticky note attached to our monitor, or incrementing our old password with the next number in line when it's time to change it. With the proliferation of cloud-based accounts and services, it's only a matter of time before one of them is breached. It seems that not a month goes by without some service provider announcing that its user base has been compromised. It is no wonder, then, that when one of our accounts gets hacked, sometimes through no fault of our own, it doesn't take long for a hacker to gain access to our other accounts, often by using our email system to reset our passwords in other systems. The fourth problem is that all of the flaws above lead services to use extreme countermeasures such as forcing you to create a password so strong that you have no hope of remembering it, or locking your account after too many failed access attempts, or making you prove that you're human with one of those "captcha" tools. Often it's not a hacker or a bot but just me, trying to remember which username and which password go with this account, hoping I guess right before I get locked out. Sometimes it's a device with an old password trying to update itself automatically that locks me out. Sometimes I fail to correctly answer my own challenge questions because the system is too picky about the answer. For the "street I lived on when I was in second grade," did I spell out "Street" or did I abbreviate with "St." (with or without the ".") or did I leave out "Street" altogether? I don't know what the solution is. Maybe a cloud based keychain that generates ridiculously strong passwords you never have to remember, or that uses some difficult to impersonate biometric like fingerprint or retina scan? All I know is that I'm in desperate need of something to fix this mess, and that a lot of other people are stuck in the same sinking boat. I will lose hours of productivity dealing with this upcoming password change, and it will be days or weeks before most of the apps on most of the devices I use regularly are updated. There has got to be a better way!
Update: I have heard very good things about 1Password, a web-based password keychain, and since I have started using Apple's iCloud Keychain, which syncs across all my devices, a strong password is supplied for each web account, and yet I don't have to remember it. So far, 6 months in, it's working great!
The Hype Cycle: Map your favorite educational technology.
How many times have you heard that some emerging technology is going to solve all of education's woes? In my experience, a technical innovation may allow the job to be done faster, cheaper, or better than before, but rarely, if ever, all three. If you're lucky, you get to pick two! If you're thinking about implementing some new technology that everyone is talking about, It's important to step back and consider its position on the "hype cycle" graph. Google Glass, for example, is just past the trigger point, and visibility is still increasing. MOOCs are at the peak of inflated expectations right now. But does anyone remember Second Life? Once heralded as "the next big thing," it has slid into the trough of disillusionment. Take Second Life out of your resumé, people. It's not doing you any favors. Speech recognition, long ridiculed, is finally climbing out of the trough and up the slope towards a more realistic "plateau of productivity." While still not practical for most uses, it fills a niche for users with repetitive stress injuries that make using the mouse and keyboard painful. Used as intended, with a realistic appreciation for what it can and can't do, technology can be highly effective. But misapplied, technology can make a real mess of things. As the old saying goes, "To err is human. To really screw up, you need a computer." One of the debates that rages in my office relates to what to teach people about a new technology. We want them to get excited about new technologies, as we are, and to be adventurous in their teaching. Often however, people with inflated expectations come to us only wanting to know how some new technology will make their job easier, and they get frustrated when we ask them why they want to use it (what problem are they trying to solve?) or try to explain that there are limitations. They don't want to hear that it won't re-energize their lectures or that it might require just as much effort as what they are doing now. Let's look at a few examples of useful technologies misapplied, and you'll see what I mean.
Technology |
Misuse |
Proper Use |
Video |
Instructor shows a full-length movie to class in order to take a day off from lecture, catch up on grading, etc. |
Instructor shows a series of relevant video clips, each followed up with insightful questions and guided discussion to engage the class in critical thinking. |
PowerPoint
(two ways to wreck a presentation)
|
1. PowerPoint presentation is viewed in absence of the presenter, but the bullet points are vague or meaningless without the emphasis and interpretation of the speaker. (Did they think the presenter had nothing of value to add?)
2. Speaker, facing away from the audience, reads paragraphs of text from each projected PowerPoint slide, adding nothing of relevance. (Did they think the audience can't read?) |
Presenter uses prompts on the slides to make key points to the audience, to jog the memory, and to engage the audience in a lively and only loosely scripted discussion. |
SafeAssign or TurnItIn |
Instructor uses tool to fail students for unintentional plagiarism. |
Instructor uses tool to show students how to properly reference the source materials they cite. |
Clickers |
Rather than make the teaching more engaging, instructor uses clickers to enforce mandatory attendance policy. |
Instructor uses tool to assess comprehension, engage students, and deepen their understanding with challenging questions and analysis of why they think what they do. |
Your assignment: Expand my table with more examples. Begin with the LMS, Facebook, eBooks, MOOCs, and iPads. All great tools. But are they being used as they should?
Yes, it's true. We all make mistakes. And sometimes, even when we don't do anything wrong, bad stuff still happens. It might not be fair, but that's life. When a setback inevitably occurs, how do we respond? Do we cover it up, downplay its significance, get defensive, make excuses, or try to shift blame? Do we make an emotional public mea culpa, roll out the damage control spin machine, and then get back to business as usual? Or do we make sure that our clients know something went wrong, apologize to the injured parties, remediate if possible, fix the problem, promise to learn from the incident, and genuinely try to do better in future? That's the true test of character, isn't it? Our Learning Management System is a critical piece of the Information Infrastructure at the University. Over the years, we've had some outages and performance issues; several that were more serious and longer in duration than we'd like. Sometimes, it's been our fault; something we overlooked or should have anticipated. Sometimes it's been an unexpected hardware failure, a guy with a backhoe, or lightning strike. And sometimes it's been a flaw in the LMS itself, or in the hosting service. Here is an interesting collection of recent readings on LMS security incidents and responses. If this was a test, I'd say, "Match the response below to the behavior described above."
LMS Marketshare Over Time
The LMS Market Share battle remains interesting. Since dominant player Blackboard's acquisition of WebCT in 2006, Angel in 2009, Elluminate and Wimba in 2010, it appears that the company's share of the market continues to shrink. Proof once more that acquiring a competitor is easier than holding onto its customers. That, more than anything else, may explain why the company went from public (BBBB on the Nasdaq) back to private hands in 2011 and just announced the departure of long time polarizing CEO Michael Chasen in 2012, who brazenly attempted to patent the LMS in 2006, sued Desire2Learn and verbally threatened open source alternatives Moodle and Sakai. While Sakai appears to have stagnated due to fragmenting of the developer community, Moodle, D2L and upstart Canvas are all growing at Blackboard's expense.
Higher Ed Goes Digital
Big changes are coming to the hallowed halls of higher education. As the cost of a four year degree continues to rise because of, well, you might be surprised. And because state funding for education continues to decline, the consumer is left paying an increasing share of the bill. Administrators, who feel pinched to keep doing more with less and to keep a lid on costs, are pushing for increased class sizes, for more classes taught by part-time instructors, for more online classes, and for the adoption of technologies that automate instruction or reduce the teaching effort per instructor, allowing each one to do more. If we step back and look at the big picture, where is all this headed? As a result of these coming changes, the tenure track faculty member who teaches for a living is, by my reading of the situation, an endangered species, and the state funded primarily undergraduate university isn't much better off. Don't believe me? Ask any department chair at any public undergraduate institution what's happening when a tenured professor (one who's primary responsibility is teaching, not research) retires. While enrollment is growing like crazy (because "college is for everyone"), experienced full-time faculty are being replaced, if at all, by much cheaper and often less qualified part-time instructors. It's happening because technology has been identified as a method for regularizing and further automating undergraduate instruction. Undergraduate university teaching is the delivery of specialized, but fairly standard, information to a large market of adults, for a high price. (K-12 is safe for the moment because teachers not only impart knowledge but also serve as workday babysitters for their young charges.) Sure, experts are still necessary to develop the standardized lessons and content for higher education but, once that's done, it can all be deployed on a massive scale and managed by less qualified people. (Well, that's the argument I hear from upper administration anyway. Whether a less qualified instructor can as effectively grasp and deliver that content is another question, but it's a tradeoff administrators seems able to live with.)
Since the market is large and the price is high, there will be lots of competition for students. With instruction going online, students will no longer be placebound, and course capacities will no longer be dictated by the size of the classroom. In the very near future, students will be able to get an online degree in most subjects from anywhere they choose. Some universities are even racing to grant degrees in personalized learning programs where students can shorten their course of study by "testing out" of classes in which they have "life experience!" (I hope the testing is rigorous and occurs in a proctored environment with ID checks!) When future students are choosing where to go for their online degree, why would they choose your institution? If you don't have a good answer, you'll be in trouble. This change will be highly disruptive. Ask yourself this. What happened to the local video rental stores like Blockbuster when Netflix came along? What happened to the local music shops after iTunes? What happened to the local newspapers after Craigslist became the place for classified ads? What happened to all the independent used bookstores and even the big chain bookstores like Barnes and Noble now that Amazon sells more digital books than paper ones? All of these digital information delivery services replaced their analog counterparts in a very short period of time. With high quality content and lessons coming from the big publishers, written by pedagogical and subject area experts and tailored for the web by skilled graphic designers, the courses developed independently by most professors don't compare favorably. Brick and mortar universities teaching traditionally will be like the small quirky independent bookshops competing against Amazon's vastly greater selection of cheaper content. Most of them will fold. What will happen to all those beautiful campuses and the college towns that depended on them? When the undergrad degree goes digital, there will be only a few winners and they will win big. There will also be many losers, as venerable local institutions see in-person enrollment decline and poorly implemented online programs fail to attract and/or retain students. Universities that conduct research and have graduate programs will be less affected, and the private ivy league institutions will continue to do fine by offering an expensive top-notch traditional education to a niche market, but the community colleges and primarily undergraduate institutions competing on price and who can't differentiate themselves will mostly go the way of the Blockbuster Video stores.
Which organization that you haven't heard of yet will be the Amazon or the iTunes of higher education? Will it be a big publisher like Pearson, or a for-profit online institution like University of Phoenix or Capella? Will it be a currently free option like Coursera, EdX, or Udacity or the Khan Academy? Will it be a highly regarded traditional institution like Stanford or MIT? Or will it be a small regional university like NAU, already accredited and experienced in online delivery to its rural population, that gets it right? It's too early to tell. But there are ways to prosper in this new era. Courses from the for-profits are still generally pretty bad, and the selection from the free services is limited, so there's a window of opportunity for some new leaders to emerge. And while Massively Open Online Courses (MOOCs) are currently getting a lot of attention, they require a level of self-motivation and organization rarely found in our undergraduates. Build better service, with better instructors, more courses of study, better than standard "canned" content, and more personal touch into our online programs and we can beat the competition, create more value for the dollar, grow enrollment, and enhance our reputation as a quality online degree granting institution. That will take time and hard work, and it will take a new kind of instructor who knows technology and pedagogy as well as the subject area. And it won't be any cheaper, to the chagrin of those who think that waving some technology pixie dust over the problem will make it all better. But change is coming and academia, steeped in tradition and rife with bureaucracy, is not very good at change, so it's going to be a shock. Are you preparing for the giant wave of change that's about to crash on traditional higher education? Because you can just sit there and get crushed by it, or you can start paddling for your life and ride it into the future!
Blackboard's latest strategy
On March 26, 2012, Blackboard's CEO, Michael Chasen, and Ray Henderson, former CEO of Angel and now CTO of Blackboard, announced an apparent shift in Blackboard's strategy. The announcement says that Blackboard now embraces open source products such as Moodle and Sakai, and curiously, there's an "oh, by the way" at the end; almost an afterthought, about the future of Angel, a recent acquisition. It's a very strange development on the surface, that took many people by surprise. Inside Higher Ed calls it a pivot in strategy but, as a long time Blackboard watcher, I don't think that's quite right. It will help if you know some of the backstory. Blackboard has a long history of acquiring other companies. If you're a Star Trek fan, it's hard not to see them as the Borg of the LMS world. From Prometheus through WebCT, Wimba, Elluminate and Angel, Blackboard has been busily buying up companies that compete in its sphere of influence. They have also produced products designed to put some of their former partners out of a job, such as the Safe Assign component that duplicated (incompletely, but well enough) the service TurnItIn provides. They famously sued Desire2Learn, an LMS competitor, and threatened the open source community not to tread on their outrageously broad patents or else face the unnamed consequences. The long-term damage Chasen did to Blackboard's reputation with these moves is still reverberating through the higher ed community. So Blackboard has a somewhat deserved reputation as a bully and an opponent of open source LMS tools. Why then, would they do such an apparent about-face and offer to extend the life of Angel and to "embrace" (their word) their open source competitors? Have they seen the error of their ways?
Extending the life of Angel is the easy part to explain. When a company acquires another company's product, as Blackboard ought to know well, it can take a while to assimilate the product and/or its users. It takes a lot more than just rebranding the product and bringing over a few executives from the absorbed company for show. Blackboard is still suffering indigestion from its absorption of WebCT. NAU was a WebCT school, and we came up using their Standard Edition, Campus Edition, and Vista products, but the import of our HTML-rich Vista courses into Blackboard's LMS, a process they told us would be far smoother than moving to anyone else's product, turned out to be a bit of a disaster. Vista is end-of-lifed in 2013 and all of the former Vista schools we've spoken to who are now on Bb Learn are suffering with broken courses that require extensive repairs. So it's not so much that Blackboard doesn't want its more recently absorbed Angel customers moving into the Bb Learn fold. It's that they've got their hands full with the former Vista clients and are scrambling to fix Bb Learn to make it work better. Giving the Angel people a little more rope looks like a favor but they seem to be fairly happy where they are and, really, the LMS group within Blackboard probably doesn't have time to deal with another set of dissatisfied Blackboard conscripts while they're busy trying to figure out how to globally repair imported Vista courses.
Now, on to the open source side of the question. While Blackboard has been gobbling up LMS competitors over the past ten years, their market share has not grown proportionally. Acquring companies, it turns out, is easier than holding onto the customers of the those former companies. Also, as fast as they can absorb a competitor, a new alternative pops up. Over those past ten years, Blackboard has also been moving in another direction. They have been building "platforms" above the LMS that create more complete solutions for their K-12 and higher ed customers. But as Blackboard has not been successful at creating an LMS monopoly, the market for their vertically integrated platforms has also shrunk. Therefore, their effort to embrace the open source world is an attempt at damage control for their bully reputation, while simultaneously trying to reach a broader market for their other products. Even many schools that use Blackboard's Learn LMS don't purchase their Analytics, Outcomes, Collaborate, Transact, and other tools, so attempting to broaden the market for these products makes sense. I don't think it's going to work, however, because there are only two major categories of open source users: penny pinchers who don't like Blackboard's price, and purists who don't like Blackboard's heavy handed anti-competitive tactics.
So what does this all mean for us? As Blackboard stretches to try to sell all the non-Learn LMS users on the rest of its many platforms, their own LMS will get fewer resources and less attention. They seem to think that they have already sold their LMS to everyone who wants it. No more blood to squeeze from that stone. Time to move on and leverage the other holdings. That's really too bad because, while the LMS has promise (in many ways I prefer it to Vista), it's still got a lot of problems that wouldn't be too hard to fix. Blackboard could take a page from Apple's playbook. They could work towards creating the best LMS and the most seamless integration to their other tools, and let the users flock to their solution. But instead, they seem to have decided that they've got other stuff to sell to other LMS users, so Learn goes to the back burner. Redoubling their efforts to beat, rather than eat, the competition would be a real strategy pivot. This just looks like more of the same to me.
Despite the April Fool's Day datestamp, this is no joke. There's something very important that you need to remember about Google and Facebook. You are not their customer. You are their product. Think about that a bit. Why are these services free? Clearly they cost money to operate. What is the revenue model? More on this topic soon...but in the meantime...Google's been worked on "augmented reality" glasses. Here's what using them will look like :p
Dear purveyors of eContent. There's something you really need to know. Our LMS is called Blackboard, but you seem to think it's called Springboard. We are not interested in putting a link in our LMS that bounces the users out of our well supported, familiar system to your unsupported, unfamiliar system. We are not interested in having our students take assessments in your system that don't put data back into Blackboard's gradebook. Our faculty don't want to learn a new user interface for every product they adopt. We made a conscious decision to adopt one LMS across the whole campus. If you have eContent to sell us, we want it in our LMS, not yours. Thanks!
A world of eContent at your fingertips.
When I went to college, back in the 1980s, each of my new hardcover textbooks weighed over 5 pounds and cost over $100.00. Cheaper used and softcover texts weren't yet readily available. Since that time, increasing numbers of students have been selling their textbooks back to the bookstore or other re-sellers in order to get a wad of cash to fund a keg-party or the next semester's textbook purchases. The increased availability of used textbooks has driven publishers, they argue, to raise the prices of their new texts and, at least to my skeptical eye, to make numerous small changes to each edition and release these new editions at ever shorter intervals to try to reduce the usefulness of old editions. So, for decades, students and publishers have been locked in an "arms race" that hasn't been particularly good for either side. That's about to change.
Enter Amazon.com, the model for a disruptive new relationship between students and textbook publishers. Amazon is the world's largest bookseller and they now sell more eBooks than paper books. They deliver their eBooks via the Kindle, but also through the free Kindle reader app for Android phones, iPhones, and iPads because Amazon only cares that you buy their content; not what you read it on. I bet not too many college students own Kindles, but they sure like their smartphones! I recently asked a fairly typical group of over 100 university students how many of them owned "smartphones." Almost every hand in the audience went up, so most students already have a mobile device capable of reading eTextbooks. I also asked them how many were currently using electronic textbooks. Not a single hand went up. In the business world, this is what people call an "opportunity."
The big academic publishers in K-12 and higher-ed, including Wiley, Pearson, Cengage, Benjamin Cummings, Houghton Mifflin, MacMillan and all the rest, are ready to get into the game. They've been watching Amazon long enough now to see that it's a winning strategy. According to the publishers, and I don't doubt their numbers, about one third of textbooks purchased annually are used, not new. Each of those re-sales is lost profit for the publishers. But because of something called DRM, or "digital rights management," students won't be able to re-sell their eTexts. While there are plenty of ways in which eBooks might be superior to paper books, the big one for the publishers is DRM. With eBooks, the used textbook market is dead. It's also possible that the publishers will profit from not having to print and distribute physical books, but at least some of those profits will be offset by the need to publish online editions, and maintain servers and a larger IT infrastructure. Publishers will tell you that students are going to love eTexts for the mobility, reduced weight, the ability to get corrections, updated content, and for the multimedia elements that make the eBook a richer learning experience. While both paper and electronic editions exist side by side, you can even expect the eText to be cheaper to drive customers into the new market. So students will like eContent, and publishers will profit from it. But convincing faculty, most of whom don't particularly like technology or change, that eTextbooks are worth the effort will be a challenge.
Faculty control the textbook adoption process, and they remain somewhat skeptical that moving to eTexts is a good idea. Publishers could try to pressure them by eliminating the paper edition, but that might drive an instructor to select a competitor's product. They could use student demand, by making the eText cheaper than, and different from, the paper edition. They might even try to convince faculty with incentives like a free iPad, or by encouraging faculty to "build your own book" by assembling chapters of pre-built content. In the days of the printed text, especially in the K-12 market where California and Texas heavily influence content decisions, the publisher sometimes faced the challenge of trying to satisfy diverse customers with the same content. Now the "controversial" chapter on Evolution or Global Warming or the Big Bang or Birth Control Methods can be easily deleted or replaced, because the customer is always right! In higher ed, the ability to easily mix and match digital content may also appeal to instructors who want to customize or who teach interdisciplinary courses. Apple hopes faculty will start writing their own eBooks for iPad using their free tools. I don't think the faculty will go willingly into this brave new world, but it's probably going to happen whether they like it or not.
Are we putting the technology cart before the instructional horse?
Recently, I was asked to develop a presentation for a group of faculty on the Teaching Uses of Social Software, and I was happy to oblige. I decided to approach the problem by giving them a broad overview of various kinds of social software, hoping that the sampler would stimulate some discussion of possible ways they could encorporate these tools into their teaching. But during the course of the presentation and following discussion, it became clear that the instructors had not really considered why they wanted to use social software except that "all the kids are doing it" and perhaps because they thought it might make them look "hip" and "with it" (or whatever terms the kids use for "hip" and "with it" these days). I attempted to reframe the discussion by asking them to identify some instructional "needs" and offering to suggest to them some possible technology "solutions" but it was pretty clear they weren't getting it. I thought maybe some examples would make that abstract question more concrete. "Say you want instantaneous feedback from students in the classroom. If so, you could use clickers or some other polling tool such as PollEverywhere.com." Blank stares. "Ok, well do you ever have problems getting students to discuss a controversial topic in class? If so, perhaps you could use Blackboard's discussions tool to create an anonymous discussion. Students may be more likely to speak up if they know they can participate anonymously." More stares. "Well, how about group work? Does anyone assign projects to groups of students? If so, you might consider a collaborative tool like GoogleDocs, or maybe a wiki, which allows students to simultaneously work on the same document. You can also view the document at various snapshots in time to see how it developed, and to see which students contributed what." At this point, one of the faculty interrupted. "Larry, what I think we need is for you to show how to teach using Facebook." By this point, I realized that by "teach" they meant "lecture" but, being a persistent sort of person (my wife says "stubborn"), I thought, ok, I can work with that. "Oh, I see. So you're interested in a sort of "get to know you" activity where people tell a little about themselves?" Looks of genuine confusion. "No, Larry, we just think it might be good to teach in Facebook because that's where the students are." I paused for a moment, trying to think of something I hadn't tried yet. "And maybe you could also show us how to teach from a smartphone?" But, by that point, we had run out of time. The moderator thanked me for the interesting presentation but, regrettably, we would have to follow up later on how to teach from smartphones. I haven't heard back.
The State of Kentucky, in a move that is sure to be emulated, is changing the way it funds public universities. Instead of providing funding based on the number of students enrolled, they will now fund based only on the number who graduate. The intentions are clear and good. Higher education is being asked to graduate more students and will only be rewarded for those who do. But universities are already working hard to keep students enrolled, and using intervention methods to turn failing students around. Faculty, for the most part, are already working hard to bridge the gap with students who are less prepared than they should be for university level work. And the new policy only gets at the quantity of graduates, ignoring the quality side of the equation. So if all the low hanging fruit has already been picked, where do schools turn? The problem is that they/we have several ways to achieve the goal of higher graduation rates, and the path of least resistance is not the one the legislature is hoping for. Can anyone predict some very likely unintended consequences? 1) Grade inflation for those in the system. 2) Stricter admission standards for those yet to enter. 3) Quicker dismissal of failing students. The legislature has identified an important issue (we don't want to give universities money for students who don't succeed), but is not applying the pressure or resources to fix the real problem. Of the three paths of least resistance, the first one, grade inflation, is clearly bad but the second and third might have an upside. If universities don't admit students who are likely to fail, and quickly kick out students who are not serious about doing the work, they use their limited state funds on the students who want to be there and who have the potential to succeed. Students unprepared for university level work would need to seek remedial coursework in order to pass admission requirements. This is a good thing. And the university degree would become a better indicator of ability. Also good. So if the legislature wants this new funding approach to work, they also need to penalize universities for grade inflation, let them admit fewer students, and kick out more students if they are unmotivated, unprepared, or both!
Recently NAU was approached by an organization called "Quality Matters" and invited to become a member. While they are a non-profit, that does not mean they are free. Annual membership dues are required, and the implication is pretty clear. If you say you're not interested, you must not care about quality, right? People pay to be trained as reviewers. People also pay to have their courses reviewed, and they pay to receive the QM seal of approval. Based on the success of this operation, QM could easily spin off some other ventures such as, "Motherhood and Apple Pie Matter," or "Patriotism Matters." Their heart is, to be fair, in the right place. The purpose of this organization is to identify things that make for a quality online course, and use a faculty peer review process to evaluate and certify these courses. This movement wouldn't even exist if there weren't some valid questions about the quality of online courses nationally, and if schools weren't feeling a little defensive about their online programs. I do, however, have some issues with their approach. My first issue is that the focus is on courses delivered online. Their scope does not include courses taught in a traditional manner, and I think we can all agree that some of those must be equally bad or worse! While I'd like to level the playing field and look at all courses, it's maybe a bit unfair to criticize QM for what they don't review. So let's look at what they do review. We will leave aside for now whether NAU should cede its authority over the evaluation of course quality to a body outside the university, and over which we have no control, because the question of who's watching the watchers could be the subject of an entirely different discussion. My biggest remaining issue with the "QM Program" is that online courses can be, arguably, broken down into three major components, and QM deals with only one. A better name for Quality Matters might be "Let's Focus on One of Three Things that Matters!" In case you're inclined to disagree with me, here are my three components of quality in an online course:
- Design: this is the way the course is structured, how it displays to the user in the online environment, including the identification and measurement of learning outcomes.
- Content: this includes the selection of appropriate materials and the accuracy and depth of those materials.
- Delivery: this includes the choice of instructional methods, all of the interactions between instructor and students, and among students.
The QM program deals only with Course Design. I'm not saying that design doesn't matter. I'm pretty convinced that it does. Without good design, it's going to be difficult to get out of the starting blocks. But I think I'd like more than one of the three reviewers of my online course to be a "subject matter expert" and I don't think it makes much sense to slap a seal of approval on a course unless the content and delivery have also been reviewed thoroughly. I have seen the disastrous results that occur when you give great materials to a poor instructor. I have also seen the tragic consequences when you combine a dynamic and motivating instructor with materials that are inappropriate for the students, either because the materials are not challenging enough, are out of date or otherwise inaccurate, or are too challenging because the students do not have the necessary background preparation. What I'd really like to see is a peer review program that looks at all of the aspects of course quality described above, and owned by our own faculty rather than an outside organization. But I think I see the writing on the wall. If we don't start policing ourselves, it may not be too long before someone else is doing it for us.
True story. I have a faculty colleage who had a formal complaint filed against him by one of his students for discriminating against him on the basis of his intelligence. The "discrimination" was giving the student a lower grade than some of his classmates, based on the student's relatively poor performance on various assessments. When the professor agreed that this was true, the student became even more convinced that he had a case! I think this raises an interesting point because the professor in question was using an "old" way of thinking, while the student was using a more modern construction.
When I was in college back in the '80s, I'm not sure there was such a thing as dropping a class. At least, if there was, I never did, and I never knew anyone who had, so it was neither common practice nor a well advertised option. It just never occured to me that one could do that. The concept of re-taking a class a second or third time to replace the original bad grade was also completely foreign. When I got the occasional grade that I was unhappy with, I owned it, and there was nothing I could do about it. It was there on my transcript for all to see, like a tenacious piece of gum on the bottom of my shoe. Today, most students would just throw away the shoes and buy a new pair. In my job at the university, we care about student success and we want everyone to get a good grade. We go to greater lengths every year to accomplish this goal, giving students more choice, more flexibility, and we intervene more than ever before to work with students who are struggling. All of this is good, I think. But we rarely think about why this is the goal. Not trying to be cynical here, but let's just step back for a minute and ask ourselves: "Isn't the point of grading students, in large part, to identify (optimistically) which ones have learned or, (pragmatically) which ones have successfully completed the assignments, or (cynically) which ones have successfully jumped through the hoops?"
Question: Is our goal to get everyone over the bar, no matter what it takes, or just to provide everyone an equal opportunity to get over the bar and then report the results? The bar I refer to here, of course, is "learning" even if measuring that intangible substance requires cruder instruments like tests and other assessments. If everyone gets unlimited chances to get an A (assuming here that letter grade correlates with learning achieved, so you can substitute A with "learned a lot" and F with "didn't learn a thing") by the process of do-overs, remedial work, tutoring sessions, interventions, etc, then aren't we artificially levelling the playing field? Aren't we de-valuing the A earned with hard work and without extra credit? Would you rather be seen by the doctor who got an A in Biology the first time through without any outside help, or the one who was failing the course and dropped it, took it again and got a D, found an easier instructor and took the course a third time, got a B- and, with a bunch of intervention, tutoring, and extra credit, got the B- rounded up to an A, which replaced the D on the transcript? I suppose that student has perseverance at least! Of course, there's an old joke: What do you call the medical student who graduated at the absolute bottom of his class? Doctor! Hah :-)
Why has it come to this, and how has it come to this, and is this where we want to be, and, if not, how do we get someplace else? I think part of the reason we have arrived at this point is that so many more kids are going to college. College really is the new high school. Michael Wesch, who I admire and mostly agree with, says "College is for everyone." True. Certainly part of the problem, though, is that if everyone is being admitted, more students are arriving unprepared. More students are here not because they want to be, but because they feel compelled to be so that they will be competitive for a job at the other end. This also explains the impatience of many of our students, who don't really love to learn or want to broaden their minds. They "just want a job, ok, and could you please show me the fastest way out of here?" I'm sympathetic. Who wants to spend $40,000 (minimum) for a bachelor's degree that still can't guarantee them a job? And certainly part of the problem is that universities love all the extra money that's coming in, but feel a twinge of guilt when those students who aren't prepared don't succeed. Legislators and administrators, who hear from the howling parents who pay the bills of these mediocre students, put pressure on faculty to do better. By "better," they mean graduate more students faster with better grades and with less funding. If we rule out the easy way (just lowering standards), and take the challenge to "do better" seriously, what's left?
Solutions: 1) Placement. Students should not be admitted to the university if they are not capable of succeeding and students should not be allowed into courses for which they have a high probability of failure. We can pretty accurately predict success with placement tests and we need to do this more. 2) Remediation. If students arrive without the skills but it is possible to teach them those skills, they need bridging courses to get them there. 3) Academic probation and dismissal. Students who are not succeeding, and who are not likely to turn it around, should not be strung along. 4) Monitoring. Technology can be used to monitor student progress so that intervention occurs quickly before students spiral downward. We do all of these things now. We just need to do them more, and better. But the following are not generally addressed at all. 5) Instruction. Most faculty arrive with good content area knowledge but limited teaching experience or knowhow. This can be addressed, but it would take a mind shift for the university to accept that this is a problem. 6) Compensation. Little attention is paid to the quality of instruction. Typically, only instructors with high D/F/W (drop, fail, withdraw) rates get any attention from administration, and this negative attention can easily be avoided by lowering standards and giving lots of As. But standardized tests, with all their flaws, can measure incoming and outgoing students and be used to reward instructors who show the gains. Will this lead to "teaching to the test?" Possibly. But if the test is good, that's not the worst problem to have. 7) Peer review. Research faculty know all about peer review. It's how they get articles published in good journals. But in the classroom, instruction is siloed. Nobody watches anyone else teach or gives them any tips on how to do it better. Sure, there's muttering in the hallways about which instructors are too easy, or just plain bad, but nothing gets done about it. This could be fixed if there was the will to do it, but again, it would require a major shift in faculty culture. 8) Reporting. Something I've never heard mentioned anywhere is that universities really ought to report not just on the grade a student receives, but how long it took the student to get there, and by what path. We have this data. We could put some of the rigor back into transcripts that are packed with As by reporting the information employers want to know: How much external time and effort was expended to get this student over the bar? 9) Tracks. I know it's sacrilege but, while college is for everyone, the liberal studies degree is not. Universities need to rethink degree granting with an eye towards certificates and diplomas that lead directly to a career path. Want to be a salesman, a dental hygeinist or an x-ray technician, or a database programmer, a forest ranger or a cop? Sure, a bachelors would be helpful, but it's probably not something you "need." Want to be an astrophysicist, a historian, or a philosopher? Ok, get the bachelors. But here's something else we should tell incoming freshmen and rarely do. If you get the bachelors, you probably don't need to come back to school when you change careers, as most of us do these days. With the certificates, you probably do.
Secure Exam Remote Proctor: Expensive, Impractical and Creepy!
There is growing concern about the identity of online test takers. Is the person who logs in and completes an assignment really who he/she says? Why couldn't, say, a talented college athlete who is, um, a little academically challenged get some unauthorized help with his web assignments so he can stay on the football team? Who would be the wiser? Of course cheating is nothing new. People were buying term papers long before the Internet, but the explosive growth of web-based distance learning programs changes things. And with that growth, people are demanding that colleges address the issue. Some advocate for a technical solution to this problem, and an industry has sprung up to provide these security products. I suspect that the vendors of these products are also quietly lobbying to make this issue a controversy so that they can step in with a profitable solution. One of the first proposed solutions was the locked down web browser. This prevents the student from using the computer for anything but the test that he/she is taking. Critics were quick to point out that this can be defeated by simply using a second computer or smartphone to google answers. And nothing stops the test taker from having a textbook or a knowledgeable friend sitting nearby. Makers of these security products quickly clarified that such tools are really meant for proctored situations, where the student is being observed, so that no unauthorized aids to test taking can be employed. However, proctoring is expensive and requires students to come to a central location, which defeats the purpose of offering distance learning classes. Anticipating this complaint, the next generation of products prompted students, at random intervals during testing, for biometrics like fingerprint scans or answers to challenge questions. But still, without a proctor, these prompts only prove that the test taker is nearby, not that he/she isn't getting help. The latest trend is to monitor the students using microphones and cameras with fisheye lenses that record the test takers and stand in for a human on-site proctor. This gets technical and expensive, not to mention the privacy concerns, and someone still has to review the recordings to verify compliance. Doesn't all of this sound very similar to the methods of the Transportation Security Administration? We can throw lots of money at technical solutions, invading privacy and creating great inconvenience along the way, and still feel pretty insecure. Perhaps we need to think more like the Israeli alternative to the TSA. They rely on psychological profiling more than full body scans. To put it simply, they ask passengers questions. If a passenger tries to board a plane for an international vacation carrying no luggage, they ask the passenger to explain. A similar common sense approach could be used in online learning. "So, Mary, I'm intrigued by what you describe as "the hegemony of orthodoxy" and I'd like to set up a Skype call where we can discuss it further." Other techniques that discourage cheating are pedagogical rather than technical. Reducing the stakes (point value) of individual assignments is shown to be effective at discouraging cheating. Asking a student to defend a position in a dynamic discussion rather than write a biography of a famous person also makes it harder to use someone else's work. The perceptiveness and adaptability of the teacher, rather than the complexity of the technology, remains the answer to the age old problem of cheating on a test.
Google seems to be doing so many disconnected things these days, but I just had a great idea. What if there was a way to tie it all together? I would love to see Google enter the Learning Management Systems (LMS) market and, when I start to think about it, it's not so crazy. They already have almost all the pieces in place and Blackboard sure could use some competition. iGoogle is the portal/login page, Google Docs is the collaborative workspace, GMail and GoogleTalk become the communication system for messaging and chat. Sites is for course content, and Wave is for, well, whatever the heck Wave is supposed to do. Blogger is for journaling and discussions. They've even got the social side covered with Buzz and maybe that would give it the network effects jolt it needs to dislodge Facebook from the throne? About the only piece they don't already have is a roster/gradebook that allows the instructor to view all rows but the students to view only their own row of scores. The roster would determine all of the component access privileges. That's not much work for a company with as much money and talent as Google has. The whole thing is free, hosted, and lives in the cloud. Make it compatible only with Android mobile devices for, um, technical reasons, and lock out iPhone. Steve Jobs would have a fit. So how about it Google? All the pieces are in place. You just need to connect them. Will you answer the call? Oh, and Larry and Sergey, if you're reading this, I'll happily accept a commission for the idea. It's where School meets Google. Call it Schoogle. Ok, maybe we can come up with a better name.
Update: Ok, three years later, they finally did it, but it's still pretty unfinished. You get a recipe and a list of ingredients, but you still have to bake it yourself:
http://code.google.com/p/course-builder/
NAU has upgraded its wifi system. The new one works very much like the ones you've encountered in airports and hotels. That's the first problem. It's a university, not a hotel. Regular users of the wireless have to agree to terms EVERY TIME they connect. If your smartphone goes to sleep to conserve power or you close your laptop to move from one location to another, when you wake the device up you need to reconnect and agree all over again. That's just silly for a system designed primarily for regular users (not one time guests). While this is annoying for laptop users, it's a downright nuisance for people with wi-fi capable smartphones and tablets. But there is a BETTER WAY. The MAC (media access control) address of every wired computer on campus is registered. If regular users of the wireless could register their devices too, then the agreements could be logged once and filed away. Sure, the agree screen should pop up for unregistered guests (parents, vendors, etc.) or when the policy changes. But for the regular users, most of the time, this shouldn't be necessary.
The second problem is security. If you try to access a web service other than a browser, you never see the agree screen so you can't connect. Even in a browser, the agreement screen doesn't always appear, and that has negative consequences downstream. If your home page is set to an NAU website, you won't be prompted to agree because the NAU domain is a "trusted site." But unless you agree, you can't join the VPN (and you're not told why; it just fails to connect) so your session is insecure and you are transmitting passwords and credit card numbers unencrypted. And even if you do agree, many people don't take that final step and join the VPN because they don't have to; the wireless works even if you don't join the VPN! Sure, there's a warning on the agreement screen, but it's buried in a page of legalese and who reads that stuff anyway? So, aside from a handful of tech people who know better, most of our wireless clients are surfing the web without encryption. Don't believe me? Ask your colleagues if they connect to the VPN while using the wireless, or if they even know what the VPN is! This makes the majority of our clients easy pickin's for any geek with a packet sniffing program and a few idle minutes in a public space! Is this bad? Think of it this way. It's the digital equivalent of walking down the street naked in the middle of winter. Normally the security-centric IT folks would be all over an issue like this, but they're not. They know about this problem, but they don't choose to fix it. Is it because there is a way to be secure and it's buyer beware? Or because they have a reactive (see problem 3 below) strategy? I don't know. But there are two proactive ways to fix this problem: technical (don't allow insecure connections) or educational (teach people about how and why to encrypt their wireless sessions).
On to the third problem. People are increasingly showing up on campus with tablets and smartphones. These devices are almost always the property of the user, not the university. But the university insists that, for the privilege of checking my work email on my personal device, a password lock with a 15 minute timeout must be installed, and that the password must be strong (hard to remember) and non-repeating. Worst of all, the university wants to be held harmless for remotely wiping my entire device without my express permission if my password is entered incorrectly too many times. This is security overreach at its worst. Restrictive policies such as these are typically written by big corporations with trade secrets to protect and who provide company-owned mobile devices to their employees for work purposes. That situation doesn't apply here. The university ought to be thrilled that faculty and staff would want to check their work e-mail on a device that cost the university nothing, and which is carried around with them during every waking moment. There is a BETTER WAY. Our policy should not discourage the use of personal mobile devices by faculty, staff and students. We need a much more flexible and less restrictive policy for personally owned devices which contain mostly non-NAU data, or users will not connect them to the services we want them to use.
Workarounds: Rather than configure IRIS in my e-mail client, I use Outlook Web Access through my mobile device's browser which, on the downside, requires a login every time but at least doesn't require me to agree to a remote wipe of my personal device. Students use Google's GMail system, which doesn't require the remote wipe. And everyone should remember to use the VPN. As for the frequent Agree prompts...don't we all agree this is just a silly waste of time?
Update: 07/11/2013 The system has changed again. The newest wireless has a more secure "NAU" network which requires a one time login with your NAU username and password, and a less secure "Public" one for guests which works much as described above. This is a big improvement. The NAU network still doesn't require you to use a VPN for greater security, but at least it only bugs you once for a login. And the public network works as it should, prompting guests to agree to terms each time they connect. We're slowly getting there!
Amidst the flurry of bad press over SB1070 and the resulting boycott of Arizona, you may have missed something interesting on page two. NAU made the Chronicle last week, and Slashdot just picked up the story. It has been spun as a privacy and digital rights story, but it's really something much bigger. It seems there's a plan in the works here at NAU to use student ID cards with embedded RFID (radio frequency identification) chips to record class attendance. We've been using clickers to do this for years. So why are university administrators increasingly interested in mandatory attendance? The answer is complex, but it has a lot to do with a societal shift that is having ripple effects in academia. Michael Wesch says it this way: "College is for learning, and learning is for everyone. So college is for everyone." It wasn't always this way.
A college education used to be something one aspired to, but it certainly wasn't a necessity. For many students today, going to college no longer feels like a choice. The bachelor's degree is the modern-day equivalent of the 1950 high school diploma. Students increasingly resent the liberal studies courses that teach "critical thinking" but don't give them the tangible workplace skills they think they need. Given the number of times a modern worker changes careers, critical thinking, the ability to write, and other versatile competencies are more important than ever, but we haven't done a good job selling that argument. Many students now see college simply as an expensive and time consuming obstacle that must be overcome on the path to a good paying job. Knowledge for its own sake is no longer the primary motivator. As Ronald Reagan once said, echoing the growing public sentiment, "Why should we subsidize intellectual curiousity?" So while the public is less interested in a classical education, demand for diplomas is at an all time high. But universities are slow to change, and haven't really adjusted how or what is taught. As a result, universities are admitting more students who are unprepared for that classical education, and less interested in getting one, than ever before. Can you see now why mandatory attendance is becoming an issue?
To keep unmotivated and/or unprepared students in the system and on the path towards a degree, administrators want to reduce the D/F/W (drop, fail and withdraw) rates, attempting to give students more opportunities to succeed and keep the tuition dollars flowing in. Faculty push back, and are sometimes admonished, for refusing to lower their standards, blaming K-12 for sending them unprepared students, refusing to teach remedial material, and resisting efforts to change the way they teach. Conflict between college professors and administrators is very noticeably on the rise. Administrators want reduced D/F/W rates, but they need to be careful that they aren't inadvertantly pressuring instructors to lower standards rather than make their courses more compelling. And faculty need to realize that while they should not lower their standards, they do need to change the way they teach to make their courses more compelling, practical and relevant. If they don't, they will be forced to deal with a lot more dissatisfied students who will, naturally, disrupt the classes they don't think are giving them what they paid for.
Think back to your own education. What was the biggest difference between high school and college? Students acted out or tuned out in high school classes because they were required to be there and didn't, for any number of reasons, want to be. Classroom management, a life and death skill for K-12 teachers, was for the most part unnecessary for higher ed instructors. In college, students who didn't want to be there quickly stopped showing up and, until recently, colleges have been mostly ok with that. The old attitude was that "college isn't for everyone" and "it's your money." We are teaching young adults to take responsibility for their choices, the argument goes. University is a place for free thinking and if a student chooses not to attend class, who are we to tell him/her otherwise? But college has become so outrageously expensive that universities are feeling more obliged to ensure that students and their parents "get their money's worth." Retention is the new mantra, and mandatory attendance is seen as one way to enforce it.
What will be the effects of mandatory attendance on college classes? On the surface, it seems like a good idea. Numerous studies show a strong positive correlation between attendance and student success. Students need to know that attendance matters and that we're serious about it. But if we dig a bit deeper, there are several problems.
In most studies, student success is only strongly correlated with voluntary attendance. If you make attendance mandatory, the effect is considerably, but not entirely, diminished.
Also, we don't achieve our goal if the students can easily defeat the mandatory attendance system; all a student has to do is give his ID card to a buddy who attends class. So will mandatory attendance actually improve student success? Yes, for a few students on the fence, attending class more often will make the difference between a pass and a fail, and some of our students do need a push in the right direction. But what worries me more about mandatory attendance is a negative unintended consequence. University instructors unaccustomed to unruly and disrespectful students are in for shock. They will be spending more effort on classroom management and it will negatively affect their ability to teach. Effort expended on making the courses more relevant, interesting, and engaging without lowering standards is a far better return on investment. If a course is compelling, students will gladly attend and value the lessons you deliver. Isn't that better than forcing them to sit through a dull lecture?
Further reading:
A Case Against Compulsory Class Attendance Policies in Higher Education
Skipping class in college and exam performance: Evidence from a regression discontinuity classroom experiment
Does Mandatory Attendance Improve Student Performance?
Do students go to class? Should they?
Should class attendance be mandatory?
Deciding on a new LMS
NAU will be making the change to a new learning management system (LMS) in the near future. Our current tool, Vista, has been in service for about five years, which is a pretty good run in the fast changing world of technology. The first question we always get is, "Why?!" Nobody likes change. We all know the current tool and change is costly, time consuming, and disruptive. Vista is working well, and we have more people using it each semester, so why change? The main problem is that WebCT, the company that created Vista, got bought by competitor Blackboard (Bb) several years ago. We were just getting done with the previous transition, from WebCT Campus Edition to WebCT Vista, when the purchase was announced to the public. You might have heard the collective groan that echoed out of e-Learning on that fateful day! Blackboard has kept Vista going for a while, giving its Vista customers time to transition to the new product, called Learn. Eventually though, Vista will be "end-of-lifed" (beginning of Fall 2013) and we will be required, by the terms of the license, to stop using it.
Our projected transition timeline.
Why now? 2013 still sounds pretty far off, right? In fact, we must be off Vista well ahead of that 2013 date because of a variety of university business rules and transition related issues (see our transition timeline). We have not yet decided on our next LMS and that process will take another six months. We will need to run both systems in parallel while we migrate content from the old to the new. That will take at least two semesters and probably more. We also need to allow time for incompletes and grade appeals to play out after the completion of the final Vista courses, and that can take a year or more. What this all means is that if we start right about now, we'll only just be able to shut Vista down by Fall, 2013.
LMS marketshare over time.
Which one? There are plenty of good LMSes out there, and they all work in pretty much the same way. Blackboard is now by far the biggest company, they are our current vendor, and they make a good product. But Blackboard has a history of suing and acquiring (see LMS marketshare figure) its competitors. A big commercial vendor that could stand up to Bb would be an option, but going with a small commercial alternative to Bb is risky. That's what happened last time and we don't want to make that mistake twice. There are also open source products, such as Moodle, which would free us from a commerical license and which are relatively safe from Blackboard. So that's how we arrive at our two most likely choices: Blackboard, the biggest commercial product, and Moodle, the strongest open source alternative. So how do we decide?
Decision Factors: There are many factors, and weighing the importance of each is difficult. How good is the user interface? How intuitive is the product? How well do these systems integrate with our other campus tools like PeopleSoft, the NAU Portal, and third-party commercial add-ins like TaskStream. What about compatibility with pre-built content modules from various textbook publishers? How easily will our courses move from Vista to the new system? What about cost? Blackboard has an annual license fee but Moodle is free. But Bb Learn comes with SafeAssign, whereas we might need to purchase TurnItIn if we go with Moodle, and that just about erases the savings. And while Blackboard provides tech support, with Moodle we'd be on our own. Of course in addition to fees, which could rise, a commercial license restricts us in ways that an open source product does not. Both have all the tools we're used to, such as discussion boards, a gradebook, an assignment dropbox, and a testing module, but how well do they work? How easy is it to create and modify content? How well does each system work with the smartphones students are increasingly using for web browsing? We often get asked, "Well, which one is best?" and that's actually a difficult question to answer because it depends on which tools you use most and how you use them. We are developing some presentations that contrast Vista with both new systems, and will be holding faculty focus groups during Summer and early Fall, where you get to try the same tasks in each system and give us your feedback. If you're feeling adventurous, you could even volunteer to participate in the upcoming Fall 2010 pilots of Moodle and Learn at NAU.
Moodle vs. Bb Learn
The Bottom Line: This transition will be bigger than the last one; we have more than four times as many users now as when we moved to Vista. But there's good news too. We learned a great deal during the previous transition, and much of that knowledge will help us going forward. Anyone who knows Vista will find either of these new systems familiar and there's plenty of time to make the move. e-Learning will be there to help, both with training, tech support, and content migration. And in the coming years, free Web 2.0 tools will continue to augment or supplant the base LMS. We will support whatever system gets picked. We hope that you will get involved in the decision making process and tell us what you want, so we can relay that to the PACAC, where the final decision will be made.
Cory Doctorow, sci-fi author and tech blogger, recently wrote a manifesto that argues nobody should buy an Apple iPad, but the reasons have very little to do with the product. Read it for yourself, but here is what I think he was really saying. And, by the way, I think he is completely off base. Except, possibly, for the last point which is sad but true.
- Everything in digital form (software, books, music, movies, etc.) should be free so it's ok for you to "acquire" a copy without paying. You aren't depriving anyone of a living because you never would have paid to own it! And it's not stealing because making a copy for yourself doesn't deprive the owner of his copy.
- If you paid for it, you should be able to give it away for free to all your friends and it's really unfair of evil companies like Apple with their DRM to make it hard for you to "share."
- You are a tinkerer and it is a travesty of justice for Apple to make their hardware so hard to take apart. Of course if you break it while trying to pry it open, you should still be entitled to a full refund because a) they should have made it easier to open and b) those corporate fat cat bastards reamed you on the purchase price and they owe you, man!
- Because you are such a hardware/software virtuoso hacker, you shouldn't be discouraged from improving on the iPad by a) jailbreaking it using a recipe you found (but do not understand) on some website, or b) by taking it apart and soldering bits onto it here and there, such as a handle maybe. How dare Apple release this amazing product and then declare themselves in exclusive control of its destiny? How dare they not fold your brilliant mods into their next distro?
- Your personal boycott will start a revolution and cause the iPad to be a flop, because everybody in the world is just like you. Also you won't buy an iPad because you're (very temporarily) just a little strapped for cash right now because your credit cards are maxed out and you are, um, between jobs and living in your parents' basement.
- Apple can't tell you what to do. Neither can your mom and dad. You are a free thinking grown-up and you can stay up as late as you want. People who tell you what to do just make you mad.
- Technology inevitably goes obsolete and will end up in a landfill.
I am still really, really impressed by GMail, GoogleEarth, Google Maps, Google Docs and of course the unbeatable Google search engine, but the company seems to be veering in a Microsoftian direction, being unethical, anti-competitive, trying to get a finger in every pie, and not always doing a very good job of it. Does Google really need to make a browser when there are so many already? An OS? A smartphone? A mobile platform? The latest thing from Google that we're all supposed to get excited about is Google Buzz. I'm still scratching my head about the purpose of Google Wave. But I digress. Anyway, Buzz is supposed to be a Facebook Killer. If anyone could do that it's Google, but even with a superior product it would take years to dislodge Facebook just as it took years for Facebook to overtake MySpace. See, there's this thing called the Network Effect that the established players have in their favor. So Google took a really, ok, I'll say it, "evil" shortcut and used their free GMail accounts to create an instant user base. And they did it without asking the users' permission. You are now friends with the people you communicate most with in GMail and it's all public. So, in essence, Google has "outed" you. Bad move. Most tech people I know who checked out Buzz turned it off within 24 hours, and now the privacy people are all over Google's case. Sorry to be a buzzkill, but I think this is just another of Google's recent duds. Of course, they can afford it. All of Google's experiments are funded by their very profitable one-trick-revenue pony...targeted ads in search results.
2.18.2011 Recent Google Fails...Google TV...Pogue calls it a huge step...in the wrong direction. Ouch. Oh, and Wave has been shut down. 'Bout time.
Despite the H1N1 scare and several ill-timed snow closures this winter, the administration still seems not to fully appreciate our telecommute and web-enhanced learning options. In the event of a snow day, epidemic, power outage, water main break, flood, or anything else that closes part or all of the physical campus, there may be no need to fully shut down university operations. If employees are able to work (and if the university would derive benefit from them doing so) then the policy should be modified to allow them to telecommute or use other technologies like our Vista online learning system to keep our operations running. During a campus closure, students in face-to-face classes could check Vista or look for email communications from their instructors regarding pending assignments or alternative assignments. An instructor of a face-to-face class could tell students to continue to work in Vista and view a podcast, take a quiz, etc. Things don't need to grind to a halt just because we can't be on campus.
However, we do have a new kind of "snow day" to contend with. Northern Arizona depends on a single data line that comes up from Phoenix, so one guy with a backhoe can take out our Internet, cellular phones, and even the ATM machines.
eInstruction clicker
I've been one of several support people for classroom response systems, aka clickers, here at NAU since we started using them in large enrollment classes several years ago. Many studies show that they can be effective at keeping a large lecture hall full of students engaged if a few conditions are satisfied.
- The instructor must be comfortable with the technology.
- The students must believe that the clickers help them to succeed.
- The technology must work.
But could the lecture format and large class size be part of the engagement problem? If so, then maybe we're using clickers to treat the symptoms (trying to keep students engaged when the class is not otherwise engaging) rather than curing the disease (making the class more engaging).
Instructure: The best little LMS I'm afraid to buy!
On Slashdot, one of my favorite "news for nerds" websites, they use the term FUD. As in, "If a dominant company can't win by building the best product, they can destroy the competition by instilling fear, uncertainty and doubt in the minds of their clients to keep them in line." There's a great little startup in the Learning Management Systems business called Instructure. It is far superior to anything else I've seen. But we won't be moving from Vista to Instructure, because of what happened to Prometheus, WebCT, Angel, and D2L. The only safe moves are open source or the biggest commercial vendor. There's just too much at stake to move to the little guy and six months later to see him get swallowed whole, or sued and shut down, forcing another painful transition. We've been down that road. That's what happened to WebCT. But if Google or Apple were to buy Instructure? Now that's a game changer.
7.8.2010 Update: Blackboard just bought up two more innovative companies, Elluminate and Wimba, in one day!
2.18.2011 Last week Instructure went open source...now the best LMS is also the free-est!
The Chief Information Technology Officer here at NAU, in the Spring 2009 newsletter, wrote, "I sense everyone can agree the new Microsoft Exchange system using Outlook is a vast improvement over our legacy email and Oracle calendar systems." I must ask, do we have any actual numbers to back up that "sense"? Microsoft's Exchange based e-mail system is a modest improvement over the decrepit legacy system it replaced but, come on, it's e-mail, not rocket science! And the new IRIS calendar, which I need to use every day, is a mess. Although I hear lots of people gripe about IRIS, particularly the calendar, it's even worse if you use a Mac or a Linux machine, because Microsoft deliberately provides an inferior experience on non-MS operating systems. Too bad we couldn't have moved the whole university to an open, stable, easy to use, and OS agnostic system like GMail. You know, the system the students got. Then we could all share calendars, it would work across platform, and my student workers wouldn't need special accounts on IRIS in order to interact with staff. I don't have much confidence left in Microsoft "solutions".
If you're looking for attention, there are really only two easy ways to get it, and one hard way. The hard way is to be the best. But if you're not Stanford or M.I.T. what do you do? The easy ways are to be the first or the last at something. And of those, being last (Are you listening, Arizona?) usually won't get you the kind of attention you want. So what sort of things could you do? It has to be a bit radical in order to get noticed. Adopting a new technology and getting lots of publicity for it can backfire, so choose carefully and then fully commit. A few schools are publishing all of their course content online for free. This is relatively safe, because they don't give you a degree unless you pay. Several schools give away iPods to all incoming freshmen. Others have instituted laptop programs which give each student a new computer. I'm waiting for some prestigious school to say no to Microsoft licensing and I think that's coming soon, given the quality of open source alternatives like Ubuntu Linux and Open Office. One fad that's on the rise is banning a technology like WiFi or Facebook from the classroom or campus, which should appeal to the Luddites. But whatever it is you try, you will need to stick with it for a while. Keep records, give it a chance, expect some bumps along the way, and adjust as necessary. If your efforts are successful, you will be able to bask in the glow of all that publicity! Of course, whether or not this attention seeking behavior does anything to improve learning is another matter entirely. Right now, I'm only talking about marketing. After all, being awesome is more fun if someone notices.
Blackboard app for iPhone
In addition to designing for accessibility, such as including text transcripts for videos and alt-tags for images, we should also be designing our web courses to take full advantage of the emerging smartphone platform. Our students, who are early adopters of devices like the iPhone, iPod Touch, Blackberry, Palm Pre, Android and Zune HD, want to use these devices to access web course content. At a minumum, this means designing lean HTML pages with CSS rather than uploading Word and PowerPoint documents. If HTML is too hard, then PDF is second best. Flash and Java are also to be avoided, at least for now, but pages done with AJAX are working. I recently discovered that there's an iPhone/iPod Touch app that lets you interact with Blackboard's next generation Learn product, and a possible successor to our Bb Vista learning management system. Any LMS replacement we consider should provide a good smartphone interface.
As long as we're into paying for software, the answer is "Yes, we should." NAU supports Macintosh computers as well as Windows PCs, but while Windows OS upgrades are free to the PC end users, Mac using departments and individuals must pay for their own OS upgrades. Apple has a very reasonable deal on educational licensing. Why aren't we using it? Central IT says it's not their responsibility to provide free OS software for Mac users even though they list the Mac as a supported platform at NAU. When I addressed the fairness issue, I was informed that Mac users, it seems, have chosen this platform and therefore must accept the "disadvantages" that come along with it. Disadvantages like getting attitude from certain IT "professionals." So changing a bad policy is apparently not an option? Mac users should just suck it up? But if the entire campus shouldn't have to subsidize the Mac users, why do we all, especially the Mac users, have to subsidize the Windows users? Why do the Mac users, who get no benefit from the Windows OS license or the anti-virus software license, have to subsidize the cost of keeping the Windows PCs secure? I was asked whether the university should, by my argument, be required to support the Linux OS as well. My answer is "Absolutely, if that's what people want to use." Especially since Ubuntu and Open Office are free of licensing charges and just as secure as a Mac. Think outside the Microsoft box, people! It is an increasingly bad deal.
Update: As of Aug. 17, 2010 or thereabouts, the MacOS, iLife and iWork suites are now available at no cost to the end users of NAU owned equipment. That's a pretty big step forward!
The following graph, while a little short on numbers, paints a pretty good picture of the Learning Management System landscape from its inception in the late 1990s to present day. The article is a good read. Note that most of these systems were originally developed by universities and later spun off into commercial entities or open source tools. To me, one of the most interesting points on the graph is Bb's history of swallowing up and shutting down competitors from Prometheus, to WebCT, to Angel. More difficult to illustrate on the graph are Bb's legal tactics to further extend that dominance by patenting basic LMS concepts, suing Desire2Learn, and intimidating the open source Moodle and Sakai communities with promises to probably not sue them. What is very clear, however, is that Blackboard's market share is not the sum of its acquisitions. They have been more successful at reducing choices than at holding onto clients, and I expect that defections will continue. Unfortunately, the open source efforts have not been as successful as initially hoped and they do not, at present, offer a competitive product. Universities are faced with a tough choice: go with an inferior but more open product, or with a superior but monopolistic entity. What I'd be most nervous about is choosing a small commercial product that might get swallowed up in the near future, forcing a second painful migration sooner than necessary.
Change over time in market share of the big LMSes.
According to recent campus surveys at the University of Virginia and University of California Davis, Mac use among students has tripled since 2006, is now between 20 and 25 percent, and is still growing steadily. So is it still acceptible to tell the Mac users to just use Windows or they can't participate in central IT services? Are Mac users being discriminated against? Not specifically. Things are equally bad for Linux users and in some cases for Windows users who don't also use Internet Explorer or the other MS products. It's more that Microsoft has a well-deserved reputation for using its monopoly power in one market to extend its leverage over other markets. Windows, Office, Internet Explorer, Windows Media, Silverlight, ActiveX, ASP, Live Meeting, and Exchange all work best with each other, and often work poorly or not at all with other platforms, browsers, plug-ins or applications. But the choice is ours. Why would we knowingly and needlessly adopt products that disenfranchise 1/4 or more of our students?
Growth in Mac OS Share at the University of Virginia over the past 10 years.
5.21.2009 After the pathos of the Mohave Experiment, where they tried to trick people into using Vista, and the incomprehensible Seinfeld ads, it was as if Microsoft had hit rock bottom. But Windows 7 looks promising and the new ads are better, because they attack the Mac where it's legitimately weakest; on price. Although the ads are staged (the shoppers are actors, not real people) and they mix in a lot of half-truths and carefully pick the price points for maximum advantage, it appears that they are having an effect. Or maybe it's just the economy? Sales of $300 Netbooks are booming, but desktop PCs and more expensive laptops are not moving. Except for those made by Apple, which is weathering the recession far better than other tech companies. This approach might also backfire. If you tell people PCs are cheap and Macs are expensive, people might draw the conclusion that you get what you pay for. And when you go to the store, you will also find that the cheap PCs are not the good brands, and they come with a crippled version of Vista for home users and a lot of annoying trial software that you will want to uninstall. If you price out a reputable brand PC with decent performance and a good anti-virus application, you're back in the Mac price range. Microsoft would have you believe that Macs are the BMWs of the computer world. But studies of total cost of ownership (TCO) show that while Macs don't cost any more than a good quality PC, they last longer and are more reliable. That makes Macs the Hondas of the computer world. So if you want cheap, do Linux. You can't beat free software on cheap hardware. If you want quality, get a Mac. So where does that leave Windows? In the middle. Neither cheap nor good. Kind of like General Motors.
People in the open source movement like to make the distinction between free (as in speech, liberty) and free (as in beer, gratis). Still, many people don't get it. Let me make it very clear. There is no free beer. If you adopt an open source software alternative to a commercial product, there may be significant hidden costs. You no longer have a warranty or a help desk to call for support when something goes wrong. If you discover a bug, or if you need the software to do something that it doesn't do, you need to fix it yourself or convince the community to do so. If the community decides to move the code in a direction you don't like, you must choose between going along for the ride or getting orphaned. If you make any customizations to the code to make it work better for you, the community will need to be convinced to accept your modifications and fold them into the codebase or again, you are orphaned. You will probably need to hire more people locally to support the product. With complex server based software applications, you would need to integrate the software on your own or pay expensive consultants for help. As any manager knows, people (salaries and benefits) are the most expensive part of the operation. An IT professional costs upwards of $60,000 per year in Flagstaff (and that's a bargain compared to most places), and add to that another 30% or more for benefits. It doesn't take long for the higher maintenance "free" product to catch up to the cost of the fully supported commercial product. So what you have to weigh against the cost of free software is the cost of supporting and maintaining it, usually measured in additional staffing. Open source might free you from a commercial software license, but it isn't free of cost. And you aren't even free of the open source community unless you choose to branch out on your own. That's not to say that open source is necessarily bad. We just need to go into it with our eyes open. And if someone trys to tell you that adopting an open source product to replace a commercial one will save you money, ask them to show you the cost analysis before you think about how to spend those savings. Chances are, they are just wishing for free beer.
Monopolies in computing, and the conformity that results, make life easy in the short term but they are ultimately bad for everyone. If we're all using the same hardware and software, training and support are so much simpler, and there are fewer compatibility problems. When a new version of a software suite comes out, "just move everyone at once" is the mantra of IT. So why is uniformity a problem? Allowing choice creates competition, which reduces the price and generates innovation. Often, we're pressured to buy a second product from the same vendor solely because it works with the first product. At that point, we're locked in and have to keep paying up or suffer the consequences. Monopolies create monocultures. Both in biology and computing, infections spread more rapidly and are more devastating in "clonal" populations where, once the exploit is found, everyone is vulnerable. A major software "upgrade" will slow down older machines, hastening their obsolescence, and reduce productivity while people re-learn the tool. In most cases, people are happy with the current version and justifiably resist the change. And if everyone is doing everything exactly the same way, there may be less creativity and less innovation. We need to take the long view and let people use the tools they want and keep the software they're happy with. It's ok to resist change that is not compelling. We don't need to spend that money. We need to demand from the software vendors that users of older versions of their software can open the new version's files, and that people on Macs, Linux and Windows can all share documents seamlessly. We need to push software makers to keep older versions of their software patched, so that they remain secure. If they don't comply, switch to a vendor that will. We'll all be better off when we stop moving in lockstep.
Here at e-Learning, we work in a lovely historic old building with single pane windows (for those of us lucky enough to have a window), an HVAC system that makes it either too hot or too cold all the time, asbestos in the walls, floors and ceilings, and a really outdated data network. That wouldn't matter so much except computing is what we do, so the Internet is kind of important. So we've got superfast gigabit ethernet to the switches in the network closet just down the hall and nothing but sub-cat-3 (untwisted pair) to the desktop. It's like the superhighway runs right over our heads but there's no on-ramp. Can't penetrate the walls because of the asbestos, so what should we do? Here's a resource page for testing network speed. It's ironic that my home network is faster than the one at work.
Update: Problem solved in an unorthodox manner. Ask me for details. We now have 100,000 kbps throughout the department, up from 10,000 kbps. Ten times faster is a rather noticeable improvement.
How do we represent numeric data to make it more comprehensible? Over one third of the human brain is dedicated to processing visual information. So whether it's what name to give your baby, looking at Napoleon's March into Russia, who endorsed who in the presidential election, what words are related to the word you type, or when to buy a new car, taking those raw numbers and representing them visually makes all the difference. Here's a great new example. Check out NewsDots to see how the stories in the news are interrelated. Here's a new one: Watch Darwin's "Origin of Species" evolve over its six editions at the Preservation of Favoured Traces website. Very cool.
Walt Mossberg, well-renowned technology columnist for the Wall Street Journal, recently labelled large IT shops such as those found at university campuses "the most regressive and poisonous force in technology today." Wow! Them's fightin' words. Is there something to this, or is Walt just trying to boost circulation? At our own campus, Central IT is very centralized, has an enormous budget and, I'm sorry to say, is increasingly slipping in its ability to deliver the reliable core services that our users take for granted. A recent Educause study found that security has taken the top spot for the last three years in IT priorities, and has been in the top five for the last 10 years. Central IT has been spending all of its time trying to shore up the foundation, but they never get to the good stuff. That's the reason the fed-up end-users are doing an end-run around those flaky central services in favor of the distributed, free and open, stable and secure alternatives offered by Google and Yahoo and a vast collection of Web 2.0 companies you've probably never heard of, where the uptime is 99.9%. As some colleagues from ASU rather bluntly put it, "It's time for free range learning. The central IT teat has dried up."
Bill Gates has retired. But does it matter? Can Ballmer turn the ship around? Vista has been a PR disaster so momentous they are changing the product name and spinning like crazy, and the latest version of Office is a bloated buggy mess all wrapped in ribbons. Nobody in my office wanted to "upgrade" but we started getting sent files we couldn't open from people who had the new version, and so begins the forced migration. Everyone who moves to the new version of Office is like a zombie who infects 10 more people. Open Office is making modest gains at Microsoft's expense, and Google Docs has reminded Microsoft that innovation in an office suite is still possible; they have created a collaborative experience far beyond "Track Changes." But Microsoft's twin cash cows are in trouble. Now Microsoft has discontinued selling eight year old Windows XP. The logic seems to be that if the world won't willingly adopt Vista, we'll force the matter. Will Apple or Linux benefit? What about OpenOffice? Sure, they're doing fine, but probably not much. Microsoft is the new IBM. It may be a slow-motion supertanker headed for the rocks, but it will remain profitable for years to come. Cringely has a good article on the subject.
9.18.08 Ads about nothing: Have you seen those new Microsoft ads with Jerry Seinfeld? They should have expected that a guy who's comedy is about nothing would make ads about nothing. Did I hear right that they spent $30 million on that ad campaign? I'd want my money back. Sure, the ads are mildly amusing, but only because it's so ridiculous to imagine one of the world's wealthiest people bargain hunting at the discount shoe store. What does this tell me about Microsoft, Windows Vista, or the PC platform? Ummm...I have no idea. Well neither, apparently, does anyone else. The ads have now been abruptly pulled and Microsoft says, "We meant to do that."
10.27.08: The Cloud: Microsoft seems to have decided that if the world isn't interested in Vista, they must not want an OS of the standard sort at all. This throwing-out-the-baby-with-the-bathwater approach has led them to The Cloud. But Apple seems to be doing pretty well with MacOS X and Ubuntu and other Linux variants are wildly popular. In fact, even Windows 7 looks very promising. Maybe someday, when everything is networked everywhere, the cloud concept will make sense. But if and when it does, you can bet that Google will be there first. Perhaps Microsoft's strategy is to get to version 3 by the time a cloud OS becomes viable, so they will have a product that's ready to market?
10.14.09 Update: Google the terms "Danger and Sidekick" or read this to learn about Microsoft's latest debacle with cloud computing.
Recently our university has been encouraging employees to turn off lights, turn down thermostats, and to attend conferences virtually instead of in person, ostensibly to reduce our global carbon footprint and be more green. It's hard to tell whether that appeal is of pure intent though, because it both helps the environment and saves the university money, while placing most of the hardships on the employees. It reminds me of the hotels that plead with you to save our planet by not changing your sheets and washing your towels each day, but then feeding you breakfast with disposable plastic plates, cups and utensils. Clearly, the green they are most interested in saving is their money. If the appeal was that they could lower your bill if you were less wasteful, that would be a far more sincere and convincing argument.
What kind of green are we being asked to save?
Are we as quick to do things that are good for the planet if they don't save us money? We have an award winning LEED certified (platinum level) green building on campus, which is great, but why aren't all our new buildings green? At least part of the reason is that they are more expensive to build. Having one makes a statement. Having many is a bigger comittment. What about things that could help the environment, but would force us to change our time honored management practices, like telecommuting? Telecommuting takes cars off the road, reduces parking demand on campus, saves the workplace electricity, and potentially lengthens the workday by eliminating commute time. It also increases efficiency by eliminating many of the distractions of the office. When I telecommute, I can be incredibly productive in my quiet home office. Unfortunately, management can be suspicious of telecommuting because of the assumption that an employee in his/her seat at work can't goof off, and because an employee working from home can't be as closely monitored. Here's the truth about telecommuting. If managers set clear deliverables and check progress regularly, then it's the quality and quantity of work and not the number of hours spent warming a seat that gets measured. Bottom line? If you have responsible employees and effective managers, telecommuting works great. But it's kind of lonely. And if you have flaky employees, there's no guarantee they're being productive even if they warm their seats at work for eight hours a day.
As a consultant for the Beyond team at Blackboard, I assisted in a small way in the development of the new Scholar product. Try out this cool new social networking social bookmarking tool that integrates with Bb Vista. Not to be confused with the other Scholar by Google, which is also cool, but another thing entirely.
College campuses are surprisingly numerous in the metro D.C. area. I interviewed staff and took lots of notes and photos on how other schools spend their money on classrooms and technology. I'm pleased to announce that my Learning Spaces article was recently published in Educause Quarterly. You can also read the more comprehensive web version of the report or browse the photos on Flickr.
So now we've got iTunesNAU working great, but a funny thing happened. Teaching was not revolutionized. Hmm. I wonder why not? Podcasting has its place, don't get me wrong. But podcasting takes a method of instruction (lecture) we all agree doesn't work very well and makes it less interactive. So if I told you that you could choose between going to a dull lecture at 8:00 am Tues/Thurs or watch/listen to a recording of the dull lecture on your own time, which is better? Exactly. The one that has an off switch.
It was announced with pride by an IT manager in a meeting I recently attended that "we throttle bandwidth on ResNet." Ironically, just yesterday I listened to educational technology expert Phil Long of MIT describe how students there are encouraged by the administration to use the bandwidth and the big screens in lecture halls at night for hosting Halo tournaments! While I know we're not MIT, maybe we should be asking those heavy users on our residential network how we can serve them better rather than shutting them down?