If Everything Will Eventually Be Obsolete, Should We Bother Learning Anything At All?
from the miseducation dept
For many years, Jakob Nielsen has been seen by some folks as the authority on web usability — particularly people in the media, who find it really hard to write articles about the topic without quoting him. Still, plenty of people don’t really agree with his ideas, even though web usability and user-interface design remains an important topic. He’s ventured a little bit outside of that with the latest self-published column on his web site, saying that teachers and schools are screwing up (via Guardian Unlimited) by teaching kids how to use specific computer applications instead of “life-long computer skills”. He also offers a bunch of such skills kids should be taught. Some of them are just silly, like “workplace ergonomics” and, unsurprisingly, “user testing and other basic usability guidelines”, while others seem rather spurious or vague, like skills to deal with information overload and “writing for online readers”. But the problem with the others — like “search strategies” and “computerized presentation skills” — is that it would make little sense to teach these general topics without using something like Google or PowerPoint to illustrate them. Nielsen doesn’t seem to realize, or perhaps he’s just unwilling to accept, that training people in certain applications or the skills to use them isn’t a mutually exclusive idea that precludes learning deeper concepts. Quite often, the best way to learn these “life-long skills” he’s so fond of is by, you know, actually using the tools and applications that are available today. Plenty of people got their start in computers not by being taught anything in a formal setting, but by just starting to mess around with them, and in time, many of these deeper concepts were absorbed unconsciously or simply learned — despite Nielsen’s contention that students are unlikely to learn them on their own. Nielsen is saying that students shouldn’t be taught anything concrete about computers because the programs in use today will eventually be upgraded or replaced, but that’s pretty pointless. Such a claim would seem to undermine Nielsen’s entire point, since after all, it seems unlikely that the “life-long” skills he’s pointed out will stand up to the test of time, either.
Comments on “If Everything Will Eventually Be Obsolete, Should We Bother Learning Anything At All?”
you make some good points.
the main thing that many schools/teachers fail to teach is critical thinking. we get so frustrated with the students and what they didn’t learn from their previous teachers that we are inclined to hold their hands too much, coddling them a little bit more each year.
if you are able to develop the kids’ critical thinking, they will figure out so much more by themselves. that is probably what jakob is getting at, but maybe he’s saying it in the wrong way. you can give kids some skills in powerpoint, but you also need to SHOW THEM that both the hard and soft skills they learn there are indeed applicable to other programs, and in other situations that have nothing to do with “computerized presentation skills.”
i learned how to program from messing around with BASIC on an Apple II+ in the early 80s. the problem-solving skills i began to develop back then served as a foundation for both careers i’ve had since: software developer and English teacher.
Well, I’m currently doing a multimedia course and we use Photoshop CS2. the skills I’m learning in that apply to almost every piece of image editing software I have came across. I don’t see how you can learn to use a program and not realize that skills and procedures you are learning and picking up are applicable to other programs and software now and in the future. Even if you don’t ‘realize’ you would use them subconsciously anyway.
I agree, I’m doing Computing in Uni, we’ve been told to use Macromedia Director which has been discontinued now, its not so much about learning how to use specific software, its learning the skills to be able to adapt to a different program, i’ve never used Director, and after I graduate I probably never will, computer skills are learnt from practice, you can’t teach computer theory without the hands-on.
In 20 years, those people like Paul`and Ash who are disagreeing with what he’s saying will come to realize that he is right.
I took a “computer science” course at a university around 1987, as part of the science requirement to earn my degree. While not everything I studied at the university has proven all that useful, absolutely nothing I studied in that computer course has any value whatsoever today. One of the major parts of the course was learning to use Lotus 1-2-3 for DOS. Knowing that has zero value today. Not even any of the skills involved in that have any use today.
Just wait another 10 or 20 years, Paul`and Ash. None of those skills you’re learning now will have any value at all any more.
Sorry but no
Plenty of the skills I learnt 10 years ago are worthless now and yes in certain areas my knowledge has slipped (sorry but when it comes to DOS 5 I am no longer your man…)
However in other areas my knowledge has grown dramatically as my career has turned corners etc
I would LOVE to have kept up to date with everything but I simply don’t have the time or capacity, as each of the fields is constantly expanding as well. It’d be like trying to climb a tree by hugging the trunk and then expecting your arms to grow longer and still allow this method half way up
This is why you need to continue to develop your skills in the workplace. Yes analytical thinking techniques would be very useful things to teach properly (I get sick of technicians applying the ‘A banana is yellow so all that is yellow must be a banana’ logic to situations), but I also get sick of people who come to the workplace knowing the theory of 3D relational databases or whatever but not being able to type a single line of SQL
There is a neat little balance between theory and practical and in all honesty I think the schools in the UK are closer to it than the universities (for their respective levels of eductaion obviously)
Using tools available at the time gives a student knowledge in them, which can usually be leveraged in whole or in part to other applications, but more importantly makes that knowledge hands-on, remember: “Tell me, and I’ll forget. Show me, and I may not remember. Involve me, and I’ll understand.”
So I suppose you never had to use the command prompt in Windows… knowledge of DOS certainly does help with that… and you couldn’t apply any of your knowledge of Lotus to any other office product?
learn fundamentals, not specifics
Can anyone imagine an engineering currilculum consisting of courses like “How to use Screwdrivers” or “The History of Hammering”? It is far more important to teach how to analyze a problem and to formulate solutions using any tools at hand, whatever they may be, even if they are not invented yet. This is where we will spur innovation, in methods, hardware, and many other disciplines. If all you learn is Lotus 1-2-3 with no context, one cannot easily move to other spreadsheet apps– but one should, in an ideal world, logically relate to and flow into the other, if it is approached properly.
Does anyone recall having heard the axiom that “If the only tool you have is a hammer every problem will resemble a nail”?
I teach introduction to computer science classes at a major university. We use C++ to teach programming. Notice I didn’t say we teach C++. I teach loops, not C++’s loop, yet I use C++’s loop as my example. I teach debugging. Sure, I use the .NET visual studio, but the principles transcend the software tool.
I have such a problem with educational programs teaching a network course that essentially just gets prepares someone for the first Cisco certification test. Principles outlast technologies, which is one reason technical schools don’t get the respect that four year universities get.
My students’ degrees should be relevant 40 years from now. Of course, the specific technologies will be long gone and they will have to be a lifetime learner to stay relevant, but the approaches to solving problems and learning new technologies will not change that much.
My brother in law went to technical school and has made himself quite valuable in his current position, but he can’t get out of his current job because his degree only prepared him for a version of software that is no longer supported: except by his company.
I think you guys are wrong I agree with everything he says. As a student I can safely say that this is a much bigger issue then people think it is. The vast majority of kids I know have NO skills or competence when it comes to adapting to a new program or computing situation. All their knowledge is specific to microsoft/google technologies/products. The other problem is that whenever they are confronted by a problem, instead of trying to understand what is going on, they just give up and say COMPUTERS ARE TOO COMPLICATED… Adults are actually much worse so finding teachers who are ABLE to teach this level of confidence to students is next to impossible.
i question whether the ideals of teaching the “computer using process” as opposed to simply teaching “program using process” is even a tangible thing (example: understanding the concept of loops and conditional logic as opposed to c++ loops, as mentioned above). it’s similar to math, you can teach someone derivations by example, but there are many students, when presented with another problem, who cannot extend their knowledge of working with the previous examples to the unfamiliar ones. all you can do is provide the process (by example) and hope they understand how to adapt the process to new problems.
Everything I know I....
This guy is nuts. I’m an IT professional, a 3D animator, and general CG fx person. My IT skills I pretty much taught myself. My lerning started way back in 7th grade when I would play with computer systems. My 3D skills I learned when I started college as a hobby to help get away from the daily grind of class work. I went to a top 10 college for computer science expecting to learn a whole lot. Sad fact was that I already knew most of what they were to try and teach me. The same went for my 3D skills. I recently went back to school seeking a degree in 3D animation and vfx. I was again dissapointed to see I would agian know most of that they were going to try and tech me. The only thing of value I got from both places were the degrees. The concepts you must know in order to put any of these software tools to use apply to more than just the tools. And all these applications are just that. Tools. To paint a masterpiece you need more than just some paint, canvas, and brush.
I like the analogy that Mattman10987 used about construction tools. Computer software can be modeled in the same context. As an example of what Mattman10987 said in his article lets look at tools. We know in general that a screw driver is used to put screws in to wood or metal. Now is it not still a screw driver if it’s a flat head or Phillips type? Same could be said about a hammer, a hammer comes in various sizes and shapes but it still has a primary purpose of driving nails (and other uses beyond the scope of this article.). So software is really no different, a word processor creates and edits documents no matter how it looks or the tools it has to get your job done.
You can expand this logic to cars. The main thing is you know a car has 4 wheels a motor and steering, no matter what fancy gadgets it has it does not change that fact that you still have a car that its primary purpose is to drive.
Now it is nice to understand ‘ALL’ the functions of a given software application but does the majority of people really have to have this understanding? ‘No’, most just like some people I know that drive cars just want it to work, the get in and turn the key and go, occasionally they remember to put gas in and pay attention to the alarm lights in it.
So a hammer drives nails, a screw drivers drives screws, and a word processor creates and edits documents. It is all conceptual and that is what we need to teach, the extras we can learn on our own.
The ability to apply theory is just as important as learning it. If I can figure out how to solve the problem, but have no idea how to make my solution actually work with what I’m doing…what good does that do anyone? You do have to apply the theory while you’re teaching it, so that as the student moves on to other things they can relate the old to the new and while the tools change the ideas behind implementing the solution do not.
Nielsen is right...
… well, to some degree. Teaching people deeper concepts, such as critical thinking and abstraction can be extremely useful, and not just with computers.
But……. those are longer term skills and in the short term folks need to learn current technology, including (sometimes) specific applications or hardware.
Will my software be obsolete in several years? Maybe. But thats not the point if I can’t use it and be successful right now.
1) Learning to use the tools of the trade is absolutely essential. I graduated with a degree in MIS, but the curriculum dictated that the instruction was almost exclusively theoretical, so as not to tie us down to one database paradigm that could be outdated soon. Seems reasonable, but do you know how hard it is to find a job when you’ve never used anything besides the very basics of Access? What am I supposed to do, go set up an Oracle server in my apartment for tinkering? I’m now in accounting, and my degree was mostly a waste.
2) Learning the finer points of almost any subject is also a waste of time nowadays. Such a huge portion of human knowledge is just a few clicks away that the rote memorization of facts of yesteryear is obsolete.
Education needs to be more focused on laying a foundation in a broad number of areas without getting too incredibly detailed in any one, but with just enough depth that you know how to look for the specifics as needed.
that’s what i question, whether it’s possible to really teach this application of theory. you either get it or you don’t…
This discussion could benefit from an understanding of Bloom’s taxonomy of educational objectives (http://www.coun.uvic.ca/learn/program/hndouts/bloom.html).
The problem is that, whether theoretical or application, many programs teach “Knowledge” and encourage “Comprehension” and to a less extent “Application,” but fail to require students to demonstrate that they can perform “Analysis,” “Synthesis,” and “Evaluation.”
When a student gets out of a program, s/he should have an application toolset and theoretical foundation that s/he learned, yes, but more importantly, they should be able to take that ideas from multiple domains and apply that critical thinking to new problems.
Many small programs suffer from emphasizing “tools” (so their students can get a job right out of college), and many larger program suffer”theory” (so their students can have that reputation), but both fail to really prepare students.
Full of it
Your full of it, all software is obsolete in a couple of years and training kids on specific applications without a general understanding of computers is useless.
I could highlight this by pointing out the troubles people had with IE7 and its new interface. What is the friggin point of learning an application if MS or another company is just going to re-invent how it works in a couple of years.
General computer knowledge doesn’t go obsolete in a few years only specific program knowledge does. Stop wasting kids times with MS applications that will become incompatible with the newer versions before you can blink your eye. Thats not knowledge that is wasted time.
Full of it
it’s only a waste of time for those who can’t adapt what they’ve learned to other programs. you can’t generalize and say that nobody will be able to adapt to a new program by learning an older program. it’s how anybody who grew up with a computer developed their computer skills. it’s how, after teaching myself how to use the programming logic of vb, i was able to sleep through java and c++ classes and still have no problem writing code in those languages. i learned the logic of high level programming, after that it was just a syntax issue, which really doesn’t require “learning”. my real concern is whether this ability to adapt is a teachable thing…
Training and Learning
As other’s have noted, it’s important to learn both general skills (and understanding the process, of the “why”) and also specifics and how it’s applied (the “how”).
You really can’t separate the two and be very effective.. they go hand in hand. If you understand why something works, and not just how it works, you gain a deeper understanding and are able to figure things out when the “how” changes.
To allude to the article, it’s stated that learning a specific application isn’t as useful as learning lifelong computer skills… one is really just a subset of the other.
Learning Microsoft Office 2003 for example, is a snapshot in time and is a component of a larger set of skills understanding how formatting works, how the keyboard, shortcut keys, printing, mail merge, etc., (whatever skill you choose to insert here). As the Office program changes, the underlying understanding of why things work they way they do, and why they NEED to work the way they do remains fairly constant. So as the programs change, the person’s ability to pick up the new skills increase.
That’s pretty much the beauty of most modern OS’s, in that they all are built upon a similar set of commands and function the same way. You can just from one program to another. Even going from Windows to Linux, there are enough similarities for someone to make the jump fairly easily (digging under the hood of course there are a TON of differences).
1-800-324-0946 – Office
1-888-796-9675 – Fax