Is It Plagiarism… Or Is It Wikipedia-Like Collaboration?

from the rethinking-plagiarism dept

Over the past few years, we’ve been forced to start rethinking the concept of plagiarism quite a bit, with help from folks like Malcolm Gladwell and Jonathan Lethem, who have both started to notice that creative works tend to build on those who came before — and that derivative works can often be an artform by themselves, if not an inspiration for additional creative works. Still, there is something of a kneejerk reaction to the idea that plagiarism is bad — potentially for very good reasons. Passing off someone else’s work as your own isn’t particularly nice and from a social standpoint, getting caught doing so can really damage someone’s reputation. One of the biggest concerns these days, not surprisingly, is the increasing claims that students plagiarize all sorts of things from the internet — even to the point where some feel that children today don’t even feel that it’s wrong to simply pass someone else’s work off as their own.

A new study looking at “personal essays” written for university admission supports this theory by pointing out repeated examples of plagiarism, where applicants pretty clearly took “personal essays” that were from certain websites and used the ideas and personal experiences in them as their own. One of the most popular, apparently, was an essay about a fascination for chemistry that began with the applicant setting fire to his or her pajamas at age 8. Apparently, that particular scenario happened to 234 individuals… Or, more likely, just one of them, and the rest took the idea from the fact that the essay was posted to a site showing “successful” personal statements. Most of the essays weren’t plagiarized directly — they just built on the idea. Of course, rather than just condemning the concept, Jeremy Wagstaff has a very interesting observation. He suggests that perhaps many of the applicants don’t think of it as “plagiarism” but more like wiki-style collaboration. That is, they’ve grown up in an age of internet collaboration where no one person “owns” the content, but that content is an ongoing process of ideas that anyone can participate in. In such a world, the idea of “plagiarism” has little meaning. Adding a paragraph to a Wikipedia entry isn’t plagiarizing the rest of the entry.

Of course, some Wikipedia detractors may find this to be yet another troublesome sign — that Wikipedia is teaching children to plagiarize. However, a more reasonable way of looking at it, is that it’s teaching students the value of collaborative work, and building on the ideas of those who have come before them. That’s a valuable lesson. None of this, of course, excuses passing off someone else’s work as your own — especially in a situation like a personal statement to gain admission into a university. However, it could help to explain the issues of plagiarism in students that shows it’s not all about just getting off easy by copying content, and more about a more collaborative approach to content. If that’s the case, the response shouldn’t be to focus on the moral or ethical issues of “copying,” but simply doing a better job of teaching students the borderline between collaborative work and independent work.

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Is It Plagiarism… Or Is It Wikipedia-Like Collaboration?”

Subscribe: RSS Leave a comment
Anonymous Coward says:

Students will always try to find a shortcut, and plagirism is a shortcut. To be honest, I know quiet a few fellow students that don’t feel guilty for copying someone else’s work. Especially if they believe themselves to be capable of doing the work, but not have enough time to get it done as neatley, or less often out of lazyness (I say that out of personal experience with said students).

It would help if teacher’s knew what plagirism is before they say its bad. It wasn’t until college I met an english teacher that knew what it really was.

Sixth grade teacher says:

Re: Re:

You went to college??? Plagiarism is spelled wrong twice, you used the word “quiet” instead of “quite,” the word “neatly” doesn’t have two e’s, and “laziness” doesn’t have a “y.” Your English teacher should have taught you that the “e” should have been capitalized. It is wise to remain anonymous!

thinlizzy151 (user link) says:

That's just it...

will the schools, which are having a hard enough time teaching kids these days the basics of proper English and Math skills be able to get them to understand both the value of originality and collaborative effort and the difference between the two? Will they be able to get them to appreciate where collaboration ends and plagiarism begins? And let us not forget that it is also OUR job as parents to actually take the time and effort to teach these basic values to our children, and not be so busy with our own lives and leave it all to the educational system

misanthropic humanist says:

myth of originality

Is there such a thing as an original idea? Does an idea ever die, or be born, or does it just hop from one host to another mutating as it goes? Potters “Cold Lazarus” explores this in asides, but very few people seem to really understand the “meme”, that we are living through other peoples ideas and our own twists on them, when taken up by the world, simulateously immortalise us and rob us of ourselves.

But you can’t have it both ways. What you won’t set free will never live. Creation is not an inviduals act, it is by definition an act of sharing. And for the creative producer, every true creator has the humility to recognise that ideas seem to come from a source that is not the true self, we are the sum of experiences and hand-me-down knowledge.

Those “common memories” are a form of what used to be called folklore. We all have experiences and thoughts our conscious mind finds easier to express in the words of other people. I think Chomsky once said, though he rephrased Ayer, that without this gestalt communication would be impossible. Language is common not because of words and symbols we agree on, but because of meanings and interpretations we assign to them, it’s all about the semantics.

It’s the ego clinging to identity that makes us believe ideas can belong, be owned by people. The pressure to be an individual in a grossly conformist world is what drives the urge to pretend we have special, unique thoughts. To be only one of 6 billion other minds that are largely the same is a frightening and depersonalising thought for many.

But whether we recognise it or not we are each participants in a vast open market of ideas. The older you get you more you notice it happening. Phrases you coin or jokes you make up come back at you. And it doesn’t matter who you are or whether you try, that’s the way life is. Everyone at some time, quite by accident, gives birth to something that will outlive them. I notice the warm feeling Mike gets when anyone talks of the “Streisand Effect”. I have my own expressions that when I hear them occasionally give me the good feeling of knowing I was instrumental in propagating or originating.

When I hear people make egocentric claims it reminds me of the line in the Simpsons, “You know those annoying radio commercials where two people yabber back and forth…. I invented those!” ( As if that were something to be proud of)

So you could say that plagarism doesn’t exist, because we are all plagarists because we fundamentally misunderstand the nature of creativity.

What does exist is lazyness, academic sloppiness, hubris, ego and insecurity from lack of identity that leads people to pretend to be what they are not. And there’s nothing intriniscally wrong with coopting and profiting from other peoples ideas either, the wrongdoing is when people seek to create a deception of originality.

The film and music business are havens for these types of people who lack the humility to understand their own cultural context and have the audacious conceit to pass off age old devices as their own. For example, the history of Disney is nothing but a litany of repackaged folklore.

It doesn’t help that we live in a system that recognises the concept of “intellectual property” and gives legal credence to it. If we truly understood art, science and creativity the concept of intellectual property would not exist at all.

Somewhat circularly, intellectual property comes about and persists precicely because of those insecure and conceited people who can’t stand to be part of the crowd and elevate themselves to glory and profit by claiming other peoples ideas as their own. It’s a bit like the problem of drugs, where supply, demand and motive are all wrapped up in a self perpetuating mess. If we scrapped the very idea of intellectual property tomorrow nobody would have the motive to “steal” anothers work, because no such possibility would exist.

Hua Fang (profile) says:

Re: myth of originality

Originality of any idea must be considered in the background of history of all ideas that have been published, globally or locally, throughout the whole human civilization. Before the information age (say 10-20 years ago), it had been close to impossible to check the actuality and truthfulness of any individual idea’s originality, but almost blindly accepting the confirmation statement from the mainstream media, which were always presented in the form of writing on paper or non-digital formats (microfilm, etc.). The difficulty to access the ultimate source for confirmation by any individual reader was generally too great to overcome. As a result, the myth of originality of many ideas had to remain. Nowadays, the new tools, such as Internet and powerful computers, bring us some hope in terms of ending the myth of originality for any individual idea. However, philosophically and technically, one great leap must be made in order to truly materialize this “originality checking”
as a spontaneous reasoning process. Such mechanism is proposed and presented on my web site at Please write your comments to me.

misanthropic humanist says:

Re: Re: myth of originality

Very interesting Hua, I think your work has the right spirit. I know this domain by some different terminology which I hope you can understand. A lot of what you discuss falls into the field I know of as “expert systems” and “knowledge based systems”, ES & KBS.

The problems that arise, for a computational solution to such a search problem, are the brittleness of natural language codification and parsing, and the problem of eliminating circualrity.

Let’s avoid the first of those, because we could discuss it all day long 🙂

Circularity is more troubling in many ways. Many things in human knowledge are defined in terms of themselves, for example the negative integers, commutative and distributive laws of mathematics and the nature of multiplication and addition are all tied up in the same nest. That doesn’t mean they are wrong or of no utility. But it does mean they can’t be represented in anything which is equivilent to a tree (non-cyclic graph). That in turn causes many problems for computers.

If you are considering only facts, let’s say football results, then no problem. Expert systems already shine when dealing with this kind of data. But consider medical expert systems for diagnosis of complicated disease. There may be omissions, inaccuracies and sometimes pieces of sub-problem knowledge that are simply wrong.
This isn’t intractable in itself, you can use backtracking and fuzzy forms of predicate logic that can help you route around partial data. But, finally there is a class of KB problems that are “strange”, not merely fuzzy. For example mathematical questions like the number theory example I just gave you. How would you decide who first “discovered” disjoint space, as it applies to set theory (union, intersect), logic (and, or) and arithmetic (add, multiply). Those are all equivilences of the same concept expressed in different symbolic form. In other words how can you distinguish between branches of human knowledge that appear to be disjoint but are in fact about the same thing unless you have profound understanding at the computational level? I suggest that if you have solved this you have already solved a much larger and more significant AI problem than the application you talk of.

Philosophically, your machine that could determine if any idea is original or not would be capable of original thought, if only by random hypothesis and exhaustive search. That would probably turn a few heads in the AI field too 🙂

Anyway, aim for the stars and you will achieve much even if you fall terribly short.

Btw, your description of the book conundrum has many resonances of Searls “Chinese Room”. Although I realise you are not proposing strong AI you have a problem still, that such a machine could only be trusted, but never proven. It’s behaviour to the outside observer may be spectacularly accurate and consistent, but it would only ever be “black box” behaviour. That would have certain, er legal ramifications, to say the least 🙂 I am impressed that you get this and arrive at the idea of internalised “experts” each with a with simultaneous hypothesis that cannot be reconciled, but can at least be given as output to be examined by a human. Have you studied automatic theorem proving? I think you would get a lot of great insights to spur your work on by reading this subject. You will see that many assumptions you make about comutability are much more difficult than they first seem.

K12 IT Admin says:

It's how teachers handle it...

Whether it’s treated as blanket plagiarism or just another source.

Here we combat it by disallowing Wikipedia as a direct reference, however the students are required to list any Wikipedia entries they gleaned information from. We then check the entries against the assignment text, to minimise flagrant plagiarism.

Of course it doesn’t help when they leave “Refine your search” or “Enter search items here” at the end of their copy text.

|333173|3|_||3 says:

In my first ever Chem lecture, the LEcturer was listing the toip rules of life at Uni, and the second was to make sure we knew the difference between plagiarism and collaboration.
The plagiarism filters work by searching the internet for phrases, and comparing results. If it finds that a lot of very similar phrases appear from the same sources, it flags it as possible plagiarism.

Ashlayne (user link) says:

Creative Commons

To Swiss Cheese Monster: Yes, it’s true that today’s children are the only ones who can truly say they’re growing up in the culture of Wikipedia. But what a lot of people are also discounting these days is Creative Commons.

“The Creative Commons (CC) is a non-profit organization devoted to expanding the range of creative work available for others legally to build upon and share.” (Pulled, ironically, from the Wikipedia article about CC.)

Basically, my take on the entire issue is this: ideas are meant to be re-used, recycled, and rewritten. The actual stories themselves… not so much. I can write a story about a modern-day magic user trying to fit into today’s society, then say I got the idea from Jim Butcher’s Dresden Files series. But I’m not rewriting Dresden and claiming it as my own. People do the same thing all the time with Superman, Batman, and other trademarked superheroes and characters. The difference is, the intelligent writers don’t try to claim something they’ve rewritten as their own.

You can write a story involving Superman, and it’s understood that you aren’t the one who originally created Superman. It’s the same with any idea.

Michael Long says:


If you wrote on a resume that your fascination for chemistry began with your setting fire to your pajamas at age 8, and it didn’t happen, then that’s not plagiarism, and it’s not “content”, it’s lying.

It’s not your experience, it didn’t happen to you, and any attempt to pass it off as such, especially in a resume or application, is simply dishonest. And probably symptomatic as well.

If you’d so casually pass off a lie there, then what else are you lying about? Were you actually president of the xyz club? What else do we need to check?

And how trustworthy are you going to be in the future?

Dam says:

It Can Be Easily Prevented

Plagiarism is the failure to cite source material, not solely the act of copying another’s work. Anyone that attempts to pass off substantial amounts of someone else’s work is plain lazy or maybe unable to think for themselves.

Learning how to do research and create an original piece of work is a skill that’s just not taught in US schools. And, in places where it is, many students simply snooze through it.

charlie potatoes (profile) says:


i’m a freelance writer and in lean times i write college essays for the well-heeled under-achiever. much of my work is for theology students, and i write on the subject of ethics. they don’t seem to see the irony. they are not supposed to pass my work off as their own, of course, but rather use my essay as a guide.. yes.. right.. wink wink, nudge nudge.. but sometimes i wonder where they get the money..its what i think of as expensive.

David says:

Re: plagiarism (charlie potatoes)

Charlie’s lean-time enterprise for “well-heeled under-achievers” is the most serious form of cheating. For most purposes, assessors are not interested in original ideas, but in effective paraphrase and synthesis of existing ideas, properly attributed. There is a chance at detecting cheats who borrow past students’ papers or Internet resources. But an intelligent ghost writer is paid to write a custom work. It is entirely original in terms of what academics expect, and entirely plagiarised if it lacks the true author’s name.

Charlie potatoes notes the irony of theology students passing off works on ethics as their own.
There’s additional irony in Charlie’s report: he is complicit in this most unethical practice.
If the students are ‘unaware’ of the irony, but Charlie is (begin better-read on the subject of ethics), who is the less ethical?

What this practice does is to bring back the pre-eminence of the classroom exam as the key assessment tool. It’s the only way to truly judge what students themselves have produced. Unfortunately, some genuine students who write poorly under sweatshop pressure will be forever suspected of foul play, if their take-home work is consistently better.

misanthropic humanist says:

Re: Fragile memory

John. Yes, I was kind of alluding to exactly that sort of thing. Our brains spend all day long soaking up information we only process a tiny fraction of. The rest isn’t just dropped, it becomes part of that “creative geatalt” which we draw on in dreams and problem solving. The only way you could ever ensure that a persons thoughts were entirely original is to isolate them on a desert island for life. But you’d never get to ask them, because long before you realised they hadn’t even developed language they’d probably kill you with a spear and eat you.

astronouth7303 says:

Not from Wikis

As The Swiss Cheese Monster said, those currently enrolling in college have not “grown up” with Wikipedia.

There is still a significant portion of the graduating class here who do not understand that Wikipedia is not just another site, that it’s editable. Well this may be a source of some of the issues, I suspect that those most aware of Wikipedia’s goal are those who are also most aware of what plagiarism is and recognizing it.

Much like how people pin urban violence on video games. The violence was there before the games. I’m willing to bet that the influence of the media (Wikipedia, video games) has had little effect on the activities (plagiarism, violence).

Desultorypolemic says:


Here is the problem. When the original information is erroneous, the casual attitude toward plagiarising it results in its endless replication. Ti is at opposite ends of the moral spectrum to try and justify plagiarism as some kind of “collaborative” effort when the information was not offered by the original author for that purpose.

Arochone says:


Just as a bit of a side note here to the ‘writing about an experience you didn’t have is lying and we can’t trust you then’:

I’m currently in high school, and most of our english teachers say ‘if you can’t come up with a good experience of your own, make one up. It’s not the content that matters, it’s how you write about it.’
Seems to be quite true to me. Who cares where they got the idea from? That’s not really what they’re being judged on. Even if it’s the coolest experience you’ve ever heard about, if they can’t write worth shit, chances are they won’t get accepted. What’s the difference if it actually happened, or if you’re exaggerating a bit, or if you’re flat out making it up or taking it from someone else?

DiNap44 says:

The question of "Common Knowledge"

The one area I haven’t hear a comment on is the one of “Commmon Knowledge. According to Indiana University’s Writing Tutorial Services website:

“Common knowledge: facts that can be found in numerous places and are likely to be known by a lot of people.

Example: John F. Kennedy was elected President of the United States in 1960.

This is generally known information. You do not need to document this fact.

However, you must document facts that are not generally known and ideas that interpret facts.

Example: According the American Family Leave Coalition’s new book, Family Issues and Congress, President Bush’s relationship with Congress has hindered family leave legislation (6).

The idea that “Bush’s relationship with Congress has hindered family leave legislation” is not a fact but an interpretation; consequently, you need to cite your source.”

With websites like Wiki,, etc., one could argue that one does not have to fear Plagiarism when using facts on a subject in a research paper because it is now readily available on the web and came be consisted Common Knowledge. This would only cover facts on a subject, not copying other’s experiences or opinions.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...