E Is For Everything -- But Why?

Is the 'E' necessary?There is an unfortunate tendency for e-learning evangelists to try and come up with as many e-words as possible when promoting the benefits of e-learning. Why?

I suppose the idea is to generate excitement, and to energise one’s colleagues. But to my mind, this is mere gimmickry.

I’ve seen it done with the 'C' in ICT as well. That stands for 'communication' or 'communications', but I’ve known people to embellish and complement it with 'collaborate', 'co-operation' and the like.

(Curiously, I have never seen it done with 'd', as in 'digital', or any other letter.)

If that is all there was to it, this tendency would be merely annoying. However, I believe it has a subtle -- but real – derogatory effect, in two ways.

Firstly, just as it is often the case that a piece of writing is diminished in direct proportion to the number of adjectives used, so is the authority of a discipline lessened as more and more attributes are generated for it. It seems to denote a certain lack of confidence: you don't see geography teachers babbling on about how good, great or gritty their subject is; you don't hear historians trying to convince people that their area of study brings happiness, or that it reduces harm or hubris.

To quote from Hamlet, "The lady doth protest too much, methinks".

Secondly, and worse, it can actually do some positive harm. As long as the myth is propagated that e-learning is different to learning, or that an e-portfolio is fundamentally different to a portfolio, some teachers and their principals will quite happily continue as if the e-revolution has nothing to do with them. Worse, concerns over e-safety could easily mask the fact that all teachers should feel responsible for children’s safety, and that there is not a subset of safety that can be delegated to the ICT staff.

As far as I am concerned, the sooner we drop the 'e' from everything, the better.

Reblog this post [with Zemanta]

An earlier version of this article first appeared on 29 July 2009.

Upcoming Article on Mission Statements

I have a thing about mission statements: basically, I think they're a waste of time, effort and resources. But there's always an implied mission statement, which is always a reflection on leadership in my opinion.

An article will be appearing about this tomorrow morning at 8 am, but if you want to read it now, you can, at the Technology and Learning Blog, where there's a slightly different version of it.

If you're an ICT or Ed Tech Leader, do take a few minutes (literally, just around five minutes) to complete a survey on what issues ICT leaders face today. The initial results will be published soon. Thanks.

Personal Learning Networks

A couple of recent posts by Miguel Guhlin -- see Vastness of You - Plurk Me No More and To PLN or Not -- and a comment on the latter by Paula Nagle, made me think about personal learning networks, or PLNs.

What IS a PLN?

A PLN can be great for supportClearly, it’s the group of people with whom one interacts, online in the first instance. I have to say I have a bit of an objection to the term, because it gives the impression of being rather self-centred, as though everyone in your learning network is there to help you learn. Help you learn. What about them?

We learn most when we’re discussing or teaching, and although that’s what goes on in PLNs, it’s not exactly explicit from the name. Perhaps Personal Interaction Network (PIN) would be better (which would no doubt lead some people to talk about their PIN numbers…).

Can your PLN be too large ?

I wonder if this term, “too large”, has any real meaning in an asynchronous world? I can put out a message on Twitter this morning, and have a response from someone over the other side of the world this evening. In that sense, is there such a thing as “too big”?

Can anyone join?

What is the qualification for becoming a member of someone’s PLN? For me, it’s having something useful and relevant to say. I don’t care if someone has been blogging for only five minutes: if their first few blog posts are interesting, I’ll follow them. I find it embarrassing, though obviously flattering, when people follow me by saying they really look forward to reading even more wisdom, or who, when I follow them in Twitter, express the hope that I’ll find it worth my while. We should try to get away from this sort of hero worship: it’s not healthy. It’s not even accurate: someone who has been active in the “edublogosphere” for five minutes can be just as “wise” as someone who has been here for years. Perhaps even more so, because they come to it all with fresh eyes. In fact, a member of your PLN could be the guy who runs the café down the road!

What other value of PLNs are there, apart from learning?

As Paula has pointed out (see post and comment referred to earlier), it’s wonderful when people in your PLN make themselves known to you – and vice versa – at conferences. A PLN can also be a great source of support, especially when the trolls are having a feeding frenzy.

So what are your thoughts on PLNs?

The Big Issues for ICT Leaders Forthcoming Initial Results Announcement

This is just a quick heads-up to say that in the next day or so I aim to publish a snapshot of the results so far of a survey I set up about a week ago. If you haven't already done so, take the survey now -- it takes only about 5 minutes.

Read the original article about it if you missed it: The Big Issues for ICT Leaders

Does ICT Improve Learning?

The intuitive answer to those of us involved in ICT is “of course it does”. However, the evidence from research is not conclusive. I think the reason is that it’s actually very difficult to carry out robust research in this area. As the impact of ICT has been a topic for discussion recently in the Naace and Mirandanet mailing lists, I thought it might be useful to try and clarify the issues as I see them.

The question “Does ICT improve learning?” naturally leads on to a set of other questions that need to be addressed:

What ICT?

The question as stated is too broad. A computer is not the same as a suite of computers. It’s not even the same as a laptop, which is not the same as a handheld device. Software is not the same as hardware, and generic software, such as a spreadsheet, is not the same as specific applications, such as maths tuition software.

What other factors are present?

ICT doesn’t happen in a vacuum. What is the environment in which the technology is being used? How is the lesson being conducted? What is the level of technical expertise of the teacher? What is the level of teaching expertise of the teacher? These and other factors mentioned in this article are not stand-alone either: they interact with each other to produce a complex set of circumstances.

What is the ICT being used for?

What is being taught? There is some evidence to suggest that computers are used for low-level and boring tasks like word processing, in which case comparing technology-“rich” lessons with non-technology-rich lessons is not comparing like with like. On the other hand, technology can be, and often is, used to facilitate exploration and discussion. Since these are educationally-beneficial techniques in their own right, the matter of validity needs to be scrutinised (see below).

How is the impact of the ICT being evaluated?

There are several ways in which this might be done, each with their own advantages and disadvantages. For example, in-depth case studies yield rich data but may be difficult to generalise from. Also, there are three other problems. One is that it is difficult to conduct experiments using a suitable control group, because no teacher wishes to try something which may disadvantage a particular group of students. Another is the so-called “starry night” effect, in which case studies focus (naturally) on the successful projects whilst ignoring all the ones which either failed or were not believed to have deliver the same level of benefits. Finally, there is the danger of all kinds of evaluation study, that the methodology itself may affect the outcome.

What exactly is being measured?

This is the issue of validity, already touched upon. Are we measuring the ability of a teacher to conduct a technology-rich lesson, in which case it’s the effectiveness of the teacher rather than the ICT that is being weighed up? By implication, it may be the quality and quantity of professional development which is being measured. It may be students’ home environments that are inadvertently being evaluated, or student-staff relationships.

How much is ICT being used?

I suggest there may be a difference between schools in which ICT is being used more or less everywhere, and those in which it’s hardly being used at all. In the former, presumably both teachers and students would be accustomed to using it, there would be a good explicit support structure in the form of technical support and professional development, and a sound hidden support structure in the form of being able to discuss ideas with colleagues over lunch or a cup of coffee.

Is there an experimenter effect going on?

This is the phenomenon whereby the results of a study confirm or tie in with the expectations of the people or organisation responsible for the study. This is an unconscious process, not a deliberate attempt to cheat. I’ve explained it in my article called Is Plagiarism Really a Problem?

Conclusion

My own feeling – backed up by experience --  is that in the right set of circumstances, the use of ICT can lead to profound learning gains. However, rather than falling into the trap of arguing whether ICT is “good” or “bad”, we need to move the debate onto a much sounder intellectual basis.

Further reading

I’d highly recommend Rachel M. Pilkington, “Measuring the Impact of Information Technology on Students’ Learning”, in The International Handbook of Information Technology in Primary and Secondary Education, Springer, 2008, USA.

What are the big issues facing ICT (Ed Tech) leaders? Please take a very short survey to help us find out.

A (Hopefully Temporary) Email Problem

One day after telling a colleague that one of my email providers has experienced problems only three times in the last ten years, there was, and still is, what they call a "major outage". What that means is that if you have sent me an email and not received a response, that's because I haven't yet received it. If you resend it to terry[at]terry-freedman.org.uk that should get to me with no trouble (I'm reluctant to say will get to me: look what happened the last time I prasied an email service).

Professional Development in Technology

I recently came across a blog by a Head of English in a school. It’s interesting to hear the views of a non-ICT specialist about what works or might work in getting teachers engaged. There are some very useful points made in the post entitled Professional Development in Schools:

Listening to staff after PD, their number one complaint is about not getting time to play and make stuff with what they just learned

This is absolutely correct in my experience. In fact, one of the most successful training sessions I ever ran was one where I allowed the teachers to spend three hours playing and experimenting, with myself and a technician on hand to give advice and guidance when asked. Teachers often think that they have to be doing and speaking all the time. You don’t.

Make sure the project is based on something that can actually be used in the classroom (not just an excuse to try new tools) following a sound curriculum planning process.

Something which ought not need to be said, but it’s all too often the case that people fall into the trap of pursuing gadgets and widgets for their own sake. The key question to ask about anything in education is “So what?”. If you can’t answer that question truthfully and convincingly in terms of students learning outcomes, then why are you undertaking that activity?

Are lunch and learns the answer?Another idea is that of “Lunch and Learns”, taken from Bianca Hewes’ blog. The idea is that you run short lunchtime sessions which teachers may attend in order to refresh their knowledge of, or be introduced to, an application. I have to say that although I can see the attractiveness of this, I have an ambivalence towards it, for the following reasons.

Firstly, I have come to the conclusion, rightly or wrongly, that the best thing to do at lunchtime is have lunch, followed by doing the crossword, chatting with friends, going for a walk or staring into space. I can’t see how working at lunchtime can be effective or even healthy – which is why for the past eight years I have eschewed breakfast meetings whenever possible.

On the other hand, I can see that lunch and learns are an attractive alternative to twilights and learns. Perhaps the important thing is to experiment and find out what appeals most to your colleagues.

The author of the blog, M Giddins, surprised me by saying that she avidly followed my 31 Days to Become a Better Ed Tech Leader series -- “surprised” because I’d written the series for ed tech leaders rather than other subject leaders, and it hadn’t occurred to me that others might find it useful. I put this to her, and she responded by saying:

I think now that any leader in education also falls into the role of educational technology leader in some ways. I have a faculty that need to be guided in their quest for technology integration and I need to be both the one who models, leads and inspires as well as the solver of the practical problems sometimes inherent in the integration of technology. Your series was very clear about the WHY behind the practical solutions that you offered, which made it possible to apply different solutions to suit my situation.

Finally, there is a link to a list of tools which is definitely worth exploring. The ones I know about already have a rightful place on the list, and I’m looking forward to exploring the others.

This précis of the article hardly does it justice, so do take the time to read the original, which is as inspiring as it is engagingly written.

Other articles you may find useful

31 Days to Become a Better Ed Tech Leader: Are You REALLY an Ed Tech Leader (ictineducation.org)

What are the big issues facing ed tech leaders today?

Please take five minutes to complete a survey about this:

Ed Tech Leadership Issues

Clay Shirky on The Times Paywall

A couple of days ago The Guardian interviewed Clay Shirky, thereby giving him great, and free, publicity for his latest book. I mention this purely because Shirky is reported as saying:

… that people are more creative and generous than we had ever imagined, and would rather use their free time participating in amateur online activities such as Wikipedia – for no financial reward – because they satisfy the primal human urge for creativity and connectedness.

The bit about no financial reward doesn’t apply to Shirky himself: his book costs £20.

Don’t get me wrong: I’m all in favour of people earning a living from their writing activities, as long as they don’t fall into this trap of suggesting that if the writing is on the internet, it should be free. Indeed, I’ve noticed this about most people who say all content should be free: they either charge for their own or they earn a salary, meaning they don’t have  to charge the consumer directly. It also means, of course, that they draw the line at providing their own expertise for free all the time.

Shirky says:

Just as the invention of the printing press transformed society, the internet's capacity for "an unlimited amount of zero-cost reproduction of any digital item by anyone who owns a computer" has removed the barrier to universal participation…

But the cost is not  zero. Maintaining a web presence costs money unless you don’t mind putting up with loads of advertisements or you have the technical ability and time to maintain your own server or you have some benefactor, such as an employer, who provides the stuff free of charge. Even then, there’s a cost somewhere down the line in terms things like of backup storage and antivirus protection. And since when was someone’s time free? See also this paper about the costs of digital storage for the British Library.

Shirky seems to have a rather bizarre view of business:

Here's what worries me about the paywall. When we talk about newspapers, we talk about them being critical for informing the public; we never say they're critical for informing their customers. We assume that the value of the news ramifies outwards from the readership to society as a whole. OK, I buy that. But what Murdoch is signing up to do is to prevent that value from escaping. He wants to only inform his customers, he doesn't want his stories to be shared and circulated widely. In fact, his ability to charge for the paywall is going to come down to his ability to lock the public out of the conversation convened by the Times.

Actually, every business tries to limit consumption of (the bulk of) its products to its customers, otherwise it wouldn’t be a business. That's how businesses work: by charging some people in exchange for providing a product or service, and then not supplying it to people who don't pay. The first group of people is known as "customers" or "clients". What we seem to have here is yet another example of muddled thinking, as also exemplified by Chris Anderson's Free, which I discussed here.

Economics 101 states that the more effectively you can prevent people in group A, the customers, from providing the product or service to group B, the non-customers, the more you can charge for the service. Murdoch's problem is that anyone can share the content of the The Times, if not the articles themselves, with anyone else. But there's nothing evil or wicked about his wanting to "prevent that value from escaping", unless you take the view that it's fine for some businesses to want to do that but not others. How would you justify that?

You can read the interview here. Be sure to read the comments too.

The Big Issues for ICT Leaders

QuestionsIn the series called 31 Days to Become a Better Ed Tech Leader, I covered a range of issues that I believe are key ones for Ed Tech/ICT Leaders.

But what do you think?

I've created a very short survey which seeks to determine the three most important issues as far as ICT leaders are concerned.

Please take a few moments to complete it. You will find it here:

Ed Tech Leadership Issues

Thank you!

Update on the Amazing Web 2.0 Projects Book

amazing-cvr

As featured in the TES!

Find out all about the book from here. It’s free!

In case you already know about it, I have a confession:

Thanks to Nyree Scott, of Christ Church University, Canterbury, for pointing out an error to me: Year 1 is 5-6 year olds, not 6-7 year olds. Don't know how I came to make such a daft mistake, but it's all corrected now!

And now for some up-to-date stats:

The Myebook version has been read 2,759 times.

The Slideshare version has been read 625 times.

The Scribd version has been read 586 times.

The YouPublish version has been read 14 times. (Come on, be fair: I only published it there properly last night, and I haven’t even told anyone about until now!)

It has been downloaded 15,143 times.

Is the Venue the Message?

I raced into the #futurising conference room 20 minutes late, having arrived 10 minutes early. Except that I wasn’t late because the organisers had thoughtfully put the talk back 10 minutes, and the presenter was still trying to get something to work. I turned to a fella behind me. “That was the slowest moving queue I’ve ever been in.”, I said. “And I’m still early!”

It's pitch black in broad daylight!Came the reply: “This has been organised by Arts people”, delivered in a tone which meant that this was not merely the most feasible  explanation, but the only one.

I could see what he meant. The officials were amiably efficient, but in an other-worldly sort of way. I’m used to conferences where there are people in power suits holding clipboards, timing everything to the second and then flapping when there is a delay. This was more like being part of one of those runny water colours you see in the Tate Gallery or along the Bayswater Road.

But despite that, or probably because of it, the conference worked. It was interesting, “edgy” and, by all accounts from looking at the tweets, useful. But I think a large part of its success was down to the venue.

The Nicholls & Clarke Building in London used to be a Victorian warehouse, which was used as a workplace until just a few years ago. One of the buildings used to be two separate ones, with a narrow alley between them. This is known as “Ripper Alley”, as it was thought to be one of the routes used by Jack the Ripper. You can see how dark and terrifying London once was: look at the pictures and shudder.This was once an external window

The thing about buildings, as anyone involved in designing learning spaces will tell you, is that the nature of the design affects the nature and quality of the activities that go on inside it. We all know this, intuitively if nothing else, and yet we keep insisting on holding ICT conferences in ordinary, traditional venues. How can you think out of the box if you’re sitting in one?

Most of us are familiar with Marshall McLuhan’s "The Medium is the Message". Might it not also be the case that the venue is the message too?

There are more photos on Flickr.

Is the ICT Curriculum Fit for Purpose?

ICT in the ‘old’ National Curriculum as it stands in my opinion is completely unfit for purpose.   A curriculum written 10 years ago can in no way reflect the changes in technology and the skills that children need to be taught in the modern world.

This is the view of Steve Kirkpatrick, as expressed in an article called The future of ICT in the curriculum? on his excellent Teaching With Technology blog. I have a lot of respect for Mr Kp, as he styles himself, so I went back to basics and had a look at the 1999 Programme of Study, and its updated online version (primary and secondary – Key Stage 3 and Key Stage 4).

Sorry, Steve, I have to completely disagree with you. It may not be all flashing lights, so to speak, but that is precisely the point. The Programme of Study, and its associated Level Descriptions, were written in a deliberately technology-free way in order to future-proof it. Indeed, one could argue that the weakest parts are the examples. Even the updated online version, with its example of “multimedia” (as compared with the original “sound” and “graphics”) is starting to curl at the edges as new technology like virtual worlds and, more recently, augmented reality have stumbled into the educational spotlight.

Steve goes on to say:

The problem is that the the ICT curriculum needs to be developed from the ground up and not from the top down.

That’s no problem. The ICT Programme of Study is “vague” enough for any creative bunch of teachers to invent their own ICT curriculum and make it match the Programme of Study. For example, read my Delegation Case Study for information on how I and a group of ICT teachers went about this around 12 years ago. The scheme of work we used, and adapted to our own purposes, not only satisfied the then existing Programme of Study for ICT, it also matched the 1999 rewrite -- and could still be used, with a bit of tweaking, obviously, today. My point is that I have always seen the ICT Programme of Study as enabling rather than restricting.

Steve says:

Can we as educators develop a skill based ICT curriculum that  is relevant and low cost that will deliver for future learners?

Skills-based? Aaaaargh!! What happens when the skills become completely irrelevant (like in about a year, if that)? The only viable curriculum, in my opinion, is one which takes a problem-solving approach, and in which the relevant skills are learnt as needed.

Where do you stand on these issues?

When To Procrastinate

Procrastination, n. The action or habit of postponing or putting something off; delay, dilatoriness. Often with the sense of deferring though indecision, when early action would have been preferable. Oxford English Dictionary.

 

My intention was to arise from the settee and take the tea things into the kitchen. I’d managed to reach Stage two of the three stage procedure (Stage one is thinking about it, Stage two is announcing it, Stage three is doing it). Having discovered that thinking about it had no effect, I made a dynamic and bold statement that I was going to do the deed. (I think what I actually said was something along the lines of, “I suppose I ought to drag my carcass into an upright position so I can take all this detritus away”, but let’s not split hairs.)

In response, my father-in-law, whose name is Frank, came out with a statement that really ought to be immortalised as “Frank’s Law of Procrastination”. He said:

If you're slow enough, someone else will do it.

Sound advice, and so true, generally speaking. But after laughing, I started to think that there are times when procrastination is, actually, the most sensible course of action. Or inaction. And although procrastination usually has negative and unflattering connotations, if you look at the OED’s definition (above), you’ll notice that it says “Often with the sense of indecision…”. Often, not always. There is, it seems, nothing oxymoronic about the phrase “planned procrastination”.

So when would procrastination be a good strategy to adopt? I can think of a number of situations.

Freedman’s Variation of Frank’s Law of Procrastination

If you wait long enough, someone else will beta test it.

There are those of us who, whilst liking the sense of exhilaration one gets from trying out something completely new, have become rather fed up with having trashed computer systems, security holes, and other unforeseen consequences. These days, I never buy anything until it’s on at least version 3.

Freedman’s Law of Intemperate Emails

We all know this one, and I’m surprised that as far as I can find out, nobody else has so egotistically given their name to it (my excuse is that I needed a snappy heading to this bit). When you hammer out an email reply telling your correspondent to do something to themselves which is anatomically impossible, that’s when you hit the Send key when you meant to hit the Delete key. Having done something like that myself once, I now draft a response in my word processor, or as an email reply but with the name(s) of the recipient(s) removed, so that even if I do accidentally hit the Send key nothing will happen.

Freedman’s Law of Decision-Taking

(You can tell that I’m on a roll here, can’t you?). I’m very good at taking decisions, but I’d not be the right person to have commanding you on a battlefield. I like to look at the situation from different angles, seek other people’s opinions and then sleep on it. Obviously there are exceptions to every rule (I wonder if that rule has an exception?), but I usually find that if I resist my urge to respond straight away I end up thinking of nuances and issues which had previously escaped me.

A good example of how planned procrastination is a useful device is when a client says they would like the bid, or case study, or vision document or whatever I’m writing for them to include X. It seems a good idea at first, until I think about it and realise that including X will mean also including Y and Z in order to explain and contextualise X, and doing all that would put us way over the word limit. But after sitting on it for a day, I realise that if I said W (do keep up at the back), it would get across the whole idea of X but without going into so much detail.

Bottom line

We live in an age when instantaneous responses are possible, expected and, furthermore, highly valued. But I think we need to ensure that youngsters are taught the value of waiting and thinking, in spite of all the pressures to do otherwise.

If you enjoyed reading this article, you’ll probably also like 21 rules for computer users.

Technology Destroying Love of Reading

It must be true, because Sir Tom Stoppard says so. At least The Register, unlike the mainstream news sources I've looked at (The Daily Telegraph, The Guardian and The Independent), all of whom seem to have merely published a press release, had the decency to (a) strike a cynical tone and (b) do some basic research. It says:

... the latest figures show 10,000+ students enrolling to study English last year, making it the seventh most popular subject - far ahead of maths, sciences or engineering. Another 7,800 enrolled to study combinations of humanities and languages, and 8,510 more for History.

All of which completely contradicts what Sir Tom said.

I've got nothing against Sir Tom -- I really like his Rosencrantz and Gildenstern Are Dead -- but I do get "exercised" when celebrities -- actors, authors, chat show hosts -- make these sort of blanket pronouncements which appear to be based on no evidence at all or, being charitable, the speaker's own experience.

Well, everyone is entitled to their point of view I suppose, but it's a great pity that all the newspapers seem to do is publish the press release as is. Thank goodness for mavericks like The Register!

See also "Is plagiarism really a problem?"

Some Statistics about the Amazing Web 2.0 Projects Book

The Amazing book.

Since its publication in March 2010, the Amazing Web 2.0 projects book has been:

  • Downloaded 14,770 times.
  • Viewed 2,748 times in Myebook.
  • Vewed 544 times in SlideShare.
  • Viewed 429 times in Scribd.

Read more about it here.

Download it by clicking on the link below:

oops!

Thanks to Nyree Scott, of the University of Canterbury, for pointing out an error to me: Year 1 is 5-6 year olds, not 6-7 year olds. Don't know how I came to make such a daft mistake, but it's all corrected now!

Bad Habit

It’s five a.m., and the world around my house is only just beginning to emerge from the shortest night of the year. What will, in a few hours’ time, be the distant din of traffic is presently a mere hum. Even the birds are too tired to sing. There’s no sound, no email, no phone call and no text messages. This is the time of day to be a writer, in England, in summer.

writersblock3d So what has prompted these mental meanderings? Although I am not one to suffer from a lack of anything to say when I metaphorically put pen to paper (some, like the one who unsubscribed from my Feedblitz notification service yesterday because of “Too many updates”, would say the reverse is true), I couldn’t resist buying “The Writer’s Block” when I saw it on offer for just a few pounds. Packed with photos, short articles and suggestions, this book is meant to kick-start your imagination in order to help you get past  -- you’re ahead of me, I can tell – writer’s block.

Well, one of the entries is “Describe one of your bad habits.” After struggling for a while to think of any bad habits (only recently I had my halo polished by a team of professionals), I came up with my worst one (in my opinion at least): staying up too late. At the time I should be going to bed, I make myself a cup of tea and start reading blogs, or writing. And I read. And write. And watch videos. And quickly check my email. And read. And check my email again. And so on, until I realise with horror that it’s 1:30 am. Thus it is that the technology, which makes it easy to do all these things, and my lack of willpower, which makes it hard for me not to do them, conspire to give me late nights, when what I really ought to be doing is what I did this time: get up early, which is my best time for doing stuff anyway.

The basic law of life with technology is that there’s always one more thing, which is my generalised version of Lubarsky’s Law of Cybernetic Entomology: There’s always one more bug (see 21 Rules for Computer Users for 20 further digital insights). There’s always one more website to check, always one more blog to read, always one more email to respond to. Always one more reference to check. This is why Computers in Classrooms can sometimes be weeks overdue. I’m almost ready to publish it when I see an article and think “Perhaps I should bookmark that, as it may be relevant to this issue.” When I embarked on my seminal work, the magnum opus entitled “Managing ICT”, I polished it off in a couple of months with almost no revisions. That’s because it was back in 1998, when research was still partially done in a library (Google had just started as a beta service), and blogs hadn’t even been conceived yet. I’ve been working on another few books and they are taking forever because I keep coming across relevant articles, and people make relevant comments on my own articles and in Twitter.

Like I said, there’s always one more thing.

There’s a wider, deeper, and more important issue here, I think. I was brought up under the tyranny of the maxim “If it’s worth doing, it’s worth doing properly”, which is actually logically untrue (if it were true you’d be spending the maximum amount of time and effort on everything you do in order to perfect it; you’d never get any sleep.) But really the only way to deal sensibly with the world of today is to cultivate an understanding of, and putting into practice of, the “good enough” approach. There comes a time when one just has to say, “This may not be perfect, but it is good enough, and spending another hour, or day, or week, on it may improve it, but any benefits of doing so will be outweighed by the cost in terms of the other things I could be doing instead.”

In my opinion, that’s my real bad habit: not having the wisdom, the willpower and, yes, the self-confidence to know when what I’ve done is “good enough”.

See also "Efficiency? Don't Make Me Laugh!"

Is Plagiarism Really a Problem?

I don’t often get annoyed when I read the newspaper these days –- well, not more than once per page anyway – but an article in today’s Guardian entitled “Internet plagiarism rising in schools”, with the subheading “Half of university students also prepared to submit essays bought off internet, according to research”, really wound me up. This for several reasons.

Firstly, the research was carried out by a researcher from the University of Manchester, and the results will be presented at a conference called The Plagiarism Conference sponsored by, amongst others, a company called nLearning, which supplies plagiarism-detecting software. Now come on: how likely is it that they would sponsor a conference in which someone comes along and say “Hey! Our research shows that you really don’t need to be buying plagiarism-busting aplications!”

Don’t get me wrong. I’m not suggesting that the research was fabricated or misreported, or that anyone has said or done anything which is underhand. The fact is that there is a tendency for research results to reflect the views or principles of the researcher or organisation involved.

I first heard about this phenomenon when I was studying Psychology at uni. It was an option I took in my first year, and in one of our experiments we looked at something called the Experimenter Effect. It was fascinating really. Paired off, we students were given the role of either experimenter or subject, and then each experimenter was given an instruction sheet to read to our subject, explaining the nature of the task he or she would be doing. The sheet included the directive to read out the instructions exactly as they were set out, apart from the last paragraph. That final paragraph told me that the task was impossible. What I didn’t know at the time was that other experimenters’ final paragraph said the precise opposite, that the task was as easy as falling off a log.

Despite, as we all thought, carrying out our instructions to the letter, and reading the sheet out exactly as it was written, ie with no diversion from the text or even giving our words a particular nuance, those of us who were told the task was impossible witnessed our subjects flailing and failing abysmally, whilst our more optimistic colleagues saw their subjects succeed with glee.

The same sort of thing was discovered many years ago in the field of Economics, in which it was found that the (to all intents and purposes objective) research of left-wing think tanks tended to reveal things like, for example, the official rate of unemployment was an understatement of the true figure, whilst their right-wing counterparts’ research demonstrated errors in the opposite direction. There was no suggestion that anyone was being economical with the truth.

It seems to me, therefore, that the results of research are coloured by hidden influences such as expectations, underlying methodology, the type of questions asked, and so on. I don’t think truly objective research is possible, and I would even apply that “law” to my own humble efforts. For example, it is hardly surprising that when I set out to find out how teachers were using Web 2.0 applications in their classrooms, and what the outcomes were for students, I discovered that teachers who use blogging and so on in their lessons universally report that it had a profoundly positive effect on their students’ learning. (Read all about in the Amazing Web 2.0 Projects Book, which is not only stupendous, but also free!)

Bottom line: I tend to take all research results, especially the ones I read about in newspapers, with a pinch of salt. And I say "especially" because I find it very depressing that stories like this seem always to be reported without any critical faculty whatsoever being exercised. Like those stories that pop up every so often in which someone starts ranting that kids don't know how to use apostrophes these days, a clear indication if ever there was one of the wholesale failure of teachers, schools and society in general -- and it is mentioned, almost in passing, that the ranter has just published "Apostrophes for Dummies". I know journalists are busy people, with deadlines and stuff, but surely they could at least raise an eyebrow?

Secondly, I refuse to believe that 50% of university students are cheats or potential cheats.

Thirdly, what exactly has changed over the last however many years apart from, perhaps, the ease with which one can buy essays? I recall a “student” I was put in contact with through a private tuition agency offering to pay me three times the hourly rate to write an essay he could copy and pass off as his own. I refused, and he was so upset and angry that he complained to the agency about me, telling them that I had made the offer to him! That was 25 years ago. As far as I can see, the difference is that now he would go to a website and anonymously purchase an essay written anonymously by someone who has basically abandoned all pretence of being professional or ethical.

Fourthly, how come their tutors need software to tell them if their students are cheating? If you read your students’ essays over the year, and listen to them debating in seminars, how could you fail to notice if their writing suddenly used different language, different sentence structures or just seemed different?

Well, maybe university tutors deal with hundreds of (to them) faceless students these days. But schools?  I mean, why should any school need a computer to tell that their kids are “cheating”?

And are they even cheating? There’s an old maxim that if you steal from one writer it’s called plagiarism, but if you steal from lots of writers it’s called research. Do youngsters actually know the difference between plagiarism and research unless they’re taught?

This is nothing new either. In my very first teaching job, when I taught Economics, I set an essay to answer the question, “What are the causes of unemployment?”. When I had marked the essays I gave the class feedback as follows:

That essay you did for me was tackled really well. The only thing I would say, though, to save us all a lot of a bother next time, is that instead of copying several pages straight out of a textbook, just hand me in a sheet of paper with your name on, together with the title of the textbook you’d like to copy from, and the relevant page numbers, and I’ll mark the book instead.

So how did I know thay’d copied large swathes of textbooks? First of all, I possessed all the main textbooks and knew them quite well. I knew the way their authors expressed things. But more importantly, I knew my students, so when the lad who would usually come out with such gems as “My granddad wouldn’t of got any work if he hadn’t gone out looking for it” handed in an essay which was full of sentences like “Indeed, we can surmise from observation of the effects of tax incentives on industry in regional development areas …”, something told me that he may not have written it all by himself.

You don’t need technology to detect plagiarism, cheating, copying or whatever you wish to call it. What you need is teachers who know their students, and common sense – and time by the powers-that-be for teachers to get to know their students, and freedom to trust and rely on their own professional judgement (because that, when it becomes subconscious, is actually “common sense”).

Moreover, if students really are cheating, we need to ask ourselves some questions, such as:

  • Are they really cheating, or have they simply not understood that that isn’t real research, or don’t have the literary skills to summarise or reword passages they read in articles and books?
  • If it turns out that they are cheating, is that because we seem to be living in a society in which it increasingly appears to be the case that the end is regarded as justifying the means?

If there is any truth in that latter suggestion, perhaps we would agree with Cassius in Shakespeare’s Julius Caesar: “The fault, dear Brutus, lies not in our stars, but in ourselves.”

Delete Cyberbullying

If you wouldn't say it in person, why say it online?

The National Crime Prevention Council in the USA has addressed cyberbullying in a number of short videos. They make the point very well: why behave differently online to how you would conduct yourself offline?

There's another, perhaps less obvious, message that comes across when you watch the videos. To quote from Edmund Burke ,

The only thing necessary for the triumph of evil is for good men to do nothing.

Use this as a starting point for discussion with pupils. Perhaps they could make their own cyberbullying video too: that approach has been used to great effect in a number of schools.

 

Who'd Have Thought It?

Interesting video that highlights just how amazing is the mobile technology we probably take for granted. OK, it's an advertisement, but I think it could make a nice starting point for a discussion with pupils.

One of the projects I used to set students when I was teaching was to envisage their library of the future. Some of the outlandish ideas they came up have since come to pass. So I wonder where youngsters think technology is going?