Dealing With Data Loss: A Look at the Problem and a Possible Solution

I once wrote, somewhat flippantly but not entirely jokingly, that if you live in the UK and pick up a newspaper on any particular day there is almost certain to be yet another news report about a government laptop going missing. The very next day another of those articles appeared. My perception is that things have improved since then, but that could be because little has occured for a while on a large enough scale, or frequently enough, to warrant the attention of the mass media.

The sorts of disaster I'm talking about include the occasion when it was reported that the UK's tax website had to be closed temporarily because:

"a memory stick containing confidential pass codes to the system was found in a pub car park."

That was repeated again a few months later, along with another article stating that according to official figures, one official is disciplined over data loss every day. And if that's the "official" figure, there is no doubt in my mind that the actual figure is higher. I wonder what it is when you take into account private companies "losing" data, or Local Authorities "losing" data?

I've even attended a seminar on the subject of missing data and laptops, where a number of experts gave talks on the problem. But it seems to me that the problem could actually be solved very quickly by changing the way we think about data.

One of the aspects of many ICT courses is the effects of IT on society. Perhaps this opinion piece (which, as you will see, is backed up by facts and figures) might be used as the starting point for a debate and other work on the subject.

The phenomenon

For those outside the UK who may not have heard about this phenomenon, these are basically what seem to be the common features of these cases.

1. Someone, for reasons best known to themselves, leaves their place of work with a laptop or memory stick containing personal data details of thousands -- or in one case, 25 million -- people.

2. They leave the laptop or usb stick on a train, back seat of a car or other equally safe places.

3. Someone discovers it and reports it to the authorities or the press.

4. There's a press release assuring us that the data was encrypted, but they've changed everything anyway, so there is no need for anyone to worry.

5. The person who lost the item is reprimanded or fired.

6. There's a lot of wringing of hands, promises of internal inquiries and so on.

7. It all goes quiet as the media focus on the next organisation to lose a load of data.

Terminology

To my mind, there is something wrong with the word "loss" in this context. I'm not sure exactly what the right word would be, but I think of it in much the same way as road accidents. Traffic "accidents" tend not to be called "accidents" these days, because most of them are caused by human error. The word "accident" conveys a sense of "not my fault", when actually most road crashes are someone's fault, as opposed to, say, mechanical failures or acts of God.

In the same way, losing thousands of people's details is not simply accidental, as the term "loss" implies. To leave a laptop lying around or to lose a memory stick in the street surely suggests a lack of attention. We all lose stuff -- I'm always putting things down and then retracing my steps mentally to work out what I did with them -- but I can assure you that when I leave the house with something really valuable, like my passport, I go to absurd lengths to prevent losing it, such as using a bulldog clip to attach it to the inside of my pocket. Or, despite wearing a jacket with zipped pockets, I check that it's still there every 5 minutes.

But wait...

Why do people feel the need to take such huge amounts of data away from the office in the first place? I've been working now for nearly 35 years, and in all that time I have never taken home the kind of data that seems to go missing virtually all the time now in the UK. If I did take data home, it consisted of pupils' names and their exam grades. School registers, which contained pupils' names and addresses and phone numbers, were never allowed off the premises.

These days, if people have to work from home, they should be able to access the data they need over a secure internet or extranet arrangement. I just don't see why there should ever be a need to physically remove the data from the place of work.

What to do about it?

Health and safety

As long as people continue to think of data loss as losing "data", there is never going to be a real appreciation of the possible consequences of the data loss in human terms. There have been cases of armed forces personnel details going AWOL, fugitive criminals' details, financial records going missing . See this article for a summary of this pretty bleak picture as it stood in August 2008, and then this article for more examples from the first few months of 2009. Just last month someone walked into a council office and walked out again carrying a laptop containing over 14,000 people's names and other details.

So surely the first thing we should do is redefine data loss as a health and safety hazard? According to a report last year into identity theft:

"More than 49% of the respondents reported stressed family life, 22% felt betrayed by unsupportive family members and friends, and 23% said their family didn't understand.

The strongest feelings expressed were: rage or anger, betrayal, unprotected by police, personal financial fears, sense of powerlessness, sense they were grieving, annoyed, frustrated, exhausted, sleep disturbances, an inability to trust people, and the desire to give up and stop fighting the system. ITRC long term emotional responses included: 8% felt suicidal (my emphasis), 19% feeling captive, 29% ready to give up and 10% felt that they have lost everything."

When we discuss e-safety with kids we talk about the need to keep their identity secret from strangers. There's an inconsistency if we fail to regard the losing of data, which could clearly lead to identity theft on a massive scale, as a health and safety issue too.

Now, if a company was poisoning its employees or the local populace with toxic waste or a contaminated water supply, they would risk being fined. The directors could even find themselves arrested on a charge of corporate manslaughter. I wonder what effect it would have on data loss if employees and their managers knew that if a memory stick ended up in a rubbish tip or whatever they could end up facing years in prison?

Learning from schools

Schools in the UK are subject to inspection every so often, and are also obliged to undertake self-evaluation. Why shouldn't companies have to do the same, and be expected to show high standards, and improvement over time, on a range of criteria, including data security?

Learning from photo libraries

If you're in the media business in the UK, and you need to hire photographic transparencies from a photo library, don't lose or damage them. Why not? Because you're likely to be fined between 400 GBP (630 USD) and 600 GBP (945 USD) for each one.

What if, applying this principle, companies or government departments were fined for each unit of data they lost? Even if only £1 per item was levied, losing 25 million names would be a costly business. Or do we as a nation think that in principle photos have more value than people?

Over to you

What do you (or your students) think of my suggestions?

This is an updated version of an article which appeared in 2008.

The Well-Fed Writer, by Peter Bowerman

Reviewed by Terry Freedman

The Well-Fed WriterWhat's a book on writing doing in a publication about educational ICT? Looked at from one point of view it's completely out of place. However, that is not the only perspective available. Much of the ICT curriculum centres on the concept of audience. Whether it's preparing a presentation for a particular audience, or responding to user feedback, the work requires an attention to someone other than oneself, and something other than the technology. Peter Bowerman, the author of TWFW, has managed to forge a living out of writing. It follows, therefore, that he may be able to teach us something about audience, and have some useful web resources up his sleeve into the bargain.

The book is, in effect, a marketing manual for the would-be serious freelance writer. Thus there is much about how to choose products and services (free is not always second-rate compared to exorbitant, it turns out), and how to approach potential clients. There is good advice about website design and what you should provide on the site, a wealth of websites to explore, and guest sections by other writers (including a few I've come across in the blogosphere, and whom I respect as writers).

There are a couple of niggling things. One is that although Bowerman makes it clear that social networking is very important in today's economy (schools that ban them, please take note), he admits that he himself isn't a member of any of them. That is disappointing because he may have been able to distil into a few bullet points the best way of making contacts in such spaces from his own first-hand experience.

As far as I can tell, there is no information about print-on-demand. Given that writers can be their own publishers these days, a section on that would not, I think, have gone amiss. There was a section about it in his companion book, The Well-Fed Publisher, in which he disparages the use of PoD (although at that time Lulu had only just appeared on the scene, and Bowerman himself had not used it yet).

However, given the readability of the book, such annoyances can be overlooked. Although the jocular (in parts) tone can start to sound a bit forced occasionally, it more often has the effect of making you want to look up that website or read such and such a blog.

Bottom line:

Perhaps not the most obvious choice for an ICT department in a school, but full of hidden gems and a cornucopia of resources. Buy it.

Related article: The case for print-on-demand.

A Teenager's View of Social Networking and Digital Citizenship

MillerElaine and I had the pleasure of chatting to Miller, a 15 year-old girl living in the USA. It is so refreshing to listen to someone who is so level-headed when it comes to issues such as cyber-bullying. It is also interesting to hear how blogging and other web 2.0 applications helped Miller to find her writer's voice within, and to deal with some difficult situations.

There is a lot in this: how her class handled a setback created inadvertently by Google, how their teacher laid down the rules and gave tuition on internet safety right up front, how their other teachers are learning from Miller and her classmates, and a lot more.

The stories I mentioned in which Facebook was involved are here:

Facebook and suicide prevention

Facebook and bankruptcy prevention

Her teacher, Vicki Davis, made the following comments on the recording:

Actually, the middle schoolers aren't using Jott; they are using cell phones in English. They are using Jott to proofread papers. We just use it for 9th grade (Year 9) but they just started charging so we had to discontinue it. That was pretty recent so Miller may not know it. I actually just canceled my Jott account but they were using it like crazy in the fall. Miller doesn't use the features requiring premium Jott.

I actually do not like Jonas brothers chat rooms, etc. That is a place for a lot of predators -- Woogi world is better than Club Penguin. But Miller and I differ on our opinion on that one.

On the issue of over-familiarity between students and their teachers, Vicki said it wasn't an issue in her school because it's a small community in which many people know each other anyway.

Miller mentioned PowerSchool. Their website is here.

The recording lasts just over 25 minutes.


 

Miller has also written a fantastic article for the Computers in Classrooms newsletter.

Acknowledgements

Thanks to Vicki Davis for her help and support in setting up this interview, and to Miller for her time.

The music after the introduction and at the end is Simple Soulman by The Groovebusters. The music is under a Creative Commons licence. Hear the band at:
http://www.garageband.com/song?|pe1|S8LTM0LdsaSkYFexYGE

Disclaimer

Miller's views do not represent the views of her school, her teacher, nor any other organization which she belongs to, but are solely her own views and opinions.

If you enjoyed listening to this, you may also enjoy hearing our interview with Edith, and English teenager.

Computing at School

Last night I attended the Owers Lecture on the subject of "Can we reverse the decline in schools' computing, especially with girls?" (I'll report on this in due course.) As the group called Computing at School was mentioned a lot, I thought I'd reproduce the following article from the April 2009 issue of Computers in Classrooms. The conference mentioned has, obviously, been and gone, but I think it's worth retaining that information for the links and because the agenda is interesting in itself.

The group has recently produced a glossy magazine (insofar as a pdf can be described as 'glossy'!) and some teaching materials, which I intend to review.

My own interest in this (as it's now de rigeur to declare one's interest, however slight) is that I love messing about with programming, having dabbled in Visual Basic and Visual Basic for Applications. Indeed, my chapter in the Year 8 book in the ICT 4 Life series is all about addressing the sequencing aspects of the National Curriculum through the use of VBA in a  spreadsheet.

There is a looming crisis in the world of computing, says Roger Davies.

As the speed of technological developments increases and with it the need for ever greater rdaviesthumbnail1numbers of computer scientists, researchers and technologists the numbers opting to study computing in higher education have halved in the last ten years. There are many reasons; the image of the discipline, the lack of a coherent study pathway in secondary education, limited exposure to any computing before 16 to name just a few. Post 16 the numbers studying Computing are small. As a result, Computing teachers often feel isolated and face difficulties keeping up-to-date.

It is ironic that as ICT becomes increasingly ubiquitous, fewer children are being taught the fundamentals of computing, in particular programming. Bright students, of the kind who might make a career in computing, often progress in spite of, not because of, their school education.

Yet many children are curious about the technology we take for granted. They want to know how Google finds so many hits so quickly, and how it ranks them. How does an email get to its correct destination? How does file compression work? It is computing that gets i-tunes onto their mobiles, allows them to stream videos from across the world and buy things safely online.

In recent years, diverse groups of enthusiasts have sought to bring these concepts to life in a way that is understandable for children. For example Queen Mary College produce CS4Fn – a magazine aimed at secondary age pupils with a wonderful supporting website. Based at Glasgow University, Computer Science Inside have worked with teachers to develop a growing number of resources and in New Zealand the Computer Science Unplugged team have produced a marvellous collection of classroom activities to demonstrate computing concepts without the need for a computer.

If the thought of programming conjures up visions of blank faces staring at incomprehensible lines code it is time to rethink. There are many exciting resources that aim to introduce children to programming in enjoyable and engaging ways. GameMaker (developed at Utrecht University), Greenfoot, (Kent University), Scratch (MIT) and Alice from Carnegie Mellon are just some of the excellent free tools finding their way into schools.

The recent revision of the National Curriculum, with a new, welcome focus on sequencing provides an opportunity to replant the computing flag within our Key Stage 3 (11-14 years old) ICT provision. Computing has a rich and deep tradition and it is time for teachers to rediscover it. Programming teaches children the skills to dissect problems, understand the logic and sequences that lie behind solutions and be able to construct those solutions so a computer can execute them. These foundations provide generic and extendable skills that have value in many spheres beyond IT. As Nicolas Negroponte (architect of the OLC project) commented:

"Computer programming is a powerful tool for children to 'learn learning,' that is, to learn the skills of thinking and problem-solving... Children who engage in programming transfer that kind of learning to other things."

There is something special in pupils being able to get a computer to dance to their own tune. In my experience computing projects are highly motivational because of their capacity to make pupils think and stretch them. But above all else, they can be fun. One of my Year 9 (14-15 year old) pupils observed, on completing a unit using GameMaker:

“That was great. You normally teach the boring bits of my Mum’s job”.

‘Computing At School’ is an open, informal working group of enthusiasts that aims to promote Computing at school. Its membership is broad including teachers, examiners, parents, LEA advisors, university faculty, and employers. CAS was born out of our excitement with the discipline; a key goal being to put the fun back into teaching computing.

We would like to invite fellow teachers to an inaugural conference at Birmingham University on June 19th. Speakers will include Tim Bell (http://csunplugged.org/), Paul Curzon (http://www.cs4fn.org/), Michael Kölling (http://www.greenfoot.org) and Quintin Cutts (http://csi.dcs.gla.ac.uk/) amongst others. We hope this free event will provide an excellent opportunity to explore new ways to bring computing into our classrooms.

We hope the conference will provide a basis for creating an organization similar to the American Computer Science Teachers Association which has done much work to support teachers and promote a passion for computing. Please come and join us.

Further details about the conference and booking details can be found at http://computingatschool.org.uk/files/CAS_Conference_2009.pdf or by mailing conf2009@computingatschool.org.uk

Roger is Director of ICT, Queen Elizabeth School, Kirkby Lonsdale, Cumbria. He started the social network http://aqacomputing.ning.com/ which aims to provide a self help group for teachers involved in A Level Computing. He is a member of the CAS Working Group.

How good is the teaching of ICT? An interview with Edith, an English teenager

EdithWe're always interested in hearing the views of young people, so it was with great pleasure that Elaine and I interviewed Edith. Edith is a teenager living in England, and has some definite views about the teaching of information and communications studies (ICT).

I saw her, not for the first time, at a recent Teachmeet and was struck by her statement that she, and her peers, were being 'under-taught'. This ties in with what I reported in a recent newsletter:

"It's been found recently , by Ofsted, that teachers tend to teach ICT up to the limit of their own knowledge, and that this effectively holds children back."

 

In this interview we explore this and other issues. The podcast lasts just over 19 minutes.


 

The music after the introduction and at the end is Simple Soulman by The Groovebusters, and is under a Creative Commons licence. Hear the band at:  

http://www.garageband.com/song?|pe1|S8LTM0LdsaSkYFexYGE

Edith is 14 and attends school in England. She has spoken at Teachmeet events, such as the North London Teachmeet in 2009.

To respond to Edith, please submit a comment in the comments area below, or send me an email.

If you enjoyed listening to Edith's views you may also like our interview with Miller, an American teenager. That will be posted here on the 11th December 2009.

And you will probably enjoy the following: What are your kids learning while you're not looking?

That was the title of a presentation that Miles Berry and I did at the BETT Show 2009. Based on original research, it made it very clear that teachers make life more difficult for themselves, and less than interesting for their students, by ignoring what their students can already do.

For more information, including a link to Miles' blog on the subject and a slide show, see my article on What are your kids learning while you're not looking?  There is also a more up-to-date article I wrote for the IFIP newsletter, which is based in India.

 

21st century skills do not exist; here are 9 skills that do

Guest blogger Derek Blunt suggests what the real 21st century skills are.

Has there ever been such a frenzy of thinking and activity over a concept which does not even exist? I am referring, of course, to the ridiculous notion of so-called '21st century skills'.

Bloggers, teachers, employer organisations and even governments have fallen over themselves to produce documents 'proving' that 21st century skills are essential in the 21st century. Papers have been written. Rubrics have been created. In England the National Curriculum itself has been perverted from its course to include 'Personal Learning and Training Skills' (aptly pronounced, generally, as 'pelts').

I am surprised that I've yet to see jobs advertised: "Wanted: Dynamic Director of 21st Century Skills"; "Needed as soon as possible: 21st century skills co-ordinator."

The truth of the matter is that there are no such things as 21st century skills, only the skills that have always enabled people to get on in their lives since time immemorial.

Think about it for a second: 21st century skills can't include technical ability, because technology is changing so rapidly that what is far more useful is an ability to learn, and an ability to be flexible. Since when was that not the case?

Before the Industrial Revolution, people had to learn to be flexible when the harvest failed, or when the local squire decided to hog all the produce for himself.

During the Industrial Revolution, people had to learn new skills, and almost certainly didn't finish their working lives using the same skills as they started with.

Perhaps the pace of change was slower, in which case perhaps a candidate for the title '21st century skill' might be the ability to cope with change taking place at a breakneck speed. I haven't seen that on any syllabus or rubric.

So what are the skills which are essential to every young person? They're certainly not the wishy-washy 'soft skills' like 'being a good team member', which is not even something you can measure. No. Any decent educational system will make sure that young people leave school being proficient in the following:

  1. Ability to size people up instantly. We don't have time to mess around with people who are going to mess us about or, worse, rip us off. An ability to spot charlatans and other ne'er-do-wells instantly and to act accordingly is essential.
  2. Aye, and there's the rub: to act accordingly. Too often we don't, choosing instead to ignore first impressions and intuition and to give the miscreant the benefit of the doubt. Malcolm Gladwell, in Blink, has convincingly shown us that making near-instant judgements is natural and normal. A good education system will teach youngsters to listen to that inner voice, regardless of what their reason is telling them.
  3. Next, an ability to network is crucial. What matters in the 21st century is being connected, both offline and online. All young people should have to take some form of Enterprise education in order to develop their networking abilities.
  4. A corollary of this is that a good school will ensure that every member of staff and every pupil belongs to a social network. It can be a closed (Ning) network if Facebook is felt to be too risky. People have to learn how to behave in such environments, which are becoming part of the normal work and leisure landscape in modern societies.
  5. Manners. We seem to have reared a generation of young people who feel that the world owes them a living and that grunting and snarling are appropriate forms of behaviour. You only have to look at the behaviour of the people who are rejected early on in shows like the X Factor or American Pop Idol. How many of them say, "Thank you, panel. That is a really useful piece of free advice given by three experts at the top of their game."
    Not a bit of it. What you see instead is the all-too-familiar curl of the lip and the barely decipherable mumble along the lines of "You'll be sorry you missed this opportunity when I'm famous!"
  6. An ability to write. I have no issue with text-speak. Used in the proper context it's fine. But I expect to see apostrophes used correctly, and not see a comma when a semi-colon should have been used.
  7. Skill in making small talk. Schools should run simulated parties in which pupils learn key skills like how to hold a glass of wine, how to make light conversation, and to how to talk without spraying their audience with the mushroom vol-au-vent they've just put into their mouth.
  8. An ability to change course almost at the drop of a hat.
  9. Finally, how to think, and how to put a logical and coherent argument together are absolutely critical. To this end I would make Computer Programming compulsory from the first grade of primary school to the final grade.

Twenty-first skills? On the contrary: what we need is a return to the basics which have served people well for as long as anyone can recall.


Derek Blunt: Blunt by name, blunt by nature.

The Case for Print-On-Demand

Terry Freedman's PoD books

The article below was first published on 20 December 2006. It still stacks up now, but I have one or two additional comments to make at the end.

What could be better than receiving a box of books? Receiving a box of books that you wrote, of course! Is there a place for self-publishing in schools?

The books I refer to are the two booklets I wrote, on Every Child Matters and Boring ICT lessons. These were produced by print-on-demand, through Lulu, but published by, an therefore assigned an ISBN number by, Terry Freedman Ltd.

I ordered 10 copies of each in order to be able to comply with the UK requirement to send 6 copies of a newly-published books to various libraries and agencies. And that, of course, leaves 4 copies of each for me to send to reviewers and casually leave lying about when guests come to the house....

But isn't this just a case of vanity publishing? Well, yes and no. "Yes", in the sense that you pay the costs of having it printed and bound, whereas in mainstream publishing those costs are borne by the publisher. And "Yes" in the sense that if it's a niche product it would be hard to find a mainstream publisher that will take it on, which leaves doing it yourself as the only option. But "No" in the sense that if, as in my case, you have been approached by mainstream publishers and declined their advances and therefore made a free choice about whom you want to publish your book. And also "No" if the book has virtually no market at all (cf The Long Tail), which is what I should like to consider now.

Print on demand is a very good option when you need  very few, perhaps even just one, copy of a book. The origination costs, ie the fixed costs of setting up the book, are not spread over a large number, and so the fixed cost per book is relatively high. On the other hand, you don't have the twin problems of trying to find (a) start-up capital and (b) room to store hundreds of copies. In the case of Lulu, it's easy to amend the text of your book very quickly too, which in education, and especially the educational technology field, is a must these days.

So, what does all this mean for the ICT (Educational Technology) leader in a school?

I have long believed that if you want people to take something seriously and treat it with respect, it has to look good. What can look better than a publication which looks like it just came from a bookstore? Most schools do not have the facilities to be able to even begin to compete.

So, if I were a Head of Department or subject leader in a school now, I would use Lulu for a number of purposes:

  • The staff handbook
  • The 3 year strategic plan
  • Information about assessment
  • A year planner or calendar with important internal events (like report deadlines, term dates) and external events (like conferences) pre-filled in.
  • Students' completed projects (added Dec 09)
  • Students' leaving portfolios (added Dec 09)


If you wanted to produce your own textbook to distribute to all your students, it may be better, because cheaper, to go down a more traditional self-publishing route. That means, finding a printer who does short print runs, ie 500 or 1000. The biggest barrier to this avenue is the advance cost.

I'm not convinced that such a strategy would be cost-effective: On the one hand traditionally-published books are much cheaper as a rule. On the other hand, it's hard to beat the cost of a ring-binder and handouts or, of course, an online collection of resources.

But for the purposes of boosting your team's morale and creating a great impression with inspectors, having a dozen each of a few publications printed is hard to beat.

Reflections, two years on

Having read this article again, two years after I wrote it, the question arises: do I still agree with it? Broadly speaking, the answer is 'yes', but it's not quite as simple as that.

It is definitely the case that print-on-demand works out more expensive per copy than going to a short-run publisher. However, the issue for me would be: how many copies are required, or are you likely to sell? In other words, the more narrow the niche, the more attractive becomes print-on-demand. So if, say, you want enough copies for your ICT team and perhaps a few more to hand around, I would think that print-on-demand is the way to go.

However, I would not recommend print-on-demand for fiction writing if you can possibly avoid it. Self-published fiction is still associated with rubbish that is not good enough for mainstream publishers to bother with. I think that perception is slowly changing, because most new writers simply do not get a look in these days, and there have been some notable self-published successes. (Update: I accidentally referred to 'non-fiction' in this paragraph in the original version; I have corrected this, although hopefully the context, and the following paragraphs, will have indicated that I'd made the error, which was a slip of the pen as it were.)

In fact, if you have the stamina and the time, there is probably a case for saying that the best thing you can do is self-publish your novel (say) and market it incessantly in the hope that it will come to the attention of a mainstream publisher. But don't count on that happening, not least because you will be hard-pressed to even get it reviewed.

There's another caveat here. The CEO of Lulu didn't do anyone any favours when he said earlier this year that Lulu publishes the worst collection of poetry in the history of mankind. (See this article for a report on that by Angela Hoy, and this article for a follow-up.) I should not go so far as to say that he did a Ratner, because everyone knows that Lulu does not edit manuscripts (you'd have to purchase that service as an extra), and that many, probably most, self-publishers have received 'critical' acclaim from nobody other than themselves and their families and friends, who for the most part are too caring and too polite to say, "Sorry, but you just can't write. Take up painting instead."

Even so, I don't think comments like that help the general perception, based on a bygone era which possibly never existed, in which manuscripts were either eagerly snapped up by publishers willing to invest money and time into them, or were taken to a vanity press.

People's perception of self-publishing is better in non-fiction, certainly in the UK, possibly because people recognise that a lot of non-fiction would not be commercially viable for a mainstream publisher. Also, if you are recognised as an expert within your field, people in the same field are almost certainly not going to be deterred by your book's self-published status.

Of course, these days you can easily avoid physical books altogether and go down the ebook route. But why not do both?

I'd be interested in hearing about your views and experiences in these areas.

 

My Nominations for the 2009 Edublog Awards

Here are my nominations for the 2009 Edublog Awards.

Best individual blog

This is an easy choice; Shelly Terrell’s Teacher Reboot Camp. With interesting articles, the ‘month in review’ series, challenges and guest bloggers, the site is always interesting and thought-provoking.

Best individual tweeter

Well, I’d like to nominate Shelly for this as well, on the grounds that she retweets incessantly. This sharing is what the Web is all about these days, and Shelly exemplifies the principle brilliantly.

I’d also like to nominate Neil Adam. Neil is absolutely brilliant to have at a conference, because he live blogs keynote talks in the form of tweets. I often use his twitter stream to double-check my own notes!

Best educational use of video

My nomination in this category is for Leon Cych, who publishes the Learn for Life blog. Whenever there is a conference going on in Britain, there’s a fair chance that Leon will be there live streaming or recording it. The results are of a very high, professional standard, and his recordings of events such as Mirandamod discussions are forming an important archive that will be useful for years to come.

Best group blog

I quite like the Technology and Learning Blog. I suppose I have to declare an interest, which is that I write for it every other Tuesday, but in fact I’d nominate it even if I didn’t. There’s a good bunch of writers there, and the result is quite a rich reading experience.

Best educational use of a social networking service

My nomination here is for the Digiteen Ning, set up by Julie Lindsay and Vicki Davis. The posts, mostly by teenagers, are usually incredibly thoughtful. Anyone who bemoans ‘the youth of today’ needs to visit this site.



Computers in Classrooms December Edition Just Published

Here's what it contains:

  • Editorial
  • Website news
  • Web 2.0 Projects Book
  • The K12 Online Conference
  • Mobile Learning Mirandamod
  • Are you taking Twitter too seriously?
  • How useful are elevator speeches?
  • What To Do When An Inspector Calls: 9 Suggestions
  • The Children, Schools and Families Bill
  • What the recent Ofsted report says about ICT
  • The New Ofsted Framework and ICT: 7 Key Points
  • Learning new software: Adobe CS4

Reviews section:

  • Your Justice, Your World - A Primary Teacher's Perspective
  • Your Justice, Your World - A Secondary Teacher's Perspective
  • WriteMonkey
  • Marxio Timer
  • The Making of a Digital World
  • The Well-Fed Writer
  • Totally Wired
  • Wikified Schools
  • Twitter Means Business
  • Grown Up Digital

Look here for details of how to subscribe (it's free).

You Need To Set a Good Example

If you want students to be good learners and users of technology, you have to set a good example. That is basically the message of Shelly Terrell's latest post, Most Teachers Don't Live There. Shelly asks:

Shelly Terrell

If we are knowledge sharers, shouldn’t we continue to fill ourselves with knowledge?

If we want to inspire students to continue learning throughout their lives, then shouldn’t we continue to learn throughout our lives?

If we want motivated students who see learning as a journey, then shouldn’t we continue our journey?

If we want to motivate students to be the best in their fields, then shouldn’t we be the best in our fields?

If we want other educators to listen to our ideas, then shouldn’t we read about their ideas?

If we want support from our colleges, then shouldn’t we support their workshops and projects?

If we want students to use digital media responsibly, then shouldn’t we give them access and show them how?

If we want student to not let technology overtake their lives, then shouldn’t we teach them how to balance themselves?

How can we teach balance, if we don’t have any social media in our diet?

These are great questions, and they are spot on. Whether you work in a school, a Local Authority or for a company or for yourself, if you do nothing else you must at least be an excellent role model in your appraoch to education in general and to educational technology in particular.

In fact, I would go further than Shelly has, and say it's not only about setting a good example to our students, but to our colleagues as well.

Of course, some of Shelly's challenges are hard to meet, like the ones about balance. Recently we watched a programme called "Email is ruining my life", which looked at someone who sleeps with her Blackberry next her in case an email comes through in the middle of the night, checks emails in the bathroom, checks them whilst having dinner.... I am not that bad, but I must be heading in that direction because at the end of the programme Elaine said to me:

"Recognise anyone you know?"

I tried to plead the 5th, but that doesn't cut much ice in England.

It reminds me of this story:

A woman takes her little boy to see a Holy Man. She says, "Please tell my son to stop eating sugar."

He replies: "Certainly. Bring your son to me in three days' time."

Three days later, she returns with her son, and the Holy Man says to him, "Stop eating sugar."

The woman says, "Why couldn't you have told him that three days ago?"

"Because", he says, "Three days ago I had not stopped eating sugar."

Shelly's post is very challenging, I think, and she finishes it with a great challenge to the reader. Do head on over there to read her post in full.

Who Ya Gonna Call? Results of My 'Experts' Poll

A while ago I conducted a survey to find out who or where people turn to for expert help. Here is a quick snapshot of the results:

Who you turn to for answersIn a forthcoming issue of Computers in Classrooms I'll be adding more detail, such as what people suggested in the 'Other' category. Thanks to everyone who took part in the survey.

 

The Online Information Conference and other news

In this video I talk about the Online Information Conference. If you're in London and you see this in time (it finishes on 4th December 2009) you might like to get along, for reasons I describe.

If you can't get there, it's worth checking out the website for information and podcasts.

I've also included a short video I shot with a pocket video recorder called the Kodak Zi8, which I'm quite impressed with.

Other items mentioned include the next issue of Computers in Classrooms, which includes several book reviews, two reviews of the same website, current legislation in the works, elevator speeches and coping with inspection. That will be out very soon.

Plus information about the Web 2.0 Projects Book I'm working on, and my two presentations at BETT, which are:

Driving Your ICT Vision: how might advanced motoring techniques help us achieve our ICT goals?

Amazing Web 2.0 Projects: Real projects in real classrooms with real kids!

Wasteful Widgets #4: Maps

Like many bloggers, I have a map on site showing where visitors are coming from. Why?

World viewI think there are two aspects to this question: why have a map, and why show it to the world?

On the first issue, it is quite nice to look at one's reach. I have to admit to a little thrill when I see that someone in Borneo, say, has been checking out my blog. And I find it fascinating that I can write something now, sitting here in my home near London, England, and seconds later someone on the other side of the world, in Australia or New Zealand, can be reading it. I am still intrigued despite having done this kind of stuff for nearly 15 years.

But the other question, about why show it, is more problematic. At first, I did so because I wanted people to see that I am being read internationally. I now feel that one should be able to take it as read that that's the case, and not make a big deal out of it.

I also think, like recent comments, it's a matter of context. When I took part in the Classroom 2.0 Live discussion, people were asked right at the start to click on a map to show where they were listening from. Seeing the whole world 'light up' as members of the audience did so was an incredible experience. And for me, as the guest speaker, it really did bring it home to me that, although I was loafing around in casual clothes and unshaven, I was addressing a global audience in just as real a way as if I had been speaking at a physical event.

So, given that there's not much point in displaying a map on my site, because it lacks context and smacks ever-so-slightly of egotism, why do I continue to do so?

The answer, I'm afraid, will be familiar to married men everywhere. My wife likes to look at it, and doesn't want to have to bother logging in in order to do so.

International ICT in education superstar I may be, but at home I know my place!

Other articles in the Wasteful Widgets mini-series

Wasteful Widgets #1: Most popular articles

Wasteful Widgets #2: Twitter Feeds, and 7 Reasons to Eschew Them

Wasteful Widgets #3: Recent Comments

Delete: The Virtue of Forgetting in the Digital Age

In "Sketches Among The Ruins of My Mind", Philip José Farmer depicts a nightmare scenario in which an object suddenly appears in our skies, and proceeds to remove everyone's memories, four days at a time. Gradually, people regress through their chronological age, ending up drooling like babies, and forgetting all their relationships and skills. As people realise what is happening, they resort to leaving themselves notes and tape recordings by which to tell or remind themselves, on waking up in the morning, what's been going on.

That's an extreme description of what might happen if we were unable, unaided, to remember anything about the last three days, but humankind has always tried to find ways of remembering.

John Mack, in "The Museum of the Mind", looks at how different people in different times and places have used artefacts such as paintings and sculptures to help them remember, a story he tells through the collections in the British Museum.

We have always been afraid of forgetting which, as Viktor Mayer-Schönberger has pointed out in a recent lecture, is the "default setting" for human beings. However, we have now entered a digital age in which this balance between remembering and forgetting has been reversed. In other words, the default setting is now remembering, and we as a society have perfect memory.

A good thing? In some respects, of course; but Mayer-Schönberger fears that we have not fully considered the negative implications of perfect memory.

One of the sources he draws upon is the Argentinian writer, Borges. In “Funes, The Memorious”, Borges provides us with a startlingly accurate insight into what a curse perfect memory would be for an individual person. “Startlingly accurate”? Yes, because decades after he wrote this we have discovered a handful of people in the world who have this rare ability affliction.

And the societal perspective on this?

As Mayer-Schönberger points out, a society that never forgets, may stop forgiving. That unfortunate photo of yourself, or that article you wrote whilst a student, may come back to haunt you years, even decades, later.

Such a situation leads people to self-censor, not just in the here and now, but with one eye on the future. It reminds me of a science fiction story I read in which crime was effectively eradicated because the police used cameras that could go back in time to record actual events instead of people's recollections of them. The story centred on one man's attempt to commit the perfect murder: he had to engineer the situation to cause his victim to have a fatal heart attack, so that when the inevitable cameras came, they would record that he had caused the person no physical harm.

Mayer-Schönberger's suggestion is that we should remember to forget. Technology can help us by prompting us to specify expiration dates for the data we store.

It was a fascinating talk, which you can listen to. I am now in the process of reading his book, 'Delete: The Virtue of Forgetting in the Digital Age', and will review it due course.

In the meantime, perhaps this is a topic that would make for a good discussion in ICT and even Citizenship lessons.

The books mentioned in this article are featured on my Amazon page, where they can be purchased, thereby providing me with a (very) modest additional income. Also mentioned on the page are Fictions, a collection of short stories by Borges that includes Funes, The Memorious. Although nothing to do with ICT in education as such, these stories make you think. And one, The Library of Babel, really does have echoes in the Web 2.0 world, as I described in this article about collaboration.

Also featured is Google Bomb, which covers similar ground, but looked at through the lens of online defamation and cyber-attacks.

Although I have yet to review them, I will say now that these books deserve a central place in your educational technology library.

 

Collaborative Approaches To Learning: Always A Good Thing?

Collaborative approaches to learning certainly have their place -- but not at the expense of the facts!

This is an updated version of an article which first appeared on Wed, 7 Sep 2005.That sounds like a long time ago, but I think the issues I was describing then are still relevant today. But I'd value your opinion on this matter. It's a longish article: go grab yourself a cup of tea.

In March 1923, in an interview with The New York Times, the British mountaineer George Leigh Mallory was asked why he wanted to climb Mount Everest, and replied, 'Because it's there'. That seems to be exactly the attitude of some educationalists when it comes to recent developments such as blogging, podcasting and wikis. That is to say, they use them purely and simply  because they are there.

I'm all in favour of pioneering and trailblazing, but the downside is that evangelistic fervour can sometimes outweigh, or cloud over, any objective judgement. In my view, what we educationalists should be aiming for is not to get our students and colleagues to use technology, but to use appropriate technology appropriately. Unfortunately, that message sometimes seems to get lost in the hubbub.

I am thinking in particular of the apparently increasing adulation of, and reliance on, collaborative tools for the purpose of research, especially blogs, podcasts and wikis (the most well-known of the last is, of course, Wikipedia). In case you are new to all this, blogs are online journals, podcasts are recordings, usually in MP3 format, and wikis are web pages which can be edited live on the internet, either by anybody or by people who have subscribed to the group concerned. Wikipedia is an online encyclopaedia which features articles which can be published, then edited and counter-edited.

Is ‘truth’ relative or absolute?

Wikipedia in particular is often hailed as a fantastic resource, and one which has grown through collaboration by ordinary people. It is, if you will, a perfect example of democracy in action -- apparently, at least. The question we need to ask, however, is whether this and similar enterprises are actually useful.

For most people, and societies, the ultimate goal is absolute truth, not relativism. This isn't only a religious quest: in the field of finance, one of the main attributes of money is that it should be a measure of value which does not, in itself, change value. Hence, in modern societies, the attempts to fix a currency's value by pegging it to gold or to another, more stable, currency. Trying to measure the value of something if the value of money is constantly changing is like trying to measure the length of something with a ruler whose length keeps changing.

Is collaboration always a good thing?If relativism is not ok in our religious or economic lives, why should it be ok in our intellectual life? We all know that knowledge and understanding are constantly evolving, and that the self-evident "truths" of yesteryear are sometimes found to be wrong in the light of new evidence. That is disconcerting, to say the least, but at least it's a process that happens over years rather than overnight.

It's also a process that happens with the involvement of experts in their field. Now, I am not so naive as to not understand that viewpoints which do not fit into the convention wisdom of the age are unlikely to be heard. You only have to look at the experiences of Freud, Darwin and, in our own age, homeopaths and others to realise that. And the economist J M Keynes, when asked why he had failed his Economics examination at university, said that it was because he knew more about Economics than his tutors.

Nevertheless, you can't have an article published in a scientific journal or the Encyclopedia Britannica unless it has been scrutinised and vetted by another expert. This is in contrast to wikis, where for the most part anybody can come along and change an article without knowing the first thing about the subject area.

Two cheers for democracy*

Now, this may seem like a very anti-democratic point of view, and that's because it is -- in this context. If that sounds arrogant, consider this: if you are the world's leading expert in a particular area, do you really want some virtual passer-by to "improve" your work by chopping bits out or adding bits in? Of course not! But even if you are an ordinary expert, as distinct from a world one, you will still not want someone correcting you. At least, not in that way. You might enjoy a good debate, and be open to have your views challenged, and may even change your views through that process, but that, I would contend, is a very different situation.

Even more important, though, is the potential confusion it creates for students. Imagine finding a great fact to put in an essay, and then double-checking it the next day, only to find that it's disappeared. Does that means it was incorrect, or that someone didn't like it? The only thing the student can do is to seek verification from another source. That's good practice, but the question is: what kind of source?

When I asked Limor Garcia, the inventor of Cellphedia** (a kind of mobile phone version of Wikipedia), how she would advise students to check the truth of the information they find, she said that people would be able to correct each other's answers, but also that they could check the answer in Google. That seems to me to beg the questions: (a) if you are going to check the answer in Google, why use Cellphedia? and (b) how would you know if the information you found in Google is correct?

The Library of Babel

Searching, searching...Interestingly, these kind of paradoxes are not new. In a story called "The Library of Babel", written in 1941, the Argentinean writer Jorge Luis Borges describes a vast library in which there is not only a copy of every book ever written, but every book which could be written. There is, for example, a library catalogue, and an infinite number of variations of it. There is a marvellous passage in which he describes the quest for the "master" book:

"In some shelf of some hexagon, men reasoned, there must exist a book which is the cipher and perfect compendium of all the rest: some librarian has perused it, and it is analogous to a god. Vestiges of the worship of that remote functionary still persists in the language of this zone. Many pilgrimages have sought Him out. For a century they trod the most diverse routes in vain. How to locate the secret hexagon which harboured it? Someone proposed a regressive approach: in order to locate book A, first consult book B which will indicate the location of book A; in order to locate book B, first consult book C, and so on ad infinitum."

(J L Borges, The Library of Babel, in "Fictions", which is featured on our Amazon page)

 The worrying development for me is not the invention and expansion of tools such as Wikipedia and Cellphedia. I actually think they have vast potential and are, in fact, tremendously exciting. From the point of view of the learning process, taking part in such collaboration is bound to engage or re-engage a lot of learners.

What I am more concerned about is the often uncritical stance of some educationalists in relation to these tools. For example, I have read articles which favourably compare Wikipedia to traditional encyclopaedias on the basis of weight, its ability to constantly change, its democratic ethos, and other characteristics. Surely the most important yardstick is accuracy? And a couple of months ago I met the Head of ICT at an independent secondary school who said, quite seriously, "We don't need to teach kids how to search the internet; they use Google and Wikipedia all the time at home."

Essential skills for users of ICT in education

We need to teach our students a number of skills or approaches when it comes to verifying information:

  • a questioning approach rather than a willingness to accept things at face value;
  • triangulation, which is the cross-checking of supposed facts with other sources of information;
  • in triangulation, the use of different types or sources of evidence; for example, there is no sense in cross-checking the accuracy of the comments I've made here by looking at other comments I've made: you should look in other sources; otherwise,it all becomes self-referential.

Above all, we educationalists should not fall into the trap of using a new technology in every situation just because it is there.

Conclusions

So what does this mean in terms of the educational benefits of services like Wikipedia, Cellphedia and, in a wider context, blogs and podcasts? Does it mean we should reject them entirely? The answer is that we need to treat them in the same way as we would encourage our students to treat any other source of information: with caution and, as stated above, to cross-check the information found using them.

We should also recognise that these new tools have some distinct advantages: they are fresh, they allow "breaking news" in academic fields to be published with a lower burden of proof required, meaning that a debate can be entered into at an earlier stage and by more people. They also enable the ordinary person and the maverick to have their say. Finally, they can also have profound benefits in a social context, especially mobile phone-based services like Cellphedia (the need for which has, I would suggest, been superceded by the wonderful mobile phone apps that are available these days) : imagine being able to go to a new area and find out where other people would recommend eating or staying (there are apps for exactly this).

Finally, taking part in such projects can be very useful for students, because it involves the skills of research, writing, collaboration and editing. It is easy enough to set up your own blog, podcast or wiki, as you will know if you've looked at the Web 2.0 Projects book .

In conclusion, we need to steer a fine line between using something in all situations, regardless of how appropriate it is, and rejecting it out of hand. I'm sure that the line is a wavy one as we continue to grapple with and debate these issues.

Postscript: The Demise of Wikipedia?

According to the London Evening Standard, editors are leaving Wikipedia in droves. Apparently, they don’t like the recently changed rules which, supposedly, make it harder to get away with writing rubbish or deleting good stuff. Read the comments too. Kate, for example, got fed up with her expert postings being deleted by some nameless and faceless person who decided that she hadn’t cited enough references. That sounds reasonable, but for me, having your work commented upon and rejected by someone who won’t or can’t even give you their name is unacceptable.

* Apologies to E M Forster.

** Unfortunately, at the time of writing the Cellphedia website seems to be unavailable.

 

Wasteful Widgets #3: Recent Comments

Many blogs display recent comments, either by the blog owner or other people. I've implemented this myself in the past, but now have reservations about doing so, for the following reasons.

Firstly, I'm not a legal expert but I should have thought that if you're going to display people's comments somewhere other than where they originally posted them, you should at least warn people that you may do so. A lot of blogs don't.

Even if you don't need to from a legal point of view (and I imagine that would depend on which country you reside in), it seems to me to be the right thing to do anyway, which is why the Terms and Privacy policy on this website states that if you post a comment it may be used elsewhere on the site, or in the newsletter.

However, there is also a moral dimension: is it right to take someone's comment out of context, without at least some clarifying text? Perhaps most of the time this won't be an issue, but imagine this scenario:

Suppose I read on someone's blog that she wrote an article for a commercial magazine, for no pay in order to get her foot in the door, and has now been told that they are not interested in commissioning her for any paid work. However, because her article was pretty good they would always be interested in receiving more, just not in paying for them.

I might write a comment like, "Your first mistake was writing an article for free. You should always agree on the fee before putting pen to paper, as it were."

Taken out of context, that could be quite reputation-damaging. It suggests, for example, that I would only write an article if I am going to be paid money for it. Anyone reading the comment will not have the benefit of seeing the context in which it was made.

In this respect, automatically posting recent comments suffers from a similar consideration to posting Twitter conversations, ie they only make complete sense in context.

As for posting your own comments automatically, I don't see the point in that at all, unless it's to demonstrate to all and sundry how ubiquitous you and your wisdom are. But again, taken out of context, your own comments have little meaning in my opinion.

What I think would be quite handy would be an application that collates comments from all over the place on a particular blog post. I sometimes have few comments on the blog itself, but they appear elsewhere such as on Twitter of Facebook.

I think overall, my objection to automatic comment posting from an educational point of view is that it represents a poor use of ICT in education. To my mind, ICT should seek to solve a problem or answer a question, not be used just for its own sake. Perhaps if someone could explain the point of displaying comments somewhere other than where they were put in the first place I'd feel differently about it.

Wasteful Widgets #2: Twitter Feeds, and 7 Reasons to Eschew Them

Many websites have a section in which their current Twitter conversation is shown. I've played around with this myself, and after some time decided that it was not something I wanted to continue with, for the following reasons.

Firstly, it just looks so ascetically awful on most websites. Maybe that's to do with broader issues, like the blog's template or the blog owner's design skills, but to my eye it usually just looks like a mess. In fact, there's one blog I checked out recently where the Twitter feed was so prominent that it took me a moment or two to work out where the actual latest article was.

When I tried it out I put it on a separate page on its own. That overcame the messiness problem, but it only served to emphasise my second objection.

It seems to me that Twitter is, fundamentally, a conversation, and that conversations take place within a context, especially a temporal context. To take a snapshot of a conversation -- which is itself taking it out of context -- and then put it somewhere else entirely, is surely a double whammy? How can that snippet of conversation be meaningful, except by pure chance?

Josie Fraser is making an effort to make her Twitter stream more meaningful, and it will be interesting to see how that works out, but I'm not holding my breath.

Thirdly, what's the point of it anyway? For me, the idea of Twitter as conversation is that I'd like people to converse with me, not look at what is, in effect, a transcript of a conversation I'm having.

The widget I tried out made matters worse because, for some reason, it showed only my side of the conversation. So you would see these disembodied pronouncements which, if anything, made me look like a complete moron. That leads me on to…

Fourthly, when I was trying it out, because I knew that the conversation, or my side of it at least, would appear on my website, I found myself starting to look over my shoulder at myself, which is physically impossible, I know, but hopefully you will get my drift. I would start to think, "How will this look to anyone who doesn't know me?", and so I began to think twice before I replied with LOL or "Oh no" or whatever. It placed an unnecessary and self-imposed block on my self-expression.

Even if all these objections could be overcome, there is a fifth one. This blog is entirely about ICT in education. Maybe that degree of nicheness makes me the most boring person on earth, but that's the way it is. In Twitter and other places, though, I have more wide-ranging conversations. Having those, or parts of them, appear on my blog would serve only to dilute it as far as I'm concerned.

You may argue that it would be nice to see a different side to me. I agree that it's always nice to see other dimensions of the people whose blogs we read. The answer is: follow me on Twitter! I'll probably follow you back, and we will both gain. Or read my other blog, where I write about anything and everything, when I find the time.

I can see that there may be some value in publishing a Twitter stream from a list you belong to, especially if it's a specialised list. But then, for me, there's another objection:

I don't know what the people I converse with are going to say. Most of them, most of the time, don't say anything which might embarrass me, but every so often one of them will swear or imply swearing. If they did so in a comment I would refuse to publish the comment, but I don't have that facility in publishing a Twitter stream (as far as I know). The swearing doesn't happen very often, but I don't want it to appear on my website at all.

Finally, this highlights a really important issue. I think one of the things we ought to be teaching young people, and demonstrating, is that we control the technology, or should do. By placing code on your website which puts you, in effect, at the mercy of anyone who, whether inadvertently or not, says something you'd rather not see under your name, you're modelling the exact opposite, ie the technology is in control while you are a passive bystander.

All things considered, I think that placing a Twitter stream on a website is definitely a solution to a problem. It's just that I haven't figured out the problem yet.

Wasteful Widgets #1: Most popular articles

We're always hearing about new widgets. I love experimenting with widgets, but I think it's easy to get carried away with the wizardry of widgets. Some of them are, in my opinion, a waste of time, and I thought I'd share my views on some of these.

One thing I see a lot of is the Most Popular Articles widget. The idea is that people can see at a glance which articles on your website or blog have been most clicked on. I've messed around with this myself, and when I started this new website just over a month ago, I was a little disappointed that Squarespace provides no obvious way of displaying this information to the public.

"But", said my wife. "Isn't that sort of widget just a self-fulfilling prophecy?"

She was right. The existence of such a widget is designed to encourage people to click on your most popular articles, thereby making them even more popular. I suppose it's based on the idea that all these people can't be wrong.

But what if the current zeitgeist changes? If your most popular articles all seem to be about X and the new Zeitgeist is Y, doesn't that immediately put people off, especially first time visitors to your blog?

In any case, far better, in a way, would be to encourage people to read your least popular articles.

A more useful variation of this sort of widget is one which provides links to related articles, which is why I like using Zemanta.

That is not to say that knowing which articles are the most popular isn't useful. I use the information to try and guide my writing, to some extent.

I think if you want to draw people's attention to other articles on your site, the best approaches are to list the most recent articles (which is the one I've adopted, and has also been adopted by Windowsbytes, where it works really well, I think), or to show which other posts are related to the current one being looked at, as seen at Problogger (although Darren Rowse, who owns it, also includes a popular articles section). When I come across sites that do one or both of these I tend to get drawn in, which is what the blog owner wants to happen.

If we transfer these ideas to an educational context, it seems to me that a widget in a virtual learning environment which automatically (or semi-automatically) showed related articles or links would be very useful indeed.

It would certainly be more useful, and probably less fraught with potential problems, than one which showed the most popular articles or links.

Awards for the best use of technology in schools

Becta has announced Inner-city comprehensive Broadgreen International School in Liverpool and independent Prospect House School in Putney, London as the top two schools in the UK when it comes to the best use of technology.

I think it's worth reading the article below, and going to the Awards website (see below), in order to pick up some more ideas about what constitutes great use of ICT in schools.

The prestigious accolade of ‘Best Whole School’ is given to only one secondary (high) and one primary (elementary)school each year. The two schools beat more than 100 other schools across the country to collect their awards at Bristol’s newest science venue, At-Bristol, last night.

Both schools have demonstrated that they have successfully placed technology at the heart of learning as well as wider school management.  This has helped to enhance teaching and bring lessons to life. It has shown how technology makes a difference, not only in the classroom, but at home within the family and across other school activities. As I will say in my talk at Classroom 2.0 Live tomorrow,

"People ask: how can I use this application in my teaching? That’s starting with the technology and hoping it will lead to the education bit. A better question is: what applications can I use to help my students achieve X? That starts with the education and leads on to the technology. I think there’s a reason our area of expertise is sometimes called ‘educational technology’ as opposed to ‘technological education’!"

Broadgreen International School impressed the judges with its futuristic technology centre and use of ICT to involve the wider community in every aspect of the school’s life including its deaf resource base and a lively ‘silver surfers’ group.

The school’s Deaf Resource Base was able to create an online British sign language version of ‘Living in the Blitz’ for history lessons. It is fully accessible by deaf pupils, allowing them to work alongside and complete the same work as hearing pupils.

Students and teacher at Broadgreen School

The school also has ‘Silver Surfers’ groups for older members of the community learning to use technology. Age is no barrier and Les, an original member of the group who is ninety next birthday, regularly communicates via webcam and has his own blog about his war time experiences.

When giving advice to other schools looking to boost ICT, Peter Banks, Assistant Headteacher, says: “Use the Becta self review framework to see where you are and how you can improve. Ensure your ICT equipment is up to date and sustainable in terms of financing. Visit schools that are using ICT well so you can learn from them.”

I would certainly agree with all this, and made that last point myself in the article 10 Ways to Become an Inspirational Teacher.

The SRF is something I very much go along with too. It's comprehensive, and at the same time generic enough to incorporate as-yet-uninvented technology.

Prospect House has students who are confident, enthusiastic and independent in their use of computers, mobile devices, digital cameras and virtual learning platforms across the entire curriculum. From reviewing their sporting performances on screen to creating animations in art lessons, technology is used in every lesson to help students achieve more. It sounds like the school has successfully embedded the use ICT right across the curriculum.

The school also posts podcasts of lessons on its Virtual Learning Environment, so that parents can see how, say, long division is taught. This has helped to raise parental involvement in their children's work.

Pupil at Prospect House

When reflecting on why the school won the award, Dianne Barratt, the Headmistress, says it is a combination of a shared vision by the Senior Leadership Team, including the school governors combined with an enthusiastic staff, all of whom are committed to developing their practice with the aid of technology. 

For more information on the Awards, please visit the Winners 2009 website, where you will find details of other winners, as well as further information about each one along with short videos. On a personal level, I was delighted to learn that The Havering ICT support service was a joint winner in the Support for Schools section.

Acknowledgements

Thanks to Kate Brennan of Shiny Red for information, case studies and photos. I’ve amended the written stuff (not the photos!) with permission. Thanks also to Dave Smith of Havering for additional information. Read Dave’s Havering blog for more ICT-related news.

Further information

The ICT Excellence Awards is an awards scheme open to all schools which aims to identify and reward excellence in Information & Communications Technology (ICT). The awards acknowledge UK schools approaching technology in outstanding or innovative ways.

See also the Next Generation Learning website.

 

Robot rights

"I will NOT have any daughter of mine bringing a robot into this house!"

You can just imagine the family rows of the future, should technology ever reach the point where it isn't possible to distinguish between humans and non-humans merely by looking at them.

And what of the ethnic monitoring forms of the future? Will employers have to ensure that a certain percentage of its workforce is non-human? An ethnic monitoring form of the future?

An article in the Daily Telegraph reports that people have already started to think about such matters:

"Society must decide if it is willing to accept relationships between humans and robots before the machines become so sophisticated they start demanding rights, a legal expert has warned."

I recall reading a short story some years ago in which a person discovers that they're not human, but a robot, and has to leave his job because of antagonism which I suppose would be classified as 'robotism'. It gives grist to my mill that, as I argued recently, science fiction can be a great starting point for discussion in a whole range of areas.

Furthermore, as this story in the Telegraph shows, the pace of technological change is such that we cannot assume that just because something is still confined to the fiction area of the bookshop it is not worth thinking about for its implications in actuality.

What Anna Russel, the legal expert referred to, has done is to extrapolate from current technological developments to potential problems for the future. This kind of exercise can be quite useful in getting students to think about the (possible) effects of technology on society, which is part of the National Curriculum in England and Wales and the curriculum of other countries.