­
12 things I've learnt about assessing Computing and ICT — ICT & Computing in Education
  • Front Page
  • Search
    • Digital Education
    • Terry Freedman's Books Bulletin
  • RSS
    • Welcome
    • The "About" Page
    • Testimonials
    • CV/Resumé
    • My Writing
    • Published articles
  • Corrections Policy
Menu

ICT & Computing in Education

Articles on education technology and related topics
  • Front Page
  • Search
  • Newsletters
    • Digital Education
    • Terry Freedman's Books Bulletin
  • RSS
  • Info
    • Welcome
    • The "About" Page
    • Testimonials
    • CV/Resumé
    • My Writing
    • Published articles
  • Corrections Policy

12 things I've learnt about assessing Computing and ICT

September 30, 2016

I've been thinking about, doing, and running courses in the art and science of assessing what kids know, understand and can do when it comes to Computing and ICT for a long time. Here are 12 things I've learnt. Four of these have been published as an article on the ICT & Computing in Education website, but this is the full list.

The more I learn, the more I realise I don't know

I think this must be true of any sphere of knowledge. The more you know, the more nuances you see, the more you become aware of the alternatives.

I experienced the same thing when I studied Economics at school. When I was 18 I thought of myself as a budding economist. By the time I'd finished my degree in Economics I thought of myself as someone who didn't know all that much in the total scheme of things.

Getting back to assessing Computing and ICT, when I was at the Qualifications and Curriculum Authority we would spend whole days looking at examples of assessment questions and examples of pupils' work, arguing back and forth about whether this particular question was "valid", or whether that particular sample of work really proved that the student knew her stuff.

It isn't straightforward.

Don't believe simplistic solutions

I like simple solutions, but not simplistic ones. A simple (partial) solution might be to say you're going to give pupils a short baseline test at the start of each new topic. That's an eminently sensible thing to do, and with the right tool and the right approach, it shouldn't take long.

A simplistic solution is usually introduced with the phrase "All you have to do is...". In my opinion, any sentence starting like that presages a simplistic "solution" that is not a solution at all.

Consolidation is not progress

I'm certainly in favour of checking that students have really understood a concept, such as conditionality. But I've sometimes seen instructions that tell you that if a student gets the right answer three times to the same sort of problem, they have achieved more than if they get it right only once or twice.

No they haven't.

All that proves is that either they have consolidated their understanding, knowledge or skills, or they have learnt how to answer that sort of question. To evaluate progress, you have to set different types of problem, to come at it from several angles.

As I used to say when we had Levels, achieving a Level 3 ten times doesn't make you a Level 30, it makes you a Level 3.

Groupthink can lead you astray

It's easy to assume that when everyone is saying or doing the same thing, then they must be right -- the "wisdom of crowds" argument. But as Crispin Weston points out in his article, It's the technology, stupid!, the book of that name is somewhat disparaging about the so-called "wisdom" of crowds.

Let's put it this way. Since many of the approaches to assessing Computing, while useful, have effectively reinvented Levels, you have nothing to lose and everything to gain by thinking through all the issues for yourself. That is far better than assuming everyone else is right, or buying an assessment product without knowing how it works (which I referred to as outsourcing assessment to an algorithm).

It’s hard, and fascinating

Assessment is a fascinating area to explore. There are so many potential pitfalls. For example, is this test valid? Is there even such a thing as validity? (Dylan Wiliam, in a conference on assessment, suggested that validity is not an intrinsic quality of a test: what matters is the purpose for which you use the test. I think this is obvious once you see it stated, but we tend to become lazy in our use of language. We use shorthand terms, and then forget that were intended to be shorthand.)

Collaboration is essential

Collaboration doesn’t prevent groupthink, but if you collaborate with a diverse range of people, it makes it less likely. That’s what I like to think, anyway.

You need to know how all-in-one solutions work

If you buy a package that has curriculum materials and assessment items built in, and those assessment items can be used to tell you how your students are doing, then you need to know how it’s arriving at its conclusions.

You will probably not be able to find out the fine detail because of intellectual property issues, but you should know how it works broadly speaking.

For example, does it arbitrarily assign some sort of level or grade depending on how a student does on a test? If so, is the test fit for purpose?

Or does it track what the student is doing on-screen, and then try to decide how competent the student is? If so, how does it do that, in general terms? Does it sound like it makes intellectual sense?

More on this in the article outsourcing assessment to an algorithm.

You must be able to exercise professional judgement

Whatever a test tells you about a student, you need to be able to override the result to take into account other factors. For example, I’ve known students to flounder on a test because they could see how more than one answer could be correct, depending on what assumptions you made. That’s obviously a poorly constructed test, so the student shouldn’t be penalised for it.

Senior leaders are wedded to numbers…

For various reasons in England — I can’t really speak for other countries — senior leadership teams insist on numbers. Never mind the fact that officially we no longer have to report Levels. As far as many schools are concerned, Levels are alive and well.

…So you need to accommodate them

If you work in a school where you have to produce numbers, e.g. Levels, you need to have a means of converting your preferred assessment approach into numbers. No point in refusing to do so, because you’ll probably end up be formally disciplined. No point in pointing out that we’re encouraged to not use Levels because of their drawbacks: the senior leadership will probably agree with you, but that won’t change anything. You are just going to have to go with the flow.

I’ve looked at how to do so in this article: How to convert your assessment system to Levels or Grades.

Your approach needs to work with or be compatible with school system

Even if your particular approach is brilliant, it has to tie in with the school’s system. I think this is easier if everyone in the school is using the same program. Where things become a bit difficult is where widely different approaches are used, and/or different programs.

If you’re in this predicament, I think you will need to do one of the following:

•       Drop your approach altogether.

•       Find a way of converting the results given by your approach into the form that the school as a whole is using.

•       Use your approach on a day-to-day basis, but the school’s approach when you have to submit results to the school’s database.

Your approach should be based on a theory

Any system of assessment should have a theoretical underpinning, so you can justify why you use it, and can modify it easily when circumstances change.

I hope you find these points useful.

An earlier version of this article was published in Digital Education, a free newsletter. Look on this page for details: Newsletters.

Advert: My “assessing Computing” courses

The challenges of assessment are not insurmountable

If you woud like some guidance in getting on top of assessing ICT and Computing, I’ll be running two courses in March 2017. However, if you cannot wait that long then why not contact me with a view to running a bespoke course for your school or a group of schools? You can see what other people have said about my courses, including a bespoke one Idid for a cluster of schools in Guernsey, here: Course Testimonials. If a whole day sounds like too much, then I'm happy to run a workshop or give a talk at your next conference or training day.

In Assessment, Using and Teaching Computing & ICT Tags assessing ICT, assessing Computing, assessment
← 5 mistakes I made when teaching Computing, by William LauWhat I've been reading: Digital Literacy Skills for FE Teachers →
Recent book reviews
Backlist: The Written World
Backlist: The Written World

Writing was invented ‘only’ a few thousand years ago. It’s a fascinating story.

Read More →
Backlist: What I'm reading: Bounce
Backlist: What I'm reading: Bounce

What does it take to become an expert? And what can the Computing teacher do about it?

Read More →
Backlist: The Fourth Education Revolution
Backlist: The Fourth Education Revolution

The title of this book invites curiosity: what were the other three ‘revolutions?

Read More →
A book review for your English department colleagues perhaps
A book review for your English department colleagues perhaps

Some of these stories are so richly told, it can almost seem as though you’re right there with him.

Read More →
Review: Pen Names
Review: Pen Names

OK, so this has nothing to do with education technology, but we all read (I hope!). A very interesting examination of the pen names some authors have adopted, and why.

Read More →
Review: The Library of Ancient Wisdom: Mesopotamia and the Making of History
Review: The Library of Ancient Wisdom: Mesopotamia and the Making of History

There's a really interesting section in this book about how ceramic storage of data and information is probably the most likely medium to stand the test of time.

Read More →
A book review for your biology colleagues perhaps
A book review for your biology colleagues perhaps

The subject under discussion here is how human physiology has developed in different ways, in response to different conditions around the world.

Read More →
Review: Social Media for Academics
Review: Social Media for Academics

This book is very readable, and if I sound surprised that is because it’s not always true of academics!

Read More →
Quick looks: VIBE Coding by Example
Quick looks: VIBE Coding by Example

For the time being, this book is free in Kindle format.

Read More →
Review: The Game Changers: How Playing Games Changed the World and Can Change You Too
Review: The Game Changers: How Playing Games Changed the World and Can Change You Too

Despite the relative paucity of immediately obvious National Curriculum links, teachers will find several of sections of this book to be highly engaging.

Read More →
Dig+Ed+Banner.jpg

Contact us

Privacy

Cookies

Terms and conditions

This website is powered by Squarespace

(c) Terry Freedman All Rights Reserved