• Front Page
  • Search
    • Digital Education
    • Terry Freedman's Books Bulletin
  • RSS
    • Welcome
    • The "About" Page
    • Testimonials
    • CV/Resumé
    • My Writing
    • Published articles
  • Corrections Policy
Menu

ICT & Computing in Education

Articles on education technology and related topics
  • Front Page
  • Search
  • Newsletters
    • Digital Education
    • Terry Freedman's Books Bulletin
  • RSS
  • Info
    • Welcome
    • The "About" Page
    • Testimonials
    • CV/Resumé
    • My Writing
    • Published articles
  • Corrections Policy

12 things I've learnt about assessing Computing and ICT

September 30, 2016

I've been thinking about, doing, and running courses in the art and science of assessing what kids know, understand and can do when it comes to Computing and ICT for a long time. Here are 12 things I've learnt. Four of these have been published as an article on the ICT & Computing in Education website, but this is the full list.

The more I learn, the more I realise I don't know

I think this must be true of any sphere of knowledge. The more you know, the more nuances you see, the more you become aware of the alternatives.

I experienced the same thing when I studied Economics at school. When I was 18 I thought of myself as a budding economist. By the time I'd finished my degree in Economics I thought of myself as someone who didn't know all that much in the total scheme of things.

Getting back to assessing Computing and ICT, when I was at the Qualifications and Curriculum Authority we would spend whole days looking at examples of assessment questions and examples of pupils' work, arguing back and forth about whether this particular question was "valid", or whether that particular sample of work really proved that the student knew her stuff.

It isn't straightforward.

Don't believe simplistic solutions

I like simple solutions, but not simplistic ones. A simple (partial) solution might be to say you're going to give pupils a short baseline test at the start of each new topic. That's an eminently sensible thing to do, and with the right tool and the right approach, it shouldn't take long.

A simplistic solution is usually introduced with the phrase "All you have to do is...". In my opinion, any sentence starting like that presages a simplistic "solution" that is not a solution at all.

Consolidation is not progress

I'm certainly in favour of checking that students have really understood a concept, such as conditionality. But I've sometimes seen instructions that tell you that if a student gets the right answer three times to the same sort of problem, they have achieved more than if they get it right only once or twice.

No they haven't.

All that proves is that either they have consolidated their understanding, knowledge or skills, or they have learnt how to answer that sort of question. To evaluate progress, you have to set different types of problem, to come at it from several angles.

As I used to say when we had Levels, achieving a Level 3 ten times doesn't make you a Level 30, it makes you a Level 3.

Groupthink can lead you astray

It's easy to assume that when everyone is saying or doing the same thing, then they must be right -- the "wisdom of crowds" argument. But as Crispin Weston points out in his article, It's the technology, stupid!, the book of that name is somewhat disparaging about the so-called "wisdom" of crowds.

Let's put it this way. Since many of the approaches to assessing Computing, while useful, have effectively reinvented Levels, you have nothing to lose and everything to gain by thinking through all the issues for yourself. That is far better than assuming everyone else is right, or buying an assessment product without knowing how it works (which I referred to as outsourcing assessment to an algorithm).

It’s hard, and fascinating

Assessment is a fascinating area to explore. There are so many potential pitfalls. For example, is this test valid? Is there even such a thing as validity? (Dylan Wiliam, in a conference on assessment, suggested that validity is not an intrinsic quality of a test: what matters is the purpose for which you use the test. I think this is obvious once you see it stated, but we tend to become lazy in our use of language. We use shorthand terms, and then forget that were intended to be shorthand.)

Collaboration is essential

Collaboration doesn’t prevent groupthink, but if you collaborate with a diverse range of people, it makes it less likely. That’s what I like to think, anyway.

You need to know how all-in-one solutions work

If you buy a package that has curriculum materials and assessment items built in, and those assessment items can be used to tell you how your students are doing, then you need to know how it’s arriving at its conclusions.

You will probably not be able to find out the fine detail because of intellectual property issues, but you should know how it works broadly speaking.

For example, does it arbitrarily assign some sort of level or grade depending on how a student does on a test? If so, is the test fit for purpose?

Or does it track what the student is doing on-screen, and then try to decide how competent the student is? If so, how does it do that, in general terms? Does it sound like it makes intellectual sense?

More on this in the article outsourcing assessment to an algorithm.

You must be able to exercise professional judgement

Whatever a test tells you about a student, you need to be able to override the result to take into account other factors. For example, I’ve known students to flounder on a test because they could see how more than one answer could be correct, depending on what assumptions you made. That’s obviously a poorly constructed test, so the student shouldn’t be penalised for it.

Senior leaders are wedded to numbers…

For various reasons in England — I can’t really speak for other countries — senior leadership teams insist on numbers. Never mind the fact that officially we no longer have to report Levels. As far as many schools are concerned, Levels are alive and well.

…So you need to accommodate them

If you work in a school where you have to produce numbers, e.g. Levels, you need to have a means of converting your preferred assessment approach into numbers. No point in refusing to do so, because you’ll probably end up be formally disciplined. No point in pointing out that we’re encouraged to not use Levels because of their drawbacks: the senior leadership will probably agree with you, but that won’t change anything. You are just going to have to go with the flow.

I’ve looked at how to do so in this article: How to convert your assessment system to Levels or Grades.

Your approach needs to work with or be compatible with school system

Even if your particular approach is brilliant, it has to tie in with the school’s system. I think this is easier if everyone in the school is using the same program. Where things become a bit difficult is where widely different approaches are used, and/or different programs.

If you’re in this predicament, I think you will need to do one of the following:

•       Drop your approach altogether.

•       Find a way of converting the results given by your approach into the form that the school as a whole is using.

•       Use your approach on a day-to-day basis, but the school’s approach when you have to submit results to the school’s database.

Your approach should be based on a theory

Any system of assessment should have a theoretical underpinning, so you can justify why you use it, and can modify it easily when circumstances change.

I hope you find these points useful.

An earlier version of this article was published in Digital Education, a free newsletter. Look on this page for details: Newsletters.

Advert: My “assessing Computing” courses

The challenges of assessment are not insurmountable

If you woud like some guidance in getting on top of assessing ICT and Computing, I’ll be running two courses in March 2017. However, if you cannot wait that long then why not contact me with a view to running a bespoke course for your school or a group of schools? You can see what other people have said about my courses, including a bespoke one Idid for a cluster of schools in Guernsey, here: Course Testimonials. If a whole day sounds like too much, then I'm happy to run a workshop or give a talk at your next conference or training day.

In Assessment, Using and Teaching Computing & ICT Tags assessing ICT, assessing Computing, assessment
← 5 mistakes I made when teaching Computing, by William LauWhat I've been reading: Digital Literacy Skills for FE Teachers →
Recent book reviews
digital culture shock.jpg
Quick look: Digital Culture Shock: Who Creates Technology and Why This Matters

Chapters look at how technology is used around the world, online communities, and building a culturally just infrastucture, amongst other topics.

Read More →
Artificially Gifted Notes from a Post-Genius World.jpg
Quick look: Artificially Gifted: Notes from a Post-Genius World

The author, Mechelle Gilford, explores how AI may render our usual way of interpreting the concept of “gifted” obsolete.

Read More →
dr bot.jpg
Quick look: Dr. Bot: Why Doctors Can Fail Us―and How AI Could Save Lives

Dr Bot discusses something I hadn’t really considered…

Read More →
seven lessons 2.jpg
Review: Seven Brief Lessons on Physics: Anniversary Edition

Rovelli draws readers into his world by describing the development of theories that scientists have posited to try and explain our world and the universe beyond.

Read More →
dear data.jpg
Review: Dear Data

The authors spent a year sending each other postcards on a different theme each week, with pictorial representations of the data they had collected.

Read More →
Blueprints.jpg
Review: Blueprints: How mathematics shapes creativity

What place might Blueprints merit on a teacher’s bookshelves?

Read More →
renaturing.jpg
Review: Renaturing: Small Ways to Wild the World

This book could prove useful to schools keen to cultivate their own dedicated ‘back to nature’ area.

Read More →
listen in.jpg
Review: Listen In: How Radio Changed the Home

A couple of generations before the first internet cafés were opened, someone attempted pretty much the same thing by opening a ‘radio café’.

Read More →
level up.jpg
Review: Level Up Your Lesson Plans: Ignite the Joy of Learning with Fun and Educational Materials

This book is awash with ideas.

Read More →
conversations-with-Third-Reich-Contemporaries.jpg
Review: Conversations With Third Reich Contemporaries: : From Luke Holland’s Final Account

This may be useful for the Hiostory department in your school.

Read More →
Dig+Ed+Banner.jpg

Contact us

Privacy

Cookies

Terms and conditions

This website is powered by Squarespace

(c) Terry Freedman All Rights Reserved