Safeguarding the future – how we should be teaching ICT in schools

Hands up if you studied IT, ICT or IS at high school in the UK.

Keep your hand up if you found the experience rewarding, enjoyable or useful.

Finally keep your hand up if you regularly apply what you were taught in your day to day job as an IT professional.

At this point there is one guy in the entire country with his hand still raised and he’s the lucky dude who got a job programming a turtle to draw spirals on a big sheet of paper :)

IT in UK schools sucks. In fact it sucks big time. Don’t take my word for it, the UK government’s Secretary for Education says so as well. In fact he refers to the ICT curriculum as “demotivating and dull” – ouch!

I think we can all agree that the current system of just teaching the next generation how to use applications they either a) already know how to use or b) have no interest in using anyway is incredibly limiting and is definitely hurting the UK’s ability to play a larger part in the digital world.

So how should we be teaching IT in schools? Well I have a couple of ideas.

1) Stop teaching IT. It’s ridiculous!

At school you study geography and history as separate subjects – you don’t study “humanities” as some weird combined course. Likewise you study chemistry, physics & biology separately.

Why is this separation important to IT in schools?

Because it’s a subject with massive scope (possibly the largest scope in any teachable area) and distinct areas of overlapping relevance to other subjects.

Specifically IT can be broken down into:

  • Hardware or Computer Engineering which has relationships with physics and more traditional engineering disciplines and covers the building of computers, computer peripherals, component design, PCB design, and all that good, solder-based stuff that makes computers and the world at large actually work
  • Software Development which overlaps maths and linguistics and covers programming from a low level up.
  • Computer systems (e.g. networks and the general theory of computer systems) which ties into both of the above and provides background and a high level overview of computing

These three core areas should form the basis of any IT syllabus and are by now means exhaustive. You also have the related aspects of computer history, specific development areas (web, games, 3D, embedded, etc.), AI, computer security and many many more.

You may think that high school is too early to start looking at this stuff but I would point you to the “big names” in the IT world who entered university already knowing some of the so called advanced computer science that they were meant to be being taught.

2) Get teachers who know what they’re talking about

I don’t subscribe to the “Those who can’t, teach” theory primarily because I’ve had great teachers in the past who very definitely “could”. And whilst the English and science departments are overflowing with competent teachers with experience in their field the IT department seems to be made up of cover teachers or disinterested mid – end career teachers who took a 2 day course on how to deliver the subject.

Hell in most schools the guy (or girl) who looks after the school network is better informed than the head of the IT department!

We need to start getting young, enthusiastic people into schools to teach children about all aspects of the subject (see point 1) and we need to do it now!

Why young? Because IT itself is ever young and the people who know what’s going on are young. And yes, there are plenty of experienced workers in the industry but if you want to engage students to learn then the best way to do it is to stick youth and enthusiasm front and centre in the classroom.

3) Don’t teach kids how to use the web – teach them how to build it!

I’ll fess up here – I’m a web developer and it’s my passion so I may sound biased when I make this next statement. The web is the future of computing.

Yes there is a heck of a lot of stuff that goes under the web (servers, OS, web application stacks, switches and much more) but the web as a development platform is one of the most exciting places in the world to work.

If software development is the act of creation – creating something from nothing more than imagination – then web software development is creation with reach. A web app can instantly be reached by pretty much the entire planet and we need to encourage the next generation to embrace this reach and start thinking in terms of global innovation from the get go.

I realise that not everyone can create the next Facebook but pretty much the entire class will be using the current Facebook (or whatever it is that those crazy kids are using then) and it gives teachers an amazing vector to take something that students are already engaged with and show them how it’s built.

4) Create a flexible syllabus framework and stay out of the way of change

The biggest risk to IT in schools is pace. Traditional subjects have the luxury of spending years (sometimes decades) polishing and revising their syllabi. History doesn’t change, the classics are still the classics and even science is watered down to a level that all but the most fundamental of theory changes can be safely avoided until such time as the next syllabus review is planned.

By contrast, IT moves so damn quickly that, by the time a syllabus drawn up in the usual way is signed off it is already out of date. Whatever the government and school boards come up with needs to be more high level than specific. Avoid getting locked into a given technology

5) Integrate with other subjects.

You show me a subject in high school and I can show you how direct application of software / computing would improve it. The whole world is filled with computers and computing use cases. It’s very unlikely that, once they’ve done their time in school, students are likely to find a job that doesn’t involve computing in one form or another. And yet in a lot of schools there’s no real attention paid to the IT aspect of a given subject despite it being a core tenet of the national curriculum!

Let’s get computer use into every single class and show the students how the real world works.

6) Mind the (gender) gap

A tricky one to actually enforce but very important: High school covers significant social and personal phases of development and is a great place to start addressing the male domination of the IT world. By stating from the get go that computing is relevant and useful to both sexes we can hopefully readdress the balance so that, by the time the students are making uni choices, they aren’t faced with the same prejudicial weight that is currently present in the industry.

7) Show that the world is more than Microsoft

Another personal peeve but education as a whole needs to get away from the corporate world and teach more than just Microsoft technologies. Certainly from a development and networking perspective Linux is better suited as a training platform. For the hardware stuff the OS is largely irrelevant adding more weight to the “alternative platform” argument.

Yes, people need to be taught how to use Microsoft Office and should be able to find their way around Windows 19 or whatever is shipping out of Redmond at the time but they should also be taught the fundamentals of OS usage across the board to provide a rounded and applicable base for future computing.

In summary

If I were to say that “this is an exciting time to be working in IT” I’d be lying. there hasn’t been a dull time in computing for the past 75 years but there are certain projects (like the Raspberry Pi computer project) at the moment that give our education system excellent opportunities to re engage with their students and avoid further lost generations.

Proper attention to computing in schools will reduce unemployment, increase the countries ailing economic viability and ensure that the UK isn’t left behind … because at the moment we’re trailing towards the back of the race and for a country that can boast the man who beat enigma, the inventor of the world wide web and the father of the computer that is an utter travesty.