When I was appointed ICT Coordinator at a large special needs school serving the Caerphilly Borough in the summer of 2011, I drew up an action plan to update ICT provision in the school. I decided to start with a bottom-up approach, to enable access for those pupils who had no, or very limited, access to the curriculum. This group comprised, in the main, pupils who were classed as having profound and multiple learning difficulties (PMLD) or severe learning difficulties (SLD) and were working below National Curriculum levels at around P3 to P6. It also included students with specific physical disabilities as a result of conditions such as Rett syndrome or cerebral palsy, and pupils with sensory processing issues who were prone to breaking conventional equipment.
With the needs of these pupils in mind, the school researched and trialled gesture-based, non-touch technologies, also now termed “natural user interfaces”. You may be unfamiliar with the term but you use these interfaces every day if you have a smartphone, tablet or advanced games console.
The school is now using an eye-gaze system that enables computer control by retina movement, an interactive floor projection system, a set of tablets and software that uses a motion sensing device.
The school has also set up a professional learning community (PLC) with other special and mainstream schools in the South Wales area to look at how best to use tablet technology with pupils with SEN. We meet regularly to share ideas and good practice. We are currently also assessing and evaluating video evidence to show progression and increased levels of engagement, wellbeing and task completion.
In an SEN setting, tablets should not be thought of as gaming devices, mobile internet browsers or poor relations to PCs. They are an important piece of equipment that can do many things that other devices simply cannot do. Tablets are portable, small and light. This means that they can be used in the classroom but also in therapy rooms, sensory rooms, corridors, playgrounds, sensory gardens, and assemblies, and they can be transported easily into the community, to respite provision and into homes. A PC sits on a desk but a tablet can be used on laps, beds, beanbags, propped up on the floor or in tents. With mounts and stands, it can be put into any position around the pupil who is using it.
We have traditionally accessed PC programs via keyboards, mice, touch screens, interactive whiteboards and switches. These can be very inclusive devices but they can also exclude pupils. Screens and whiteboards are almost always vertical and fixed to a wall or desk. Switches, although very accessible, are limited to cause and effect actions, and keyboards and mice (even enlarged keyboards and roller balls) are obviously not suited to many pupils with SLD or PMLD.
In addition to their mobility, it is the variety of input and output methods available that sets tablets apart in the classroom and makes them an important device in their own right. Tablets respond to pressure, motion and the number of fingers (or a nose, elbow, wrist or cheek) used in touching them. As with switches, pupils with any controllable body movements can input information into a tablet. Importantly, though, unlike a switch, the input is not limited to “off” or “on”; pupils can move from side to side and up and down and thereby create a different response each time. A switch also usually causes an effect on a computer or screen that is in a different place to the switch itself, meaning that the pupil has to concentrate on two points – the switch and the screen. The tablet is the switch and its effect in one; the visuals and sound emanate from where the hand, fingers or face are on the device, giving a more immediate and rewarding cause and effect response.
Computers run programs but tablets run apps (short for “applications”). There is a huge range of apps available and the trick is to get the right ones for you. Luckily there are a few good guides on the web.
Visual apps respond to touch to create an endless array of visual effects, such as fireworks, where a touch on the screen creates one explosion, a longer touch creates a larger explosion and a swipe of a finger creates multiple explosions that follow the finger trace. There are apps that replicate water and produce ripples when touched, fluid apps that mimic multi-coloured, sparkling gloop, and particle apps where thousands of lines or dots are attracted to a touch of the pupil’s fingers. These apps can provide visual magic, especially in a darkened room and, given the mobility of the tablet, they can open up whole new interactive sensory experiences for individual pupils or small groups. The touch element also means that pupils can pop balloons or bubbles on the screen, make a duck quack, a horse neigh, a favourite song play or a cartoon character move.
It doesn’t stop there, as touching the screen can cause sounds as well as visual effects. The tablet can become a piano keyboard, guitar strings, a zither, drums or a panpipe set, using a swipe of the finger to create the notes. Software programmers have also used the technology to create new types of musical interaction, where the user touches different parts of the screen to create different notes, pitches or tempos, or set in motion expanding discs that create notes when they touch each other.
You can touch the screen to produce musical raindrops, create repetitive beats and place your own strings to pluck and swipe. Tablets also have a microphone built in, so they can be used to record sounds, speech and singing. Apps can repeat sounds and vocalisations that the pupils make and instantly repeat them, perhaps in a comical voice, or instantly remix the pupil’s short spoken phrase to make a whole song. The pupil’s voice can also change visual elements on the screen, by making a face talk or by affecting an on-screen graphic. This gives instant feedback and is a great motivator for the pupils’ vocalisations and emerging speech.
Some tablets also react to shaking, rotating, titling or other movement in space. This gives a new level of control for programs; balls on screen can be made to noisily bounce off the edges of the screen when the tablet is tilted, or particles can behave like water as the device is shifted around. Some simple programs use this effect to change the screen colour when the device is tilted or rotated, so even simple gross body movements can be converted into a visual or auditory effect.
So, apart from touch control, sound interaction, tilting, rotating, creating music, cause and effect activities and light effects, what else can the tablet do? Well, as the technology gets better, so do the ideas; programmers are realising that now you don’t even need to touch the device to get a response. One app uses the tablet’s inbuilt camera to judge how far away a part of the body is, so when this body part moves the pitch and tone of a musical note alters. Another app uses the camera to track movement and creates shiny coloured balls around the user, so they can watch themselves sparkling on the screen.
Our PLC’s work with tablets has taken, quite literally, a very “hands-on” approach, focusing on trying them out with pupils in creative ways to see what they can do. One teenage boy with Angelman syndrome (working at P4 level) normally engages with objects for very short periods and is usually more interested in the switch than the response it is creating. One of his favourite objects is a mirror, which he uses to look at himself and others. We put a tablet in a protective cover and used the camera mode on it to serve as a mirror to engage him. This idea was then quickly expanded so that when he put the “mirror” to his face it would produce sounds and visual effects on the screen, right in front of his eyes. We also used the depth-camera sound-making program so that he started to move his face away from the device and notice the changing sound responses that he was creating. He is now learning to interact in different ways with his “mirror” as it has, to all intents and purposes, started to interact back with him.
An older teenage pupil with ASD has serious sensory processing issues; he requires and gives a lot of deep pressure, is often agitated and on the move and will hit and crush equipment. Switches and touchscreens only interest him for very short periods of time. With a tablet, though, he can take it with him, hit at the screen as much as he likes (in its protective cover) and engage with it where and when he wants, whether sitting, standing or lying down. He has now shown that he can choose the app he wants and will interact with it using the touch screen for longer periods. He also has the opportunity to self-regulate the sensory input he is getting; for example, he can move it closer to his ears if he wants more audio feedback.
Another boy has severe cerebral palsy and can use a switch mounted by his head or elbows to engage simple cause and effect experiences, like playing music tracks or a recording of his mother singing to him. Now, a tablet can also be mounted by his head and he can use the small range of movement that he has to move his head over the surface of the screen in different directions. Instead of just having a “play” option, his movements now create different effects each time he interacts with musical programs. In a recent occupational therapy session, he refused to interact with the tablet screen at all until his favourite app was put on for him, then he spent ten minutes creating lovely Chinese zither sounds.
Of course, there are so many other ways in which tablets can be used in schools – literacy and numeracy development and general curriculum delivery, for example, are all very well suited to tablet support. What I hope I have highlighted in this article, though, is just how important tablets can be for pupils whose access to the curriculum and learning opportunities is severely limited. For many of these pupils, especially those with PMLD and SLD, tablets can provide a whole new means of interaction, offering opportunities for multi-sensory experiences and learning that were previously unavailable to them. The ability of tablet technology to engage and inspire even the most hard-to-reach pupils should not be underestimated.
Anthony Rhys has taught pupils with SLD and autism for ten years. He is the ICT Coordinator and on the senior leadership team at Trinity Fields School and Resource Centre, Wales:
The school heads up two professional learning communities (PLCs) looking at tablet technology and pupils with SLD and gesture based technology. Anthony runs wiki websites for each PLC. Information on apps, the PLCs’ work and additional resources are available at: