Automation, deskilling and safety

Submitted by AWL on 28 July, 2015 - 5:16 Author: Bruce Robinson

Martin Thomas’ criticisms of my review of Nicholas Carr’s book on automation (Solidarity 370) focus on two related issues: the deskilling effects of automation and my rejection of the full automation of safety-critical systems through e.g. driverless cars or pilotless planes. On deskilling, I think there is one misunderstanding and one difference.

Firstly, I do not “want to have all traditional skills kept in general use” indefinitely. I am not proposing we return to handloom weaving or horse-drawn carriages. But I doubt that Martin as a maths teacher believes that his students should not know their times tables because they can now use a calculator. The point I — with Carr — was making is that automated devices that are easy to use lead quickly to the atrophying or loss of skills which may be important in everyday life, in certain necessarily skilled tasks or, at the most extreme point, in certain fundamental human functions and capacities.

Martin argues that new skills will replace old skills and that we should therefore not worry too much. This ignores that the shift may lead to the loss of valuable knowledge and experience that may be important in particular labour processes.

One example is the shift in control of industrial production processes such as the production of chemicals that results from computerisation. In place of hands-on physical control and direct knowledge of the process, the operator of a computerised process is seated in front of a screen in which the process is represented symbolically through software. The operator is only able to interact at one or more remove from the real process and only by means of the tools the system designer has provided to manipulate the representation on the screen, leading to a decline in skills associated with direct interaction. The operator becomes totally dependent on the software.

This can have dangerous effects in humans’ ability to control the automated task (and also on job satisfaction). It may be, as Martin writes, that most deaths in the US take place in private non-automated planes but Carr describes several incidents in which hundreds of people died as a result of pilots either misinterpreting the messages from the system controlling the plane or having lost the ability to respond to them correctly. As systems become more complex and integrated, they become less transparent to the users especially in emergencies. The human may either not be allowed to intervene or lack the skill to do so.

Even the most highly automated tasks require scope for human intervention or override. Martin is quick to point out the deficiencies and inconsistencies of humans — pilots can be “inattentive, distracted, unwell, drunk, sleepy” — but does not list any of the limitations, dangers or deficiencies of automated systems based on software. Software is always a model of a partial slice of the world and there will always be contingencies it has no ways of dealing with. Software testing can never be 100% effective once programs are of any size. Software also has to assume certain ways of human working that may not correspond to the way people actually do their job. In large systems, neither the designers nor the programmers may have a detailed understanding of how the system as a whole works.

These problems can be mitigated, but not overcome, and it may be that we would accept in many cases a reasonable statistical likelihood that something will work; is that true of pilotless planes, though? There are choices and socialists should not have the default position that machines are superior to humans (or vice versa). These choices include how humans and machines should interact and an assessment of what types of skill and knowledge are maintained or how they should be modified in automation.

Where there is little skill to start with and the worker already functions almost as an extension of the machine, there may be little worth preserving skill and automation can be as extensive as is safe and feasible. But sometimes it is necessary to take conscious measures to ensure certain skills do remain, ranging from teaching times tables to ensuring that those surgeons, who Martin assures us will retain manual skills despite the superior performance of robots, receive a practical training.

We need to assess individual technologies critically, uncovering the social, economic and technical choices and aims they embody before deciding whether they should be accepted, rejected or modified to be more compatible with our aims. As I said “Another automation is possible”, but it’s also necessary.

This website uses cookies, you can find out more and set your preferences here.
By continuing to use this website, you agree to our Privacy Policy and Terms & Conditions.