Risk criteria

False alarms in early warning systems

The risk of an accidental nuclear war results primarily from early warning systems. These are based on sensors, very complex computer systems and networks for predicting and evaluating possible attacks by nuclear missiles. This can lead to false alarms, which can have very different causes (e.g. hardware, software, operating errors or incorrect evaluation of sensor signals). In peacetime and phases of political relaxation, the risks that the evaluation of an alarm message leads to a nuclear attack are very low. In such situations, false alarms are assumed in case of doubt.

Political crises – several events

The situation can change drastically if there are political crisis situations, possibly with mutual threats, or if further events occur in a temporal context with a false alarm. For this purpose, an evaluation searches for causes, i.e. attempts are made to find causal relationships. If such causal connections are found and are logically plausible, there is a great danger that these are assumed to be valid, i.e. that the alarm message is assumed to be valid, even if independent events coincide at random.

Alarm chains

TThe risks can be aggravated by alerting chains. As a result of an alarm signal from an early warning system, armed forces can be put on alert. Such activities are recognized by the enemy and can lead to an increased readiness for alert in conflict situations. This in turn has repercussions on their own assessment of the situation. If a false alarm with regard to attacking nuclear missiles occurs in crisis situations with mutual threats and events that are classified as hostile, then a chain reaction with ever higher alarm levels can be set in motion within minutes, which gets out of control.

Error-free software is not feasible

Errors can never be excluded in a complex system and can be caused by both humans and computers. With complex applications, it is technically impossible to produce error-free software. Even if a software is proven to be correct with techniques of program verification, such proofs are only possible on the basis of a formal specification, which may itself contain errors. An important method to reduce errors in software development is testing. But testing an early warning system under real conditions will hardly be possible.

Rare errors are particularly dangerous

If it is possible to improve early warning systems in such a way that false alarms only occur very rarely, security will not be increased. The rarely occurring alarm messages are then unusual and difficult to interpret. This significantly increases the risk that they will be taken seriously, i.e. as valid. This is particularly true in crisis situations or when there are other events that can be related to them.

Not predictable – sudden event

An “accidental nuclear war” is not directly predictable. As with other accidents in technical systems, there is no pre-warning. Like a “normal accident”, an accidental nuclear war can suddenly break out within a few minutes. After that, no correction is possible. Concerning normal accidents, certain actions are often taken afterwards to avoid such risks in the future. After a nuclear exchange of blows, such a future will hardly exist. In the case of nuclear war risk, we cannot wait until there has been a first “accident” in the form of an “accidental nuclear war” before taking actions to reduce these risks.

Doomsday Clock

Since 1947, nuclear scientists have been setting a “doomsday clock” to alert the public to the current risk of nuclear war. The current state is 100 seconds to twelve, closer to twelve than ever before.

You can find a detailed description of the interrelationships here: http://www.fwes.info/fwes-21-1-en.pdf

Translated with www.DeepL.com/Translator