By
the time of the outbreak of war in 1939, Nazi Germany had thoroughly penetrated
Royal Navy encryption systems, and the Navy's changing its main codebook caused
barely a hiccough to German Navy cryptanalysts: in fact it took until 1943 to
secure RN and allied naval encryption and during this time the German Navy
destroyed millions of tons of allied shipping and came close to preventing the supply
of food and military supplies to the UK from North America. Just as Bletchley
Park exploited poor security by the German military in their use of encryption,
the German Navy's B-Dienst was able to exploit the poor security of the codes
used by the Admiralty. How could this have happened? A note written years later
by Captain D A "Willie" Wilson RN identifies areas where the
Admiralty got it wrong. I think the issues they raise are as relevant today as
they were between the wars, because although security today addresses
technological challenges that our forebears couldn't have imagined, the
foundations of security and the mindset of the security practitioner are
fundamentally the same.
The
Admiralty got things so badly wrong because there was no single coordinating body
managing Comsec in the UK, and no recognisable centre or standard of Comsec
expertise. GC&CS had a responsibility to provide advice on Comsec to civil
ministries, and could be approached by the service ministries if they so wished.
Within GC&CS, just as Comint was the responsibility of Denniston, the Head,
Comsec was the responsibility of Travis, the Deputy Head. However, Travis was also
responsible for: collection management, for the reporting, indexing, and distribution
of intelligence reports, for the GC&CS Registry, for liaison with SIS, and, from
1938, for liaison with the service ministries. All of this was in addition to his responsibility (to
'C', interestingly: not the Head of GC&CS) for the British Codes Section.
The fundamental problem was that nobody questioned the
principle that good cryptanalysts would make good communications security
advisors, when in fact their advice would be limited by the extent of their
cryptanalytic skills. In a non-mechanical environment, this meant cryptanalysts
saying "this is the sort of code I can't break", and assuming a) that
that meant that nobody else would be able to, granting that code an
invulnerability; and b) given that cryptanalysts didn't do traffic analysis, that
crypt security was the only security that mattered: that wireless security
(secure traffic management) wasn't something that needed to be considered.
1. Don't split responsibilities among different
people.
The
Director of Naval Intelligence (DNI) was responsible for technical and physical
security, but was dependent on technical cypher matters on the advice given by
GC&CS. The Director of the Signal Department (DSD) was responsible for
providing the means of communication, and for providing the Coding Staff, but
he had no responsibility for wireless security, (ie for denying the enemy
intelligence from traffic analysis). The Paymaster Director General (PDG) had
responsibility for the provision and training of cypher staff, but had no
responsibility for the cyphers used. In
the Fleet the Secretary to a Flag Officer was the Squadron or Fleet Cypher
Officer, responsibly for seeing the proper use of cyphers afloat. Within the
Admiralty M Branch of the Secretariat was responsible for the distribution of
Cyphers and Codes through the Navy, and the Secretariat provided the Admiralty
cypher office, known as "War Registry".
2. Choose
your advisers carefully
GC&CS was required to advise (but not mandate)
security to the services between the wars. In fact Travis advised the
Admiralty, Tiltman advised the War Office, and Josh Cooper advised the Air
Ministry, but they didn't talk to each other about security, and didn't consult
any of the rest of the cryptanalytic staff in GC&CS. Their advice was,
therefore, strictly limited. Cipher security was the subject of a lecture on
the Accountant Officers Technical Course (a prerequisite for a Secretary to a
Flag Officer) but by the beginning of the 1930s this was a lecture about
cryptanalysis given by Bodsworth and Knox, and was unlikely to have been
illuminating. The Long Signal Course for officers specialising in
communications had no lectures at all on communications security.
3. Know when to ask for a second opinion
The key piece of advice that the Admiralty needed was about
the security of the 'Long Subtractor', the key used to encipher messages once
encoded. In effect, this was a one time pad which was reused. How many times
could it be reused without compromising security? Theoretically, it should
never be reused, but in practice, could it be? Wilson wasn't happy with the
response: once gave absolute security, twice was almost guaranteed, three times
wasn't really dangerous, but more than three was, and using the same piece of
key five or six times was positively dangerous. This was very bad advice, and
Wilson knew it, but there was no move to change advisor (nor any obvious other
advisor to turn to).
4. Security doesn't come second to intelligence
The Admiralty had two billets in GC&CS for officers:
the idea was that this would be a source of officers with practical experience
in security. But everyone in GC&CS knew that the department's main job –
its important job – was intelligence production, and the officers posted in
became, for the most part, cryptanalysts. Intelligence breakthroughs are of
tremendous value: if your security is no good, it will be your adversary who
makes the intelligence breakthrough by exploiting your communications.
5. Don't do the enemy's job for him
The RN used three different basic codebooks: Tactical
Code (3 letter groups, not reencrypted); Administrative Code (five figure
groups, reencrypted with a Long Subtractor); Naval Cypher (used for operational
traffic: four figure groups, reencrypted with a Long Subtractor). This meant that anybody
intercepting a naval message immediately knew, without decryption, what sort of
message had been sent. And although different Long Subtractor tables were used
for different regions and commands, the indicator – the explanation of the
start point in the relevant table – wasn't disguised at all. The German Navy's
cryptanalysts had it easy.
6. Don't assume before you understand
In order to gauge how many of each reciphering table to
print and distribute (a job that would always be complex given the Navy's
worldwide extent) the Admiralty had to make assumptions. It assumed that the
proportion of administrative to operational traffic in wartime would be around
80:20 – they were wrong: it was the other way round. If the Royal Navy officers
in GC&CS had been doing the job for which they had been posted in, they
would have seen this by looking at traffic levels in Italian Naval material
collected for GC&CS during the Abyssinian Crisis and the Spanish Civil War.
7. Even if it ain't broke it might still be worth
fixing
According to Wilson, by 1937 when Typex, the UK version
of Enigma, came into service, Lord Mountbatten was pressing for it to be used
by the Royal Navy but got nowhere because of the lack of a single person able
to resolve to bring a new encryption system into use. Lord Mountbatten
concluded that it was just too difficult, but even putting Typex on a few ships
in the North Atlantic could have made a tremendous difference. I asked the Duke
of Edinburgh about this in 2013 and he confirmed it. (That is probably the most
outrageous name drop you'll see today.)
8. Who needs what?
As a matter of routine, Commands, shore establishment and
major ships were given copies of all reciphering tables, so that anybody could
communicate securely with anybody else. But why (for example) would CinC Nore
need to communicate securely and directly with the Senior Naval Officer
Upper Yangtze? Why couldn't the few (I imagine) messages between most
territorial commands not be relayed through the Admiralty. Apart from the
difficulty of printing and distributing so much key material, it meant that if
a ship was believed to have potentially been captured by enemy forces, all key
material on board had to be assumed to be compromised.
9. Understand the vulnerability of your mode of
communication as well as of your crypto
After Wilson had been Head of a new section NID10/DSD10
in the Admiralty early in the war, he set up a new section to look at the way
the Navy communicated to see whether the Germans might be able to derive
intelligence from analysis of the traffic. This had never been done before.
10. Security versus Operability
Security and operability don't always have to be in
competition: designers of systems need to understand how users communicate to
design practicality alongside security; system administrators need to ensure
that the reasons for the rules they impose are understood; and users mustn't
try to subvert the rules and compromise security simply to make life easier. In
order to make Royal Navy comms practice secure against German traffic analysis,
DSD10/NID10 introduced a series of security measures without explaining them to
the signalmen who had to implement them. As a result, they were widely ignored
and subverted. When this was discovered the signalmen asked for a senior
officer to investigate the matter. Admiral Somerville (who had been in charge
of wireless comms for the Mediterranean Fleet during the First World War) was
asked to do this and he came down firmly on the side of security, but was able
to explain why to the signalmen, from whom there was no further opposition to
the new measures.