Cadzow Communications Consulting Ltd

c3l-logo

Cadzow Communications Consulting Ltd, commonly referred to as C3L, is a consulting company offering advice and development in the area of standardisation related to security (CIA) and privacy. When we say consult we mean working on problems and finding solutions to them. Our problem and solution space is standardisation and within that we specialise in security for telecommunications within the ICT domain, covering privacy protection, protocol design, system analysis and system test.

Our tag line of security needs planning describes much of our approach. Like all things you need to have a goal and unless you know what you're trying to secure you will not be able to know if the security you're providing is effective. This is at the root of what has begun to be termed Design for Assurance and Privacy by Design both of which have been spearheaded by C3L. We will continue to refine these paradigms to set the standard in standards development (this is also our secondary tag line of setting the standard in making standards).

Topic 2 text for FP7 and H2020 projects

Why support standardisation? Simply put standards are a key part of the building block of modular systems. It would be difficult to plumb a house, or drive a car, if every plumber and manufacturer did things their own way. As it stands we have commonly accepted ways of doing things - standard fittings for pipes, standard layout of car controls and so forth. Standards don't simply appear, they take time to develop and to be adopted. In some cases this is "de facto", i.e. by industry and users simply adopting one way of doing things that becomes the accepted norm and by a raft of commercial and non-commercial licensing agreements a standard is set. Often "de facto" standards are set by formal and semi-formal standards bodies where players with a common interest sit down together and agree a set of specifications that identify how parts of the system work together (the interface specifications). What distinguishes standards from product specs is that a standard sets out what something has to achieve and not (generally) how to achieve it. Generally then standards are written to state exactly what is mandatory (using keywords such as shall (EU preference) or must (favoured in the USA)) and what is strongly recommended (using the keyword should).

Sometimes industry cannot, or is unwilling to, set a standard by itself and in these cases "de jure" standards are prepared, often by the same people and formal and semi-formal standards development organisations as "de facto" standards. De jure standards are those required to ensure safety or to respect some regulation and are generally legally binding, i.e. to enter the market you have to comply with these standards. Of course in some markets de facto standards have similar market entry force but are not legally enforceable. The standards market has changed in the past few decades, particularly in telecommunications, where less and less of the standards produced are de jure but rather increasingly are born in a de facto environment for common self-regulated deployment.

Even the simplest of standards can absorb years of effort, attending meetings, building consensus and of course writing, testing, validating to ensure the resultant standard is fit for purpose.

Standards development has over time developed a tool-set of its own and there is a degree of expertise in using this tool-set and applying it successfully. In addition most organisations don't develop that expertise or retain it for long - the demands of everyday business are such that in most cases standards experts cannot be retained as only standards experts. However development of a standard is a long process and requires development of contributions alongside critical review of other contributor's contributions. In many cases there is a delicate job to do of pushing a commercial competitor's proposal as the long-term strategic goal is better suited by backing them than the short-term tactical goal of having your own ideas and input accepted.

Topic 4 text on Reconfigurable Radio

Summary

Good randomness that leads to high entropy, or sources of entropy that lead to true randomness, cannot be ignored. If the underlying source of randomness is weak (i.e. not really random or random over a very small range) then any dependent security function is going to be weakened. The attacker is not going to be stupid and try and break the crypto engine and the protocols if he can use weak randomness as an attack vector.

Debunking Crypto Myths - the movie plot misdirection

The media these days is full of misguided reporting on the strength of the cryptographic tools we use. I have not seen it reported that cryptography is a controlled tool with restrictions on its use, and its exportability. I have similarly not seen any realistic reporting of just how good modern cryptography is. So I'm going to offer some points of note and leave it for people to apply some thought to the process.

There is a set of international treaties on the use of what is termed dual use technology (i.e. technology that has merit in civil business but can be seen as a weapon in the wrong hands). The main impact is that for certain classes of use the effective key length, and the access to the cryptographic algorithm in equipment is restricted. In practice most commercial applications of cryptographic algorithms are incredibly strong. In mobile phones where the radio interface is encrypted for any data captured off the radio link it is quite simply infeasible for anyone to decrypt it and recover the content. In our web browsing and e-commerce it is  similarly infeasible that anyone intercepting our traffic will ever be able to decrypt the content. Quite simply the cryptographic tools we see in use work where they need to.

There is a lot of guff talked written in the press about breaking encryption. I watch too much TV and too many films where the techie says "it's 128 bit encryption, it'll take me a few hours to break" and all they have is a wee laptop. Think of how big that number is and how many keys fit into 128 bits: it is 2 to the power of 128, roughly this means 1,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 keys. Now if you can check say 1000 billion keys a minute it would take 1,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 minutes to check the key space. That's a lot of minutes that add up to a bunch of years - so many that the galaxy will be long gone before you make a sizeable dent in the pile. Quite simply you cannot attack modern encryption using brute force. When you see somebody claiming it on TV remember this - it is fiction and it moves the plot along nicely but it ain't like that in the real world.

There has been press speculation on "back doors" in crypto-algorithms. This is nonsense - the majority of algorithms we rely on are quite simply too tested, too open and too critical to be purposefully weakened. If I have the key I can decrypt the content - if I don't I can't. That's it. It is an old principle - leave the security to the key and the key alone.

Of course somebody can get access to the content - that's what keys are all about. You lock it up with a key and you unlock it with a key. If you want to let someone into your house give them the key. If you want someone to access your encrypted content do likewise. Just in case you're wondering - there are no "skeleton" keys in good modern cryptography.

Some of our cryptography has a finite lifetime though as it depends on "hard" problems remaining hard. The work that I'm looking at for ETSI TC CYBER covers the issue of the impact of quantum computing on the viability of cryptography and how to continue to keep one step (at least) ahead of the attackers.

I'd like readers to go away with the knowledge that TV and movies are doing cryptography a huge dis-service - it works and it works well.

Secure Cryptographic Mechanisms - entropy and randomness

Cryptographic security is non-trivial and requires an understanding of a number of concepts and two of them are entropy and randomness, and these are strongly linked. In communications security there is often an assumption that any transmitted message will have some variation to all other transmitted messages, however in some communications systems there may be very strong levels of commonality between transmitted messages that may be exploited. A goal of cryptography is to mask the similarity between messages and commonly this is referred to as maximizing the entropy of a transmitted message. If a message is to be encrypted and the message has low entropy the cryptographer has to raise the entropy prior to encryption or as part of the encryption process. There are a number of examples of messages with inherently low entropy:

  • Short text messages.
  • Telematics status messages.
  • Call setup messages.

Message entropy is discussed in a number of mathematical sources but at the root is Shannon's "A Mathematical Theory of Communication". Essentially if the attacker knows or guesses that the message can take a small set of values the probability of correctly guessing bit N+1 after receiving bit N tends towards 1 whereas for a random binary alphabet the probability of a correct guess should always be 0.5. In a cryptographic context, where Alice is sending a message m to Bob in the form of a binary string the rule of thumb is that the bigger the entropy of the message m the more guesses required by an attacker to guess m. After encryption of message m to generate message c the entropy of c should be as high as possible.

The rule of thumb for randomness is that if an attacker that can get access to all the historic random elements (all N values) this has to give zero information to correctly guess the value of the (N+1)th element. If this condition is met then the element can be considered as having a random value - but only with respect to the previous elements. However we also need to determine if we can emulate the randomness so that even if prior knowledge gives no greater likelihood of guessing the (N+1)th element we have to be assured that knowledge of the context does not allow us to guess the (N+1)th element. The source of entropy in a system that seeds the random number generator has to be good and many of them are poor, and further, are unsuited to standalone devices. For example sources for entropy may include movement of the mouse (not available in embedded and virtual systems with no GUI), the amount of free memory (not always practical in a virtualised environment where the VMs have fixed memory allocations), CPU temperature readings (in server farms this will be load balanced and trimmed to work in a very small and optimised range with very high load factors in many VM environments too). If our sources of entropy are random over only a small range then their value is reduced to that range - so we should not rely on achieving 128-bit security when our randomness is only within (say) a 4-bit range.

Thus the ability for a system to generate random numbers is central to most forms of modern cryptography. Typically cryptographic suites, such as OpenSSL, rely on operating system-provided methods for sourcing random numbers. Examples include /dev/(u)random on *nix and rand_s() on Windows. None of these methods provide truly-random numbers - that would require a physical source of randomness. Instead they are seeded by sources of entropy within the system and subsequently updated periodically. Typically when an operating system is restarted a pseudo-random seed is written to disk and this seed is used as an additional source of entropy when the operating system starts back up again. This ensures that the machine will not boot in the same entropy-state that it has booted in previously. Systems providing insufficient randomness have been shown to have compromised the integrity of security suites running on them, examples include the breaking of Netscape's implementation of SSL and the ability to predict Java session-ids.

Analysing PRNGs

Analysing PRNGs is a difficult task. The Linux PRNG, for example, is a part of the kernel so any modifications to enable introspection require that the kernel be re-compiled (such as those described in "Not-so-Random Numbers in Virtualized Linux and the Whirlwind RNG"). This kernel re-compilation may affect kernel-provided entropy sources in such a way that the experimentation no longer represents what would happen on an un-modified kernel. Gutterman, Pinkas and Reinman get around this problem by using a user-mode simulator of the Linux RNG for their experimentation, however producing this simulator is no mean feat. They also comment that although the source code for the RNG is available, it is poorly documented and what documentation there is is not up-to-date.

Straightforward assigment as consultant

It is very simple to engage us on a per diem basis. The rate is broadly negotiable but is priced to be competitive in the market of security and privacy analysts and standards experts. Our preference is for well defined roles even if that role has an uncertain outcome. For example in standards (see the sponsorship discussion) not everything submitted will be approved and in some cases the politics will be against us, and in security we cannot offer 100% guarantees of success. To kick-off the process simply send an email. We will reply and begin the dialogue. Just like a lawyer though we're not too keen on free consulting during the negotiation phase and so the dialogue may appear abrupt but this is a business contract we're trying to develop and we don't want to appear to easy to steal information from.

Training delivery

Notwithstanding any of the caveats regarding standards and development we feel (that is Scott feels) it is important to make sure there are experts in the field and we aim to ensure that training in technologies such as TETRA, ITS, Cyber-Security is available to those wishing to develop their expertise. In some cases we will simply direct you to other providers if we feel their material is better suited to you but we will also offer a set of pre-defined courses on set dates and locations. These will run at or near normal commercial rates (we're not going to undercut the competition but rather offer the expertise you need at a competitive rate).

In addition to any training courses developed in-house we will partner with appropriate parties. This means for now with WrayCastle covering TETRA although we do plan to develop refinements to their course and perhaps (with permission) to develop our own in parallel. In the i-locate project we are similarly providing input to the training modules for privacy and security and these are part of the overall project offering. In this latter case the courses are recorded webinars.

Training development

There will be times where the off the shelf courses from ourselves and the wider market simply fail to meet your requirements. in such cases we will work with you and other partners to develop a bespoke, or near bespoke, course. This will be more expensive than an existing course but that additional cost is offset against the tailored fit.

Sponsorship as engagement

In times gone by artists and artisans often had their work supported by simple patronship. In sport and arts today we see that role being taken by sponsors. So when a brand sponsors the league cup or a formula 1 team it doesn't mean that they are directly marketing the football team or the car but win by association to a winner. In recognising that C3L is best of breed and that association with the best is a good thing I'm going to make the relatively unusual step of explicitly listing sponsorship or patronship as a means of maintaining C3L at the forefront of standards development and security work. The suggestions listed below are for 4 levels of sponsorship (Bronze, Silver, Gold and Platinum) with increasingly interactive relationships between C3L and our patrons or sponsors as the levels go up.

Sponsorship, or patronship, is not a trivial undertaking and the levels suggested are not insignificant. However even the simplest of standards can absorb years of effort, attending meetings, building consensus and of course writing, testing, validating to ensure the resultant standard is fit for purpose. In light of this every sponsor will be acknowledged on this website and on the documentation produced by the company with the assistance of their sponsorship, which could be contributions to standards or to the marketing material of the company. In all cases the wishes of the sponsor for visibility will be respected. The forms of visibility of the sponsor may include:

  • have your name or company logo incorporated on any publicity material produced
  • joint involvement in standards tutorials, seminars, conferences and so forth

The following list suggests some levels of support for a patronship or sponsorship opportunity.

  • BRONZE £12,500 per annum excluding VAT
  • SILVER £25,000 per annum excluding VAT
  • GOLD £50,000 per annum excluding VAT
  • PLATINUM £100,000 per annum excluding VAT

Actual contact details

what ????