Improving the Security of Cryptographic Protocol Standards 
Security protocols are a common place within computing; forming foundations of trust for users interacting with computers ranging from secure banking to use of social media applications. While these protocols are complex and sometimes difficult to appreciate from an end users point of view; expectations are these protocols are well defined not just from an implementation perspective but from a standards point of view with respect to threat models, security properties and security guarantees. The research paper ‘Improving the security of cryptographic protocol standards’  reviews several well-known protocols WiMax, EAP and ISO/IEC 9798 with respect to security guarantees. Analysis of these protocols has highlighted several vulnerabilities ranging in degree of severity. It is somewhat surprising to discover some of these vulnerabilities are well known. It should come as no surprise these vulnerabilities are best dealt with before standardization is finalized. Dealing with weaknesses subsequent to implementation will most often be time consuming and expensive, and most likely adversely affect reputation.
During a standardization process it seems prudent to define both security properties and treat models, ideally once defined these can become benchmarks for analytical tools such as the scyther tool . By defining such properties and models a security protocol can begin to define the guarantees as to security of protocol. Using documented standards such as ISO/IEC JTC 1/SC 27 ‘Verification of cryptographic protocols’ it should be no surprise improved security protocols would result. This standard identifies a certification process for protocol designs, which includes several levels of certification (PAL protocol assurance level) i.e. to obtain the highest level of certification; the protocol would need to utilize formal methods with appropriate proofs underpinning the guarantees as to the security of the protocol. The foundation to such a standardized approach requires unambiguous definitions allowing security properties and treat models to be defined. Once such a foundation exists, this would allow comparisons of protocols. Furthermore; a foundation would allow tools such as scyther to evolve out of the research community into industry. Notwithstanding such tools themselves require certification. Defining such foundations is no trivial undertaking; research should be directed towards defining this common framework incorporating diversities between security researchers, network engineers and cryptographers. Net result of providing a common framework allows tools such as scyther evolve and overall improve security by providing efficiency in vulnerability discovery e.g. vulnerabilities in Needham-Schroeder public key Protocol took several years to discover .
Relying on mature protocols does not necessarily achieve ‘secure’ as a base foundation, this has been evident when ISO/IEC 9798 which started out in 1991 and having gone through several revisions was only formally analyzed in 2010 using the Scyther analytical tool. During this analysis a number of vulnerabilities were discovered.
Abahi and Needham  have identified several prudent best practices for cryptographic protocols; many published vulnerabilities would have been mitigated if these best practices were applied. Protocols tend to be defined by well-intended experts individuals and groups; and most often scrutinized by peer reviews. Often an area of debate discusses ‘secure’ systems and why to date we have yet to create ‘secure’ systems. In addition to Abahi and Needham prudent approaches being sidelined; Saltzer and Schroeder principles often appear to be neglected.
 Basin et al, Improving the security of Cryptographic Protocol Standards, http://www.cs.ox.ac.uk/people/cas.cremers/downloads/papers/BCMRW2013-standards-draft.pdf, 2014.
 Cas Cremers, Sycther Tool
 Gavin Lowe, An Attack on the Needham Schroeder Public Key Authentication Protocol, 1995.
 Martin Abadi and Roger Needham,