Robust Design: Quality vs. Security

Monday, June 14th, 2010 by Robert Cravotta

I had a conversation recently with Nat Hillary, a field application engineer at LDRA Technologies, about examples of software fault tolerance, quality, and security. Our conversation identified many questions and paths that I would like to research further. One such path relates to how software systems that are not fault tolerant may present vulnerabilities that attackers can use to compromise the system. A system’s vulnerability and resistance to software security exploits is generally a specification, design, and implementation quality issue. However, just because secure systems require high quality does not mean that high quality systems are also secure systems because measuring a system’s quality and security focuses on different metrics.

 

Determining a system’s quality involves measuring and ensuring that each component, separately and together, fits or behaves within some specified range of tolerance. The focus is on whether the system can perform its function with acceptable limits rather than on the complete elimination of all variability. The tightness or looseness of a component’s permitted tolerance balances the cost and difficulty of manufacturing identical components with the cumulative impact of allowing variability among the components against the system’s ability to perform its intended function. For example, many software systems ship with some number of known minor implementation defects (bugs) because the remaining bugs do not prevent the system from operating within tolerances during the expected and likely use scenarios. The software in this case is identical from unit to unit, but the variability in the other components in the system can introduce differences in behavior in the system. I will talk about an exhibit at this year’s ESC San Jose that demonstrated this variability in a future post.

 

In contrast, a system’s security depends on protecting its vulnerabilities operating under extraordinary conditions. A single vulnerability under the proper extraordinary conditions can compromise the system’s proper operation. However, similar to determining a system’s quality, a system’s security is not completely dependent on a perfect implementation. If the system can isolate and contain vulnerabilities, it can still be good enough to operate in the real world. The 2008 report “Enhancing the Development Life Cycle to Produce Secure Software” identifies that secure software exhibits:

 

1. Dependability (Correct and Predictable Execution): Justifiable confidence can be attained that software, when executed, functions only as intended;

2. Trustworthiness: No exploitable vulnerabilities or malicious logic exist in the software, either intentionally or unintentionally inserted;

3. Resilience (and Survivability): If compromised, damage to the software will be minimized, and it will recover quickly to an acceptable level of operating capacity;

 

An example of a software system vulnerability that has a fault tolerant solution is the buffer overflow. The buffer overflow is a technique that exploits functions that do not perform proper bounds checking. The Computer Security Technology Planning Study first publicly documented the technique in 1972. Static analysis software tools are able to assist developers to avoid this type of vulnerability by identifying array overflows and underflows, as well as when signed and unsigned data types are improperly used. Using this fault tolerant approach can allow a software system to exhibit the three secure software properties listed above.

[Editor's Note: This was originally posted on the Embedded Master]

Tags:

Leave a Reply