
Lifting the Fog: Cyber Security vs. Functional Safety, the CIA Triad, and Cryptography Basics for Embedded Systems
In our previous blog post, we explored the growing importance of cyber security in industrial systems and introduced the EU Cyber Resilience Act (CRA). The CRA is a game-changer for the embedded market, requiring manufacturers to ensure robust cyber security for their digital products. Today, we’ll dive deeper into key concepts like the difference between cyber security and functional safety, the CIA Triad, and the basics of cryptography, with practical examples from the embedded field.
Let’s start by clarifying an important distinction: cyber security and functional safety.
Cyber Security vs. Functional Safety
In industrial environments, cyber security and functional safety address different challenges. They are distinct but interconnected concepts.
Figure 1: Cyber security is not functional safety.
(1) https://www.itgovernance.co.uk/what-is-cybersecurity
(2) https://www.tuv.com/landingpage/en/functional-safety-meets-cybersecurity/main-navigation/functional-safety/
Cyber security protects systems against intentional and unintentional acts, such as hacking or tampering. For example, a compromised robotic system could cause safety hazards if an attacker manipulates its controls.
→ Cyber security protects the machine from humans
Functional safety ensures that a system behaves predictably and safely, even in the event of internal failures or external environmental challenges. For example, an industrial robot stopping its movement when encountering an obstruction is a matter of functional safety.
→ Functional safety protects humans from the machine.
Although distinct, both are critical to ensuring safe and reliable industrial operations.
Figure 2: CySec vs FuSa - Generated with Dall-e
The CIA Triad: The Pillars of Cybersecurity
At the core of cybersecurity lies the CIA Triad: Confidentiality, Integrity, and Availability. These three principles are fundamental to protecting embedded systems and their data.
Figure 3: CIA Triad
(1) https://www.nccoe.nist.gov/publication/1800-26/VolA/index.html
Confidentiality
Confidentiality ensures that sensitive data is only accessible to authorized individuals. For example, in an embedded medical device, patient data must remain confidential, protected from unauthorized access or eavesdropping during transmission.
Integrity
Integrity ensures that data is accurate and unaltered. In an industrial controller, commands sent to machinery must remain intact. If attackers alter these commands, it could lead to equipment malfunction or damage.
Availability
Availability ensures that systems and data are accessible when needed. For instance, in a factory, an embedded system monitoring critical machinery must always be operational to avoid downtime or safety risks.
Together, these pillars ensure that embedded systems are secure, reliable, and functional, even in the face of cyber threats.
Cryptography Basics for Embedded Systems
To support the CIA Triad, embedded systems rely heavily on cryptography. There are two main types of encryption:
Symmetric Encryption: Uses the same key for encryption and decryption. It’s fast and efficient but requires secure key sharing between parties.
Asymmetric Encryption: Uses a pair of keys: one public and one private. The public key is shared openly, while the private key is kept secret.
Asymmetric Encryption in Embedded Systems
Asymmetric encryption is particularly important in embedded systems for secure communications and authentication. It helps address all three aspects of the CIA Triad:
Confidentiality: Public and private keys ensure that only the intended recipient can decrypt the data, keeping it confidential.
Integrity: Digital signatures, created using private keys, ensure that data hasn’t been tampered with during transmission.
Availability: Secure communications protect systems from attacks that could disrupt service, such as denial-of-service (DoS) attacks.
Explaining Public and Private Keys
Think of a public key as a mailbox that anyone can drop letters (messages) into, but only the person with the private key (the mailbox key) can open and read them. Here’s how it works in practice:
- A sender encrypts a message using the recipient’s public key.
- The recipient uses their private key to decrypt the message.
This ensures that only the intended recipient can read the message, even if it’s intercepted during transmission.
Real-World Example: Embedded Systems and Cryptography
In the embedded market, asymmetric encryption is commonly used in secure boot processes. Secure boot ensures that only trusted software is executed on a device. Here’s how it works:
- A device manufacturer digitally signs the software using their private key.
- During boot, the device uses the manufacturer’s public key to verify the signature.
- If the verification succeeds, the device proceeds; otherwise, it halts to prevent running malicious software.
This protects both the integrity and confidentiality of the device while maintaining operational availability.
Stay tuned – be informed – stay ahead!
Cybersecurity in embedded systems is about more than just protecting against threats—it’s about ensuring trust, reliability, and safety. By understanding the difference between cybersecurity and functional safety, applying the CIA Triad, and leveraging cryptography, manufacturers can build systems that are not only compliant with regulations like the Cyber Resilience Act but also resilient against evolving cyber threats.
In future posts, we’ll delve deeper into specific aspects of the CRA, explore advanced security features, and discuss how computer-on-modules can help streamline compliance and enhance security. Stay tuned as we continue lifting the fog around cybersecurity!