Designing hardware for data privacy

Article By : Richard Quinnell

Ensuring privacy of electronic data requires data security, but a secure design does not necessarily assure data privacy. Developers must consider the two together.

As Internet-connected devices become more prevalent, they are fueling an increasing risk to privacy. Fortunately, there are now many off-the-shelf chips and services available to help designs resist intrusion and prevent unauthorized access to private data. The key lies in identifying the specific threats that need mitigation.

Broadly stated, privacy entails keeping designated information inaccessible without authorization from the information’s owner. Privacy involves security; information cannot be kept private without also keeping it secure. But they are not the same thing. Security for information also involves preventing malicious alteration or destruction. In that sense, information can be kept secure without being kept private.

Editor’s Note: Our technological capabilities are developing faster than are the laws that dictate how they can be used. There are also many gray areas that require a lot of deliberation. This article is part of an AspenCore Special Project that examines the technology, what we can do with it, and what we should do with it. Other articles cover the crossroads of cybersecurity and privacy, facial recognition, security in smart meters, data privacy laws, and more.

There are many reasons to keep information private, with one of the major ones the many governmental regulations and requirements worldwide for maintaining privacy of information that can be associated with an individual person, known as personally identifiable information (PII). Much of what constitutes PII is obvious – things like a person’s name, address, account numbers, location, health status, images, and activities are all PII that most would recognize as meriting privacy.

But other types of information that might need to be kept secure are not as obviously private, because the information in and of itself is not personally identifiable. Paired with a second piece of information, however, that first piece can become identifiable, thus requiring privacy. For example, a connected thermostat’s temperature setting may not seem like it needs privacy protection. Somewhere a thermostat is set to 55 degrees. So, what? Pair that setting with information about the thermostat’s current location, however, and it suddenly becomes personal. Someone knowing that a specific house has its thermostat set to 55 degrees may allow them to infer that the owners are on vacation and the house is relatively safe to burgle.

In electronic devices the information, or data, that may need to be kept secure is substantial. Both data at rest, i.e., in memory or storage, as well as data in motion, passing through the network or along wires and circuits, will need protection. If that protection involves encryption, which it usually does, then the encryption keys must also be protected.  And in addition to the personal data and keys present or moving about, designers may need to ensure that system software and firmware is safe from tampering, lest a “bad actor” alter system behavior to make it disclose information that was to be kept private.

There are also a substantial, and growing, number of ways in which electronic data security can be compromised. Remote attackers can eavesdrop on network communications. A “man in the middle” can intercept messages in transit and pretend to be the intended recipient, passing along messages between the legitimate parties to conceal the interruption. A remote attacker also can send messages that trigger system weaknesses, insert malware during provisioning or over-the-air updates, or exploit other code vulnerabilities.

Figure 1 There are many forms of attack that can compromise data security, requiring many different countermeasures. (Image source: ARM)

If the attacker has physical access, or even just proximity, to a system then additional attack avenues open, both intrusive and non-intrusive. Probing of signal buses can reveal private information as it flows through the system. Monitoring power lines or EMI radiation can provide information that allows recovery of encryption keys. Altering the content of or even replacing code storage devices is also possible with physical access, as is stripping away packaging to expose die for probing and analysis.

Physical access attacks on data security can occur at times other than after a device is deployed, as well. Bad actors can copy encryption keys, or even clone whole systems, during manufacturing and assembly, for instance. A device’s clone can then operate in a system as though it belongs there, bypassing nearly all forms of security with “inside” activity. And once a device is retired from service and discarded, someone can acquire it and extract information at their leisure.

No single method provides security against the myriad types of attack possible and implementing all possible security methods can be prohibitively expensive. Also, attack opportunities and risk vary by application. So, developers must choose carefully what methods they will implement. The place to start is before beginning detailed design, by creating a threat model. The model should consider not only the device itself, but the systems connected to it. It should also examine the environment of the device’s normal operational setting, as well as throughout its entire life cycle.

In developing the threat model, there are several steps developers should take:

  1. Identify the assets (data) they need to protect. For privacy this will include any PII present or in motion through the device. It also must include encryption keys if used, and may need to include system firmware, identity codes, message metadata, and more.
  2. Identify the types of attack these assets might be subject to, estimate their likelihood, and evaluate a successful attack’s impact. Be sure to include the device’s whole lifecycle, including manufacturing, deployment, and disposal. This will help identify which threats are serious enough to require mitigation.
  3. Determine the countermeasures that will be required and their implementation cost in terms of design time, performance, and bills of material.

The threat model thus developed will help guide the hardware and software design attributes required for protecting data privacy. Fortunately, the semiconductor industry has been developing numerous devices and services that address most threats and offers developer support in identifying applying the right countermeasures.

[Continue reading on EDN US: Data security hardware and services]

Rich Quinnell is an engineer, writer, and Global Managing Editor for the AspenCore Network.

Leave a comment