It’s no secret that wearable technology, such as smartwatches and fitness trackers, are increasingly becoming a key part of our everyday lives. However, as with any trend, cybercriminals are always hot on the heels and ready to exploit vulnerabilities they find. In October 2021, the European Commission announced it was laying down new legal requirements for wireless devices to ensure that consumer privacy and personal data are protected. However, that will only apply to new devices, and it will be three years before the rules come into force. So, what do we need to know? In our recent episode of the SureCloud Live: Cyber Threat Briefing, Senior Cybersecurity Consultant, Hugh Raynor, and Risk Advisory Senior Director, Craig Moores, tell us more about the current security threats posed by wearable devices.
Our session began looking at big players in the space like Apple and Fitbit. These wearable devices are generally quite good in terms of security as they have the ability to link up with a smartphone or computer, which can handle the encryption and data security elements. However, there are also cheaper devices available from smaller players, lesser-known brands, that operate independently. These are the more concerning devices, as their ability to handle encryption is almost non-existent due to their small processing chips. Your data has to leave that device because it needs to be ingested somewhere else for you to view, and that’s often happening in the ether where we have no control over it.
What kind of data are we talking about? Many devices can count our steps, monitor calories burned, track heart rates, etc., and wearable devices also often hold personal and demographic information. But that’s not all. Hugh and Craig discussed how most of these devices have a microphone that is always listening, right on your wrist. There could be real consequences if these devices were compromised not just from an invasion of privacy but also in terms of identity theft. There’s probably enough information in your day-to-day conversations for a threat actor to use for impersonation and account takeover.
And it isn’t just watches or fitness trackers. There are medical devices that are also wearable technologies, such as glucose sensors and pacemakers. These are essential for people with medical conditions, but they are almost always designed with their primary function, meaning security can be left as an afterthought. Again, since the device is so small and just doesn’t have the processing power to handle modern encryption standards, it would be a real problem if a hacker were to get control, especially if it’s inside your body.
Another problem is that with limited on-device storage, wearable tech must be in constant communication with other devices. Hugh and Craig said this is a big concern, especially when it comes to keeping them up to date. The majority of devices are built with a transmission capability, but that’s it; they are only able to transmit. This makes it very difficult to send new firmware, patches or fixes to these devices because all they know is how to send out, or transmit, your data. And the data, generally, is not encrypted. Often data may be encoded using a proprietary encoding algorithm, but this can be reverse-engineered. And even if the data is encoded, it still doesn’t do anything to protect against replay attacks.
Hugh and Craig explained that not many people realize who their data is shared with, especially when it comes to third-party integrations. If you have a sleep tracker, for example, there might be several apps that you initially installed to use on your phone, but that imports across to your watch as well. The key concern here is the number of interdependencies. Often it’s mentioned as part of a master terms of service agreement, which people might not have read or don’t remember.
They then discussed how many apps, like Strava, are built as a social application. So, it’s not just one organization you share your data with; the whole point is that you put it out there for anyone to access. A few years ago, there was an incident with Strava where you could spot the outline of a US military base in the Middle East because the troops all had wearable technology. That is pretty valuable to someone who wants to do some harm. In general, we would like to think that we’re still in control of our own data but, ultimately, that data has to go to lots of other places in order for the particular service to perform as indicated.
It would seem there are a lot of potential threats and vulnerabilities in wearable technology, and while overall they are personal devices, there should still be a cause for concern amongst organizations. For example, take a smartwatch, which mirrors what your phone does. If paired with a corporate phone, it expands the organization’s attack surface and makes way for a potential compromise that the organization is not in control of. Then there are smart glasses, such as Google Glass, or those being developed for the ‘Metaverse’ by Meta. These glasses create privacy implications as they are constantly recording information – what if an employee was to bring something like that into the corporate environment? When we think about BYOD in the workplace or bring-your-own-device, it should no longer just be a conversation about phones. As wearable technology becomes more and more prevalent, it will be another piece of the puzzle that has the potential to become a very large target for cybercriminals.