Fighting the Rogue Toaster Army: Why Secure Coding in Embedded Systems is Our Defensive Edge

3 years ago 284
BOOK THIS SPACE FOR AD
ARTICLE AD

Rogue Toaster Army

There are plenty of pop culture references to rogue AI and robots, and appliances turning on their human masters. It is the stuff of science fiction, fun, and fantasy, but with IoT and connected devices becoming more prevalent in our homes, we need more discussion around cybersecurity and safety.

Software is all around us, and it's very easy to forget just how much we're relying on lines of code to do all those clever things that provide us so much innovation and convenience.

Much like web-based software, APIs, and mobile devices, vulnerable code in embedded systems can be exploited if it is uncovered by an attacker.

While it's unlikely that an army of toasters is coming to enslave the human race (although, the Tesla bot is a bit concerning) as the result of a cyberattack, malicious cyber events are still possible. Some of our cars, planes, and medical devices also rely on intricate embedded systems code to perform key tasks, and the prospect of these objects being compromised is potentially life-threatening.

Much like every other type of software out there, developers are among the first to get their hands on the code, right at the beginning of the creation phase. And much like every other type of software, this can be the breeding ground for insidious, common vulnerabilities that could go undetected before the product goes live.

Developers are not security experts, nor should any company expect them to play that role, but they can be equipped with a far stronger arsenal to tackle the kind of threats that are relevant to them. Embedded systems - typically written in C and C++ - will be in more frequent use as our tech needs continue to grow and change, and specialized security training for the developers on the tools in this environment is an essential defensive strategy against cyberattacks.

Exploding air fryers, wayward vehicles… are we in real danger?

While there are some standards and regulations around secure development best practices to keep us safe, we need to make far more precise, meaningful strides towards all types of software security. It might seem far-fetched to think of a problem that can be caused by someone hacking into an air fryer, but it has happened in the form of a remote code execution attack (allowing the threat actor to raise the temperature to dangerous levels), as has vulnerabilities leading to vehicle takeovers.

Vehicles are especially complex, with multiple embedded systems onboard, each taking care of micro functions; everything from automatic wipers, to engine and braking capabilities. Intertwined with an ever-increasing stack of communication technologies like WI-Fi, Bluetooth, and GPS, the connected vehicle represents a complex digital infrastructure that is exposed to multiple attack vectors. And with 76.3 million connected vehicles expected to hit roads globally by 2023, that represents a monolith of defensive foundations to lay for true safety.

MISRA is a key organization that is in the good fight against embedded systems threats, having developed guidelines to facilitate code safety, security, portability and reliability in the context of embedded systems. These guidelines are a north star in the standards that every company must strive for in their embedded systems projects.

However, to create and execute code that adheres to this gold standard takes embedded systems engineers who are confident - not to mention security-aware - on the tools.

Why is embedded systems security upskilling so specific?

The C and C++ programming languages are geriatric by today's standards, yet remain widely used. They form the functioning core of the embedded systems codebase, and Embedded C/C++ enjoys a shiny, modern life as part of the connected device world.

Despite these languages having rather ancient roots - and displaying similar vulnerability behaviors in terms of common problems like injection flaws and buffer overflow - for developers to truly have success at mitigating security bugs in embedded systems, they must get hands-on with code that mimics the environments they work in. Generic C training in general security practices simply won't be as potent and memorable as if extra time and care is spent working in an Embedded C context.

With anywhere from a dozen to over one hundred embedded systems in a modern vehicle, it's imperative that developers are given precision training on what to look for, and how to fix it, right in the IDE.

Protecting embedded systems from the start is everyone's responsibility

The status quo in many organizations is that speed of development trumps security, at least when it comes to developer responsibility. They're rarely assessed on their ability to produce secure code, but rapid development of awesome features is the marker of success. The demand for software is only going to increase, but this is a culture that has set us up for a losing battle against vulnerabilities, and the subsequent cyberattacks they allow.

If developers are not trained, that's not their fault, and it's a hole that someone in the AppSec team needs to help fill by recommending the right accessible (not to mention assessable) programs of upskilling for their entire development community. Right at the beginning of a software development project, security needs to be a top consideration, with everyone - especially developers - given what they need to play their part.

Getting hands-on with embedded systems security problems

Buffer overflow, injection flaws, and business logic bugs are all common pitfalls in embedded systems development. When buried deep in a labyrinth of microcontrollers in a single vehicle or device, it can spell disaster from a security perspective.

Buffer overflow is especially prevalent, and if you want to take a deep dive into how it helped compromise that air fryer we talked about before (allowing remote code execution), check out this report on CVE-2020-28592.

Now, it's time to get hands-on with a buffer overflow vulnerability, in real embedded C/C++ code. Play this challenge to see if you can locate, identify, and fix the poor coding patterns that lead to this insidious bug:

[PLAY NOW]

How did you do? Visit www.securecodewarrior.com for precision, effective training on embedded systems security.


Found this article interesting? Follow THN on Facebook, Twitter and LinkedIn to read more exclusive content we post.

Read Entire Article