Last December, Amazon notified millions of its Ring video surveillance doorbell customers that hackers had gained access, from the company’s security devices, to live camera feeds inside their homes.
One Mississippi family reported hackers had used a Ring security camera to spy on their 8-year-old daughter in her bedroom and speak to the child via the device.
The risk of having your home security camera hacked is fairly high, according to Ian Harris, UC Irvine professor of computer science. He sees evidence of it all the time.
Harris heads the recently launched Cyber Test Range at CALIT2. The lab was created to simulate different types of cyberattacks on IoT-connected devices, those billions of smart appliances and devices – from home security cameras and internet routers to health care trackers and smart lightbulbs– that connect to each other and the internet.
The lab sets up and launches attacks against a variety of systems. “We launch attacks over the network. We launch physical attacks. We open the device and see if we can steal data. We see if it can be reverse-engineered,” says Harris. “The goal is to find out how secure the device is in general, and how resistant it is to attacks.”
The Cyber Test Range is one of three research projects overseen by UCI’s Cybersecurity Policy & Research Institute (CPRI) and its executive director, Bryan Cunningham. The effort to advance the institute’s investigations into cybersecurity vulnerabilities is funded in part by a $1.4 million gift from the Herman P. & Sophia Taubman Foundation.
Security vulnerabilities in IoT devices can yield a host of dangers. Hacking into a home security camera allows cybercriminals to see inside a house and know if someone is home. Most cameras also have microphones, allowing hackers to eavesdrop or record
conversations. Temperature sensors and smart thermostats can help criminals determine when someone is out of town. And because these devices are connected to the internet, hackers can move laterally to hijack other connected devices in a home.
Vulnerable IoT devices are also at risk of becoming exploited as botnets. Cybercriminals can infect these devices with malicious software that enables the hacker to hijack the processing power of the small devices. Thousands of infected devices, known as botnets, can produce computing power to rival that of a supercomputer; they can be used to route traffic and launch cyberattacks designed to overwhelm servers and disrupt the internet.
The problem with security today isn’t that you can’t make things secure. “You can make things secure, but it’s going to cost you, and nobody wants to spend the money,” Harris says. “There are people who write the code and design these systems but they’re not
trained in how to write secure code. Programmers learn how to get their code to work; security is optional.
One reason is perceived value, he says. “If the cost of a device increases 20 percent because it’s more secure, people probably won’t see the value nearly as much as when the cost is reflected in a brand new, cool feature.” Harris also believes programmers hold some of the blame. “There are people who write the code and design these systems but they’re not trained in how to write secure code. Programmers learn how to get their code to work; security is optional,” he says.
The work, which Harris, Cunningham and the Cyber Test Range are doing, is multidisciplinary, encompassing law, policy and government standards alongside security. CPRI’s Cunningham, an expert in cybersecurity law and policy, is a former White House lawyer who served as deputy legal adviser to former National Security Adviser Condoleezza Rice. He also served six years in the Clinton Administration as a senior CIA officer and federal prosecutor.
Cunningham works closely with UCI law professor Shauhin Talesh, who explores the role insurance companies play in helping organizations comply with privacy laws and deal with cyber theft. The U.S. doesn’t have a law that regulates security and privacy across the entire economy. “In the absence of this, insurance companies are setting the standards for online cybersecurity,” Cunningham says. “They’re becoming the de facto cyber regulators.”
Another key player in the lab’s effort is Scott Jordan, UCI professor of computer science, who leads efforts to research government regulation and standards, and works to create regulations for laws that govern connected-device security and privacy.
California is the first state in the country to have an actual IoT device security law. Senate Bill 327 requires companies that make Internet of Things (IoT) devices to incorporate minimum security features for every device. CPRI is helping to write draft regulations around the new IoT security law. “Much of how the law will be implemented will depend on these regulations,” Cunningham says.
The Cyber Test Range also develops prototype systems based on standards recommendations to evaluate strength and security, as well as other properties that could be affected. “If the standard makes a prototype completely secure but it doubles the cost or slows performance, there’s no way a company is going to adopt that,” Cunningham says.
The lab expects to take advantage of CENIC, a high-performance, high-bandwidth network that links California’s research universities. The Cyber Test Range currently conducts local attacks within the CALIT2 Building, but using the network would allow for more realistic remote attacks. “To do that, we need an isolatable network because we’re running real attacks; we can’t let them get out into the internet,” says Harris.
Another research problem: even when companies do build security into their devices, they’re not necessarily communicating and coordinating with other companies building devices that will connect to their product. One example is home device hubs.
Consumers may have several smart TVs from different manufacturers that aren’t configured to talk to each other. This can cause user frustration and the possibility that security features will be disabled or never turned on. “The Cyber Test Range will be working to find ways to create new technologies that will allow devices to be secure even when they aren’t originally designed to be compatible,” Cunningham says. By improving security and regulation standards, he foresees opportunities to greatly enhance consumer apps. For example, many vehicles are already equipped with an onboard safety feature that can sense when there has been an accident and call for medical help. But what if you’ve had an automobile accident and the interface on your Apple
watch could transmit your vital statistics to the paramedics before they arrive?
It’s unlikely that Apple and Chevrolet are talking to each other about how to make this happen, Cunningham says. “It’s impossible for an automobile manufacturer to predict every type of app being built that can connect to their car.” The solution would be uniform standards for all medical and health technology, not unlike specific standards now required for Bluetooth devices.
“There are 10,000 things we could be working on,” he adds. “It’s a blessing and the curse for cybersecurity. In some ways the biggest challenge we have is figuring out what to prioritize and focus on, but it’s a good problem to have as long as we’re being funded.”
– Sharon Henry