Tag Archives: internetofthings

Securing the Internet of Things through Class-Action Lawsuits

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2020/02/securing_the_in.html

This law journal article discusses the role of class-action litigation to secure the Internet of Things.

Basically, the article postulates that (1) market realities will produce insecure IoT devices, and (2) political failures will leave that industry unregulated. Result: insecure IoT. It proposes proactive class action litigation against manufacturers of unsafe and unsecured IoT devices before those devices cause unnecessary injury or death. It’s a lot to read, but it’s an interesting take on how to secure this otherwise disastrously insecure world.

And it was inspired by my book, Click Here to Kill Everybody.

Security in 2020: Revisited

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2020/02/security_in_202_1.html

Ten years ago, I wrote an essay: “Security in 2020.” Well, it’s finally 2020. I think I did pretty well. Here’s what I said back then:

There’s really no such thing as security in the abstract. Security can only be defined in relation to something else. You’re secure from something or against something. In the next 10 years, the traditional definition of IT security — that it protects you from hackers, criminals, and other bad guys — will undergo a radical shift. Instead of protecting you from the bad guys, it will increasingly protect businesses and their business models from you.

Ten years ago, the big conceptual change in IT security was deperimeterization. A wordlike grouping of 18 letters with both a prefix and a suffix, it has to be the ugliest word our industry invented. The concept, though — the dissolution of the strict boundaries between the internal and external network — was both real and important.

There’s more deperimeterization today than there ever was. Customer and partner access, guest access, outsourced e-mail, VPNs; to the extent there is an organizational network boundary, it’s so full of holes that it’s sometimes easier to pretend it isn’t there. The most important change, though, is conceptual. We used to think of a network as a fortress, with the good guys on the inside and the bad guys on the outside, and walls and gates and guards to ensure that only the good guys got inside. Modern networks are more like cities, dynamic and complex entities with many different boundaries within them. The access, authorization, and trust relationships are even more complicated.

Today, two other conceptual changes matter. The first is consumerization. Another ponderous invented word, it’s the idea that consumers get the cool new gadgets first, and demand to do their work on them. Employees already have their laptops configured just the way they like them, and they don’t want another one just for getting through the corporate VPN. They’re already reading their mail on their BlackBerrys or iPads. They already have a home computer, and it’s cooler than the standard issue IT department machine. Network administrators are increasingly losing control over clients.

This trend will only increase. Consumer devices will become trendier, cheaper, and more integrated; and younger people are already used to using their own stuff on their school networks. It’s a recapitulation of the PC revolution. The centralized computer center concept was shaken by people buying PCs to run VisiCalc; now it’s iPads and Android smartphones.

The second conceptual change comes from cloud computing: our increasing tendency to store our data elsewhere. Call it decentralization: our email, photos, books, music, and documents are stored somewhere, and accessible to us through our consumer devices. The younger you are, the more you expect to get your digital stuff on the closest screen available. This is an important trend, because it signals the end of the hardware and operating system battles we’ve all lived with. Windows vs. Mac doesn’t matter when all you need is a web browser. Computers become temporary; user backup becomes irrelevant. It’s all out there somewhere — and users are increasingly losing control over their data.

During the next 10 years, three new conceptual changes will emerge, two of which we can already see the beginnings of. The first I’ll call deconcentration. The general-purpose computer is dying and being replaced by special-purpose devices. Some of them, like the iPhone, seem general purpose but are strictly controlled by their providers. Others, like Internet-enabled game machines or digital cameras, are truly special purpose. In 10 years, most computers will be small, specialized, and ubiquitous.

Even on what are ostensibly general-purpose devices, we’re seeing more special-purpose applications. Sure, you could use the iPhone’s web browser to access the New York Times website, but it’s much easier to use the NYT’s special iPhone app. As computers become smaller and cheaper, this trend will only continue. It’ll be easier to use special-purpose hardware and software. And companies, wanting more control over their users’ experience, will push this trend.

The second is decustomerization — now I get to invent the really ugly words — the idea that we get more of our IT functionality without any business relation­ship. We’re all part of this trend: every search engine gives away its services in exchange for the ability to advertise. It’s not just Google and Bing; most webmail and social networking sites offer free basic service in exchange for advertising, possibly with premium services for money. Most websites, even useful ones that take the place of client software, are free; they are either run altruistically or to facilitate advertising.

Soon it will be hardware. In 1999, Internet startup FreePC tried to make money by giving away computers in exchange for the ability to monitor users’ surfing and purchasing habits. The company failed, but computers have only gotten cheaper since then. It won’t be long before giving away netbooks in exchange for advertising will be a viable business. Or giving away digital cameras. Already there are companies that give away long-distance minutes in exchange for advertising. Free cell phones aren’t far off. Of course, not all IT hardware will be free. Some of the new cool hardware will cost too much to be free, and there will always be a need for concentrated computing power close to the user — game systems are an obvious example — but those will be the exception. Where the hardware costs too much to just give away, however, we’ll see free or highly subsidized hardware in exchange for locked-in service; that’s already the way cell phones are sold.

This is important because it destroys what’s left of the normal business rela­tionship between IT companies and their users. We’re not Google’s customers; we’re Google’s product that they sell to their customers. It’s a three-way relation­ship: us, the IT service provider, and the advertiser or data buyer. And as these noncustomer IT relationships proliferate, we’ll see more IT companies treating us as products. If I buy a Dell computer, then I’m obviously a Dell customer; but if I get a Dell computer for free in exchange for access to my life, it’s much less obvious whom I’m entering a business relationship with. Facebook’s continual ratcheting down of user privacy in order to satisfy its actual customers­–the advertisers–and enhance its revenue is just a hint of what’s to come.

The third conceptual change I’ve termed depersonization: computing that removes the user, either partially or entirely. Expect to see more software agents: programs that do things on your behalf, such as prioritize your email based on your observed preferences or send you personalized sales announcements based on your past behavior. The “people who liked this also liked” feature on many retail websites is just the beginning. A website that alerts you if a plane ticket to your favorite destination drops below a certain price is simplistic but useful, and some sites already offer this functionality. Ten years won’t be enough time to solve the serious artificial intelligence problems required to fully real­ize intelligent agents, but the agents of that time will be both sophisticated and commonplace, and they’ll need less direct input from you.

Similarly, connecting objects to the Internet will soon be cheap enough to be viable. There’s already considerable research into Internet-enabled medical devices, smart power grids that communicate with smart phones, and networked automobiles. Nike sneakers can already communicate with your iPhone. Your phone already tells the network where you are. Internet-enabled appliances are already in limited use, but soon they will be the norm. Businesses will acquire smart HVAC units, smart elevators, and smart inventory systems. And, as short-range communications — like RFID and Bluetooth — become cheaper, everything becomes smart.

The “Internet of things” won’t need you to communicate. The smart appliances in your smart home will talk directly to the power company. Your smart car will talk to road sensors and, eventually, other cars. Your clothes will talk to your dry cleaner. Your phone will talk to vending machines; they already do in some countries. The ramifications of this are hard to imagine; it’s likely to be weirder and less orderly than the contemporary press describes it. But certainly smart objects will be talking about you, and you probably won’t have much control over what they’re saying.

One old trend: deperimeterization. Two current trends: consumerization and decentralization. Three future trends: deconcentration, decustomerization, and depersonization. That’s IT in 2020 — it’s not under your control, it’s doing things without your knowledge and consent, and it’s not necessarily acting in your best interests. And this is how things will be when they’re working as they’re intended to work; I haven’t even started talking about the bad guys yet.

That’s because IT security in 2020 will be less about protecting you from traditional bad guys, and more about protecting corporate business models from you. Deperimeterization assumes everyone is untrusted until proven otherwise. Consumerization requires networks to assume all user devices are untrustworthy until proven otherwise. Decentralization and deconcentration won’t work if you’re able to hack the devices to run unauthorized software or access unauthorized data. Deconsumerization won’t be viable unless you’re unable to bypass the ads, or whatever the vendor uses to monetize you. And depersonization requires the autonomous devices to be, well, autonomous.

In 2020 — 10 years from now — Moore’s Law predicts that computers will be 100 times more powerful. That’ll change things in ways we can’t know, but we do know that human nature never changes. Cory Doctorow rightly pointed out that all complex ecosystems have parasites. Society’s traditional parasites are criminals, but a broader definition makes more sense here. As we users lose control of those systems and IT providers gain control for their own purposes, the definition of “parasite” will shift. Whether they’re criminals trying to drain your bank account, movie watchers trying to bypass whatever copy protection studios are using to protect their profits, or Facebook users trying to use the service without giving up their privacy or being forced to watch ads, parasites will continue to try to take advantage of IT systems. They’ll exist, just as they always have existed, and — like today — security is going to have a hard time keeping up with them.

Welcome to the future. Companies will use technical security measures, backed up by legal security measures, to protect their business models. And unless you’re a model user, the parasite will be you.

My only real complaint with the essay is that I used “decentralization” in a nonstandard manner, and didn’t explain it well. I meant that our personal data will become decentralized; instead of it all being on our own computers, it will be on the computers of various cloud providers. But that causes a massive centralization of all of our data. I should have explicitly called out the risks of that.

Otherwise, I’m happy with what I wrote ten years ago.

Half a Million IoT Device Passwords Published

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2020/01/half_a_million_.html

It’s a list of easy-to-guess passwords for IoT devices on the Internet as recently as last October and November. Useful for anyone putting together a bot network:

A hacker has published this week a massive list of Telnet credentials for more than 515,000 servers, home routers, and IoT (Internet of Things) “smart” devices.

The list, which was published on a popular hacking forum, includes each device’s IP address, along with a username and password for the Telnet service, a remote access protocol that can be used to control devices over the internet.

According to experts to who ZDNet spoke this week, and a statement from the leaker himself, the list was compiled by scanning the entire internet for devices that were exposing their Telnet port. The hacker than tried using (1) factory-set default usernames and passwords, or (2) custom, but easy-to-guess password combinations.

Lousy IoT Security

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/12/lousy_iot_secur.html

DTEN makes smart screens and whiteboards for videoconferencing systems. Forescout found that their security is terrible:

In total, our researchers discovered five vulnerabilities of four different kinds:

  • Data exposure: PDF files of shared whiteboards (e.g. meeting notes) and other sensitive files (e.g., OTA — over-the-air updates) were stored in a publicly accessible AWS S3 bucket that also lacked TLS encryption (CVE-2019-16270, CVE-2019-16274).
  • Unauthenticated web server: a web server running Android OS on port 8080 discloses all whiteboards stored locally on the device (CVE-2019-16271).
  • Arbitrary code execution: unauthenticated root shell access through Android Debug Bridge (ADB) leads to arbitrary code execution and system administration (CVE-2019-16273).

  • Access to Factory Settings: provides full administrative access and thus a covert ability to capture Windows host data from Android, including the Zoom meeting content (audio, video, screenshare) (CVE-2019-16272).

These aren’t subtle vulnerabilities. These are stupid design decisions made by engineers who had no idea how to create a secure system. And this, in a nutshell, is the problem with the Internet of Things.

From a Wired article:

One issue that jumped out at the researchers: The DTEN system stored notes and annotations written through the whiteboard feature in an Amazon Web Services bucket that was exposed on the open internet. This means that customers could have accessed PDFs of each others’ slides, screenshots, and notes just by changing the numbers in the URL they used to view their own. Or anyone could have remotely nabbed the entire trove of customers’ data. Additionally, DTEN hadn’t set up HTTPS web encryption on the customer web server to protect connections from prying eyes. DTEN fixed both of these issues on October 7. A few weeks later, the company also fixed a similar whiteboard PDF access issue that would have allowed anyone on a company’s network to access all of its stored whiteboard data.

[…]

The researchers also discovered two ways that an attacker on the same network as DTEN devices could manipulate the video conferencing units to monitor all video and audio feeds and, in one case, to take full control. DTEN hardware runs Android primarily, but uses Microsoft Windows for Zoom. The researchers found that they can access a development tool known as “Android Debug Bridge,” either wirelessly or through USB ports or ethernet, to take over a unit. The other bug also relates to exposed Android factory settings. The researchers note that attempting to implement both operating systems creates more opportunities for misconfigurations and exposure. DTEN says that it will push patches for both bugs by the end of the year.

Boing Boing article.

NTSB Investigation of Fatal Driverless Car Accident

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/11/ntsb_investigat.html

Autonomous systems are going to have to do much better than this.

The Uber car that hit and killed Elaine Herzberg in Tempe, Ariz., in March 2018 could not recognize all pedestrians, and was being driven by an operator likely distracted by streaming video, according to documents released by the U.S. National Transportation Safety Board (NTSB) this week.

But while the technical failures and omissions in Uber’s self-driving car program are shocking, the NTSB investigation also highlights safety failures that include the vehicle operator’s lapses, lax corporate governance of the project, and limited public oversight.

The details of what happened in the seconds before the collision are worth reading. They describe a cascading series of issues that led to the collision and the fatality.

As computers continue to become part of things, and affect the world in a direct physical manner, this kind of thing will become even more important.

Measuring the Security of IoT Devices

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/10/measuring_the_s.html

In August, CyberITL completed a large-scale survey of software security practices in the IoT environment, by looking at the compiled software.

Data Collected:

  • 22 Vendors
  • 1,294 Products
  • 4,956 Firmware versions
  • 3,333,411 Binaries analyzed
  • Date range of data: 2003-03-24 to 2019-01-24 (varies by vendor, most up to 2018 releases)

[…]

This dataset contains products such as home routers, enterprise equipment, smart cameras, security devices, and more. It represents a wide range of either found in the home, enterprise or government deployments.

Vendors are Asus, Belkin, DLink, Linksys, Moxa, Tenda, Trendnet, and Ubiquiti.

CyberITL’s methodology is not source code analysis. They look at the actual firmware. And they don’t look for vulnerabilities; they look for secure coding practices that indicate that the company is taking security seriously, and whose lack pretty much guarantees that there will be vulnerabilities. These include address space layout randomization and stack guards.

A summary of their results.

CITL identified a number of important takeaways from this study:

  • On average, updates were more likely to remove hardening features than add them.
  • Within our 15 year data set, there have been no positive trends from any one vendor.
  • MIPS is both the most common CPU architecture and least hardened on average.
  • There are a large number of duplicate binaries across multiple vendors, indicating a common build system or toolchain.

Their website contains the raw data.

Another Attack Against Driverless Cars

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/07/another_attack_.html

In this piece of research, attackers successfully attack a driverless car system — Renault Captur’s “Level 0” autopilot (Level 0 systems advise human drivers but do not directly operate cars) — by following them with drones that project images of fake road signs in 100ms bursts. The time is too short for human perception, but long enough to fool the autopilot’s sensors.

Boing Boing post.

Resetting Your GE Smart Light Bulb

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/07/resetting_your_.html

If you need to reset the software in your GE smart light bulb — firmware version 2.8 or later — just follow these easy instructions:

Start with your bulb off for at least 5 seconds.

  1. Turn on for 8 seconds
  2. Turn off for 2 seconds
  3. Turn on for 8 seconds
  4. Turn off for 2 seconds
  5. Turn on for 8 seconds
  6. Turn off for 2 seconds
  7. Turn on for 8 seconds
  8. Turn off for 2 seconds
  9. Turn on for 8 seconds
  10. Turn off for 2 seconds
  11. Turn on

Bulb will flash on and off 3 times if it has been successfully reset.

Welcome to the future!

Zipcar Disruption

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/03/zipcar_disrupti.html

This isn’t a security story, but it easily could have been. Last Saturday, Zipcar had a system outage: “an outage experienced by a third party telecommunications vendor disrupted connections between the company’s vehicles and its reservation software.”

That didn’t just mean people couldn’t get cars they reserved. Sometimes is meant they couldn’t get the cars they were already driving to work:

Andrew Jones of Roxbury was stuck on hold with customer service for at least a half-hour while he and his wife waited inside a Zipcar that would not turn back on after they stopped to fill it up with gas.

“We were just waiting and waiting for the call back,” he said.

Customers in other states, including New York, California, and Oregon, reported a similar problem. One user who tweeted about issues with a Zipcar vehicle listed his location as Toronto.

Some, like Jones, stayed with the inoperative cars. Others, including Tina Penman in Portland, Ore., and Heather Reid in Cambridge, abandoned their Zipcar. Penman took an Uber home, while Reid walked from the grocery store back to her apartment.

This is a reliability issue that turns into a safety issue. Systems that touch the direct physical world like this need better fail-safe defaults.

An Argument that Cybersecurity Is Basically Okay

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/03/an_argument_tha.html

Andrew Odlyzko’s new essay is worth reading — “Cybersecurity is not very important“:

Abstract: There is a rising tide of security breaches. There is an even faster rising tide of hysteria over the ostensible reason for these breaches, namely the deficient state of our information infrastructure. Yet the world is doing remarkably well overall, and has not suffered any of the oft-threatened giant digital catastrophes. This continuing general progress of society suggests that cyber security is not very important. Adaptations to cyberspace of techniques that worked to protect the traditional physical world have been the main means of mitigating the problems that occurred. This “chewing gum and baling wire”approach is likely to continue to be the basic method of handling problems that arise, and to provide adequate levels of security.

I am reminded of these two essays. And, as I said in the blog post about those two essays:

This is true, and is something I worry will change in a world of physically capable computers. Automation, autonomy, and physical agency will make computer security a matter of life and death, and not just a matter of data.

The Latest in Creepy Spyware

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/03/the_latest_in_c.html

The Nest home alarm system shipped with a secret microphone, which — according to the company — was only an accidental secret:

On Tuesday, a Google spokesperson told Business Insider the company had made an “error.”

“The on-device microphone was never intended to be a secret and should have been listed in the tech specs,” the spokesperson said. “That was an error on our part.”

Where are the consumer protection agencies? They should be all over this.

And while they’re figuring out which laws Google broke, they should also look at American Airlines. Turns out that some of their seats have built-in cameras:

American Airlines spokesperson Ross Feinstein confirmed to BuzzFeed News that cameras are present on some of the airlines’ in-flight entertainment systems, but said “they have never been activated, and American is not considering using them.” Feinstein added, “Cameras are a standard feature on many in-flight entertainment systems used by multiple airlines. Manufacturers of those systems have included cameras for possible future uses, such as hand gestures to control in-flight entertainment.”

That makes it all okay, doesn’t it?

Actually, I kind of understand the airline seat camera thing. My guess is that whoever designed the in-flight entertainment system just specced a standard tablet computer, and they all came with unnecessary features like cameras. This is how we end up with refrigerators with Internet connectivity and Roombas with microphones. It’s cheaper to leave the functionality in than it is to remove it.

Still, we need better disclosure laws.

Security Flaws in Children’s Smart Watches

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/01/security_flaws_3.html

A year ago, the Norwegian Consumer Council published an excellent security analysis of children’s GPS-connected smart watches. The security was terrible. Not only could parents track the children, anyone else could also track the children.

A recent analysis checked if anything had improved after that torrent of bad press. Short answer: no.

Guess what: a train wreck. Anyone could access the entire database, including real time child location, name, parents details etc. Not just Gator watches either — the same back end covered multiple brands and tens of thousands of watches

The Gator web backend was passing the user level as a parameter. Changing that value to another number gave super admin access throughout the platform. The system failed to validate that the user had the appropriate permission to take admin control!

This means that an attacker could get full access to all account information and all watch information. They could view any user of the system and any device on the system, including its location. They could manipulate everything and even change users’ emails/passwords to lock them out of their watch.

In fairness, upon our reporting of the vulnerability to them, Gator got it fixed in 48 hours.

This is a lesson in the limits of naming and shaming: publishing vulnerabilities in an effort to get companies to improve their security. If a company is specifically named, it is likely to improve the specific vulnerability described. But that is unlikely to translate into improved security practices in the future. If an industry, or product category, is named generally, nothing is likely to happen. This is one of the reasons I am a proponent of regulation.

News article.

EDITED TO ADD (2/13): The EU has acted in a similar case.

Security Analysis of the LIFX Smart Light Bulb

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/01/security_analys_6.html

The security is terrible:

In a very short limited amount of time, three vulnerabilities have been discovered:

  • Wifi credentials of the user have been recovered (stored in plaintext into the flash memory).
  • No security settings. The device is completely open (no secure boot, no debug interface disabled, no flash encryption).
  • Root certificate and RSA private key have been extracted.

Boing Boing post.

Japanese Government Will Hack Citizens’ IoT Devices

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/01/japanese_govern.html

The Japanese government is going to run penetration tests against all the IoT devices in their country, in an effort to (1) figure out what’s insecure, and (2) help consumers secure them:

The survey is scheduled to kick off next month, when authorities plan to test the password security of over 200 million IoT devices, beginning with routers and web cameras. Devices in people’s homes and on enterprise networks will be tested alike.

[…]

The Japanese government’s decision to log into users’ IoT devices has sparked outrage in Japan. Many have argued that this is an unnecessary step, as the same results could be achieved by just sending a security alert to all users, as there’s no guarantee that the users found to be using default or easy-to-guess passwords would change their passwords after being notified in private.

However, the government’s plan has its technical merits. Many of today’s IoT and router botnets are being built by hackers who take over devices with default or easy-to-guess passwords.

Hackers can also build botnets with the help of exploits and vulnerabilities in router firmware, but the easiest way to assemble a botnet is by collecting the ones that users have failed to secure with custom passwords.

Securing these devices is often a pain, as some expose Telnet or SSH ports online without the users’ knowledge, and for which very few users know how to change passwords. Further, other devices also come with secret backdoor accounts that in some cases can’t be removed without a firmware update.

I am interested in the results of this survey. Japan isn’t very different from other industrialized nations in this regard, so their findings will be general. I am less optimistic about the country’s ability to secure all of this stuff — especially before the 2020 Summer Olympics.

Hacking Construction Cranes

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/01/hacking_constru.html

Construction cranes are vulnerable to hacking:

In our research and vulnerability discoveries, we found that weaknesses in the controllers can be (easily) taken advantage of to move full-sized machines such as cranes used in construction sites and factories. In the different attack classes that we’ve outlined, we were able to perform the attacks quickly and even switch on the controlled machine despite an operator’s having issued an emergency stop (e-stop).

The core of the problem lies in how, instead of depending on wireless, standard technologies, these industrial remote controllers rely on proprietary RF protocols, which are decades old and are primarily focused on safety at the expense of security. It wasn’t until the arrival of Industry 4.0, as well as the continuing adoption of the industrial internet of things (IIoT), that industries began to acknowledge the pressing need for security.

News article. Report.

Why Internet Security Is So Bad

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2019/01/why_internet_se.html

I recently read two different essays that make the point that while Internet security is terrible, it really doesn’t affect people enough to make it an issue.

This is true, and is something I worry will change in a world of physically capable computers. Automation, autonomy, and physical agency will make computer security a matter of life and death, and not just a matter of data.