Surveillance-a-service industry needs to be brought to the forefront

Here we go again: Another example of government surveillance involving Apple and Google’s smartphones has emerged, and it shows how sophisticated government-backed attacks can happen and why it makes sense to keep mobile platforms completely locked down.

What happened?

I don’t want to focus too much on the news, but in a nutshell:

  • Google’s Threat Analysis Group has released information on hacking.
  • The attack was carried out by the Italian surveillance agency RCS Labs.
  • The attack has been used in Italy and Kazakhstan and possibly elsewhere.
  • Some generations of attacks were carried out with the help of ISPs.
  • On iOS, attackers have misused Apple’s Enterprise Certification tool to enable in-house app installation.
  • About nine different attacks have been used.

The attack works like this: the target is sent a unique link whose goal is to trick them into downloading and installing a malicious app. In some cases, Spooks worked with an ISP to tactfully disable data connections to targets to download the app to restore that connection.

The zero-day exploits used in these attacks have been fixed by Apple. It has previously warned that bad actors are abusing its systems to allow businesses to distribute apps at home. The revelations are linked to recent news from Lookout Labs, an enterprise-grade Android spyware called Hermit.

What are the risks?

The problem here is that surveillance technology has been commercialized. This means that powers that were historically only available to the government are also being used by private contractors. And it represents a risk, as highly confidential tools can be leaked, exploited, re-engineered and misused.

As Google puts it: “Our searches underscore the extent to which commercial surveillance vendors have historically expanded the capabilities used by governments to develop and manage exploitation with only technical expertise. This makes the Internet less secure and threatens user-dependent trust.”

Not only that, these private surveillance agencies are enabling the spread of dangerous hacking tools, while providing governments with these high-tech snooping facilities – some of which appear to be spying on dissidents, journalists, political opponents and human rights activists.

An even bigger danger is that Google is already tracking at least 30 spyware manufacturers, suggesting that the commercial surveillance-a-services industry is strong. This means that it is now theoretically possible for even the least credible government to access the tools for such purposes – and given the many identified threats exploited by cybercriminals, it seems reasonable to assume that this is another income stream that encourages corruption. Research

What are the risks?

Problem: These close-to-ideal links between private surveillance and cybercrime entrepreneurs will not always work in one direction. These exploitations – of which at least some seem to be difficult enough to discover that only the government would have the resources to do so – will eventually be exposed.

And while Apple, Google, and everyone else are committed to a cat-and-mouse game to stop this kind of crime, stop exploiting wherever they can, the risk is that any government-directed backdoor or device security flaws will eventually go commercial. The market, from where it will reach the criminals.

Europe’s data protection regulator warns: “Revelations about Pegasus spyware have raised very serious questions about fundamental rights and especially the potential impact of modern spyware tools on privacy and data protection rights.”

It goes without saying that there are no valid reasons for safety research. Errors exist in any system, and we need to motivate people to identify them; Without the efforts of various security researchers, security updates would not exist at all. Apple pays up to six-figure researchers to identify vulnerabilities in their systems.

What happens next?

The European Union’s data protection supervisor earlier this year called for a ban on the NSO Group’s use of the infamous Pegasus software. In fact, the call went further, directly seeking a “ban on the development and deployment of spyware with Pegasus capabilities.”

NSO Group is now apparently up for sale.

The EU further states that if such exploitation is used in exceptional circumstances, organizations such as the NSO should be subject to regulatory supervision for such use. As part of this, they must respect EU law, judicial review, the right to criminal proceedings, and agree not to import illegal intelligence, the political abuse of national security, and to support civil society.

In other words, these companies need to be brought to the line.

What can you do

Since its announcement about the NSO Group last year, Apple has released the following best practice recommendations to help mitigate against such risks.

  • Update devices with the latest software, including the latest security fixes.
  • Secure device with a passcode.
  • Use two-factor authentication and a strong password for Apple ID.
  • Install the app from the App Store.
  • Use strong and unique passwords online.
  • Do not click on links or attachments from unknown senders.

Follow me TwitterOr join me on AppleHolic’s bar & grill and Apple discussion group on MeWe.

Copyright © 2022 IDG Communications, Inc.

Leave a Reply

Your email address will not be published.