As many of you may have been tracking, the Australian Parliament has been deliberating on a piece of legislation for the last couple of months called the Telecommunications and Other Legislation Amendment (Assistance and Access) Bill 2018. This bill seeks to give Australian law enforcement and intelligence agencies powers to force tech companies and telcos operating in Australia to do any number of things. Here’s a partial list:
(a) removing one or more forms of electronic protection that are or were applied by, or on behalf of, the provider; or
(b) providing technical information; or
(c) installing, maintaining, testing or using software or equipment; or
(d) ensuring that information obtained in connection with the execution of a warrant or authorisation is given in a particular format; or
(f) assisting with the testing, modification, development or maintenance of a technology or capability; or
(h) modifying, or facilitating the modification of, any of the characteristics of a service provided by the designated communications provider; or
(j) an act or thing done to conceal the fact that any thing has been done covertly in the performance of a function, or the exercise of a power, conferred by a law of the Commonwealth, a State or a Territory, so far as the function or power relates to:
(i) enforcing the criminal law and laws imposing pecuniary penalties; or
(ii) assisting the enforcement of the criminal laws in force in a foreign country; or
(iii) the interests of Australia’s national security, the interests of Australia’s foreign relations or the interests of Australia’s national economic well-being.
Pretty scary stuff, huh? We thought so too. We’ve been tracking the progress of the bill and have been trying to analyse it to work out what the implications might be for software companies both inside and outside of Australia, and especially for Loki.
The Assistance and Access Bill 2018 (Shortened to AAA18 for the rest of the article) gives Australian agencies the ability to issue 3 types of notices to ‘communications service providers.’ The definition of ‘provider’ in the legislation is very broad. Pretty much anyone that provides any service or product that involves the internet could fall under its scope, and the notices that can be issued increase in scope and obligation, and are called Technical Assistance Requests (TARs), Technical Assistance Notices (TANs), and Technical Capability Notices (TCNs). The latter is a legally enforced instruction to create or modify features to give an agency a new technical capability. Although this notice must come from the Attorney General of Australia, the scope for new espionage tools to be created by this notice is extremely broad.
The scariest thing about this bill is the penalties given to providers who leak information about the investigation or notice, or refuse to comply with the notice. Jail sentences as long as 10 years could apply to a whistleblower, and given that these notices could be issued to companies and individuals that provide services to Australians, these notices could be issued to almost anyone around the world. With strong extradition treaties, this legislation could reach people across the nations of the Five Eyes Alliance (UK, US, AU, NZ, CA) and beyond. Some, myself included, strongly suspect that this is a coordinated effort by the Five Eyes alliance to gain access to the world’s most popular applications, as the UK recently pushed through an amendment to the already controversial Investigatory Powers Act of 2016, so that it is now quite closely aligned with this new Australian legislation. They even use similar terms, with the main similarity being the Technical Capability Notice.
Thankfully, AAA18 does explicitly state that these notices cannot be used to force a company to break its own encryption, introduce security flaws, or deliberately ignore existing flaws. It also explicitly says that these notices cannot be used to introduce a ‘systemic weakness’ into the product or service. There were a great many concerns about the definition of ‘systemic weakness’ being too vague in the legislation. The final amendment to the bill gave us the following definitions:
systemic vulnerability means a vulnerability that affects a whole class of technology, but does not include a vulnerability that is selectively introduced to one or more target technologies that are connected with a particular person. For this purpose, it is immaterial whether the person can be identified.
systemic weakness means a weakness that affects a whole class of technology, but does not include a weakness that is selectively introduced to one or more target technologies that are connected with a particular person. For this purpose, it is immaterial whether the person can be identified.
target technology :
(a) for the purposes of this Part, a particular carriage service, so far as the service is used, or is likely to be used, (whether directly or indirectly) by a particular person, is a target technology that is connected with that person; and
(b) for the purposes of this Part, a particular electronic service, so far as the service is used, or is likely to be used, (whether directly or indirectly) by a particular person, is a target technology that is connected with that person; and
(c) for the purposes of this Part, particular software installed, or to be installed, on:
(i) a particular computer; or
(ii) a particular item of equipment;
used, or likely to be used, (whether directly or indirectly) by a particular person is a target technology that is connected with that person; and
(d) for the purposes of this Part, a particular update of software that has been installed on:
(i) a particular computer; or
(ii) a particular item of equipment;
used, or likely to be used, (whether directly or indirectly) by a particular person is a target technology that is connected with that person; and
(e) for the purposes of this Part, a particular item of customer equipment used, or likely to be used, (whether directly or indirectly) by a particular person is atarget technology that is connected with that person; and
(f) for the purposes of this Part, a particular data processing device used, or likely to be used, (whether directly or indirectly) by a particular person is a target technology that is connected with that person.
For the purposes of paragraphs (a), (b), (c), (d), (e) and (f), it is immaterial whether the person can be identified.
What this effectively means is that providers can not be forced to make changes to their products that negatively affect every user of that product. Instead, they can only be ordered to create the means by which they could selectively inject weaknesses or vulnerabilities into a specific product in use by a specific targeted person.
However, AAA18 gives the legislative authority for agencies to create and install monitoring tools and other intrusive mechanisms into all kinds of software and hardware. For this iteration of the bill, these tools can only be switched on and used to target crimes with minimum sentences of 3 years or more. However, there is nothing to say that won’t change and very little oversight is required by this bill. It is also extremely problematic that these tools will even exist in the first place. If they fall into the wrong hands, the effects will be devastating. The NSA in the US developed a range of surveillance techniques that were eventually leaked and used against American interests by criminals and foreign governments, and the same will likely happen here.
The introduction of AAA18 and its UK equivalent means that applications such as WhatsApp, Signal, Facebook Messenger, Gmail, and any other popular communications medium can silently turn into a monitoring device for ASIO, ASDS, AFP, GCHQ, MI5, and so on. The companies behind these products can’t utter a peep about these notices or their repercussions, or warn their users. They are even given protections to indemnify themselves against any civil cases caused by any later discovery of eroded privacy.
This bill does not require these companies to break the encryption of their systems, but there are plenty of other things these agencies could force companies to create and install, such as tools that would allow them to remotely pull information from a specific user’s device post-encryption, or gain access to these services’ servers where treasure troves of metadata could be harvested. Every ‘private’ messaging application out there could now be forced to send data back to intelligence agencies about who is communicating to who and when the communications are taking place. This isn’t hypothetical anymore – these powers are now law, and companies across the world can be compelled to follow them.
We will not know how widely these powers will be deployed until the first annual report is released. All it takes is one TCN for each of Facebook, Google, and WhatsApp, and the vast majority of private communications will be corruptible at the whim of the UK/Australian Intelligence and Law Enforcement community. The information that they can gather from this can easily be shared across the Five Eyes Alliance, effectively giving these tools to the governments of the whole English speaking world.
For advocates of digital privacy such as myself, this is a very concerning development. If anything, it only strengthens the need to shift the paradigm in communication tools so that they are decentralised, open source and private by default. If we succeed, laws such as these can’t dissolve our ability to collectively access private spaces online. I have said this before, but access to these online spaces is critical to a healthy 21st century democracy, and our governments are too quick to dismiss the need for widespread digital security.
How it will affect Loki
Obviously, we were terrified when we first saw this bill. The potential for the project to be entirely undermined by this legislation did not go unnoticed. We had begun to consider how we might set up failsafes to allow people to catch bad code being injected into our codebase, or to pay someone external to Loki to do regular inspections of our binaries that we release and ensure they are not leaking extra information or mismatching the codebase in some way.
If we were to be issued a TCN, we would not be able to tell anyone about it. If we set up some sort of canary system, we could be imprisoned. So whatever failsafe we did set up would have to be external to Loki, and would have to be regularly auditing us to make sure we haven’t been compromised before a TCN was issued.
We had also considered that we may have to leave Australia altogether. However, given the legislation allows them to target any company whether they are in Australia or not, and given most of the Loki team currently lives in Australia and has family here, this would be a very extreme and difficult move to make. None of us would ever be able to return home if we were issued a notice and refused to comply. We may have simply been extradited out of wherever we moved to anyway, so this option seemed extreme and ineffective. New amendments to the bill make this move unnecessary.
Our analysis of the bill and its proposed amendments lead us to the following conclusions:
- With the addition of the amendment defining ‘systemic weakness,’ any modification to the Loki source code that gives authorities new capabilities would almost certainly be classed as a systemic weakness. Thus, we see it is a nearly impossible that we would be forced to implement any privacy degrading code into the software that we release to the public.
- It is feasible that we may be forced to develop an alternative client software for authorities to use with additional data collecting features or something of the sort. Thankfully, given the network is regulated by Service Nodes, the authorities would still need to own 40%+ of the entire Service Node network to make this effective for widespread surveillance. Given our economic assumptions as outlined in our Cryptoeconomics proposal, this is likely to be prohibitively expensive for the government, particularly if the usage of Loki is so high that it is worth the government issuing a TCN in the first place.
- As long as we are able to keep our code open source, we can guarantee that our code will always be auditable. As such, it may make sense to go over our existing licenses and convert them to GPLv3 to prevent closed source software derived from our implementation from appearing. We will require written permission from the copyright holders of the projects we have forked in order to achieve this.
- As Loki is a decentralised system with extensive privacy protections, there is very little we as the Loki team can do to de-anonymise our users even if we wanted to. The extent of the information we might be able to provide to authorities are all known attacks such as DPI and traffic shape correlation, however we don’t have any additional information at hand than any other node operator on the network. This means that if we are issued a notice, we won’t be of much use to these agencies anyway.
The chances that Loki will eventually be issued a notice are fairly high, however such a notice would not result in compromising the privacy that our system provides.
The same can not be said of other communication platforms, where the service provider has central control over the data being transported. Your phone number is attached to your WhatsApp and Signal accounts, and all of the metadata that you create is now up for grabs. The case for a system like Loki has never been stronger. Only by decentralising the routing and storage of communications can it become truly private under this new legislation.
What you can do to stay private on the internet
With this new legislation, it is important to understand and be vigilant about our privacy on the internet. Wherever practical, using open source software is a good start to protecting one’s privacy. Open source software is much more trustworthy than closed source applications as the code is generally reviewed by people from all over the world. If any backdoors or questionable features are added to this software, alarm bells should be ringing shortly afterwards.
Whenever possible, you should also always build the open source software you are using from source. This will remove another layer of trust, as you’ll know the application you build is running exactly what is in the codebase you want to use. Nothing extra, nothing less. It is entirely feasible that modified versions of applications can put in place instead of the real ones if a notice compels a developer to do so. Reproducible builds help to mitigate against this, and this is something we are now striving towards, but it is your responsibility to check you didn’t get served a fake version of the application you want to use.
VPNs are a decent first step in protecting network privacy, but this legislation could severely undermine their utility. If the VPN you use is compelled to install monitoring systems in their service, you might just be handing your browsing and connection information straight over to the authorities and be none the wiser.
A better approach is to use some sort of mixnet. Tor is the obvious choice for the time being, but when Lokinet is launched, you should use it, as you’ll be able to use essentially any program you like straight out of the box (it’s both TCP and UDP friendly). You’ll enjoy low latency, and a dedicated network of incentivised nodes that you can reasonably trust to not be largely comprised of nodes run by surveillance agencies.
I hope this article has clarified the situation for you, and given you some insight to why decentralised privacy is so important. If you’d like to follow what we’re doing at Loki, head to https://loki.network
Simon