In which I attempt to be pragmatic.
Are you allowed to run whatever computer program you want on the hardware you own? This is a question where freedom, practicality, and reality all collide into a mess.
Google has recently announced that Android users will only be able to install apps which have been digitally signed by developers who have registered their name and other legal details with Google. To many people, this signals the death of "sideloading" - the ability to install apps which don't originate on the official store0.
I'm a fully paid-up member of the Cory Doctorow fanclub. Back in 2011, he gave a speech called "The Coming War on General Computation". In it, he rails against the idea that our computers could become traitorous; serving the needs of someone other than their owner. Do we want to live in a future where our computers refuse to obey our commands? No! Neither law nor technology should conspire to reduce our freedom to compute.
There are, I think, two small cracks in that argument.
The first is that a user has no right to run anyone else's code, if the code owner doesn't want to make it available to them. Consider a bank which has an app. When customers are scammed, the bank is often liable. The bank wants to reduce its liability so it says "you can't run our app on a rooted phone".
Is that fair? Probably not. Rooting allows a user to fully control and customise their device. But rooting also allows malware to intercept communications, send commands, and perform unwanted actions. I think the bank has the right to say "your machine is too risky - we don't want our code to run on it."
The same is true of video games with strong "anti-cheat" protection. It is disruptive to other players - and to the business model - if untrustworthy clients can disrupt the game. Again, it probably isn't fair to ban users who run on permissive software, but it is a rational choice by the manufacturer. And, yet again, I think software authors probably should be able to restrict things which cause them harm.
So, from their point of view it is pragmatic to insist that their software can only be loaded from a trustworthy location.
But that's not the only thing Google is proposing. Let's look at their announcement:
We’ve seen how malicious actors hide behind anonymity to harm users by impersonating developers and using their brand image to create convincing fake apps. The scale of this threat is significant: our recent analysis found over 50 times more malware from internet-sideloaded sources than on apps available through Google Play.
Back in the early days of Android, you could just install any app and it would run, no questions asked. That was a touchingly naïve approach to security - extremely easy to use but left users vulnerable.
A few years later, Android changed to show user the permissions an app was requesting. Here's a genuine screenshot from an app which I tried to sideload in 2013:
No rational user would install a purported battery app with that scary list of permissions, right? Wrong!
We know that users don't read and they especially don't read security warnings.
There is no UI tweak you can do to prevent users bypassing these scary warnings. There is no amount of education you can provide to reliably make people stop and think.
Here's the story of a bank literally telling a man he was being scammed and he still proceeded to transfer funds to a fraudster.
It emerged that, in this case, Lloyds had done a really good job of not only spotting the potential fraud but alerting James to it. The bank blocked a number of transactions, it spoke to James on the phone to warn him and even called him into a branch to speak to him face-to-face.
Here's another one where a victim deliberately lied to their bank even after acknowledging that they had been told it was a scam.
Android now requires you to deliberately turn on the ability to side-load. It will give you prompts and warnings, force you to take specific actions, give you pop-ups and all sorts of confirmation steps.
And people still click on.
Let's go back to Google announcement. This change isn't being rolled out worldwide immediately. They say:
This change will start in a few select countries specifically impacted by these forms of fraudulent app scams, often from repeat perpetrators.
…
September 2026: These requirements go into effect in Brazil, Indonesia, Singapore, and Thailand. At this point, any app installed on a certified Android device in these regions must be registered by a verified developer.
The police in Singapore have a page warning about the prevalence of these scams. They describe how victims are tricked or coerced into turning off all their phone's security features.
Similarly, there are estimates that Brazil lost US$54 billion to scams in 2024 (albeit not all through apps).
There are anecdotal reports from Indonesia which show how easily people fall for these fake apps.
Thailand is also under an ongoing onslaught of malicious apps with some apps raking in huge amounts of money.
It is absolutely rational that government, police, and civic society groups want to find ways to stop these scams.
Google is afraid that if Android's reputation is tarnished as the "Scam OS" then users will move to more secure devices.
Financial institutions might stop providing functionality to Android devices as a way to protect their customers. Which would lead to those users seeking alternate phones.
Society as a whole wants to protect vulnerable people. We all bear the cost of dealing with criminal activity like this.
Given that sideloaded Android apps are clearly a massive vector for fraud, it obviously behoves Google to find a way to secure their platform as much as possible.
And Yet…
This is quite obviously a bullshit powerplay by Google to ensnare the commons. Not content with closing down parts of the Android Open Source Project, stuffing more and more vital software behind its proprietary services, and freezing out small manufacturers - now it wants the name and shoe-size of every developer!
Fuck that!
I want to use my phone to run the code that I write. I want to run my friends' code. I want to play with cool open source projects by people in far-away lands.
I remember The Day Google Deleted Me - we cannot have these lumbering monsters gatekeeping what we do on our machines.
Back in the days when I was a BlackBerry developer, we had to wait ages for RIM's code-signing server to become available. I'm pretty sure the same problem affected Symbian - if Nokia was down that day, you couldn't release any code.
Going back to their statement:
To be clear, developers will have the same freedom to distribute their apps directly to users through sideloading or to use any app store they prefer.
This is a lie. I can only distribute a sideloaded app if Google doesn't nuke my account. If I piss off someone there, or they click the wrong button, or they change the requirements so I'm no longer eligible - my content disappears.
They promise that Android will still be open to student and hobbyist developers - but would you believe anything those monkey-punchers say? Oh, and what a fricking insult to call a legion of Open Source developers "hobbyists"!
I hate it.
I also don't see how this is going to help. I guess if scammers all use the same ID, then it'll be easy for Android to super-nuke all the scam apps.
Perhaps when you install a sideloaded app you'll see "This app was made by John Smith - not a company. Here's his photo. Got any complaints? Call his number."
But what's going to happen is that people will get their IDs stolen, or be induced to register as a developer and then sign some malware. They'll also be victims.
So What's The Solution?
I've tried to be pragmatic, but there's something of a dilemma here.
- Users should be free to run whatever code they like.
- Vulnerable members of society should be protected from scams.
Do we accept that a megacorporation should keep everyone safe at the expense of a few pesky nerds wanting to run some janky code?
Do we say that the right to run free software is more important than granny being protected from scammers?
Do we pour billions into educating users not to click "yes" to every prompt they see?
Do we try and build a super-secure Operating System which, somehow, gives users complete freedom without exposing them to risk?
Do we hope that Google won't suddenly start extorting developers, users, and society as a whole?
Do we chase down and punish everyone who releases a scam app?
Do we stick an AI on every phone to detect scam apps and refuse to run them if they're dodgy?
I don't know the answers to any of these questions and - if I'm honest - I don't like asking them.
33 thoughts on “Is it possible to allow sideloading *and* keep users safe?”
@Edent Is this just all about installing from trust worthy sources? Play Store is meant to be a trust worthy source. If a user wants add another source, warn them, but don't block them. Gamers cheating isn't worth taking freedom away from everyone over. Scammers will always find a way to scam people.
None of this seams new problems to me. Benjamin Franklin "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."
| Reply to original comment on mastodonapp.uk
Mr Rando
Which bank do you bank with that is liable for any scams against you? It is quite clear that if you it your computer transfers money from your account, you're responsible, so you can remove that from your list of why. Banks are way better lawyered up and insured.
@edent
Take a look at https://www.bbc.co.uk/news/articles/cy94vz4zd7zo
Most UK banks have to refund people who are defrauded.
JohnSmith
I am ok with the bank telling me what hardware and/or software I can run their app on. ALL apps have requirements (I can't run safari on windows). I can moan, still its their choice. But hardware manufacturers (via googles imposition on all of them) telling me which developer/ apps to run on my hardware, that's a different ball game.
You should have to write a 1000 word essay on why you should be allowed to enable rights on a sideloaded app before being able to install it. 😉
| Reply to original comment on bsky.app
@Edent The bank has another option: They can say to me, "You can only run our app on your rooted phone if you sign this waiver that we're not responsible for whatever happens." Why not - they already had me sign such a thing in relation to other products.
Similarly the game co. They _could_ say, "Our game is only certified safe on platforms X and Y. If you run it on anything else you assume all liability for the consequences."
99.9% of users are never going to want to do that anyway.
| Reply to original comment on indieweb.social
@Edent I think I arrive at much the same place as you: a big part of my objection is that Google are the gatekeepers.
Google and their ham-fisted policy implementations are *also* the reason I choose to use F-Droid over the Play store.
I'd have more sympathy for the plans if the trust store was controlled/handled by someone else - that would at least strike the compromise between protecting freedoms and the vulnerable.
But, Google can't be trusted with that kind of power
| Reply to original comment on mastodon.bentasker.co.uk
@Edent agree with the comment saying to avoid the loaded term sideloading, when you mean install.
| Reply to original comment on qoto.org
This is also a case of controlling third party app stores and in-app payment processors while appearing politically neutral, very much in response to the Epic vs Google loss and European DMA demands. Indie devs are just collateral damage.
@blog @Gargron
| Reply to original comment on mas.to
@Edent I recall that banks tend to be the first to point the finger at 'insecure' client devices, yet their own systems may be full of holes and open to social engineering attacks. I spent ages going through PCI DSS for full stack certification, only to have the bank outsource their own networking to the cheapest bidder and try to persuade me to change the endpoint for *unencrypted transfers* with an unauthenticated, unexpected phone call. Turns out it was genuine. Sigh.
| Reply to original comment on mastodon.online
New regulations, particularly Europe's Digital Markets Act (DMA), are compelling tech platforms to permit app sideloading—the installation of applications from sources other than official app stores. While intended to foster user freedom and competition, this shift introduces significant security challenges, making it exceedingly difficult to ensure user safety, according
| Reply to original comment on 200666.xyz
@ben @Edent my dilemma is that I don’t see an obvious answer for who else it could be. You don’t want a single government to control it but the groups like the Linux or Apache foundations which might be trusted don’t have the resources to vet people and handle appeals.
| Reply to original comment on code4lib.social
@Edent The weird thing about the banking apps example is that pretty much every bank that has a mobile app also has a website with all the same features as the mobile app. Banks say they need all these attestation and fingerprint APIs on mobile for security and act like the world will end if you root your phone, but you can still make a transfer through the website on a 20-year-old laptop running Linux Mint or whatever.
| Reply to original comment on mas.to
@Edent Personally, I don't trust any device with my data that I don't have root on, because that means that somebody else has more control over it than I do.
| Reply to original comment on mastodon.online
@Edent mind you, I'm technically competent, most of the sort of people who are taken in by scammers are not, and are easily persuaded to do dubious tasks.
| Reply to original comment on mastodon.online
@Edent Personally I think the right path is a bit of a mix of things. We must - must! - be able to run the software of our own choosing on our devices if we wish to do so. I think there can also be a place for voluntarily trusting a company to verify apps or developers, but it must be voluntary.
Perhaps this can be partly a decision made at the time of purchase. Maybe you can't undo it without factory resetting the phone, which few people will be willing to do on a whim
| Reply to original comment on jawns.club
@acdha @Edent Agreed and things quickly get even more complicated if you decide there should me multiple (because then you've basically got TLS cert authorities again).
But, as TFA says, I'm also struggling to see the benefit - they'll be able to yank certs as they become aware of them, but by then the harm is likely already done and the next attempt will use some other mule's identity instead
| Reply to original comment on mastodon.bentasker.co.uk
Is it possible to allow sideloading and keep users safe? | Hacker News
| Reply to original comment on news.ycombinator.com
@Edent
Attempting _nuance_ on social media? Brave soul
@mozilla faces a similar dilemma with Firefox extensions in the face of a relentless wave of assholes and scammers. It's a depressing no-win position to be in
| Reply to original comment on infosec.exchange
James Addison
I'd be more comfortable sideloading a program onto my phone if the source code to that program is available - e.g. it can be categorised as free and open source software.
There is a problem, though: programs are usually distributed as binaries - and sometimes compiling the same source code produces different binaries. How could I know that the program I'm installing to the phone is genuinely the same program described by the source code?
The answer to that that I find acceptable is that the program should not only be free/open source, but also that it should build reproducibly[1], so that I know for certain that the bits/bytes I'm installing are what I would get if I (or a friend) compiled the program from source code myself.
Build reproducibility does not guarantee that the program has no bugs or problems. But it does ensure that if problems occur due to the program's logic, then the source code can be inspected to identify the cause. In my opinion this not only allows continuous progress/improvement, but also enables good software engineering practices.
Preventing scams becomes a question of sufficient review and inspection of the code - and of modifications to the code. This is a social and ongoing process.
[1] - https://www.reproducible-builds.org
@Edent I think you answered your own question - there is no foolproof solution to allow 1 device to be secure and have the freedom to run untrusted software. The solution is to use 2 devices. One trusted, the other not.
| Reply to original comment on floss.social
@blog @Gargron Software from unofficial sources should be installable as some kind of containers. That would help.
Also it seems we should consider buying two devices, one for banking and other for playing around. 😅
| Reply to original comment on fosstodon.org
@williamoconnell @Edent indeed. If they actually cared about this, the obvious solution would be to stop courting developers to make apps and stop pushing users to install them when it comes to things where the site is perfectly capable.
Instead, we see orgs choose to deliberately hobble their site, sometimes to the point of being nonfunctional, all so they can induce people to approve overly broad requests for app permissions for their *own* apps, which they're perfectly okay with, of course.
| Reply to original comment on kosmos.social
3 comments
@blog
This is a central problem of what makes societies: where do we put trust ? In a capitalist world we are pushed all the way to not trust each other but only the State and companies, not because we want to, but because we have to. It's only in this world that trusting Google is the best thing to do to prevent malware.
Which is why we absolutely need to build another system where we have collective circles of trust. No need to wait for the Revolution to happen, we have already started it: our Fediverse instances are bundles of trust, oftentimes not for profit, where we can ask each other questions and support each other. This is, to me, the model where we need to go: those-who-know must take some time to show those-who-don't around, teach them how to actually use a smartphone, install F-Droid and then trust them with everything that's on F-Droid, ....
To me the pragmatic (I hate that word) approach is to build mutual aid communities, bring back the human touch. Yes, sideloading is full of malwares, but the solution is to turn to your local geek groups. LUGs used to do that, they should be forwarded all the way to the 21st century where the main device is a smartphone
@Gargron
| Reply to original comment on blah.rako.space
Dr. Tim
"Consider a bank which has an app."
Sure, let's. Every (modern, reputable) bank also has a webpage which lets you log in and perform account operations. Are you claiming that they're OK running Javascript (and a lot of it, judging by how slow it is) on my computer, but not x86/ARM machine code, even in a sandbox -- unless the OS prevents me from installing other applications which haven't been through the OS manufacturer's approval process?
No part of this makes any sense.
Javascript is way easier to mess with than machine code, both the delivered application and the platform itself. Sandboxes should be perfectly sufficient to keep applications isolated from each other, so the presence of other applications (or how they were installed) is irrelevant. And the approval process itself is so infamously cursory that plenty of sketchy apps make it through all the time. Besides, banks are fine with the current state of affairs on the web, so they've already got the proper interfaces and abstractions!
I might be OK with OS vendors adding restrictions for the benefit of banking software, but only if the proposed restrictions actually provided some possible mechanism which might conceivably help. Is Android's web browser also going to limit users to visiting pre-approved URLs? Because if not, it's absurd to think that doing this for applications will solve anything.
@edent
I don't know where in the world you are, but the UK has lots of banks which are app-only. No legacy websites to secure.
Dr Tim
Interesting. I’m in the USA. We have one or two banks here…
Note that this only affects one prong of my comment. We still have sandboxes. We still have laughably bad app approval processes. Bad guys still have debuggers and packer sniffers.
@edent
Android already has sandboxes. It isn't possible for one app to interfere with another - https://source.android.com/docs/security/app-sandbox
But that doesn't stop someone downloading a fake bank app and putting their real details in.
I think Terence might be baiting me 🙂
Good article, though. I think the underlying question is not “should we attempt to keep vulnerable people safe from scams” but “who should be charged with protecting people from scams?” To my mind, that’s definitely not huge corporations that could – and will – use that power to generate profits that would have made Rockefeller blush.
| Reply to original comment on www.ianbetteridge.com
Olivier Barthelemy
Hi.
Interesting piece, and important debate. I think you forget a key option: user choice. Car analogy: cars that can go well past the speed limit, or that are old enough to no longer meet today's road-worthiness standards. Pizza analogy: pineapple. People choose to use, and abuse, those.
This freedom comes with 2 issues:
1- unqualified users taking exaggerated risks. The OS must publish very clearly a safety status: Green = from store, Yellow = authenticated sideload, Red : anonymous sideload. And remind the user regularly / at all times of the status of their device and of each app.
2- hijacking of one app by another app. This might/could/should be solved by sandboxing, and if the OS itself is compromised (custom ROMs...), apps should be able to refuse to run.
The issue with forcing devs to reveal themselves is authoritarian govs (incl the US now) going after devs or users, and the possibility that app's supply chain might get specifically poisoned. Security through obscurity isn't very effective, but it's better than a big red arrow pointing at Mr "free-speech app" dev and their server/userbase.
In the end, I think of the mobile ecosystems as infrastructure, on the same level as roads and the electricity grid. We can be unsafe on roads, and plug in stuff that'll catch fire or even blow up the grid, because the alternative of cars limited in their features/capabilites/even destinations, and devices having to be validated/sold/made by the grid operator is unacceptable both from a functionnality/innovation standpoint and from a competition standpoint. Ditto apps.
Terence Eden’s upset at Google’s proposed changes to Android to further lock-down the ecosystem, and I’m concerned too.
| Reply to original comment on danq.me
I agree with everyone else. Using a a commercial phrase like "sideloading" is already buying into the unreal view these for-profits are trying to spread. The word is install. It's all much clearer if you frame the issue using actual concepts instead of marketing ones.
We need to accept that smart phones are not computers. They are obviously not general purpose computers. They are not owned by the end user and cannot be owned by them at multiple layers in the stack from the PHY and transmit licensing to the applications themselves.
Because of this we need to stop computing on smart phones. I know that's a rough pill to swallow. But that's just the legal and practical reality that intrudes into day to day use more and more. They aren't computers. What they are is amazing bank dumb terminals and communication systems and hot spots for using real general purpose computers that you can and do own.
More comments on Mastodon.