Cofield: And when bad things happen, we should figure out what went wrong, so we can learn from it and correct it.
House: So that we can assign blame instead of recognizing that bad things sometimes happen. It was nobody’s fault.
– House, S8E11, Nobody’s Fault
Programmers are lazy. You might’ve seen a quote - attributed to various tech idols, pick your favourite - that is along the lines of “A good programmer is a lazy programmer, because they use it to reduce their workflow”.
See, I use the computer every day, and have since I was young. I use lots of different programs, I use a web browser with lots of different sites. One thing they have in common is that they are usually broken and buggy in inscrutable ways; ways that make you think, “how did this even make it out of the IDE” (special shoutouts to how the Kotlin IntelliJ plugin spams errors every single update on relatively bog-standard projects). When I use this software, I think about that quote about lazy programmers.
Today, the 7th of June 2023, the discovery of the Fractureiser worm (or Nekolauncher) was publically announced by prominent modding figures. Fractureiser is a semi-self-replicating piece of malware that specifically targets modded Minecraft users. It is a three-stage worm that infects all Java archives on the system, and begins scraping for credentials on the system to be sent to a remote server.
If this is news to you, please, read this. Then come back here.
This is not the first piece of Minecraft-related malware, but it certainly won’t be the last, and it’s (as far as I know) the first one that is specifically targeting both modded end users and modded developers. But the saddest part is that was entirely preventable and was a result of multiple points of failure - a single fix in any of them would have killed this day 1. But programmers are lazy.
Point #1 - Distribution
The problem with writing malware is that you need to find a way to get it onto your target’s systems. There’s a few ways to do this, in reverse order of difficulty:
- (if you have physical access) Physically install it. Who’s gonna stop you?
- Convincing somebody to download and run it
- Compromising somebody’s account and then remotely installing it
- A zero-day exploit in something
Fractureiser uses a combination of part 2 and 3; most malware using part 2 host their own distribution servers and just need to trick you into going there. The worm, however, was hosted directly on CurseForge and distributed to end-users through the CF launcher or other modded launchers that directly download the mods.
Please note that you might see some people talk about CurseForge’s content moderation, or how this was the fault of their moderation for not catching the malware. This is complete bullshit. Outside of the really really obvious factors (such as a .so packed in with the jar) statically analysing compiled Java code is prohibitively difficult and is very easily circumvented. Unless you want to block every single upload on somebody poring through the decompiled code to see if it’s doing something wrong - and this is difficult, even for seasoned reverse engineers! - then there’s no real way to vet mod releases. The real problem that they have is that they pretend to vet the mods more than anything.
(EDIT 2023-06-08T01:41: Corrected the way this was spread.)
Fractureiser was initially spread by sockpuppet accounts spamming fake mods disguised with similar names to real mods, and then may have spread further by compromising the CurseForge access tokens of mod authors that downloaded the mod. It contains the ability to inject the mod’s downloader code into fresh JARs (even non-mods, sorta rootkitting an entire system if you use Java anywhere else) allowing it to spread further through legitimate channels.
The problem here is that CF allows uploading new artefacts without multi-factor authentication. How many times are we going to learn this lesson? Requiring MFA to upload artefacts would have shut this attack down immediately. Sure, an actively malicious author knowingly uploaded the malware anyway, but this stops it from replicating to other mod uploads! Instead, any mod may be unknowingly compromised. But this would be a slightly different flow, and developers are lazy, so nobody wants to implement this lest they get the backlash. PyPI finally implemented mandatory 2FA for uploads, even after backlash multiple months ago so it is possible.
Point #2 - Mod Signing
Okay. If you can’t stop the malicious mod files from being uploaded, you can at least stop them from being executed on the client-side. This is not a solved problem but we’ve gotten pretty close! The answer is cryptographic signatures.
If you use a Linux system, your security depends on cryptographic signatures. Every single distribution uses PGP signatures to sign their packages that your package manager installs to prevent Mallorys from intercepting them and replacing the binaries with malicious ones. This is achieved using asymmetric cryptography; when you install your system, a set of known good keys is installed alongside it from the installation medium or live environment and all packages are checked against this set of keys. If a package is signed with a key it doesn’t recognise, it fails very loudly to install.
Java has a mechanism for signing JAR files; the official client and server JARs are signed. Remember deleting META-INF? That’s because it contains signatures and checksums for the JAR that Java verifies. Invalid signature? Gone. In fact, Forge used to have JAR signing ten years ago; but nobody adopted it because it was slightly annoying. The fault here kinda lays with Forge, too, as they should have enforced it harder. No signature? No mod. This would have prevented Fractureiser from gaining a foothold as the modified jars would have simply failed to load.
Cryptographic signatures and JAR signing like this is difficult to implement - I will admit that.
But it’s not completely insurmountable. The wider web uses TLS for exactly this purpose and that
works great! The key problem (no pun intended) is figuring out what certificate to trust and where;
for example, it’s no good embedding the public key into the JAR file itself as malware can simply
swap it out. The wider internet solves this by having a set of Certificate Authorities, a set
of public keys that are built into every single browser and operating system. These are used to
build a chain of trust up to the end-user’s certificate whose ownership has been verified by
said certificate authority. If you’re not me and you’re reading this, you’ve gone through at least
two different TLS certificates to see this. My Let’s Encrypt TLS certificate proves that I own
this website, and (for some reason) has the ability to emit digital signatures that I could then use
to prove that any mod in the
tf.veriny namespace is owned by me and nobody else.
Now, I only practice cryptography recreationally and I’m certainly not a mod loader developer, so I can’t offer concrete suggestions - nor will I. But there’s nothing about this that isn’t feasible from a technical perspective. It just isn’t implemented anymore.
Point #3 - Mod Supply Chain
(This is a tangential point, but it’s mentioned in the Fractureiser docs, so I’ll talk about it too.)
Mods are developed (for the most part) using a tool called Gradle. Gradle is fantastic! It’s overhated by people who write thousand-line-long Groovy buildscripts and have never tried to understand it. Sure, it was kinda really bad a few versions ago but the 8.x versions are perfectly fine. Most importantly, Gradle is pluggable, or in other words it supports plugins to customise the functionality of your build.
Mod development with Gradle depends on one of two plugins: ForgeGradle or Fabric Loom. (Basically everything that isn’t this is actually Loom in disguise; honorable mention goes to things like VanillaGradle or Minivan that are Loom’s organs arranged into different shapes.) I’ll focus on Loom because its what I’m most familiar with, as a primarily Fabric developer.
Here’s a nice list of possible entrypoints for an exploit with Loom that an aspiring malware author can use to create another such worm as Fractureiser:
The Fabric example project depends on Loom 1.2-SNAPSHOT instead of doing version pinning. Gradle will silently automatically redownload newer snapshot versions of plugins or dependencies, so all you need to do to compromise everyone is to get it onto a plugin repository that a Gradle project can use. (It’s not truly silent, but you can disguise this, sorta…)
This does require some conscious effort to actually make people use your malicious repository, but given that most people hate Gradle so much they just copy-paste buildscripts everywhere, this is not as hard as it might seem.
Loom adds repositories to your project, unsolicited, with no way of disabling it. A secondary plugin can swoop in, swap these repositories out, and nobody will notice because who really knows what Loom adds? Gradle will output what its downloading - but, again, this might just be a client dependency that Loom implicitly adds unless you look closely. Or it might be something from a buildscript you copy and pasted and don’t understand.
A lot of mod dependencies aren’t hosted on their own maven. Instead, you use the CurseMaven or Modrinth’s fake maven to depend on them. See Point #1.
A lot of people use a plugin to automatically publish to CurseForge or Modrinth, filling in a token from environment variables as so that your API token isn’t obviously in your git repo. The vuln is pretty obvious here; a malicious plugin that somebody gratefully copypasted into their buildscript (because they hate gradle and don’t want to learn it) can just run the publication task after injecting some generated code.
Realistically, this is by far the best injection point to do Fractureiser V2; socially engineer somebody having Gradle problems into including your malicious plugin and auto-publish a backdoored release using the convenient token they provided you. Again, this is entirely defeated by mandatory 2FA when uploading files! See Point #1!
I am really mad about this malware. I wasn’t personally affected, but I could’ve been as I’ve been extensively playing modded Minecraft in the last few weeks. And if I was, it wouldn’t’ve been my fault!
In all honestly, if you get malware on your computer in 2023, it is usually your fault. You clicked a dodgy link, you downloaded an obviously fake PDF, et cetera et cetera. Sometimes you get got by a zero-day; even then that’s sometimes your fault for not updating. If you got got by Fractureiser, however: it’s not your fault! You did nothing wrong! It’s a systemic failure. This could have been prevented twice over.
This will happen again unless these issues are addressed. The door has been opened. Copycats will spawn.