I guess people have already offered some community-focused opinions on this.
Realistically, you might not believe it, but machines are very good at what they do. Computers can do complex maths equations and learn from the large amount of data the network can generate to better understand how players act and interact. A properly trained and programmed piece of software to serve as an anticheat can be much more efficient than a human having a guess by eye on if a player is using disallowed mods. Edit: The reason why it is so difficult is that a client can play around with packets to deceive the server, as well. It's like a game of Tom and Jerry, the server and the client developers will aim to constantly update their systems to stay in the game. We'll, hopefully, get to a point where this will be efficient enough to learn from behaviour that manual intervention will not be needed as much.
Some players might be banned falsely by such a system as it learns, but they are built to compensate with things like the player's connection. This reduces load on staff, as well. They are volunteers, after all. Nobody wants to spend their free time watching over hackers forever. They're players, just like yourself, and they want to spend some of their time assisting with moderation. Their job should not be more difficult if technology can be used to relieve them.