Chess and Silos: Stockfish's Buffer Overflow as a Parable
"What we’ve got here is a failure to communicate." ~ The Captain, Cool Hand Luke (1967)
"So, so what? / I'm still a rock star / I got my rock moves / And I don't need you" ~Pink, So What (2008)
Last week, the community that develops and maintains the open-source chess engine Stockfish witnessed an age-old dance, not between pawn and rook, but between security researchers and developers. This time-honored ritual is more formally known as the Security Defect Report.
The vulnerability discovered in Stockfish is somewhat of a classic, so much so that even the original OWASP Top 10 list included a buffer overflow on it. The issue was flagged by user ZealanL, who found that if a player inputs a position list (essentially a text-based representation of an ongoing chess game) with too many moves, the excess moves begin to overwrite data outside of their designated area, affecting other parts of the Stockfish machine state. The best-case scenario is a crash, but the worst-case scenario is being tricked into injecting code that is disguised as a chess game.
Surely, the developers saw this report, thanked ZealanL for their findings, and immediately began securing this input, right? Well, not exactly. What ensued was a back-and-forth debate, a dance all too familiar to seasoned security researchers and developers. The Captain from Cool Hand Luke might have diagnosed this as a failure to communicate, and many echoed Pink's defiant lyrics, asking, "So what?" The developers didn't need ZealanL's input and eventually, due to the escalating tension, the conversation was closed.
The "So What" does matter. The initial response was that Stockfish was free to crash on any illegal move list, such as a move-list that was impossible without breaking the rules of chess—like having too many queens on the board at once. It became clear that the developers respected the rules of chess more than they cared about the rules of software security. They felt that if a user was audacious enough to violate the rules of chess, they could deal with crashing their own instance and any subsequent consequences of such an action. Security experts countered that this was a Buffer Overflow, a major concern, but their warning fell on the developers' disinterested ears.
This cultural disconnect happens daily in development shops and security labs worldwide. Everyone has their "So Whats" that drive their daily priorities and actions. The "So What" dance begins when a security professional advises, "When you do this, don't do that," and the developer responds, "Why shouldn't I do that?" What follows is a back-and-forth where each side speaks according to their own set of priorities, yet never truly understanding the other's perspective.
One common strategy security researchers use is creating a proof-of-concept to demonstrate the severity of a vulnerability. Developers then expect such a proof with every vulnerability reported. This dance occurs where trust is lacking and vision is not shared. If security only presents restrictions, developers will leverage every ticketing system, workflow, policy clause, and chain of command to keep them at arm's length, effectively creating a silo.
This silo enables them to do what they're incentivized to do—create functionality and push new features—while minimizing obstacles. It’s a self-defense mechanism.
In future posts, I'll discuss breaking down silos, but the "quick" fix involves having leadership redefine developers' priorities to include security, or the security team fostering interpersonal relationships with the development team—not only showing up when there are problems. Outreach and threat intel briefings are beneficial, as are security expos and talks from external speakers. Security champions are another excellent method for embedding security professionals into pre-existing silos to help dismantle them.
In the end, the thread was closed by user Vondele without addressing or resolving the issue despite the dedicated efforts of ZealanL and other security researchers to demonstrate the problem, communicate the issue, and synthesize a fix. The rationale was that crashing on bad input was acceptable, and a classic OWASP Top 10 alumni vulnerability like the buffer overflow didn't pose a threat to their game of chess.
Security doesn't always win; sometimes it gets outplayed. However, that doesn't mean security has to lose in your organization. Never stop communicating, never stop advocating for prioritizing security, and never stop fighting for the fact that for your company to do what it does well, it must do it securely.
There's always more words to spend on a topic like this one, but I've hit my budget for now. Stay secure, and never forget the humans.