Why we need to support, not punish, curious security minds
Recent reports of teen security researcher, Bill Demirkapi, exposing major vulnerabilities in software used by his school certainly brought back some memories. I remember being the curious kid, lifting the hood on software to take a peek underneath and see how it all worked - and more importantly - if I could break it. For decades, software engineers have sought continuous improvement and fortification of their products, and the security community (while a little cheeky in their approach, at times) plays an important role in discovering flaws and potential disasters, hopefully before a bad guy does the same.
However, the issue here is that in response to his discoveries, he wore a minor suspension from school. And that only came after he exhausted all avenues to contact the company (Follett Corporation) privately, finally opting for a rather public blast to identify himself and his ability to breach their system. His repeated attempts to ethically warn Follett Corporation went without reply, while the software remained vulnerable and mountains of student data fairly easily exposed, as much of it was unencrypted.
He also hunted bugs in another firm’s software: Blackboard. While Blackboard’s data at least had encryption, potential attackers could have reached in and nabbed millions more records. Both this software and Follett’s product was in use by his school.
The ‘evil hacker’ narrative is problematic.
Demirkapi presented his findings at this year’s DEF CON, with the more mischievous details of his antics getting crowd applause. Rightly so, really - while he was initially in the bad books and faced many hurdles to get his discoveries recognized, Follett Corporation was reportedly thankful for his efforts and acted on his advice, ultimately making their software more secure and averting the crisis of becoming another data breach statistic. He’ll also be attending the Rochester Institute of Technology after completing high school, so it’s clear that he’s on the right path to becoming a security specialist in-demand.
As a security guy myself, it’s difficult not to take issue with how this situation was handled. Though all’s well that ends well in this case, initially, he was treated like an annoying script kiddie putting his nose where it didn’t belong. A Google search of the incident has articles that refer to him as a “hacker” (in the mind of the security layman, this positions him as the villain in a lot of ways), when in fact his approach (and that of many others) is what helps keep our data safe.
We need inquisitive, clever and security-focused people looking under the hood, and we need it happening far more often. As of July, over four billion records have been exposed in malicious data breaches this year alone. You can potentially add another fifty million to that figure, thanks to the August breach of fashion and lifestyle brand, Poshmark.
We’re making the same mistakes, and even more worryingly, they’re often facepalm-inducing, simple vulnerabilities that keep tripping us up.
Cross-site scripting and SQL injection haven’t gone away.
As reported by WIRED, Blackboard's Community Engagement software and Follett's Student Information System were found by Demirkapi to contain common security bugs like cross-site scripting (XSS) and SQL injection, both of which have been hairs on the tongue of security specialists since the 1990s. We’ve endured their existence for a really long time, and much like Hypercolor t-shirts and floppy disks, they should be a distant memory by now.
But, they’re not, and it’s clear that not enough developers demonstrate adequate security awareness to stop the introduction of them into their code. Scanning tools and manual code reviews can only do so much, and there are far more complex security problems than XSS and SQL injection, where these expensive and time-consuming measures could be better utilized.
People like Bill Demirkapi should inspire developers to create a higher standard of code; at just 17, he breached two high-traffic systems by way of threat vectors that should have been sniffed out and corrected before the code was even committed.
Gamification: The key to engagement?
I’ve written a lot on why developers remain largely disengaged with security, and the short answer is that not a lot is being done at the organizational, nor educational levels, to foster security-aware developers. When companies take the time to build a security culture that rewards and recognizes engagement, including implementing training that speaks the language of developers and motivates them to keep trying, these pesky vulnerability relics start to disappear from the software we use.
Demirkapi obviously has an extracurricular interest in security, and has taken the time to learn how to reverse-engineer malware, spot flaws and, well, break stuff that doesn’t appear broken from the outside. However, in speaking to VICE (and via his DEF CON slides), he made an interesting statement on his self-education… he gamified it:
“With the goal being to find something in my school’s software, it was a fun—gamified way of teaching myself a significant amount of penetration testing. Although I started my research with the intent to learn more, I ended up finding out things were a lot worse than I expected," he said.
While not every developer is going to want to specialize in security, every developer should be given the opportunity to become security-aware, with the basics serving as almost a “license to code” within an organization, especially those in control of masses of our sensitive data. If the simplest security holes can be patched by every developer before they’re even written, we’re in a much safer position against those who seek to wreak havoc.