The password in question, “solarwinds123,” was almost laughably easy. The high-stakes drama that it may have triggered, with Russian hackers spying on federal agencies and businesses, was downright cinematic.

But the news that poured out of IT management company SolarWinds earlier this year, with bad actors from Russia manipulating the company’s security weaknesses in order to cause possibly the worst security breach in U.S. history, is neither entertaining nor funny. It’s deadly serious, and it has created a chorus of commentators asking the same resounding question: what did Solarwinds know about their vulnerabilities, and why didn’t someone act earlier?

The ethical issues that exist around the wake of discovered security vulnerabilities are vast and murky. And sometimes, it all appears familiar and almost too easy. Like poker players, many of these ecosystems have a “tell” that a skilled player can easily identify. 

These discoveries come as little surprise. But what does often shock is the reaction — or lack thereof — that corporations, businesses and government security entities give us when they are notified about these vulnerable links in the chain.

Far too often, the conversation about how and when to disclose security weaknesses shifts from a dialogue to a one-way monologue. Even more troubling, it is sometimes not even a conversation at all. Many organizations shut the door on security companies, practitioners and even white hackers with no financial incentive, all of whom are attempting to ring the alarm bell. Or, companies make vacant promises to review and remedy — promises which are often not followed through on.

Forty-seven percent of cyber security professionals are investigating only 10-20 threats per day, according to a report from CriticalStart. Sixty-eight percent reported that up to three quarters of the threats they do investigate are false positives.

And in the midst of this sluggish pace, there is massive burnout to contend with: that same report revealed that nearly half of all cybersecurity professionals experienced up to 25 percent turnover in their organization last year, and 38 percent get only less than a full week’s worth of cybersecurity training each year. This is a volcano just waiting to erupt.

Part of the problem likely stems from the fact that many organizations haven’t created a disclosure system in place to begin with. Without a clear and easily followed process stipulated in company culture, a practice of blame-shifting and finger-pointing is adopted in its place. And for so many CISOs, dealing with the nagging issue of a potential security breach and the ethical mandate to disclose and create dialogue turns instead to yet another task on the to-do list. It is pushed aside. It is the one that is always carried. And sometimes, after long enough, it just gets pushed under the rug and forgotten.

Sometimes the burying of the head in the sand, even if it’s borne out of desperation and a practice of being overworked and understaffed, turns into something deliberate.

But while companies are dragging their feet, bad actors are mobilizing their armies. In my own work, I’ve met CISOs — more than I care to admit — who create an email address that doesn't even fit their company’s standard. This makes them harder to contact, and therefore, essentially impossible to alert. Some organizations’ existing disclosure programs are even designated as “top secret,” bound by strict NDAs and accessible by invitation only. The drawbridge is always up; the moat is considered impossible. And what organizations don’t know, they are not beholden to either address or resolve. I’ve also run into plenty of organizations who declare outright that they don’t want to want to receive disclosures, because they have no desire and / or no capacity to deal with the liabilities created by them.

But as we saw clearly with Solarwinds, ignoring a security problem doesn’t make it go away. If anything, without attention and adherence, it festers and grows until it has the potential to not just cause annoyance and frustration. It can become the straw that breaks the company’s back.

To move forward and shift the culture on disclosures, the first challenge is to find the sweet spot between being approachable versus actually inviting hacking attempts. The door needs to be open to those who wish to raise alarms, but firmly closed to those who want to breach its door frame and crash right on into the building.

Too many CISOs are stuck between two terrible options: if they don’t catch an issue, they are bad at their jobs. But if they do catch it and then fail to act on it, they run the risk of being blamed for a security failure and losing their reputation — or worse, their livelihood.

Proper regulations for ethical disclosures need to right the balance of this entirely lopsided situation. Some companies are allowing disclosure to choose a donation for a fiscal reward rather than pocketing it themselves. Recognizing the value that disclosures play in a company’s security is a great first step, but it must be followed by a concrete plan that lays out steps for achieving a CISO’s desired outcome. 

The bottom line: CISOs must do what’s right in terms of disclosures, and to motivate them in that direction, the burden of identifying and inventing what’s right needs to be shifted. We need to set the rules for them. Without regulations, we’re all just holding our breath and waiting for the next attack.