An ex-Uber chief security officer has been sentenced to probation after being found guilty of trying to cover up a 2016 data breach.
In a statement released by the U.S. Attorney's Office, Joseph Sullivan was sentenced to serve a three-year term of probation and ordered to pay a $50,000 fine.
In October 2022, a federal jury convicted Joseph Sullivan, the former Chief Security Officer (CSO) of Uber Technologies, Inc., of obstruction of proceedings of the Federal Trade Commission (FTC) and misprision of felony in connection with his attempted cover-up of a 2016 hack of Uber.
Sullivan was hired as Uber’s first chief security officer in 2015. In November 2016, ten days after Sullivan testified, under oath, to the FTC regarding Uber’s data security practices, he learned the company was hacked again.
According to reports, although the data breach occurred in 2016 it wasn’t revealed until 2017. Uber admitted that the breach, which affected 57 million users, was covered up and $100,000 in bitcoin was paid to the threat actors to ensure the information wasn’t made public. According to the Department of Justice (DOJ), Sullivan took several steps to prevent the FTC from finding out and arranged to pay off the hackers in exchange for them signing non-disclosure agreements.
“Sullivan executed a scheme to prevent any knowledge of the breach from reaching the FTC,” the U.S. Attorney's Office, Northern District of California said in a statement. “For example, Sullivan told a subordinate that they ‘can’t let this get out,’ instructed them that the information needed to be ‘tightly controlled,’ and that the story outside of the security group was to be that ‘this investigation does not exist.’”
Security leaders weigh in
Security magazine asked enterprise security leaders their thoughts on the recent sentencing of Sullivan and what impacts the ruling may have on the broader landscape of the industry and cybersecurity.
How does the Joseph Sullivan case highlight the importance of transparency and accountability in cybersecurity incidents?
Roy Akerman, Co-Founder and CEO, Rezonate: Infosec is a business risk, across financial and services provided to customers, from all C-levels and boards — including but not limited to CISO. Operational cybersecurity processes must reflect that in every action taken — from investigation to solutions to security breaches. Incidents are never a mere technological issue, when an incident arises, it should be escalated to senior management with the utmost transparency of risk, actions taken, and path for solutions. CISOs must know that at the end of the day they’re managing a business risk.
Linn F. Freedman, Partner and Data Privacy + Cybersecurity Team Chair, Robinson+Cole: Covering up an incident is not within the definition of “transparency.” This case is outside the norm, and the penalty is indicative of that fact. It illustrates that cooperating with regulators during an investigation is expected and part of the normal process. As a former regulator, my experience is that generally, regulators are open to discussion with companies about matters that are being investigated and understand the balance between providing information that is necessary to evaluate the incident, against disclosure of proprietary or irrelevant information. Covering up is obviously not something that regulators look favorably upon. This case shows how important it is to abide by applicable laws, to follow those laws and to cooperate with regulators following an incident. The risk that is magnified here is an obvious one: if one knowingly keeps an incident from a regulator, the price paid will be high when they find out about it.
David Maynor, Senior Director of Threat Intelligence, Cybrary: I don't blame organizations that aren't immediately transparent when they first learn of a breach. They need to take time to assess what was compromised and what might have been compromised. At this point you don't really want to tell the world what happened and your response plan since the threat actor could be listening as well. That said, there is a clock running on these situations and when the adversary has been exorcised from the network, it is time to talk to your comms people and build a timeline of events and steps taken. So, transparency and accountability are important but top priority must go to triage and stopping the bleed.
Ted Miracco, CEO, Approov Mobile Security: This case brings the important issues of transparency and accountability to the forefront, even if the sentencing was light and the accountability was limited in this case. It would have been ideal if the former CEO, general counsel and any board members that condoned this payout had also been held accountable, but this is a good start on opening executives’ eyes to the risks of not disclosing data breaches.
I would hope that this case would both encourage higher ethical standards and timely disclosures on data breaches and simultaneously discourage payouts to criminal hackers. That is a best-case scenario, and the worst case scenario is that the CISOs become the sacrificial lambs in cases like these (and the CEOs, boards and others whom are complicit in these decisions avoid accountability).
What impact will this case have on the broader cybersecurity and data privacy landscape?
Akerman: Board and CXOs will require ongoing reporting and a continuous approach to cyber-risk management. This requirement will also come from the investors and customers themselves.
Freedman: In my view, this case in an outlier, and the penalty is one that is meant to prove that point. Nonetheless, it reinforces to all companies that cooperating with regulators is expected, and the penalty for failing to do so — or even more egregiously, covering up an incident — could be high.
Stephen Gates, Principal Security SME, Horizon3.ai: As many learn about the outcome of this case, hopefully they come to realize that trying to hide a breach doesn't always pay well. Eventually, the truth always seems to prevail regardless of attempts to do otherwise. The impact could be something along the lines of more legislation, penalties and prosecutions as a precedent has pretty much just been established.
Maynor: A lot of infosec practitioners today don’t have a strong grasp of laws that govern cyberspace, particularly the Computer Fraud and Abuse Act (CFAA). A lot of threat hunting and forensics violate CFAA if a practitioner does something as simple as trace where data is being exfiltrated to and having the creds the attacker used to login to the box to see what is going on. While this example violates the CFAA, it is a common industry practice that relies on a belief the practitioner is a good guy, and no-one would arrest and prosecute a good guy. This case knocks that belief on its head. What people are learning is that they are good guys, and no one arrests good guys unless the DOJ really wants you to face punishment for something, then CFAA is back on board.
U.S. officials have come to places like Black Hat/DEF CON to reassure attendees that no one wants to arrest good people doing good work, yet they will not repeal or even amend the CFAA. It is too powerful a tool for DOJ to use when they are backed into a corner.
What the Uber CTO did isn’t an outlier, its business as usual. This is a common problem between boards and CISO-type roles. The board will say “who will rid me of this troublesome hacker” and they expect the security practitioners to make it go away.
Miracco: This case should ultimately be beneficial for the broader cybersecurity industry in that it brings the issue front and center in the board rooms. CISOs need to be given the tools, technology and resources to fight off attackers and the boards must recognize the tremendous responsibility that comes with the CISO role. The other beneficiary of the case should be the consumers, as users of these mobile apps have a right to promptly know when their data has been compromised and the companies responsible for the breach should have to do a lot more than offering free credit monitoring services for 12 months. Consumers should hold companies accountable for not promptly disclosing data privacy violations, and the perpetrators of these cover-ups should face more severe consequences than just probation.
What lessons can other organizations learn from the Uber data breach and “cover-up” that led to Sullivan’s sentencing?
Akerman: Time to report an incident, the best way to model and assets risks, responsibilities and ownership of roles and responsibilities across senior management, who should the CISO report to, etc.
Freedman: The lesson learned is to accept responsibility for an incident, follow applicable laws and cooperate with investigations that follow the incident. I am not sure it is such a profound lesson. This case should discourage any company from thinking that covering up an incident is a good strategy.
Gates: Look, nothing is perfect. Breaches will happen and organizations will recover. However, the real key regarding a breach is to begin putting more effort into proving and documenting that due care and due diligence are being performed. If you can prove to the public and law enforcement that you were following and enforcing the fundamentals of due care and due diligence, and everything possible was being performed based on industry standards and best practices to evade a breach, plus ensuring the most minimal impact, then you’re likely to receive forgiveness instead of a punishment.
Maynor: “Cover-up” is a bit salacious. As previously stated, there are legitimate reasons to not disclose immediately, especially if you think the threat actor might still have a toe hold in your network. There is also an argument to be made that once one group pops you, more will try. I think the bug bounty idea was an elegant solution to the problem but I’m a hacker and not a lawyer.
Miracco: The cost of a data breach is high and bound to go much higher, and the cost of cyber insurance is also rising, and in some cases, it is becoming completely unavailable. This should tell organizations that is high time to get serious about building security into their development budgets and providing the cycles for pen testing systems adequately, and designing systems that are resilient to emerging threats. The days of moving fast and breaking things need to come to an end, especially when it is the customers that get impacted by the breaking process.
How can security professionals and legal teams at an organization work together to ensure that cybersecurity incidents are handled in a lawful and ethical manner?
Akerman: Cross-collaboration between teams and clear duties from each are a best practice in the era we’re in when unfortunately, security breaches are part of the new norm. Incident response procedures must be put in place and acknowledged by all players, this goes beyond the just security’s responsibility, to the complete business risk across Legal, HR, security, GRC, investors and more.
Freedman: Companies may wish to consider implementing an incident response plan that addresses the steps to follow during a security incident. That plan usually will assign an incident response team to respond to the incident. The plan will designate roles for team members, and will include mitigation of the event from an information technology standpoint, as well as designate roles for legal, compliance, communications and others in the organization. When the legal team is involved, legal obligations, as well as risk, are determined. If an organization has a plan in place, and the team members follow their roles, covering up an incident would be very difficult to accomplish. In my experience, companies abide by their legal obligations and cooperate with investigations that follow an incident. Having a plan in place facilitates the ability to handle incidents in a lawful and ethical manner.
Gates: Due care is all about taking reasonable steps to safeguard your company and clients. Due diligence is about identifying and reducing risk. When both are performed and well documented, handling cybersecurity incidents becomes less worrisome because you can prove you did everything possible to avoid them. The real key to achieving this is for the legal teams and security pros to develop and enforce these types of internal security policies together. Then if a breach does happen, organizations should have no fear of handling incidents in a lawful and ethical manner. Any oversights, neglect and carelessness will likely be uncovered through internal policy enforcements anyway.
Maynor: The role of CISO has blown up recently with big money and perks being offered for anyone willing to take the role. The responsibility, despite the job description, is to insulate higher execs from security related fallout. They are the fall guys. I have no doubt the Uber board patted themselves on the back that the breach got handled and the CISO firewall worked.
What people will take away from this is that if there isn’t a CISO in their org get one fast then empower them to deal with incidents any way they see fit. When a government agency comes knocking throw the CISO out to them as an offering.
This isn’t the takeaway people want but it is the takeaway I have heard in C suite circles. As far as practitioners go, I won’t change anything I do because of this… The only headway cybersecurity is making now is through a cycle of threat intelligence, detection engineering, and proactive hunts. Describing these things to legal gives them stomach aches and some find it hard to believe other companies and vendors not only do proactive work. Bad guys do not care about your legal process.
Miracco: I think organizations need to have employment policies that protect security professionals and make the legal team ultimately responsible for decisions regarding these incidents. The security professional should be empowered and required to make full and accurate disclosures on an incident and the legal team and executives, and board of directors should be empowered to handle the response and also have to take responsibility for these decisions.