Security researcher Jeremiah Fowler and the Website Planet team discovered an unsecured database containing 886,521,320 records. The data, however, were from a test database, and not from actual patients' database. (See editor's note below)
"Upon further research there were multiple references to Deep 6 AI, including internal emails and usernames," WebsitePlanet says. Researchers notified Deep 6 AI of the unsecured database and the company took action to secure the test database.
The total size of the dataset was 68.53 GB and type of data collected was divided into the following sections:
Date, document type, physician note, encounter IDs (An interaction between a patient and healthcare provider(s) to provide healthcare service(s)), patient ID, note, UUID, patient type, doctor notes, date of service, note type (example Nursing/other), and detailed note text.
According to Fowler, the database was at risk of a ransomware attack and was publicly accessible to anyone with an internet connection.
Editor's note: Deep 6 AI provided the following statement to Security magazine:
Despite recent claims, no personal or patient health data was accessed, leaked or at risk from a Deep 6 AI proof-of-concept database.
In August, a security researcher accessed a test environment that contained dummy data from MIT's Medical Information Mart of Intensive Care (MIMIC) system, an industry standard source for de-identified health-related test data. To confirm, no real patient data or records were included in this ephemeral test environment, and it was completely isolated from our production systems.
Based on current reporting, we have confirmed that the recent claims reference MIMIC data, and there was no access to real patient records. When the researcher notified us in August, we immediately secured the test environment to ensure there was no further concern.
Data security and privacy is a top priority at Deep 6 AI, and the responsibility to protect data is at the core of our business and top-of-mind for all our people.