Mental Health, AI, and Data Mining: A Question of Ethics

With patient privacy and safety of utmost importance, this post attempts to demonstrate both the benefits and drawbacks of using Artificial Intelligence (AI) and data mining in mental healthcare.

By Ryan Koch


One in four people in the United States will be diagnosed with a mental health condition at some point during their lifetime, and rates of mental illness have not improved over the last 30 years despite advances in treatment. Some believe that the stagnation can be attributed to destigmatization of mental illness over the last 10-15 years, particularly with the advent of social media. Either way, the scope of mental illness is large and affects millions of people not only in the United States, but worldwide. With recent advances in technology, could artificial intelligence (AI) and data mining be used to significantly improve the lives of those who suffer from mental illness?

The health informatics implications of mental illness again are large. Storage of highly sensitive psychiatry and therapy notes electronically can be a big privacy risk for a patient, but are useful for continuity of care. Virtual therapy sessions offered through companies such as Better Help have become commonplace, providing ease of access for patients and providers but also comes with security and privacy risks. Today, I am going to focus specifically on artificial intelligence’s and big data’s role in mental health treatment. A study from Austria reviewing what is referred to as “iHealth,” or “intelligent Health,” which explores the therapeutic implications of AI and Big Data analytics in mental healthcare (Rubeis, 2022). The author describes iHealth as giving more contextual data from a patient’s physical environment by self-reports, self-monitoring, and wearable sensor modules in combination with data mining technology for real-time assessment and patient-centered treatment (Rubeis, 2022). However, the evidence regarding iHealth is minimal and often unclear due to the novelty of the interventions (Rubeis, 2022).

The use of digitally-aided self-monitoring in mental illness has been used in a variety of disorders including depression (Dogan et al., 2017), bipolar disorder (Faurholt-Jepsen et al., 2016), borderline personality disorder (Tsanas et al., 2016), and eating disorders (Tregarthen et al., 2019), with some evidence of effectiveness found for each in terms of symptom reduction or medication adherence. However, the user experience of self-monitoring was suboptimal, with many finding self-monitoring a consistent reminder of their mental illness or increased anxiety due to fear of constant surveillance (Rubeis, 2022).

Data mining typically refers to collecting large amounts of data and then creating models through patterns and learning algorithms (Dipnall et al., 2016). Currently it is thought that data mining can create more tailored treatments based on specific health indicators, leading to earlier interventions. This may reduce hospitalizations, cost, and symptoms of patients (Rubeis, 2022). However, threats to both personal and data privacy through this intervention pose concerns to both patients and providers (Rubeis, 2022). Furthermore, data has oftentimes shown to biased, particularly with confirmation bias among providers (Rubeis, 2022). A lack of social determinants of health being included in modeling systems may also lead to decontextualization of patient’s mental illness (Rubeis, 2022).

With any new technology, it is critical to consider all of the ethical implications when it comes to patient autonomy and data so a tool that could be used for good has negative unintended consequences. Based on my research into the use of AI and data mining within the field of mental health, my recommendations are to improve upon the existing regulations so as to ensure patient privacy and autonomy is protected while allowing the use of these potentially life-saving tools. Such regulations should include ensuring quality standards, ensuring patients have complete control over their mental health data (i.e., informed consent must be gathered before a patient allows AI to be used in their treatment). Furthermore, the social determinants of health must be included when developing interpretive models for mental health. Finally, practitioners must be well-educated in these tools before offering patients to participate, again emphasizing informed consent and patient autonomy while ensuring the provider-patient relationship maintains its therapeutic alliance.

References:

  1. Dipnall, J., Pasco, J.A., Berk, M., et al. (2016). Fusing data mining, machine learning, and traditional statistics to detect biomarkers associated with depression. PLoS ONE, 11(2). https://doi.org/10.1371/journal.pone.0148195.
  2. Dogan, E., Sander, S., Wagner, X. (2017). Smartphone-based monitoring of objective and subjective data in affective disorders: where are we and where are we going? Systematic review. Journal of Medical Internet Research, 19(7): e262. https://doi.org/10.2196/jmir.7006.
  3. Faurholt-Jepsen, M., Munkholm, M., Frost, M., et al. (2016). Electronic self-monitoring of mood using IT platforms in adult patients with bipolar disorder: a systematic review of the validity and evidence. BMC Psychiatry, 16: 7. https://doi.org/10.1186/s12888-016-0713-0.
  4. Rubeis, G. (2022). iHealth: The ethics of artificial intelligence and big data in mental healthcare. Internet Interventions, 28: 100518. https://doi.org/10.1016/j.invent.2022.100518.
  5. Tregarthen, J., Paik, J., Sadeh-Sharvit, S., et al. (2019). Comparing a tailored self-help mobile app with a standard self-monitoring app for the treatment of eating disorder symptoms: randomized controlled trial. Journal of Medical Internet Research Mental Health, 6(11): e14972. https://doi.org/10.2196/14972.
  6. Tsanas, A., Saunders, K.E.A., Bilderbeck, A.C., et al. (2016). Daily longitudinal self-monitoring of mood variability in bipolar disorder and borderline personality disorder. Journal of Affective Disorders, 205: 225-233. https://doi.org/10.1016/j.jad.2016.06.065.