OpenAI has unveiled ChatGPT Health, a new feature within the ChatGPT platform aimed at enhancing its role as a “healthcare ally.” This innovation creates a secure environment where users can engage in health-related inquiries with a separate chat history and memory feature distinct from the general ChatGPT interface. OpenAI encourages users to integrate their personal medical records and wellness applications, such as Apple Health, Peloton, MyFitnessPal, Weight Watchers, and Function, to receive more tailored and relevant information regarding their health.
The integration of medical records is facilitated through a partnership with b.well, which will manage the backend support for uploading medical documents. This collaboration spans approximately 2.2 million healthcare providers. Currently, access to ChatGPT Health is limited to a beta group, requiring interested users to join a waitlist. However, OpenAI plans to gradually roll out the product to all users, irrespective of subscription level.
While users are urged to connect their medical data for more personalized responses, OpenAI maintains that ChatGPT Health is not intended as a diagnostic or treatment tool. The company recognizes that users often turn to AI for healthcare advice after hours, with underserved rural communities accounting for an average of 600,000 health-related messages each week. Past incidents of misleading advice from AI have raised concerns, particularly a case where a user misinterpreted guidance on salt replacement, leading to hospitalization.
According to OpenAI, over 230 million individuals worldwide seek health and wellness advice from ChatGPT weekly. The firm has collaborated with over 260 medical professionals to refine its service, emphasizing that ChatGPT can assist users in understanding test results, preparing for doctor appointments, and exploring diet and exercise options.
Notably, the aspect of mental health was noticeably underrepresented in the announcement. OpenAI’s CEO of applications, Fidji Simo, mentioned that mental health is encompassed within the health domain. However, he acknowledged the importance of directing users in distress toward professional help and supportive resources. Concerns persist regarding the potential for AI tools to exacerbate health anxiety, particularly among individuals predisposed to such conditions.
In addressing security, OpenAI assures that ChatGPT Health operates within a space designed to protect sensitive information. The feature includes multiple layers of encryption and does not incorporate health-related conversations into the training of its foundational models. Nonetheless, data sharing may occur in compliance with legal requirements when necessary, a point underscored by previous security issues faced by the company.
Importantly, Gross, OpenAI’s head of health, clarified that ChatGPT Health does not fall under the regulations of HIPAA, as those regulations apply chiefly to professional healthcare settings rather than consumer products. This new development places OpenAI at the forefront of integrating AI into healthcare, offering a hopeful glimpse into the potential for personalized support in health management.
