New preclinical research suggests that bowel preparation procedures for colonoscopies may temporarily alter gut balance, culminating in unappreciated effects in patients with compromised gastrointestinal health.
The study, published…

New preclinical research suggests that bowel preparation procedures for colonoscopies may temporarily alter gut balance, culminating in unappreciated effects in patients with compromised gastrointestinal health.
The study, published…

Reading promises so much: better mental health, a sense of wellbeing, cultural and educational enrichment, even greater confidence and eloquence.
It sounds irresistible; yet for many of us, the reality is very different. Half of the adults…
Maunakea, Hawaiʻi – Astronomers using W. M. Keck Observatory on Maunakea, Hawaiʻi Island have uncovered the largest and most extended…

Self-collection of vaginal samples for human papillomavirus (HPV) testing is now being considered an acceptable, recommended option for cervical cancer screening, per recent updates to the American Cancer Society’s (ACS) guidelines published…

Liv McMahonTechnology reporter
Getty ImagesOpenAI has launched a new ChatGPT feature in the US which can analyse people’s medical records to give them better answers, but campaigners warn it raises privacy concerns.
The firm wants people to share their medical records along with data from apps like MyFitnessPal, which will be analysed to give personalised advice.
OpenAI said conversations in ChatGPT Health would be stored separately to other chats and would not be used to train its AI tools – as well as clarifying it was not intended to be used for “diagnosis or treatment”.
Andrew Crawford, of US non-profit the Center for Democracy and Technology, said it was “crucial” to maintain “airtight” safeguards around users’ health information.
It is unclear if or when the feature may be introduced in the UK.
“New AI health tools offer the promise of empowering patients and promoting better health outcomes, but health data is some of the most sensitive information people can share and it must be protected,” Crawford said.
He said AI firms were “leaning hard” into finding ways to bring more personalisation to their services to boost value.
“Especially as OpenAI moves to explore advertising as a business model, it’s crucial that separation between this sort of health data and memories that ChatGPT captures from other conversations is airtight,” he said.
According to OpenAI, more than 230 million people ask its chatbot questions about their health and wellbeing every week.
In a blog post, it said ChatGPT Health had “enhanced privacy to protect sensitive data”.
Users can share data from apps like Apple Health, Peloton and MyFitnessPal, as well as provide medical records, which can be used to give more relevant responses to their health queries.
OpenAI said its health feature was designed to “support, not replace, medical care”.
Generative AI chatbots and tools can be prone to generating false or misleading information, often stating this in a very matter-of-fact, convincing way.
But Max Sinclair, chief executive and founder of AI marketing platform Azoma, said OpenAI was positioning its chatbot as a “trusted medical adviser”.
He described the launch of ChatGPT Health as a “watershed moment” and one that could “reshape both patient care and retail” – influencing not just how people access medical information but also what they may buy to treat their problems.
Sinclair said the tech could amount to a “game-changer” for OpenAI amid increased competition from rival AI chatbots, particularly Google’s Gemini.
The company said it would initially make Health available to a “small group of early users” and has opened a waitlist for those seeking access.
As well as being unavailable in the UK, it has also not been launched in Switzerland and the European Economic Area, where tech firms must meet strict rules about processing and protecting user data.
But in the US, Crawford said the launch meant some firms not bound by privacy protections “will be collecting, sharing, and using peoples’ health data”.
“Since it’s up to each company to set the rules for how health data is collected, used, shared, and stored, inadequate data protections and policies can put sensitive health information in real danger,” he said.


Leung described feeling overwhelmed by the barrage of negativity, saying the experience was “truly surreal”
Harry Potter actress Katie Leung has opened up…
County emergency medical partner AMR reports 60% rise and shares guidance on when to seek emergency care
DeKalb County is alerting residents to a significant increase in flu-related emergency calls, based on data and observations from the…

Consumer decision-making is entering a new era. AI-driven tools, sharper expectations of fairness and transparency, and the need for meaningful human interaction are reshaping what shoppers consider acceptable, valuable, and worth paying for. Price alone no longer defines value – quality, trust, and emotional connection matter more than ever.
What matters to today’s consumer 2026, the latest report from the Capgemini Research Institute, explores how AI, personalization, and emotion are influencing consumer choices, and what brands must do to deliver experiences that feel transparent, adaptive, and human. Key findings, based on a survey of 12,000 consumers across 12 countries, include:
What matters to today’s consumer 2026 is essential for CMOs, digital and e-commerce heads, loyalty and consumer insights leads, product executives, and category managers, offering practical guidance for strategy, pricing, and technology teams seeking to meet evolving consumer expectations by:
To discover how empowered consumers are redefining value, and how brands can respond by building transparency, trust, and emotional engagement in the age of AI-enabled shopping, download the report today.