Google’s Bioacoustic AI ‘HeAR’ is Here To Help Detect Early Signs of Disease

HomeTech NewsArtificial Intelligence(AI)Google’s Bioacoustic AI 'HeAR' is Here To Help Detect Early Signs of Disease

Highlight

  • Google has developed an AI model called HeAR (Health Acoustic Representations).
  • HeAR can analyze body sounds like coughs and breaths to detect conditions like tuberculosis (TB).
  • The model was trained on 300 million audio samples.
  • Google has partnered with India’s Salcit Technologies to enhance early detection of TB using the HeAR model.

Today, it is almost customary to do a Google search every time there is a health issue for you or anyone even in your distant family.

Google has become our go-to health adviser.

We get to know about health issues, symptoms, possible causes and treatments as well from the search engine.

And now it appears the concept of Google as our health adviser is getting a serious upgrade with artificial intelligence coming into the forefront.

While taking health advice from Google is often considered inappropriate or quack, Google itself is now working on an AI model that will help detect any signs of diseases.

It is a bioacoustic foundation model that may help improve health outcomes for people across India.

Google has announced the AI model is named “HeAR,” an acronym for Health Acoustic Representations, in one of its recent blogs.

While announcing the new development, Google outlines the importance of sound and speech in diagnosis.

Google’s Bioacoustic AI 'HeAR' is Here To Help Detect Early Signs of Disease
Google’s Bioacoustic AI ‘HeAR’ is Here To Help Detect Early Signs of Disease.
Photo by: yourstory.com

Our bodies make different sounds from cough to speech and even breath that can help detect maltitude of information about our health condition.

As per Google, one of the conditions that can be detected with the help of body sounds is tuberculosis (TB) or chronic obstructive pulmonary disease (COPD).

“Earlier this year, we introduced Health Acoustic Representations, or HeAR, a bioacoustic foundation model designed to help researchers build models that can listen to human sounds and flag early signs of disease. The Google Research team trained HeAR on 300 million pieces of audio data curated from a diverse and de-identified dataset, and we trained the cough model, in particular, using roughly 100 million cough sounds,” Google mentioned in the blog,

To bring this vision closer to reality, Google has partnered with India’s Salcit Technologies, a respiratory healthcare company known for its AI bioacoustic model called Swaasa.

Google has developed an AI model called HeAR (Health Acoustic Representations).
Google has developed an AI model called HeAR (Health Acoustic Representations).
Photo by: Info 3

Swaasa uses the sound of a patient’s cough to assess lung health and is currently being refined to improve the early detection of tuberculosis.

However, convincing doctors to trust and adopt technology like HeAR for diagnosis could be challenging.

Yet, having the backing of reputable organizations, such as the United Nations StopTB Partnership, lends credibility to HeAR’s potential.

In the near future, you might find yourself being diagnosed and treated by Google, all through the power of your smartphone.

“Solutions like HeAR will enable AI-powered acoustic analysis to break new ground in tuberculosis screening and detection, offering a potentially low-impact, accessible tool to those who need it most,” said Zhi Zhen Qin, digital health specialist with the Stop TB Partnership.

“Every missed case of tuberculosis is a tragedy; every late diagnosis, a heartbreak,” says Sujay Kakarmath, a product manager at Google Research working on HeAR. “Acoustic biomarkers offer the potential to rewrite this narrative. I am deeply grateful for the role HeAR can play in this transformative journey.” We’re also seeing support for this approach from organizations including The StopTB Partnership, a United Nations-hosted organization that brings together TB experts and affected communities with the goal of ending TB by 2030.

FAQs

Q1. What is Google’s new AI model HeAR designed to do?

Answer. Google’s AI model HeAR (Health Acoustic Representations) is designed to detect signs of diseases using bioacoustic data, such as sounds from coughs and breaths.

Q2. What health conditions can HeAR help detect?

Answer. HeAR can help detect conditions like tuberculosis (TB) and chronic obstructive pulmonary disease (COPD) by analyzing body sounds.

Q3. How was the HeAR model trained?

Answer. The HeAR model was trained on 300 million audio samples, including 100 million cough sounds, to identify health-related patterns.

Also Read: Google Bard Expands Globally with Gemini Pro and Launches New Image Generation Feature

Also Read: Google Rebrands Bard to Gemini; Introduced Access to Gemini Ultra 1.0 and Mobile App for Android, iOS

Also Read: Google Gemini Update Brings Automatic Voice Commands, To Expand to Headphones

Latest Articles

CATEGORIES