{"id":12723,"date":"2023-03-14T18:06:49","date_gmt":"2023-03-14T18:06:49","guid":{"rendered":"http:\/\/scannn.com\/health-ai-research-llm-updates\/"},"modified":"2023-03-14T18:06:49","modified_gmt":"2023-03-14T18:06:49","slug":"health-ai-research-llm-updates","status":"publish","type":"post","link":"https:\/\/scannn.com\/lv\/health-ai-research-llm-updates\/","title":{"rendered":"Health AI research LLM updates"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p data-block-key=\"tlyah\">We\u2019ve spent the past several years researching artificial intelligence (AI) for healthcare \u2014 exploring how it can help detect diseases early, expand access to care and more. We\u2019ve taken a \u201cmove slow and test things\u201d approach to prove efficacy, equity, helpfulness and safety above all. Today, at our annual health event, The Check Up, we shared health AI updates including our progress on our medical large language model (LLM) research, partnerships that are bringing solutions into real-world settings, and new ways AI can help with disease detection. Here\u2019s a look at what\u2019s new.<\/p>\n<h3 data-block-key=\"7j5jn\">Ongoing research on Med-PaLM 2, our expert-level medical LLM<\/h3>\n<p data-block-key=\"31csc\">Recent progress in large language models (LLMs) \u2014 AI tools that demonstrate capabilities in language understanding and generation \u2014 has opened up new ways to use AI to solve real-world problems. However, unlike some other LLM use cases, applications of AI in the medical field require the utmost focus on safety, equity, and bias to protect patient well-being. To work toward developing AI tools that can retrieve medical knowledge, accurately answer medical questions, and provide reasoning, we\u2019ve invested in medical LLM research.<\/p>\n<p data-block-key=\"b3n3l\">Last year we built Med-PaLM, a version of PaLM tuned for the medical domain. Med-PaLM was the first to obtain a \u201cpassing score\u201d (&gt;60%) on U.S. medical licensing-style questions. This model not only answered multiple choice and open-ended questions accurately, but also provided rationale and evaluated its own responses.<\/p>\n<p data-block-key=\"3qn2\">Recently, our next iteration, Med-PaLM 2, consistently performed at an \u201cexpert\u201d doctor level on medical exam questions, scoring 85%. This is an 18% improvement from Med-PaLM\u2019s previous performance and far surpasses similar AI models.<\/p>\n<p data-block-key=\"cqau2\">While this is exciting progress, there\u2019s still a lot of work to be done to make sure this technology can work in real-world settings. Our models were tested against 14 criteria \u2014 including scientific factuality, precision, medical consensus, reasoning, bias and harm \u2014 and evaluated by clinicians and non-clinicians from a range of backgrounds and countries. Through this evaluation, we found significant gaps when it comes to answering medical questions and meeting our product excellence standards. We look forward to working with researchers and the global medical community to close these gaps and understand how this technology can help improve health delivery.<\/p>\n<h3 data-block-key=\"60anr\">New partners for AI-assisted ultrasound<\/h3>\n<p data-block-key=\"5e7rl\">In recent years, sensor technology has evolved to make ultrasound devices more affordable and portable. But they often require experts with years of experience to conduct exams and interpret the images, and many low-resource areas have a shortage of ultrasound specialists. To help bridge this divide, we\u2019re building AI models that can help simplify acquiring and interpreting ultrasound images to identify important information like gestational age in expecting mothers and early detection of breast cancer.<\/p>\n<p data-block-key=\"36jik\">We\u2019re partnering with Jacaranda Health, a Kenya-based nonprofit focused on improving health outcomes for mothers and babies in government hospitals, to research digital solutions that can help them reach their goal. In Sub-Saharan Africa, maternal mortality remains high, and there is a shortage of workers trained to operate traditional high-cost ultrasound machines. Through this partnership, we\u2019ll conduct exploratory research to understand the current approach to ultrasound delivery in Kenya and explore how new AI tools can support point-of-care ultrasound for pregnant women.<\/p>\n<p data-block-key=\"21h3k\">We\u2019re also partnering with Chang Gung Memorial Hospital (CGMH) in Taiwan to explore using ultrasound for breast cancer detection. Mammograms, which are X-rays of the breast, are typically used to screen for breast cancer and are a proven approach to reducing mortality. However, screening programs aren\u2019t available in many regions due to high costs. Further, we know that mammograms can be less effective for certain populations, including those with higher breast density. With CGMH, we\u2019re exploring whether our AI models can help with early detection of breast cancer using ultrasound.<\/p>\n<h3 data-block-key=\"5vbqi\">AI for cancer treatment planning with Mayo Clinic<\/h3>\n<p data-block-key=\"aq3g3\">Over the past three years, we\u2019ve partnered with Mayo Clinic to explore how AI can support the tedious, time-consuming process of planning for radiotherapy, a common cancer treatment used to treat more than half of cancers in the U.S. The most labor-intensive step in the planning process is a technique called \u201ccontouring\u201d, where clinicians draw lines on CT scans to separate areas of cancer from nearby healthy tissues that can be damaged by radiation during treatment. This process can take up to 7 hours for a single patient.<\/p>\n<p data-block-key=\"e05co\">We\u2019ll soon publish research about the findings of our study and the radiotherapy model we developed. As of today, we\u2019re formalizing our agreement with Mayo Clinic to explore further research, model development and commercialization. Taking these next steps with Mayo Clinic means that together we can extend the reach of our model, with the goal of helping more patients receive radiotherapy treatment sooner.<\/p>\n<h3 data-block-key=\"e17f9\">Bringing tuberculosis screening to thousands<\/h3>\n<p data-block-key=\"5rpn8\">Building on years of health AI research, we\u2019re working with partners on the ground to bring the results of our research on tuberculosis (TB) AI-powered chest x-ray screening into the care setting. According to the WHO, TB is the ninth leading cause of death worldwide, with over 25% of TB deaths occurring in Africa. While TB is treatable, it requires cost-effective screening solutions to help catch the disease early and reduce community spread.<\/p>\n<p data-block-key=\"1gj2t\">We\u2019re partnering with an AI-based organization headed by Right to Care, a not-for-profit entity with extensive experience in TB care within Africa, to make AI-powered screenings widely available across Sub-Saharan Africa. Our partners have committed to donating 100,000 free AI-powered TB screenings during the collaboration to help with early detection and treatment of TB and reduce the spread of this disease.<\/p>\n<p data-block-key=\"a2q6f\">In healthcare, there is enormous potential for AI to augment diagnostic and treatment planning processes, especially through partnership to help bring high-quality care to communities that need it most.<\/p>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/blog.google\/technology\/health\/ai-llm-medpalm-research-thecheckup\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>We\u2019ve spent the past several years researching artificial intelligence (AI) for healthcare \u2014 exploring how it can help detect diseases early, expand access to care and more. We\u2019ve taken a \u201cmove slow and test things\u201d approach to prove efficacy, equity, helpfulness and safety above all. Today, at our annual health event, The Check Up, we [&hellip;]<\/p>\n","protected":false},"author":16,"featured_media":12724,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[100],"tags":[],"class_list":["post-12723","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-google"],"_links":{"self":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/12723","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/users\/16"}],"replies":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/comments?post=12723"}],"version-history":[{"count":0,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/12723\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media\/12724"}],"wp:attachment":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media?parent=12723"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/categories?post=12723"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/tags?post=12723"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}