Some cool AI healthcare projects

What was built at the OOP hackathon?

Looking to hire the best talent in healthcare? Check out the OOP Talent Collective - where vetted candidates are looking for their next gig. Learn more here or check it out yourself.

Hire from the Out-Of-Pocket talent collective

Healthcare 101 Crash Course

Crash course the basics of US healthcare in a simple to understand and fun way. Understand who the different stakeholders are, how money flows, and trends shaping the industry.
Learn more

Featured Jobs

Finance Associate - Spark Advisors

  • Spark Advisors helps seniors enroll in Medicare and understand their benefits by monitoring coverage, figuring out the right benefits, and deal with insurance issues. They're hiring a finance associate.

Data Engineer - firsthand

  • firsthand is building technology and services to dramatically change the lives of those with serious mental illness who have fallen through the gaps in the safety net. They are hiring a data engineer to build first of its kind infrastructure to empower their peer-led care team.

Data Scientist - J2 Health

  • J2 Health brings together best in class data and purpose built software to enable healthcare organizations to optimize provider network performance. They're hiring a data scientist.

Looking for a job in health tech? Check out the other awesome healthcare jobs on the job board + give your preferences to get alerted to new postings.

Check Out The Job Board

We hosted our first ever hackathon last month. I learned a few things.

  • We had about 1/3rd or so people fly into the hackathon and also about 1/3rd that were totally new to healthcare. This was an excellent way to blend fresh eyes on a problem with people that had actual experience with it. We should have more events like that.
.
  • The large language models are advancing insanely fast. Claude Sonnet came out a few days before the hackathon and OpenAI released a low latency version of their model a month prior which enabled so many new use cases at the hackathon.
  • My body physically cannot do all nighters anymore. My brain was just looping “BBL DRIZZZZAYYYY” at 2 am. 
  • Hackathons are filled with very positive energy. Most conferences are either selling or complaining. Everyone at the hackathon was thinking about what could be possible in healthcare. There’s so much happiness and beauty when you’re not HIPAA compliant.
  • A week before the hackathon, a bunch of people reached out about judging the hackathon so they could use it in their O-1 Visa application. I also learned there are entire events that get O-1 visa applicants to pay them to be a judge. 
  • It is shockingly easy to make a branded energy drink. Maybe a little TOO easy honestly.
Breakfast of champions, Dinner of psychopaths

But the real lesson is that you can build some crazy shit with generative AI tools today. Here’s a quick roundup of the demos that people built in 36 hours, with a bit of my commentary on how I think about the viability of the project or that slice of the healthcare ecosystem. 

I highly recommend taking a look at what’s possible because they might inspire you to try some of these at your job. And don’t be shy to reach out to the teams!

Autofill PDFs for SNAP, Medicaid, etc.

***Winner of the “I’m using this tomorrow” award - the most practical application

Name of the Project -  Lighthouse.AI

Who Contributed - 

Albert Cai

Nathan Leung

Irene Jiang

What does it do? 

In the US, 100M+ people have unmet health & social needs, and many don't receive the gov't benefits they're eligible for (Medicaid, SNAP, etc.) because of administrative burdens - too much paperwork that takes too much time. We wanted to build a tool that makes it easier to fill out these program applications and wrangle PDFs with tedious, repetitive questions.

In our demo, users can upload a PDF and the GPT-4o-powered assistant helps identify which pages need to be filled with information. Users can also upload pictures/scans of existing documents (ex: driver's license, pay stub, tax return) and the assistant automatically fills some fields in - saving time, energy, and hopefully someday closing health equity gaps!

.

Hackathon Demo: Youtube

Usable Demo: Link to the app

Contact info - 

alcai@umich.edu

18nleung@gmail.com

NK Commentary - In the US, we care so much about avoiding fraud in the US that we use onerous government forms and enrollment processes that end up preventing the people that need these services from actually getting them. Using large language models to make this easier is awesome. You could also imagine a world where an LLM can change the forms to a different language or even provide visuals to explain what the form is talking about, tailored to the person filling it out.

I’d love to see them pointed to more forms that consumers find to be a total headache to fill out.

Robocalls to Check Pharmacy Inventory

***Winner of the people’s choice award aka. The audience vote!

Name of the Project -  StockScout

Who Contributed - 

William Pang

Andrey Risukhin

Emily Tang

Chriz Turitzin

What does it do? 

AI-powered agent to check pharmacy stock. While the provider is with the patient, StockScout AI calls local pharmacies to find stock. Never send an Rx to a pharmacy without stock again. 

They showed snippets of real audio calls with local pharmacies, highly recommend watching

Recorded Demo - Link to the demo

Live Demo - Youtube

Contact info - 

NK Commentary - The fact that this bot could navigate the phone tree and press the right button was already cool and showed that it could make the right judgment based on the intent of what was being asked.

But the wildest part about this demo is seeing a bot figure out the cadence of speech in a conversation. The latency in these large language models is so good, the robocall even knew during the pause where the pharmacist was saying “let me check…” to wait 1.5 seconds and say “thank you” while it was drawn out. It’s subtle, but it’s one of those things that’s pretty mind blowing to see. 

I actually talked about the pharmacy stock use case specifically in a previous post, and this project convinced me that lots of companies will use robocalls to build their own datasets very soon.

Air Traffic Control for Hospital Beds

***Winner of the “Damn Son” Award aka. for the application that was most mindblowing

Name of the Project -  Bedrail.AI

Who Contributed - 

Andrew Lokker

Chien Ho

Colin DuRant

Shlok Natarajan

What does it do? 

With Bedrail.AI, our team sought to leverage the power of generative AI to build an app that could serve as an "air traffic control" for hospital bed and staff management, allowing administration to respond in real-time — starting from the first 911 call.

Bedrail is a web app that allows users to visualize and customize hospital bed and nurse staffing capacity across different units like the ER, the ICU and specialty wards such as cardiology or orthopedics. Incoming patients can either be triaged and placed real-time via either real time clinical conversations fed through Deepscribe, or via a text interface via a triage and placement powered by the Anthropic API. 

.

Screen recording: Recorded walkthrough 

Hackathon Demo: Youtube

Contact info - 

  • ajlokker@gmail.com
  • chienqho@gmail.com
  • colin.durant@gmail.com
  • shlok.natarajan@gmail.com

NK Commentary - If there’s one thing hospitals care about, it’s heads in beds. Also getting certain heads the f*** out of certain beds (respectfully)

Capacity planning for beds during COVID was a huge issue, as everyone knows. The most interesting part of the demo to me was triaging the EMT call and quickly figuring out what bed the patient should go to before the patient even got to the hospital. Helping the dispatch teams and EMTs quickly figure out the beds ACROSS hospitals and quickly pushing the patient data from ambulance to hospital feels like it could be really useful. 

Turning Video into Assisted Daily Living Documentation

***Winner of the Most Fun award - The Project That Put A Smile On Your Face

Name of the Project -  AiDL: Nurse Scribe

Who Contributed - 

Allison

Kajari

Kevin

Vansi

What does it do? 

Skilled Nursing Facilities are required to document ADL tasks (e.g. grooming, transfers) their nurses and staff assist patients with. However their staff are running room to room helping patients and don't have time to document completed ADLs. We built an AI assistant that uses ambient audio and video to automatically document tasks and level of support in the CMS required Minimum Data Set standard - with no added work for nurses!

.

Recorded Demo: Live walkthrough

Hackathon Demo: Youtube

Contact info - 

NK Commentary - People have been clowning these “AI hardware pin” companies, but this demo shows how much potential they have in healthcare. Putting it on nurses and caregivers and capturing the tasks they did passively to get them reimbursed is nuts.

It does also raise the question of…do we think body cams on frontline staff is net positive? My hunch is yes - you can audit videos to also make sure that these more vulnerable patients aren’t being abused AND also reduce the amount of admin work the caregivers/nurses have to do.

Medical Interpretation

Name of the Project -  Interpreter in your Pocket

Who Contributed - 

Emily Gu

Mohak Jain

Chris Zou

What does it do? 

Medical interpreters are important in many care settings but they are often 1) unavailable or 2) don't translate well. We built a web app that provides culturally competent translation via voice input/output, using Chinese as a proof of concept language. In the background, we're using the Whisper API for voice transcription, GPT-4o for text translation, and ElevenLabs for output voice. We're excited that despite how simple it is, the app passes online interpreter exam samples and can translate a basic doctor-patient conversation we constructed!

.

Demo - Youtube

Contact info - 

eyyg123@berkeley.edu

mohakjain@berkeley.edu

cwzou@berkeley.edu

NK Commentary - Akron Children’s hospital pays up to $1.2M a year for translation services! There are somewhere between 17K-19K medical interpreters in the US. These services mostly don’t get reimbursed.

This is going to end up being an interesting question about the risk of AI interpretation vs. humans - will hospitals rather use humans that currently have lower risk for getting it wrong, even if some hospitals are basically eating the cost or don’t have access at all? What level of performance does an AI have to have to be officially considered safe enough to replace an interpreter? 

This feels like a clear area where we can reduce admin spend.

Quickly Assessing The ROI Of An Intervention

Name of the Project -  Refract

Who Contributed - 

Lindsay Zimmerman

Shashin Chokshi

Bea Capistrant

What does it do?

Refract accelerates the development of effective, value-driven health interventions. It leverages Claude AI and specialized models like PubMedBERT to:

  • Analyze datasets like claims to figure out which conditions are the most spend
  • Assess available literature on interventions that might be able to assist treating that condition
  • Build prototypes of that intervention
.

Hackathon Demo - Youtube

Contact info - lindsay@sociumhealth.org

NK commentary: I have a whole thesis around “the productization of consultants” in healthcare, and I think this fits that mold. There’s a lot of people that get paid well to provide the expertise to do these things. AI can now provide lightweight expertise to entities like employers to do analyses like this themselves.

But they also made fun of me so I’m dinging them.

Care Journeys, But It’s A Video Game

Name of the Project -  Pokemon inspired AI doctor creator and care plan.

Who Contributed - 

Rachel Kim

Jose Rodriguez

Mimu Jung

Matthew Woo

What does it do? 

Creative exploration of what it might look like if patients created their own AI doctor based on their preferences (i.e. communication style, comprehensiveness), would then get to know their family and come up with a personal care plan.

.


Hackathon Demo -
YouTube

Contact info - Matthew@summerhealth.com

NK Commentary - I thought this one was very cute and fun. Video games have nailed engagement, maybe healthcare can learn something from them. I thought it was cool to see a patient journey actually put onto a trail - you can imagine being able to see how far you’ve come, milestones you’ve hit, potentially where forks in the road might be, or how close you are to the end. Turning a care journey into a fun visual for patients might just make the whole thing easier to understand.

Better Imaging Summaries

Name of the Project -  DAN (Data Augmentation for an N-1)

Who Contributed - 

Dan Conger

Blair Myers

Daniel Sam Pete Thiyagu

James Leonard

What does it do? 

The tool enhances shared decision-making by extracting data from Electronic Health Records (EHRs) and translating it into digestible, clinically relevant information. Utilizing an Agentic RAG Pipeline it explains imaging reports in plain language citing peer reviewed literature, and ensures clear communication by summarizing information into draft messages for providers.

TL:DR, it helps patients make sense of the image report that shows up in their inbox.

.


Loom Demo:
DAN 3 minute video

Hackathon Demo: Youtube

Contact info - 

JamesFLeonard@gmail.com

Bedmyers@gmail.com

dconger2@gmail.com

danielsamfdo@gmail.com

NK Commentary - Imaging reports really highlight the divide of text written for other doctors vs. patients. If one more radiologist calls my liver unremarkable they’re gonna have to square up.

If you’ve ever seen an imaging report, it’s like someone just started smashing the keyboard with their fist and threw in very scary sounding words like “lesions”. When patients see this, it can be very confusing and anxiety inducing and most patients have the same questions. “What does this mean?”, “how common is this?”, “what are next steps?”, etc.

This has been exacerbated now that the reports are auto-released to patients thanks to the 21st Century Cures Act. Patients see these results and have no idea what they mean.

I think this is a fantastic place where AI can make a huge difference - quickly generate reports to help patients understand what’s happening in their images, hit the FAQs, and have it already loaded into the EHR for the physician.

AI Care Navigators

Name of the Project -  FLOW

Who Contributed - 

Kanon Mori

Carlos García Morán

Suman Sigdel

What does it do? 

Patients with chronic illnesses are completely lost -- what are the steps in my treatment journey, do I need to schedule them, and how much are they going to cost?? FLOW is a personal care navigator that on the front end, empowers patients with just the right amount of information, and on the back end, conducts all administrative chores for a seamless care experience. 

In the future, we can integrate features like seamless scheduling through an Epic integration, and proactive prior authorizations made possible by predicting use.

.


Hackathon Demo -
Youtube

User Demo: flow-oop.vercel.app

Contact info - 

kanon76@ucla.edu

cgmoran32@gmail.com

sumansid1113@gmail.com

NK Commentary - If you could see the entire care plan, estimated dates for each step, etc. that would be great. My hunch is that hospitals don’t even know this internally because of how many things with scheduling that it would be impossible to provide this with any certainty.

But even just being able to visualize the steps and options going forward when you get a diagnosis feels very helpful. I feel like when a doctor has told me about the potential options, I try to make a visual in my head anyway. 

A Better Family History Tool

Name of the Project -  AI-powered genealogy health insights

Who Contributed - 

Samir Chowdhury

Daniel Kotin

Jinny Yoo

What does it do? 

Our app is a generative AI-enabled family health history platform that allows patients to easily share and interact with their family health history. We generate preventative care recommendations and help justify insurance coverage for these recommended steps, in turn allowing patients to be more involved in the decision-making process for their care.

Our project uses a Flask-based web application with Python backend, HTML/JavaScript frontend, Cytoscape.js for graph visualization, and Anthropic's Claude AI for health data analysis and query processing.

.


Hackathon Demo -
Youtube

Contact info - 

Our LinkedIn dms are open!

NK Commentary - I always thought an interesting wedge into a new personal health record would be one where it’s a multiplayer record that you and your family use. Family histories are pretty hard to keep track of today and usually just rely on patient’s remembering what your drunk aunt told you at Thanksgiving when she was spilling all the tea about your family. Having the records automatically linked together and auto-filling family history would be great.

Robocalls for Pre-Charting Documentation

Name of the Project -  Charty.ai

Who Contributed - 

Carlos Martinez

Ganga Nadella

Sophia Clark

Veronica Nutting

What does it do? 

Charty.ai, takes the content of a provider's most recent outpatient notes (SOAP notes), and then uses an LLM (gpt-3.5-turbo) to summarize and prepare questions for a patient follow up based on action items from the plan recommended in the SOAP note (i.e. Have you been taking 10mg of Metformin daily as prescribed?).

The LLM will "call" and speak to the patient to ask about their plan and will summarize the patient's answers to generate a report for a PCP's pre-charting documentation. The backend is built with Node.js and we generate a viewable report using HTML.

They simulated the phone call and showed the summary pop up in the EHR

Hackathon Demo - Youtube

Contact info - 

NK Commentary - When I saw this demo, it made me realize that there’s going to be a huge divide in the near future. Some people will see robocalling to follow up with you as a totally normal part of the patient visit, and some people will see it as equivalent to the robo spam calls we get today and/or the doctor’s office doesn’t even take the time for them. 

Personalized Lab Result Reports (...from a grandmother)

Name of the Project -  What Would Granny Say

Who Contributed - 

Claire North

Akash Chaurasia

Arshia Kapil

Pratik Katte

What does it do? 

"What Would Granny Say?" is an innovative bloodwork chatbot that demystifies bloodwork results for patients using a warm, calming "Granny personality." It offers personalized medical advice tailored to your specific results while reducing healthcare provider burden by cutting down on lab result-related messages and calls. By providing accurate, personalized information and actionable steps, it helps reduce patient anxiety caused by googling generic medical information, empowering patients to make everyday lifestyle changes and improve their health year by year.

This chatbot was built using OpenAI’s gpt-4o model, Flask for the backend, and React for the front end 

Blood-Work Granny, wasn’t that a villain in Avatar the last airbender?

Recorded Walkthrough -  walkthrough

Hackathon Demo - Youtube

Contact info - 

claireknorth13@gmail.com 

NK Commentary - Auto-released lab results are very confusing, and large language models will likely be used to help patients make sense of it and doctors to write quick summaries of what’s happening. I liked that in their project they showed things like “recipes to address some of the issues you’re facing”. People always talk about how doctors aren’t trained in nutrition as much as they should be, I can see these large language models helping to fill in the gap and give patients more personalized suggestions for non-clinical things like food or exercise. 

Fighting Your Medical Bills (Automatically)

Name of the Project -  Medical Bill Fighter 

Who Contributed - 

Hannah Sennik

Kate Zellmer

James Xu

What does it do? 

80% of medical bills contain errors. A huge burden is placed on patients to find these downstream errors and figure out what they can do about it. We used Voiceflow to build a chatbot that guides people through this cumbersome and frustrating process. It leverages a knowledge base of common medical billing errors, a combination of rule and LLM based decision making, and an email API to draft a dispute that can be sent to a provider's billing department.

.


Hackathon Demo -
Youtube

Contact info - 

NK Commentary - A lesser known offshoot of Street Fighter, medical bill fighter lets you Hadouken that level 5 E&M visit.

I like AI tools for consumers to fight back. With more data shifting into the hands of patients via 21st Century Cures Act, soon you’ll be able to cross reference what happened in the hospital with your actual bill to see if it was coded properly. On top of drafting an angry email to your provider, there should also be a “snitching” function that anonymously posts on twitter every time a provider is overbilling someone and how much the overbilling was.

Giving Proactive Nudges to Close Care Gaps

Name of the Project -  Closing Care Gaps for Chronic Diseases

Who Contributed - 

Nagarjuna Tella

Shailesh Dudala

Surya Chappa

William Laolagi

What does it do? 

The value based care model is focused on reducing costs of a bloated healthcare system by incentivizing quality of care provided and improving patient outcomes with an emphasis on preventing progression of chronic diseases. While providers are aware of these goals, most providers are already overburdened with other administrative work in addition to patient-facing work. Additionally, care gap reports are surfaced from multiple payers and cannot be used to proactively close patient care gaps.

Our project surfaces these care gaps in near real-time by obtaining CCDAs sent to the payer from the practice and extract relevant metadata using a RAG-based GenAI pipeline to calculate the care gaps. This information is then overlaid with the physician EMR using a SMART on FHIR app, showing care gaps at both patient population level and patient-specific level. This real time information can be used by the physician to tailor conversations and treatment plans for the patients accordingly.

TL;DR - Proactively tells docs what to watch out for based on reports the payer sends.

.

Hackathon Demo - Youtube

Contact info - 

1. nagarjunatella@hotmail.com

2. shaileshdudala@icloud.com

3. surya.chappa@icloud.com

4. william.laolagi@gmail.com

NK Commentary - There was an interesting discussion with the judge after this presentation about whether doctors would listen to the reports about what to do with their patients that came from payers. Which I think actually gets to a key premise question for AI companies - is this problem a technical issue or an incentive alignment issue? 

The things that payers care about in value-based care arrangements might be very different that what doctors want even though you’d think they’d be aligned. For care gaps in particular, sometimes payers really want the doctors to optimize closing care gaps that give them the most bonus payments which may not be how the doctor prioritizes things.

AI Nurse Triage Lines

Name of the Project -  ETHOS

Who Contributed - 

Pavan Agrawal

Justin Lin

Vaibhav Verma

What does it do? 

ETHOS is a voice agent that helps patients triage their medical concerns, reducing the need to call a nurse support line that health systems or insurance companies run today.  We spent a bulk of our time evaluating the efficacy of LLMs for this task where we measured how good an LLM is at helping patients understand their options for treatment and the level of care needed.

We simulated 100s of conversations between patients and the voice agent to measure how well GPT-4 and Anthropic's Claude Sonnet 3.5 work. We find that GPT-4 very closely follows industry standard triage protocols, such as Schmitt-Thompson's and there's low variance between the first aid and level of care suggested when prompted with triage instructions.

.

Demo - Link for the app

Hackathon Demo - Youtube

Contact info - 

pavan@edith.ai

NK Commentary - While I’m skeptical projects like this will fully automate nurse triage lines, I could see this being an excellent tool to get the basic information from a patient and then put patients in the queue based on severity for a nurse. 

Also it feels like audio alone is missing the real potential here. With multimodal models, it would be cool if you could have a nurse FACETIME line so you can show visuals. A lot of the call is just spent explaining visuals. 

I could see payers leaning into this and making it easier to access and more multimodal so that people go to them instead of urgent care.

Conclusion, Sponsors and Parting Thoughts

We’re definitely going to be doing another hackathon next year, the tools are getting better and healthcare needs these kinds of builder spaces. Plus I got to try my hand at standup comedy, to which someone said at the happy hour afterwards “wait when did you do the standup?”.

.

And finally want to thank our highest tier sponsors who were integral parts of the event. They gave talks, hackers built on their tools, and we had a great time with them!

  • Canvas Medical empowers organizations to safely and quickly leverage AI and accelerated computing to improve care delivery through its pre-built condition specific EMR solutions or its highly extensible and customizable EMR platform.

  • ​Abridge's AI-powered platform improves clinical documentation efficiencies while enabling clinicians to focus on what matters most—their patients. Interested in joining their mission to power deeper understanding in healthcare? Check out open roles here.

  • Autoblocks AI helps product teams improve the reliability and accuracy of GenAI features. Applications to their Health Tech Partner Program close this week. This is an exclusive opportunity to work 1:1 with the Autoblocks team to evaluate & improve the accuracy of your LLM-based products: Apply Now.

We’re scheming up next year’s hackathon already, so if you’re trying to get people to build on top of your shit hit us up. 

It was a good time

Thinkboi out, 

Nikhil aka. “Hack city b**** hack hack city”

Twitter: ​@nikillinit​

IG: ​@outofpockethealth​

Other posts: ​outofpocket.health/posts​

--

{{sub-form}}

‎If you’re enjoying the newsletter, do me a solid and shoot this over to a friend or healthcare slack channel and tell them to sign up. The line between unemployment and founder of a startup is traction and whether your parents believe you have a job.

Let's Keep In Touch

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
close
search icon
close