June 13, 2025
Recently, we decided to sunset our nurse call device. This post is a reflection on everything we worked on, what we accomplished, and what we learned.
This was a direction we had pursued for 8 months - 253 days of building, testing, and iterating.
We spoke with nurses, doctors, administrators, and just about anyone in healthcare who would take a meeting. We saw the potential, and the people we spoke with saw the potential, for what we were building to fundamentally change how patients and caregivers communicate. Here's a bit about the journey.
What happens when a patient can't call for help?
In the earliest stages, that question led me to think about what could be done to bridge this gap, and why it's a problem that hasn't been solved. Whether a patient is unable to use a physical call bell or is in a situation where the device has fallen out of reach, there are times when communication is cut off where it's needed most.
To try and come up with an answer, I put a team together and visited Johns Hopkins University for a hackathon, a chance to prototype a solution and get feedback. We sketched out interfaces that relied on alternative input modalities, for patients with limited use of their hands or speech difficulties.
Digging deeper, I found that there are many situations in which patients are left without a way to communicate their needs. A patient may drop their call bell on the floor and be unable to reach it without risking a fall. Another patient recovering from surgery may be too weak or in too much discomfort to press a button. Others may be intubated and unable to speak, or living with conditions like ALS that limit their ability to use traditional devices.
There's also the issue of limited resources. Short staffing prevents timely responses, increasing the risk of a patient taking matters into their own hands and attempting to move without help. In those moments, something as simple as trying to get out of bed unassisted can lead to falls, preventable injuries, and setbacks in recovery.
These realizations drove everything we did next.
We started simple. The earliest version of the product was essentially an iPad app using the beta eye-tracking. The user would look at one of six icons on the screen and hold their gaze to select, with their request being routed to the appropriate staff via the companion app. It was clunky, barebones, and rushed, but it proved our concept and started conversations.
Next, we focused on usability. The second version introduced the option of recording a message to then be transcribed and sent as a notification. We mounted the tablet and tested it in a hospital environment.
The third version of the product moved beyond software. Much of the feedback we received pointed to the need for a cost-effective solution. Most facilities will almost always opt for the cheapest option when considering nurse call systems. So, we started building a hardware prototype with the absolute core functionality, a mounted device that would be accessible to a patient lying face up.
With two card shaped selectors, we cut down the functionality to disambiguating between a call to the patient's assigned nurse or a CNA. A patient can have their call routed to the appropriate personnel using their eyes, without having to physically interact with a call device.
Despite the progress, we realized that the barriers to adoption were greater than mere technical challenges. Hospitals are risk-averse. Budgets are tight. Regulations are complex. And introducing a new system, no matter how promising, requires a level of trust and validation that, quite frankly, we didn't have.
This doesn't mean it can't be done, nor does it mean we're abandoning our mission. We're proud of what we built, and we're redirecting our energy to move faster towards the future we set out to create.
- Yonatan