This post was originally featured on HIStalk.
What is the Smartphone Physical, how did it get started, and how did you get involved?
The Smartphone Physical originated from a collaboration between Medgadget, TEDMED, and Nurture by Steelcase. As a Medgadget editor, I had been approached by TEDMED in the fall of 2012 to come up with an interactive experience for delegates. It was the perfect opportunity to implement something I had been ideating on through my dual roles. First, as a medical student at Johns Hopkins, where I’ve been learning how to perform physical exams on patients. Second, as an editor of Medgadget, for which I’ve been covering digital health apps and devices that are becoming increasingly capable of collecting clinically relevant information.
I noticed that many of the procedures I was doing as a medical student, such as taking blood pressure and visualizing ear drums and optic discs patients, were being reproduced on the digital health side with apps and peripheral devices. I combined the two perspectives into the Smartphone Physical to see how many clinically relevant exam maneuvers we could perform using smartphone-based technologies. To this end, I formed partnerships with many of the app creators and device manufacturers to develop the Smartphone Physical concept, and then recruited a team of forward-looking medical students and biomedical engineers to help perform the physical exams. In the process we’ve found some interesting use cases, ranging from telemedicine and occupational health to screening and medical education.
Can you elaborate on the reception to and next steps for the Smartphone Physical?
Overall we’ve received a lot of support from the digital health community as well as a tremendous amount of interest from the healthcare professional and patient communities. Since TEDMED we’ve performed at least 500 Smartphone Physicals and in the process have realized that there are some important obstacles to overcome if these digital health tools want to cross the chasm and evolve from toys to tools. I summarize these in what I like to call the stairway to digital health: awareness, compliance, evidence, and reimbursement (ACER). The Smartphone Physical, among other initiatives and companies, is working to tackle these issues.
With regard to awareness, we recently showcased the Smartphone Physical at the American Medical Association, where hundreds of medical students got their hands on these apps and devices. As a medical student and co-founder of the med ed tech company Osmosis, I’m particularly interested in medical education as a way to reach the next generation of clinicians. In terms of compliance, the FDA just released its finalized guidance on mHealth and many of the Smartphone Physical apps/devices will be regulated as a result. That’s to be expected because they’re tackling some hard problems, such as clinical data collection and diagnostics.
In addition to government compliance, one of the most formidable challenges – not technologically, but politically – is getting the (patient-collected and validated) data into the electronic health record so it is available on-demand. And it can’t be a simple data dump, but rather needs to be structured in a way that insight can be gleaned easily. Big data does not imply big insight.
Evidence is a third key area that we are interested in. We talk anecdotally about improved outcomes or improved engagement of patients, but these need to be proven using randomized controlled trials. To this end, we’ve heard from a number of institutions that want to run digital health studies to determine whether these devices actually enhance patient engagement, reduce costs, and improve outcomes.
Perhaps the greatest challenge is the reimbursement question. Physicians and other clinicians will not use digital health devices and apps until there’s a clear path to being reimbursed for their time in the upgrades and changes.
Can you provide an example of a device that’s succeeding along these dimensions?
In general, device manufacturers keep their adoption metrics close to their chests. In some cases, this is because adoption is fairly low. For example, as a medical student at Hopkins, I’ve met hundreds of practicing doctors and I haven’t seen one who uses a smartphone-based ophthalmoscope or stethoscope. In part this is because many devices are fairly new and just getting FDA approval; that is they still need to get ACER. One device that has succeeded along these dimensions is AliveCor’s Heart Monitor. They’ve generated tremendous awareness, were FDA approved last year, are working hard to get evidence. Recently they reported positive results on their SEARCH-AF study. They are figuring out the business model, which will involve a data play.
Sounds like most of the innovation is going on in diagnostics, not treatments. Do you have any insights as to why that might be?
That’s a good question. The relative types of apps and devices can be viewed as a funnel. At the top, the majority are simply informational or reference, for example to look up drug interactions or calculate GFR. The next layer enable data collection, and include many of the Smartphone Physical apps/devices such as the iSpO2 and blood pressure cuff. It’s a big jump though to the next layer, which moves from collecting data to actually offering a diagnosis. A number of dermatology apps have tried doing this with computer vision, though have received mix results. Smartphone Physical companies such as AliveCor and CellScope are trying to create machine learning algorithms to actually diagnose things like atrial fibrillation or ear infections, so I’d expect to see some fairly sophisticated apps coming out in the near future.
Have you heard of Shazaam, the music-identification service?
Shazaam is able to tell you what you’re listening to based on matching parameters extracted from the wave form of the music. Similarly, an ECG can be deconstructed into various parameters. If we create a library with millions of ECG recordings that are tagged to diagnoses, an app may eventually become good enough to go beyond simply data collection and more towards data analysis. Again the difference between Big data and big insight. Even so, there likely will be a human expert, at least for the next five, 10 years, making the final call on the diagnosis, because machines are fallible.
Similarly, algorithms may be applied to other data collection apps/devices to extract useful diagnoses. For example, continuous oxygen saturation combined with EEG data may be used to diagnose sleep apnea at home. Repeated weight and blood pressure measurements can help diagnose hypertension.
In terms of treatment, there are few if any devices that currently do that. I know of at least one group working on a smartphone-controlled biofeedback machine, which would be great for the field of physical therapy. One key issue to keep in mind is that you do not necessarily want patients self-diagnosing or self–treating based off this generation of apps and devices. Eventually that will be the Holy Grail of healthcare – the engaged and empowered patient.
How do you see funding playing out? Will these devices be crowd-funded through platforms like Kickstarter or will they go the more traditional VC route?
The Smartphone Physical is a great case study for this. We included nine devices and they could be categorized into three broad groups. The first group was very consumer-facing: weight, blood pressure, and oxygen saturation monitors. The middle group was primarily consumer-facing, but unlike the first group was primarily for specific cases as opposed to daily use. These included the ECG, spirometer, and otoscope for ear examination. The third group was primarily provider-facing and included advanced devices such as the stethoscope, ophthalmoscope, and ultrasound.
The specific end user is the first indicator I’d look at for whether an app/device would be a successful crowd-fund. For example, the Scanadu Scout was marketed as a medical tricorder for general consumer use and raised a record $1.6M on IndieGoGo. Similarly, I think if CellScope decided to do a crowd-funding for their otoscope, that would be quite popular since many parents or grandparents would want it for their young ones. This is not to say that provider-focused apps and devices won’t be successful since there are certainly a number of med tech-focused crowd-funding sites now that attract those in the industry.
Broadly speaking, digital health is well poised for crowd-funding and crowd-investing. Healthcare as a whole, as you know, has been a very capital-intensive field. If you wanted to bring a new medical device such as a stent to market, you would need a lot of capital and face significant regulatory pressure. Digital health apps and devices are much cheaper because the battery, monitor, processors, sensors, and transmitters are built into the smartphone and don’t necessarily need to be duplicated in the device itself. Indeed, the CellScope otoscope and AliveCor heart monitors are simple cases. This has reduced the amount of capital needed, and with the FDA’s final guidance the regulatory uncertainty has also decreased.
Any final thoughts?
We’ve been speaking with a lot of stakeholders who are interested in the Smartphone Physical, ranging from occupational health offices to pharmaceutical companies to hospitals and clinics. We’d be happy to forge connections with HIStalk readers who are interested in digital health and our specific application. They can contact us via our website or by sending an e-mail.