EducationGadgetsHardwareScienceTC

Non-invasive glucose monitor EasyGlucose takes home Microsoft’s Imagine Cup and $100K

Microsoft’s yearly Imagine Cup student startup tournament crowned its latest winner today: EasyGlucose, a non-invasive, smartphone-based mode for diabetics to experiment their blood glucose. It and the two other similarly beneficial finalists presented today at Microsoft’s Build developer conference.

The Imagine Cup brings together winners of many local student competitions around the world, with a focus on social good and, of course, Microsoft services like Azure. Last year’s winner was a smart prosthetic forearm that uses a camera in the palm to identify the object it is meant to grasp. (They were on hand today as well, with an improved prototype.)

The three finalists hailed from the U.K., India and the U.S.; EasyGlucose was an one-person faction from my alma mater UCLA.

EasyGlucose takes merit of gagdet learning’s knack for spotting the signal in noisy data, in this case the little details of the eyeball’s iris. It turns out, as creator Bryan Chiang explained in his presentation, that the iris’s “ridges, crypts and furrows” hide little hints as to their possessor’s blood glucose stages.

EasyGlucose presents at the Imagine Cup finals

These features aren’t the kind of thing you can see with the naked eyeball (or rather, on the naked eyeball), but by clipping a macro lens onto a smartphone camera, Chiang was able to get a clear enough picture that his computer vision algorithms were able to analyze them.

The resulting blood glucose measurement is significantly good than any non-invasive measure and more than good enough to serve in place of the most common mode used by diabetics: stabbing themselves with a needle every couple of hours. Currently EasyGlucose gets within 7% of the pinprick mode, well above what’s needed for “clinical accuracy,” and Chiang is working on closing that gap. No doubt this innovation will be welcomed warmly by the community, as well as the low cost: $10 for the lens adapter, and $20 per month for continued assist via the app.

It’s not a home run, or not just yet: Naturally, a technology like this can’t go straight from the lab (or in this case, the dorm) to international deployment. It needs FDA approval first, though it likely won’t have as protracted a review period as, say, a brand-new cancer treatment or surgical machine. In the meantime, EasyGlucose has a patent pending, so no one can eat its lunch while it navigates the red tape.

As the winner, Chiang gets $100,000, plus $50,000 in Azure credit, plus the coveted one-on-one mentoring session with Microsoft CEO Satya Nadella.

The other two Imagine Cup finalists also used computer vision (among other things) in service of social good.

Caeli is taking on the issue of breezed
pollution by producing custom high-performance breezed
filter masks intended for people with chronic respiratory conditions who have to live in polluted areas. This is a serious problem in many places that cheap or off-the-shelf filters can’t really unravel.

It uses your phone’s front-facing camera to scan your face and pick the mask shape that makes the best seal against your face. What’s the point of a high-tech filter if the unwanted particles just creep in the sides?

Part of the mask is a custom-designed tight nebulizer for anyone who needs medication delivered in mist form, for instance someone with asthma. The medicine is delivered automatically according to the dosage and schedule set in the app — which also tracks pollution stages in the venue so the user can dodge hot zones.

Finderr is a compelling solution to the problem of visually impaired people being unable to find items they’ve left around their home. By using a custom camera and computer vision algorithm, the service watches the home and tracks the placement of everyday items: keys, bags, groceries and so on. Just don’t lose your phone, as you’ll need that to find the other stuff.

You call up the app and tell it (by speaking) what you’re looking for, then the phone’s camera determines your venue relative to the item you’re looking for, giving you audio feedback that guides you to it in a sort of “getting warmer” style, and a gigantic visual indicator for those who can see it.

After their presentations, I asked the creators a few questions about upcoming challenges, since as is usual in the Imagine Cup, these companies are extremely early-stage.

Right now EasyGlucose is working well, but Chiang emphasized that the version still needs lots more data and testing across multiple demographics. It’s trained on 15,000 eyeball images but many more will be necessary to get the kind of data they’ll need to present to the FDA.

Finderr recognizes all the images in the widely used ImageNet database, but the faction’s Ferdinand Loesch pointed out that others can be added very easily with 100 images to train with. As for the upfront cost, the U.K. offers a £500 grant to visually-impaired people for this sort of thing, and they engineered the 360-degree ceiling-mounted camera to minimize the number needed to cover the home.

Caeli noted that the nebulizer, which really is a medical machine in its own right, is capable of being sold and promoted on its own, perhaps licensed to medical machine manufacturers. There are other smart masks coming out, but he had a beautiful low opinion of them (not mysterious in an opponent, but there isn’t some gigantic marketplace boss they need to dethrone). He also pointed out that in the target marketplace of India (from which they plan to extend later) it isn’t as strenuous to get insurance to cover this kind of thing.

While these are early-stage companies, they aren’t hobbies — though, admittedly, many of their founders are working on them between classes. I wouldn’t be surprised to hear more about them and others from Imagine Cup pulling in funding and hiring in the next year.

Source
TechCrunch
Tags

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Close