I’m using a free Android app called BlindTool. By Joseph Paul Cohen, a Ph.D Candidate at the University of Massachusetts Boston, BlindTool allows you to simply aim your phone at an object, and the app will try to identify it within a second.
Today, our computers have become absurdly good at identifying objects. Trained on more than a million images of mundane items like photocopiers and trash cans, our best neural nets—by companies such as Google and Microsoft—can actually name things better than humans can. The catch is that for the most part, these systems require powerful PCs and even servers in the cloud to process the information; they’re simply not practical for calling out the surroundings of someone’s day to day life.
BlindTool, on the other hand, fits on a smartphone and runs as a completely self-contained app. How is this possible? Therein lies the catch. Whereas the most state-of-the-art neural nets are trained on images in as many as 37,000 categories, BlindTool’s logic is built from experience with a mere 1,000 categories of images (which, in fairness to the scale at play here, still represents 150GB of image files).
However, BlindTool isn’t always so accurate. In order to make the app work quickly, Cohen used a database of about one thousand categories of images (compare that to large-scale systems that have as many as 37,000 categories of images). That relatively small database allows users to get almost instant feedback, but it also means the app is frequently wrong. Mark Wilson at FastCoDesign reports that when he tried the app in his apartment, it identified his Christmas tree as a feather boa and a door as an armoire. When Andrew LaSane at Mental Floss tested it out, the app named a toothbrush as a letter opener and insisted that a trashcan was a toilet seat.
Cohen designed the app with these inaccuracies in mind, and built in measures to let users know how confident (or unconfident) BlindTool is in particular identifications. The app only identifies an object verbally if it’s at least 30 sure that the identification is correct. The app also vibrates according to its confidence level, and will only vibrate at full intensity when it’s 90 percent sure that it’s right.
Link: Android
via: codesign