The majority of people over the age of 50 wear reading glasses—despite never having needed eyeglasses before. Why is this? As people age, the lenses in their eyes become more rigid, and thus less able to change shape, which is required in order to focus on objects in their environment. This makes reading particularly difficult, at least without adjusting the distance of the book according to the size of the font.
At Stanford University, PhD candidate Nitish Padmanaban is working on a solution: a lens equipped with eye-tracking software aided by a depth camera, and the ability to intelligently estimate precisely what the subject is looking at in their environment, and precisely how far away that object is from the subject. Padmanaban joins the podcast to cover all the interesting details of how exactly he’s developed this technology, how it could be implemented, the challenges that he still needs to overcome, and which direction his future work is headed.
Interested in learning more? Press play and visit https://www.computationalimaging.org/.