Researchers at MIT have created an imaging system using artificial intelligence to effectively see around corners.

The new algorithm has been developed by MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), and can work with cameras on smartphones.

The system uses information about light reflections from objects or people around a corner and using that information, assess the speed and trajectory of that information. According to MIT, the development has major implications, ranging from emergency response to self-driving cars.

As an example, the researchers use the scenario of walking along an L-shaped hallway, with a wall between yourself and objects around the corner. A ‘fuzzy shadow’ is created by these objects in the form of a small amount of light on the ground. That fuzzy shadow is called the ‘penumbra’ and when in the line of sight of the person walking down the corridor, provides the key to the system.

Dubbed ‘CornerCameras’, the researchers’ system uses video of the penumbra to create a series of one-dimensional images that are then stitched together and can provide information about the objects on the other side of the corner.

By watching video footage of the penumbra, stitched together from several images, rather than looking at just one image, the ‘CornerCameras’ system can identify separate objects and their speed and trajectory.

Katherine Bouman, the lead author on a paper about the system, commented on the effectiveness of the system: “Even though those objects aren’t actually visible to the camera, we can look at how their movements affect the penumbra to determine where they are and where they’re going.

“In this way, we show that walls and other obstructions with edges can be exploited as naturally-occurring ‘cameras’ that reveal the hidden scenes beyond them.”

Bouman sees applications from helping firefighters find people in burning buildings to letting drivers see their blind spots. Technology has been developed in the past to help cameras ‘see’ round corners, but these typically used lasers, referred to as ‘time-of-flight’ cameras.

These types of cameras are typically expensive and get easily confused by ambient light, especially outdoors. In contrast, this system works better than the researchers expected outdoors, even in poor weather conditions.