Conventional cameras will now be able to see in three dimensions

Researchers at Stanford University have developed a new approach that allows standard

image sensors see light in three dimensions. That is, these ordinary cameras can soon be used to measure the distance to objects.

Standard image sensors that are alreadyinstalled in almost every smartphone used today, fix the brightness and color of light. Based on a standard sensor technology known as CMOS, these cameras are getting smaller and more powerful every year, and now offer resolutions in the tens of megapixels. But so far they have only seen in two dimensions, creating flat 2D images.

Measuring the distance between objects usinglight is currently only possible with the help of specialized and expensive lidars (transliteration LIDAR English Light Detection and Ranging “detection and ranging using light”). Most often they are installed in self-driving cars. This lidar collision avoidance system uses lasers to determine the distance between objects. The problem is that existing lidar systems are large and bulky.

Scientists from Stanford solved the problem and developeda new approach that allows standard image sensors to see light in three dimensions. The solution is based on the use of a thin lithium niobate plate coated with two transparent electrodes.

Lithium niobate is a piezoelectric.That is, when electricity is injected through the electrodes, the crystal lattice at the base of its atomic structure changes shape. It vibrates at very high, predictable and controllable frequencies. And when it does, lithium niobate modulates the light heavily—with the addition of a pair of polarizers, this new modulator effectively turns the light on and off several million times a second.

Read more

The "fifth element" exists: a new experiment will confirm that information is material

Found the remains of political refugees from the Middle Ages: they were considered victims of the plague

Marine fungus recycles plastic in two weeks