Tech Startup (Lucid) CEO Claims AI to Replace Depth Sensors In Smartphones

Tech Startup (Lucid) CEO Claims AI to Replace Depth Sensors In Smartphones

We all must have thought about AI and how it can make our lives a hoot, but who

Published By - Debra Bruce

We all must have thought about AI and how it can make our lives a hoot, but who would have thought that they could replace the various sensors, especially on our smartphones? If you think what you are reading is a hoax, well, even you will be surprised once you read on! Lucid, a tech startup led by Han Jin (CEO) and Adam Rowell (CTO) have made bold claims that AI can replace the depth sensors in smartphones soon. This claim comes on the back of the latest AI tech developed by the company which has been capable of learning with time to circumnavigate the various depth data and stereoscopic data to artificially doctor images without enabling AI to replace depth sensors altogether.

How AI is Transforming Smartphones?

If TV adverts are some clues, companies such as Oppo have already started marketing artificial intelligence based smartphones which don the role of image beautifiers. Lucid is currently pitching it’s real-time 3D fusion technology to smartphone OEMs to incorporate the AI based camera application to reduce the hardware cost of the depth sensors and instead pay a minimal license fee for the artificial intelligence module. The company claims that the artificial intelligence benefits include refraining from regular updates to the artificial intelligence.

Most smartphone software requires timely OTA updates, but the AI is capable enough to learn on its own as time passes and hence does not require any updates. Some of the interesting Lucid AI applications include perceiving three dimensions and distances and being able to correct surface textures, bad lighting or directional lighting, and depth perception to produce cleaner images captured on the smartphones.

Now, one has to think about it, shall we still depend on the hardware to get our images right? Or shall we let the software take over? Is the software ready yet to consider all possible situations when working on the images? Lucid certainly seems confident, and they have over three years, worth of data from the various camera and drone imaging captured and worked on by the artificial intelligence. Moreover, they also have the price point going for them where a depth sensor which is IR based working on the current iPhone X costs about $60 apiece; an AI-based solution would do the same job with a $10 costing dual camera setup. Something definitely to think about!