EETimes on Event-Driven Sensor Use Cases

EETimes on Event-Driven Sensor Use Cases - Hallo friendsCAR ON REPIYU, In the article you read this time with the title EETimes on Event-Driven Sensor Use Cases, We have prepared this article for you to read and retrieve information therein. Hopefully the contents of postings Article car review, Article mobile review, We write this you can understand. Alright, good read.

Title : EETimes on Event-Driven Sensor Use Cases
link : EETimes on Event-Driven Sensor Use Cases

Read too


EETimes on Event-Driven Sensor Use Cases

EETimes publishes an article "Neuromorphic Vision Sensors Eye the Future of Autonomy" by Anne-Françoise Pelé. Few quotes:

“Why do we say that an event-based vision sensor is neuromorphic? Because each pixel is a neuron, and it totally makes sense to have the artificial intelligence next to the pixel,” Pierre Cambou, principal analyst at Yole Développement (Lyon, France) told EE Times.

“It has taken awhile for us to come with a good strategy,” iniVation’s CEO Kynan Eng said in an interview. While other companies perform high-speed counting, Eng said “it is no big deal counting objects at high speed” since conventional cameras can get “a thousand frames per second, even more.” If applications don’t need to respond immediately, then “there is no point using our sensors.”

“I would [categorize] industrial vision as a relatively low risk, but low volume market,” said Eng. Hence, there has been little interest from venture funds. With an eye toward organic growth, iniVation is thinking in terms of economies of scale. Through its 2019 partnership with Samsung, iniVation shifted from manufacturing and silicon sales to selling cameras to the machine vision industry. “You can sell the $100 silicon, or you can package it in a camera and sell a $1,000 camera,” noted the Yole analyst Cambou.

“We recognized that it did not make sense for us to become a chip company,” Eng said. “We could raise a billion, and it would still not be enough to make the chip ourselves. People were asking us why our cameras were expensive and how we could make them cheap.” Partnering with Samsung, “makes that question go away.”

“A window for mobile will open in 2021 or 2022,” said Cambou. “Today, we have five cameras on the back of a Huawei phone.” Moving forward, he continued, “I don’t see anything else than an always-on neuromorphic camera. Some people talk about multispectral, but I am more thinking about always-on awareness.” An event-based camera could enable touchless interactions such as locking and unlocking phones.

Event-based cameras are power-efficient because pixel activity is insignificant; almost no energy is needed for “silent” pixels. That’s a selling point as autonomous vehicles transition from internal combustion to electric engines. For car companies, “power consumption is much more important than what I thought initially,” said Eng. “In their current planning for electric cars, if a car uses a 4kW total power budget at constant speed, half of that is for moving the car and the other half is for the computing. Every watt you can save on the compute, you can add to the range of the car or have a smaller battery.”


iniVation’s DAVIS346 DVS


Thus Article EETimes on Event-Driven Sensor Use Cases

That's an article EETimes on Event-Driven Sensor Use Cases This time, hopefully can give benefits to all of you. well, see you in posting other articles.

You are now reading the article EETimes on Event-Driven Sensor Use Cases with the link address https://caronrepiyu.blogspot.com/2020/04/eetimes-on-event-driven-sensor-use-cases.html

Subscribe to receive free email updates:

0 Response to "EETimes on Event-Driven Sensor Use Cases"

Post a Comment