How All Programmable Revolutionizes Embedded Vision
How All Programmable Revolutionizes Embedded Vision

How all programmable technology revolutionizes embedded vision

Person using a self driving car

Autonomous driving is just the start of EV solutions

In 1982, Knight Rider brought us KITT, an artificially intelligent car that fought crime through high tech features like embedded vision. More than 35 years later, auto makers’ aspirations are racing past driver only or assisted automation like monitoring blind spots or adaptive cruise control. They have their eyes on the prize: fully autonomous driving.

In fact, IHS Automotive predicts autonomous car sales will hit 21 million by 2035.

Embedded vision (EV) replicates the human ability to take in visual information and process it in order to make decisions. Except EV does it with cameras, cables and CPUs, allowing machines like cars to absorb information and make decisions as well.

That creates a host of design challenges:

  • High performance demands: Enabling an embedded vision system to perform analytics in real time is a complex task. The higher the resolution and frame rates of the image, the more computation power required to process the data and extract meaningful information from it. The increasing challenge on designers, is that it must be done at a faster rate and a lower amount of power than ever before. The advent of machine learning algorithms will only exacerbate these demands.
  • Complex programming environment. Building a design that is differentiated and responsive while also able to immediately adapt to the latest algorithms and image sensors creates exponential complexity and stress. You’re left with large decisions such as what tools and emerging techniques can help you build a design that supports quality.
  • Shortened design cycles. Systems must be highly differentiated, extremely responsive and able to immediately adapt to the latest algorithms and image sensors. They must also hit the market faster than their competitors. With shortened design cycles, designers are having to choose between creating next-generation architectures and getting their IP to market on deadline.

Let’s take our autonomous driving example. This EV application, which promises to simplify a common task for people globally, is deeply complex system with multiple interactions between all of its parts:

  • Sensing: processing raw frame-by-frame data via in-vehicle sensors
  • Perception: taking data to do object detection, classification and positioning
  • Mapping: identifying safe driving areas and objects within a mapped area
  • Localizing: pairing information with the vehicle’s high-accuracy GPS
  • Route/path planning: determining short and long-term driving paths—including incident reaction
  • Motion planning: navigating vehicle control strategies appropriate for selected routes
  • Vehicle control: issuing braking, throttling, steering and suspension commands while driving
  • Driver interaction: providing feedback to the driver, sensing driver intent and handing off control

This used to be quite a challenge, considering whether a tiny sports car or a large truck, truly autonomous driving needs a network of cameras on all corners of the vehicle.

But All-Programmable SOCs have brought more clarity to this complex process.

Previous generation ADAS systems required an external processor to implement the algorithms for image processing and analytics. Such ASSP-based architectures required proprietary interface protocols and were more challenging to customize for feature differentiation.

With the advent of All Programmable MPSoCs, software bottlenecks can be accelerated in high-performance programmable logic while retaining the reconfigurability required for rapid upgrade. Designers may choose a software-defined development flow within a familiar, eclipse-based environment using C and C++ languages and leverage hardware-optimized image processing libraries such as OpenCV for an optimal partition of embedded vision algorithms between software and hardware.

As the auto industry transitions from ADAS to autonomous driving, ever greater concentrations of sensor fusion will combine visible-light cameras, radars and LIDAR systems distributed across the vehicle, connected over high-speed serial links. Combining multiple sensor interfaces, analytics and vehicle control into one system helps designers create lower power, higher efficiency data paths that enable self-driving cars to prevent a break-in before the first window pane is shattered or stop a self-driving car dead in its tracks to avoid an accident with an impending obstacle.

But it does more than just simplify design. It also solves problems for end customers.

Right now, most EV in cars reaches level 0 or level 1 in the autonomous driving spectrum, enabling those in the passenger seat with blind-spot monitoring or lane-keeping assistance. Fully driverless cars (level 5) is the hardest version of EV implementation to pull off. But considering 80% of accidents are a result of distracted driving, according to the NHTSA, self-driving cars are also the key to safer roads for us all. And these high efficiency, low power solutions make EV accessible at accessible price points.

We’re driving toward new innovations in embedded vision – and the future of driving is only the beginning.

How All Programmable Revolutionizes Embedded Vision
How All Programmable Revolutionizes Embedded Vision
Related Articles
warehouse robot
Navigating the Future: How Avnet is addressing challenges in AMR Design
By Jamie Pederson   -   April 16, 2024
Autonomous Mobile Robots are performing tasks that are too dangerous, tedious, or costly for humans. Designing AMRs involves many technical and business challenges. This article covers these challenges and how Avnet will help you overcome them.
Industrial cloud platforms, the engine that improves quality and performance
Imagine the Metaverse
June 15, 2023
Via a combination of virtual reality (VR), augmented reality (AR), and mixed reality technologies, end-users can visit whole new worlds and experience incredible interactive activities. The digital world we’re accustomed to will be elevated to uni
How All Programmable Revolutionizes Embedded Vision
Related Events

No related Events found