Article

Choosing the right image sensor for machine vision applications

David Hustler headshot
sensor image close up

Image sensor fundamentals

Image sensors, sometimes called imagers, convert light (photons) or other forms of electromagnetic radiation into electric current (electrons). By measuring the radiation emanating from an object at different points, an image of that object can be constructed, captured, and displayed. The smallest unit of a digital image is called a pixel.  

Image sensors are fundamentally monochromatic. To facilitate color imaging for visible light, the most common approach is to employ a matrix layer of color filtering above the sensor. The filter separates light into its red, green, and blue (RGB) components and interpolates the outputs of individual pixels to display color. 

CCD vs. CMOS image sensors 

You may remember how early digital cameras were plagued by short battery life – a dozen shots might be all you could take before the battery needed changing or recharging. This is not least because such cameras used power-hungry Charge Coupled Device (CCD) image sensors. In the 1990s CMOS image sensors emerged and by the mid-2000s CMOS had become the dominant image sensing technology. The trend was driven by the lower cost and lower power consumption of CMOS imagers compared to CCDs.  There were early concerns that CMOS sensors created more noise, which resulted in image degradation, but this problem has been mitigated to some extent as the technology has advanced. In recent years the two major suppliers of CCD image sensors have exited the business to focus R&D resources toward CMOS, rendering CCD effectively done. 

How CCD image sensors work

Charge-coupled devices were developed as semiconductors memories and first appeared in 1969. Three years later, Fairchild Semiconductor demonstrated a prototype CCD for photography on a 100 x 100-pixel array, and by the mid-1980s, you could buy a 1-megapixel consumer camera with a CCD image sensor. The digital photography era had begun.

A CCD image sensor comprises a capacitor array located behind a lens, and in the case of color sensors, an RGB filter array. Each capacitor represents an image pixel. Photons falling on the capacitors generate electric charge proportional to the light intensity at that point in the array. The charge is passed from each capacitor to its neighbour and then through the final capacitor to an amplifier that converts the accumulated charge into a voltage. The process is repeated to build up the image and the analog voltages generated are digitized using an analog-to-digital converter (ADC) before further processing.


Figure 1: CCD vs CMOS image capture

How CMOS sensors work

In contrast to CCDs, CMOS sensors convert charge to voltage at each pixel and then perform pixel-level amplification of the signal before it is passed to an ADC. Figure 1 illustrates this difference. It is a much faster process than CCD image capture, so facilitates higher frame rates in video cameras. The energy consumption of CMOS image sensors can be up to an order of magnitude lower than for the equivalent CCD device, they cost less to manufacture, and they are constantly improving as CMOS process technology develops. As a result, CMOS image sensors have replaced CCDs in most applications.

Understanding pixels and image sensor resolution

The more pixels, the greater the resolution of the image. The smaller those pixels are the greater resolution you can achieve and the more detail you can discern within a given image size. Today, you can buy image sensors with 100 million pixels, each measuring 1.5 microns x 1.5 microns, and even consumer cameras can provide 48-megapixel (MP) resolution. However, as a general guide, the higher the resolution, the higher the cost of the image sensor. It’s also worth bearing in mind that monochrome image sensors cost less for a given resolution than color ones, so monochrome should be considered for applications where color is not important. Although CMOS sensors use semiconductor technology, the process cannot follow Moore’s Law in the way that other semiconductors do. This is because the wavelength of light is around 0.5 microns, so diffraction effects limit the minimum usable pixel size and hence the maximum resolution for the size of the sensor. There are likely to be marginal improvements in future but after a while, the only way to improve resolution will be to make larger sensor chips. In any case, small pixels are not always desirable because larger pixels capture more light and exhibit a better signal-to-noise ratio, so this is a significant trade-off. In practice, image sensors for industrial applications typically use pixel sizes between 1.5 microns and 10 microns. 

Although often the first parameter considered when choosing an image sensor, the resolution is not the only important characteristic. There are many others, but the most common considerations are sensor size, frame rate, shutter type, responsivity, and dynamic range. Interface types, power consumption, operating temperature range, and mechanical formats are also important, but these apply to any system component, so we’ll focus on those applicable to image sensors here.

Understanding image sensor sizes

For a given lens arrangement, the larger the sensor, the wider the field of view that is captured. The nomenclature of image sensor sizes is historic and confusing, but most imagers adhere, at least approximately, to the 4:3 aspect ratio of early television pictures. ‘Standard’ sensor sizes are specified from ¼ inch to 35mm – the largest being referred to as ‘full frame’ because it equates closely to the 35mm film size of cameras. Figure 2 shows the metric dimensions for each standard size .

Standard Size Diagonal mm Horizontal mm Vertical mm
1/4 inch 4.5 3.6 2.7
1/3 inch 6 4.8 3.6
1/2.5 inch 7.2 5.8 4.3
1/2 inch 8 6.4 4.8
1/1.8 inch 9 7.2 5.4
2/3 inch 11 8.8 6.6
1 inch 16 12.8 9.6
1.1 inch 17.6 14.08 10.56
4/3 inch 21.6 17.3 13
APS-C 27.9 22.4 16.8
APS-H 35.5 29.2 20.2
35mm 43.3 36 24

The metric dimensions of standard image sensor sizes

In any camera, image sensor performance is only part of the story. It’s important to select a compatible lens to minimize distortion, particularly at the edges of the image. The larger the sensor, the more challenging and expensive this issue becomes.

Images sensor frame rates

The frame rate is the number of images captured per second. It’s expressed as Frames Per Second, or FPS. The maximum frame rate is related to shutter speed, the resolution – which determines how much data needs to be transferred from each frame, how fast the data can be read from the sensor, and the speed of data transfer that's possible in the system. To monitor fast production lines, you need higher frame rates.

When using image sensors for production line inspection, the calculation of frame rate is simple. If the objects to be inspected move along the line at a rate of ten per second, the minimum frame rate needed is 10 FPS to capture one shot of each object. 

Most commonly, industrial image sensors have frame rates of a few tens of FPS but rates can range from less than one FPS to monochrome sensors that will operate at over 800 FPS.

Electronic rolling shutters vs. global shutters 


Figure 2: This onsemi 16 MP, global shutter image sensor for industrial cameras offers up to 65 FPS and is available in monochrome and color versions

How the pixels on an image sensor are exposed to light depends on the type of shutter employed on the camera. Electronic rolling shutters expose, sample, and read each pixel in a sequence, not simultaneously. If a machine vision camera is viewing a fast-moving conveyor line, for example, this may result in the shape of the image being distorted. It’s usually only a problem for fast-moving images, but wherever there’s a moving image, there may be a preference for a global shutter that samples and exposes all pixels simultaneously (and reads each pixel sequentially). Global shutters have long been used with CCD image sensors but, until recently, the availability of CMOS imagers that will work with global shutters has been limited by the greater process complexity involved in manufacturing suitable pixels. However, economical global shutter image sensors are now available from manufacturers such as AMS Osram and onsemi. Global shutter sensors are larger and more expensive than similar rolling shutter types, but they offer higher sensitivity and responsivity, which makes them attractive for many industrial applications. 

Image sensor sensitivity and responsivity

Sensitivity and responsivity are sometimes used interchangeably, and incorrectly, in image sensor data sheets. Device sensitivity refers to the minimum input stimulus needed to produce an output with a specific signal-to-noise ratio, or other defined threshold. The output of an image sensor is not totally linear relative to the intensity of light reaching it. At low light levels, a sensor's output may be noisy and erratic. As illumination increases, there is a linear relationship between illumination and output voltage until saturation sets in and the curve flattens out. The relationship between incident illumination and output voltage in the linear region is the responsivity of the sensor. Responsivity is expressed in Volts per lux-second but where the sensor includes an ADC to convert the analog response to a digital value, it may be stated as least significant bits (LSBs) per lux-second. Typical responsivity figures for image sensors range from 1 Volt per lux-second to 10 Volts per lux-second. 

Incidentally, one lux is the illumination level produced by an even distribution of one lumen of light over one square meter. The lumen is a measure of brightness emitted by a light source. To give some idea of scale, it has been described as about the same light intensity you get at one-foot distance from a birthday cake candle. It’s not bright. A 100W incandescent light bulb will produce approximately 1,600 lumens. 

The dynamic range of image sensors

The dynamic range of an image sensor defines the minimum and maximum levels of light that the sensor can acquire and produce an output signal. The higher the dynamic range of the sensor, the greater its ability to capture high-contrast images. Dynamic range can be described as a ratio but is more often quoted in decibels, or dB. The human eye has a dynamic range of about 100,000:1 or 100dB. The typical dynamic range of a machine vision sensor is around 60dB to 80dB. 

How much do image sensors cost?

The sheer breadth of sensor types and formats makes it impossible to generalise about cost. Furthermore, it’s not possible to directly compare the costs of CMOS vs CCD sensors, even though we know the latter are more expensive to make, because the formats of common devices tend to be different. For example, CCDs are still popular for linear scanning applications where typical image sizes are between 1,500 x 1 pixels to 10,680 x 4 pixels. In small quantities for development, these devices will range from around 25 USD to 250 USD each. Monochrome versions are the lowest cost and color versions with greater than 2-megapixels resolution are the most expensive.

Priced on the same basis, the most popular CMOS image sensors, which have resolutions between one and five megapixels for industrial applications, typically cost somewhere between 25 USD and 35 USD each.

In all cases, prices are significantly lower for product volumes.

What is smart vision?

Increasingly, vision systems are connected to computers that run artificial intelligence (AI) algorithms. One result is smart machine vision. 2D and 3D images are captured, grouped, and analyzed far faster and more accurately than humans could perform these functions. Depending on factors such as camera speed and resolution, the tiniest inconsistencies can be detected on a production line and where necessary, corrective action initiated. This may mean identifying a dry solder joint on a printed circuit board, a tiny pattern flaw in woven cloth, or a card with a pill missing on a pharmaceutical packing line.

Image sensors are also used to analyze liquids. For example, by measuring light scatter, sensors coupled with a computer vision system can determine what compounds are present in raw milk. No milk is wasted because the tests don’t involve adding any potential contaminant to make the tests, which are important for quality monitoring and for early identification of dairy diseases.
Rapid AI analysis of video footage now plays an important role in security too. For example, law enforcement agencies use smart vision systems to identify criminals as they pass through airports or other transportation hubs. Cameras connected to AI-enabled computers facilitate video analysis at speeds that are impossible for humans to emulate.

Image sensors are the essence of cameras used in smart vision systems that make us more productive and keep us safe.

Other selection factors

This article has touched upon the factors that are most commonly critical in selecting the best image sensor for a given application. As mentioned earlier, analog/digital interfaces, physical form factor, power consumption, and environmental and regulatory factors need to be considered too. 

A simple selection tool for image sensors can be found by going to Avnet.com and typing 'Image Sensors' into the search box. The tool provides a good starting point in finding the ideal sensor for your application.

Should your application call for 3D image sensing, this article provides a good overview of the available technologies.

About Author

David Hustler headshot
David Hustler, Image Sensor Ecosystem Management Specialist

David Hustler has a strong engineering background in mixed signal image sensor products. His current...

Marketing Content Spots
Related Articles
Related Articles
warehouse robot
Navigating the Future: How Avnet is addressing challenges in AMR Design
By Jamie Pederson   -   April 16, 2024
Autonomous Mobile Robots are performing tasks that are too dangerous, tedious, or costly for humans. Designing AMRs involves many technical and business challenges. This article covers these challenges and how Avnet will help you overcome them.
an EV charger with CO2 sign
The future of EV charging: revolutionary developments are closer than you think
By Harvey Wilson   -   October 4, 2023
Electric vehicles (EVs) are big business. You only need to look at Tesla’s share price to see just how big. Imagine how much more valuable the market would be if EVs could charge in a few seconds, not minutes or hours.
Related Events
Related Events