RGB Camera

The first generation of cameras and photography was introduced in approximately 1816 [1] and since then have been used for almost all applications we know. Despite some inherent advantages of analog film photography, such as higher spatial resolution, transition to digital imaging was inevitable [2]. Photoreceptor cells in the human eye respond to violet (~440 nm), green (~540 nm),  and yellow (~570 nm) light more than other bands[3].  Similarly, in digital photography, the sensors are designed to be sensitive to specific bandwidths, and at least three bands combined to form a color image. For example, Blue (~ 450-490nm), Green (~ 520-560nm), and Red (~ 635-700nm) bands are combined to form a color image in RGB space. Sensors based on this band combination are referred to as RGB cameras with the primary purpose of mimicking the human eye in the digital world. These cameras are readily available off the shelf and has been used in many agricultural studies.

RGB cameras are extensively used for studying phenomena in which plants show visual symptoms such as  diseases that affect the color composition of the leaves and visible pests or fungi on the leaves[4]. This type of research needs high spatial resolution images that are available in RGB cameras more than other bands due to the high energy level at this region, as discussed in the previous section. Sometimes a camera is sensitive to all (or a large portion of the) wavelengths in the visible range that produces panchromatic images which usually have higher spatial resolution compared to images created by separate bands. This is because detectors on panchromatic cameras receive cumulative energy from the whole spectral range and, as a result, smaller detectors can be utilized and still sustain a high signal-noise ratio. Overall, images acquired in the visible region, whether panchromatic or RGB, are endowed with a high spatial resolution that makes these images a perfect choice for studies that need the most details[5].

Moreover, RGB cameras has been widely utilized for Greenness identification using various visible spectral indices as described in the following table:

Table 4- Some of the common vegetation indeces used for Greenees Idnetification using RGB cameras

Index name

Index Formula

Equation number

reference

The Excess Green index

ExG= kG-(R+B), 1.5<k<2.5

Eq. 1

[6]

The Excess Green Minus Excess Red index

ExGR=ExG+G-1.4R

Eq. 2

[7]

The vegetative index

VEG=g/RaB1-a, a≈0.66

Eq. 3

[8]

The Color Index of Vegetation Extraction

CIVE= 0.4R-0.8G+0.4B+18.8

Eq. 4

[9]

The combined index

COM=aExG + bCIVE + cVEG, a≈0.36, b≈0.47, c≈0.17

Eq. 5

[10]

 

Various combinations of R, G, and B bands are intended to minimize the environmental and lighting effects to achieve the best segmentation of green vegetation from the rest of the image. However, their unstable thresholding limits the usage of these indices. Cameras that take advantage of at least one band in Near-Infrared (NIR) region, such as Colored Infrared (CIR) or multispectral cameras, perform significantly better than RGB cameras that rely solely on visible region, in terms of vegetation segmentation [11] and canopy cover estimation throughout season[12].

One of the points that should be considered when using an RGB camera as a remote sensing device is that most of the RGB cameras are designed in a way that can capture relatively wide range in red, green, and blue regions and store them as separate arrays. The central band and bandwidth of different sensors might differ from each other, producing some inconsistency in the results that should be acknowledged. Moreover, sometimes measurements from a very narrow target wavelength are required without being affected by the adjacent wavelengths. As a result, digital sensors should be designed to be sensitive to the proposed narrow bands only, and an RGB camera cannot be used even though the goal bands are in the visible region. For instance, the Photochemical Reflectance Index (PRI) that is a powerful indicator of productivity and stress in both agricultural and forest ecosystems [13], can be derived from narrowband absorbance of xanthophyll pigments at 531 and 570 nm. Using an RGB camera for this index will not produce satisfactory results unless the camera is modified to filter unrelated bands.

 

References

[1]       “History of the camera,” Wikipedia. Jun. 19, 2020. Accessed: Jul. 08, 2020. [Online]. Available: https://en.wikipedia.org/w/index.php?title=History_of_the_camera&oldid=963373872

[2]       J. B. Campbell and R. H. Wynne, Introduction to remote sensing. Guilford Press, 2011.

[3]       R. Hunt, “The reproduction of colour sixth edition,” John Wiley & Sons, 2004.

[4]       K. Padmavathi and K. Thangadurai, “Implementation of RGB and grayscale images in plant leaves disease detection–comparative study,” Indian Journal of Science and Technology, vol. 9, no. 6, pp. 1–6, 2016.

[5]       T. Zhao, B. Stark, Y. Q. Chen, A. L. Ray, and D. Doll, “Challenges in Water Stress Quantification Using Small Unmanned Aerial System (sUAS): Lessons from a Growing Season of Almond,” Journal of Intelligent and Robotic Systems: Theory and Applications, vol. 88, no. 2–4, pp. 721–735, Dec. 2017, doi: 10.1007/s10846-017-0513-x.

[6]       D. M. Woebbecke, G. E. Meyer, K. Von Bargen, and D. A. Mortensen, “Shape features for identifying young weeds using image analysis,” Transactions of the ASAE, vol. 38, no. 1, pp. 271–281, 1995.

[7]       J. C. Neto, A combined statistical-soft computing approach for classification and mapping weed species in minimum-tillage systems. The University of Nebraska-Lincoln, 2004.

[8]       T. Hague, N. D. Tillett, and H. Wheeler, “Automated crop and weed monitoring in widely spaced cereals,” Precision Agriculture, vol. 7, no. 1, pp. 21–32, 2006.

[9]       T. Kataoka, T. Kaneko, H. Okamoto, and S. Hata, “Crop growth estimation system using machine vision,” in Proceedings 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), 2003, vol. 2, pp. b1079–b1083.

[10]     M. Montalvo, J. M. Guerrero, J. Romeo, L. Emmi, M. Guijarro, and G. Pajares, “Automatic expert system for weeds/crops identification in images from maize fields,” Expert Systems with Applications, vol. 40, no. 1, pp. 75–82, 2013.

[11]     H. Zheng et al., “Evaluation of RGB, color-infrared and multispectral images acquired from unmanned aerial systems for the estimation of nitrogen accumulation in rice,” Remote Sensing, vol. 10, no. 6, p. 824, 2018.

[12]     A. Ashapure, J. Jung, A. Chang, S. Oh, M. Maeda, and J. Landivar, “A comparative study of RGB and multispectral sensor-based cotton canopy cover modelling using multi-temporal UAS data,” Remote Sensing, vol. 11, no. 23, p. 2757, 2019.

[13]     T. Adão et al., “Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry,” Remote Sensing, vol. 9, no. 11, p. 1110, 2017.