26 Hardware processing
The book is still taking shape, and your feedback is an important part of the process. Suggestions of all kinds are welcome—whether it’s fixing small errors, raising bigger questions, or offering new perspectives. I’ll do my best to respond, but please keep in mind that the text will continue to change significantly over the next two years.
You can share comments through GitHub Issues.
Feel free to open a new issue or join an existing discussion. To make feedback easier to address, please point to the section you have in mind—by section number or a short snippet of text. Adding a label characterizing your issue would also be helpful.
Last updated: October 15, 2025
26.1 Hardware processing overview
To this point, we have considered the environmental light field Section 2.1, how it is measured by the optics Chapter 7, and how it is then converted into electrons at each pixel Section 15.1. We have also considered how the visual system would capture the same light field.
The first part of this chapter on image processing describes the steps that converts the data measured by the sensor into a signal that can be viewed by a person.
It seems as if we should have display go first so we can do it. Or we might just assume that people know displays are typically RGB and hope for the best?
Also, the human section on color might include some minimal display description that we could use here as the basic display model, with more advanced displays to come later, in the chapter Section 25.1.
26.2 Lens shading
26.3 Acquisition policy here, or earlier in sensor-system?
26.4 Demosaicking
Space-color correlations?