In astronomical photo processing, you often hear the words ‘stretched’ and ‘linear’ to describe the state of an image. Often a linear image is quite dark, except for the very brightest parts of the image, and a stretched image is one that you can share with others (you don’t need software like PixInsight to view a stretched image).

In this article, I explain exactly what’s meant by the terms stretched, non-linear, unstretched, and linear. At the end of this article, you will have a more intuitive understanding of these terms and will be able to more readily understand the transforms you apply to your images.

Understanding the Histogram

Before we can delve into the topic of stretched versus linear images, you must first understand the contents of an image at the pixel level.

Consider the following image of a scale of gray squares arranged from black to white:

The number below each square indicates the value of the amount of whiteness of the pixels in the square above it, with zero indicating no white pixels, and 255 representing all white pixels – this range of pixel values from 0 to 255 is referred to as an 8-bit range because a computer can represent the value of each square using eight binary digits, or bits. The range of an eight-bit number is between 0 and 255(in binary, 0 is represented as 0000 0000 and 255 is represented as 1111 1111, which is why the range is called an eight-bit range).

The black square on the left is labeled zero, which means there aren’t any white pixels in the square. The next square is labeled 13, which corresponds to about 5% white pixels (13/255=0.05); the square following that is labeled 25, which corresponds to about 9% white pixels;; 128, or 50% white pixels, is at the very center of the chart. The squares continue all the way to 255, which corresponds to 100% white pixels; these numbers are the intensity of the corresponding pixels.

If I were to plot a chart of the intensity on the X-axis and the number of pixels in each intensity on the Y-axis, I would end up with a chart similar to this one:

The X-axis corresponds to the numbers below each of the squares – the color, and the Y-axis shows how many pixels are that color (I just used 10 pixels for the example).

In a more practical example, take a look at the histogram that Photopea.com plots for this image:

  1. Point your browser to photopea.com
  2. Select File – Open from the menu at the top-left
  3. Select the grayscale image
  4. From the menu, select Window – Histogram

The histogram that Photopea.com displays is below:

The histogram is almost identical to the bar graph. The blob you see at the bottom of the fourth line is just the extra grey in the image, and the thick line on the left is a representation of all of the white pixels in the image (there are more white pixels than are in the corresponding box, like the white background color of the image).

Now let’s consider a more practical example of a monochrome image of NGC 6992:

Plotting its histogram, we end up with the following image:

The histogram looks quite different from the last one because most of the pixel values are over on the left side of the chart. This is expected because most of the image’s color is near the black end of the scale, with only a few pixels, relatively speaking, at the white end, represented by the stars in the image. A small number of pixels are in the middle of the histogram – this is a result of the nebulosity and other lighter hues; the numbers are small when compared to the rest of the image, which is mostly almost black from the sky background.

Now that we’ve established the fundamental concepts surrounding image processing and the crucial role histograms play in understanding pixel intensity distribution, let’s transition to the pivotal concept of the transfer function. This mathematical tool serves as the linchpin for transforming linear images, often characterized by their dark and underexposed appearance, into their non-linear counterparts, where hidden details are revealed through the artful manipulation of pixel intensities. Understanding the transfer function and its application is key to unlocking the full potential of astrophotography and creating visually stunning representations of the cosmos.

Understanding the Transfer Function

The transfer function is a mathematical formula that changes, or transforms, an input value to an output value, producing an output that is different from the input.

For example, consider the following chart, the histogram we saw earlier of NGC 6992, but this time, the transfer function appears as a diagonal line on the chart:

The transfer function in this case maps the input to the output in a one to one relationship, so the output image would be identical to the input image.

Now, consider the following transfer function:

In this case, the transfer function is a curved line that rises steeply and then starts to level-off at a certain point. In this case, the resulting histogram would look like this:

This histogram is quite a bit different than the original histogram: all of the pixel values have moved over to the right a little, indicating that the pixel values are brighter on the output, as compared to the input. Here’s the corresponding image after we apply the above transfer function:

The result image is quite a bit brighter than the original because we have pushed the pixel values to the right by having the transfer function rise steeply and then level off later – the pixels that were near black are now more grey, the pixels that were grey are a little more white, and the pixels that were white, remain white as a result of the shape of the transfer function.

The transfer function works by modifying the pixels near the center of the image – this region of pixels are called the midtones, so the transfer function is called the Midtones Transfer Function, or MTF.

Now that you have an idea about the histogram and transfer function, we can now discuss linear and non-linear, or stretched, images. The transfer function plays a crucial role in differentiating between these two types of images, ultimately influencing how we perceive and interpret the visual data captured by telescopes.

Linear and Non-linear Images Defined

A linear image is one where the transfer function maps pixels one-to-one: the input maps directly to the output. A linear image is often referred to as an image as it was acquired directly by the camera attached to a telescope.

A non-linear image is one where the transfer function changes, in some way, how pixels are mapped from input to output.

Consider this linear image of NGC 6992 – the image is mostly black, except for the very brightest stars:

The image’s histogram looks like you would expect:

The pixels that are close to black are all bunched up at the left side of the histogram and are hardly visible on the chart – zooming in 20x on the left side of the histogram reveals the spike in the number of pixels near the black end of the histogram:

Now, let’s apply a more realistic transfer function to this image (I have zoomed in 30x on the left side of the histogram to show you the transfer function):

In this case, the transfer function looks similar to the one we saw before, the MTF, except that it is zoomed in quite a bit, so only a small range of pixels are being manipulated. This is what the output histogram looks like:

From this, you can see that the pixels are redistributed so that the pixels that were almost black are now grey, and the remaining pixels slowly approach white. This is the output image:

You can see quite a bit of detail in this image because pixels that were close to black are now spread out among the image’s grey values, and things that were white have also become a little brighter.

The transfer function is a curve – this is how it looks like when it’s zoomed out:

The mathematical formula that represents the curve is said to be non-linear because it rises quickly at the beginning and then levels off later. If the curve were the diagonal line we saw earlier, the transfer function would be linear because it maps input to output, one-to-one.

Because the transfer function is non-linear, it is said to stretch, or modify, the pixel values in the image – this is why we refer to this type of image as a non-linear, stretched, image.

There are a lot of variations on transfer functions; however, for astronomical images, the transfer function I have already shown you is the most common one for converting from a linear to a non-linear image. Naturally, there are other transfer functions that you can use once your image is non-linear, and these can also be non-linear functions to highlight or lowlight certain parts of your image.

Conclusion

In the vast realm of astrophotography, understanding the distinction between linear and non-linear images equips you with a powerful toolset for transforming raw astronomical data into captivating visuals. Whether you’re a seasoned astrophotographer or just starting your journey, mastering these concepts empowers you to extract the most from your astrophotography sessions and produce images that accurately represent the celestial wonders you capture. By manipulating transfer functions, you can significantly enhance the visual impact of your images, allowing you to communicate the scientific beauty of the universe to a wider audience. Remember, every image is a valuable scientific record, and through non-linear transformations, you can unlock its full potential to inform and inspire.