The DynamicBackgroundExtraction process, or DBE, is a powerful process in PixInsight that corrects a broad range of common issues like gradients, vignetting, and other flat field problems. It is recommended to use DBE early in your processing workflow. DBE works equally well on monochrome as well as color images. If using DBE with a color image, the process divides your image into its individual components, executes DBE, and then recombines your images, so there’s no need to use DBE on each of your individual R, G, and B images.
The way DBE works is by taking measurements from your image, based on samples you place, or elect to have placed for you, models the background, and then compensates for the variation, resulting in a more even background.
Because this process takes measurements from your image using samples, you need to have a good understanding of samples and what their various parameters mean.
Sample Download
In case you don’t have your own image and want to follow along in this article, you can download a sample observation of Messier 63 – the Sunflower galaxy. The ZIP file includes the luminance and R, G, and B images that are already aligned so you can start working with the image right away.
Download sample observation of Messier 63 (43Mb)
Understanding what DBE Does
The best way to explain what DBE does is to demonstrate its effect with an example. Earlier I provided you with a link to download the sample data for this article. In that sample is are L, R, G, and B images and here is what the Luminance image looks like when it’s opened in PixInsight:
This image has some important issues: the top-left and bottom corners are darker than the rest of the image and there’s a gradient that goes out from the center of the image and gets darker toward the sides of the image.
After applying DBE with some carefully selected samples and some changes to the default settings in DBE, this is what the image looks like:
The difference is significant – this image has a more even background throughout and is ready for further processing.
For the RGB image, this is what the image looked like before applying DBE:
There are areas where there’s a lot of green and then areas which are blended with red and green, the corners are particularly green (except for bottom-left which is more red).
This is what the image looked like after applying DBE:
Again, the difference is significant: the background is much more even and the problems with the image appear to have been resolved.
Understanding DBE Samples
If you want to follow along, open the M63_L.fit image, and press CTRL+A to enable the Auto Stretch ScreenTransferFunction.
Start the DynamicBackgroundExtraction process by selecting from the menu Process – BackgroundModelization – DynamicBackgroundExtraction, or select it from <All Processes> option under the Process menu.
Click somewhere in the image to associate DBE with the image on your PixInsight desktop.
Click somewhere on the background sky to place a sample and the upper part of your DBE process should look something like the following:
Here are some points about what you are looking at:
- At the top-left of the box is the sample number, and this increments with each sample you place
- The Anchor X and Anchor Y fields show the location of the sample within the image
- The Radius field shows the sample’s radius in pixels
- The R/K field (and the G and B fields below it) show the value of the pixels within the sample
- The Wr field shows the sample’s relative weighting – note the approximate value in this field now
- The box on the right side shows what the pixels look like within the sample
As you just learned, the box on the right shows you the pixels within the sample, but it also shows something else – it shows what pixels are being included in the sample. For example, place a sample – by clicking on your image – and place it on a star and your screen might look like the following:
The star in the box is black and there are some white pixels in the box as well. What this is showing you is that the star is being ignored and is not part of the sample. You’ll also notice that the number in the Wr box is lower than it was before when you placed a sample on the background sky. This means that the amount this sample contributes to the algorithm’s calculations is less than the sample containing background sky.
The Wr value ranges between 0 and 1 and you can think of this as a percentage of the number of pixels that are usable by the algorithm when compensating the background.
The sample radius is probably quite small, and you can make it bigger by entering a larger Radius value in the sample’s Radius box. Any subsequent samples you place after changing the Radius will have the new radius.
When placing samples, you need to ensure that you pick samples that truly represent background sky and not stars, parts of a galaxy, nebulosity, or other aspects of your image that represent the signal in your image.
There are times when placing a sample that DBE will consider the sample unusable, as shown in the following screenshot:
The sample color is red, indicating that it is being excluded from consideration by the DBE algorithm. Of course, the solution to this is to move the sample, and you can do that by simply selecting the sample and dragging it. However, there are cases where moving the sample is not practical.
In case your sample does indeed represent background sky but is being rejected, expand the Model Parameters (1) section and increase the Tolerance.
The Tolerance field controls how much of a sample is used. What the Tolerance does is increase the Wr (the weight) of all of the samples.
While many people simply increase the Tolerance to some large value (the maximum is 10), I recommend that you use the smallest Tolerance value that’s practical because it affects how the DBE algorithm works to create the model of the background sky. Think of the Tolerance as a value that represents how much of the background sky is considered to be bright with higher values allowing for brighter regions.
Before moving on, you can increase the size of all of the samples on your image at once by opening the Sample Generation section, enter a larger value in ‘Default sample radius’, and then click ‘Resize All’. Generally, you want to have your samples large enough to capture the region of sky you are sampling, but not so large that they encompass a large section of sky.
Automatically Generating Samples
While you can certainly place samples yourself, DBE can help you by generating samples on its own. It uses the values in the Tolerance Shadows relaxation settings (under Model Parameters (1)), and ‘Minimum sample weight’ along with ‘Samples per row’ (under Sample Generation) and covers your image in samples.
When placing samples automatically, the default value of 0.75 for ‘Minimum sample weight’ does a good job of avoiding areas of high signal like stars and parts of galaxies.
You can generate samples of any size and samples per row, by clicking the Generate Samples button.
Using the automatic process is not recommended because DBE is intended to be used by you and you are the best judge of what is in your image. The automatic process simply covers your image with a lot of samples, and this can affect the background model that DBE produces, often resulting in less than optimal results. If you want to use an automatic process, you are better off using AutomaticBackgroundExtractor (ABE) instead.
How To Place Samples On an Image
You need to use your eyes take a look at your image to figure out where to place samples. Take a look at your image and consider the sources of additive light, or perhaps consider the gradient across your image.
When placing samples, it is quality not quantity that makes the big difference.
For the sample image, this is where I placed my samples:
On studying the image, I saw that there was a darker region at the bottom right, the bottom left, and a slightly darker region in the upper parts of the image. I also saw some areas that needed extra attention from DBE so I placed more samples in that region. I avoided areas of high signal – like the galaxy itself along with the stars. There was a nebulous region at the left side of the image and I placed a sample close to it because the background was not even in that region.
Understanding Target Image Correction Settings
The Target Image Correction settings area of DBE is shown below:
This tells DBE what to do with the model it generates.
The Correction parameter has two values: Subtraction and Division.
Use Subtraction when dealing with image variations caused by things like moonlight, light pollution, glow from a large nearby star and other things that add a constant value to your image.
Use Division for all other cases – for example, use it for gradients not caused by the sources I mentioned for Subtraction, use Division to correct vignetting and other artifacts. Generally, I tend to use Division in about 80% of the images I process.
The Normalize option attempts to neutralize the background by taking the average of all of the samples in your image and calculating the average value, it then adds that average value to the background. You generally don’t need to use this parameter and can leave it unchecked.
The ‘Discard background model’ option allows you to completely ignore the background model. It is recommended that you always review the background model to ensure that you’re correctly capturing the problems in your image.
The ‘Replace target image’ allows you to replace your image with the result of using DBE. It is recommended that you enable this setting because doing that adds the DBE process to the history of your image (which you can find in the History Explorer). You can go back at any time and review the settings you used for DBE with this option enabled.
Evaluating The Background Model
As you just learned, you have the option of discarding the background model; however, this is not recommended because the model actually provides you with a lot of information about your image.
For example, this is the background model of the sample image for this chapter:
The background model makes it apparent exactly what DBE is using to correct your image. In this background model we can see that the overall structure is smooth in that there are not large variations between regions. We also confirm that the three corners of the image are darker than the rest of the image and confirm that there’s a gradient from the center of the image.
If the background model is significantly different from what you see in your image, chances are that you have misplaced one or more samples, and possibly have a Tolerance value that is too high.
In summary, it is recommended to always review the background model to ensure it is consistent with what you see in your original image.
Using DBE More Than One Time
With some difficult images, it may be necessary to use DBE more than one time. For example, you might still see a gradient on your image after applying DBE the first time. In this case, feel free to apply DBE again using a different set of samples.
Summary of Using DBE
Here’s a summary of what you just learned:
- You need to have at least one image open on your PixInsight desktop
- Once DBE is open on your desktop, click anywhere in your image to associate your image with the DBE process
- Place samples on true background sky, avoiding things like stars, galaxies, nebulosity or other areas of high signal
- Adjust the size of your samples to capture a reasonable amount of background sky
- Carefully adjust the Tolerance to include regions of brighter sky
- Set the ‘Correction’ parameter appropriately (refer to the section called ‘Understanding Target Image Correction Settings’ above)
- Enable the ‘Replace target option’ to add the DBE process to your image’s history so that you can recover the settings you used in the future (assuming you save your workspace as a project)
- Always evaluate the background model to ensure it is consistent with what you see in your image
- You may need to use DBE more than once with difficult images
Final Thoughts
DBE is a powerful tool that can make a dramatic difference in your images – here are some final pointers:
- Use DBE early in your workflow
- Avoid using the automatic sample generation option
- DBE works equally well on monochrome as well as color images
- Sometimes a background is so complex that it requires more than one application of DBE
Update
There’s another effective way to remove the background in your astronomical images – it’s called GraXpert and is a standalone, open-source application for Windows, Mac, and Linux – read about it here.
Conclusion
In this article, you learned about DynamicBackgroundExtraction, or DBE. You learned about what it’s used for, you learned about samples, you learned about how to evaluate the background model, and you learned the overall process for using DBE on your images.
More Articles In This Series
This article is part of a whole series of articles about processing images using PixInsight:
- If you are using the LRGB/broadband processing workflow, click here for the index article for processing broadband images.
- If you are using the narrowband processing workflow, click here for the index article for narrowband images.
You must be logged in to post a comment.