Remote sensing

Cloud Mask: What Makes A Difference For Data Accuracy

satellite imagery with clouds

Satellite imagery benefits multiple industries flaunting a wide range of applications. Yet, it would not be entirely possible without cloud masking techniques. Unlike SAR aperture, optical satellite sensors cannot provide a clear image on a cloudy day, so the issue has to be fixed. This is when cloud masking proves to be an efficient problem solution. This way, the process gets seamless. While satellites observe the situation from the sky, EOSDA computes spatial data with specific algorithms for actionable analytics.

Why Is Cloud Masking Necessary?

Cloud masking in remote sensing prepares imagery for processing and improves product generation.

In preprocessing cloud masking, cloud coverage is not the only trouble. Visibility is also reduced by the shadows they cast and thus decrease the reflectance capability of target objects. Clouds (and, correspondingly, their shadows) differ in shape, size, altitude that depend on the geographic position and climatic peculiarities of the studied region.

Information about clouds is available at multiple sources, unlike information about their shadows, which definitely matters when it comes to image accuracy. Another thing is mask scalability. There should be sufficient image resolution to zoom not only in the entire field but in its separate zones as well. Lacking details leads to errors.

satellite imagery of coast covered with clouds

Where Are Clouds-Free Satellite Images Used?

Since unidentified clouds cause analytic errors, the issue requires proper control. Cloud masking advantage is essential for many spheres relying on remote sensing and change tracking, both for governmental and commercial needs. The raster cloud mask of the image series improves analytics for agriculture, forestry, oil & gas, mining, construction, transportation, communications, environmental protection, military and law enforcement, emergency and disaster response, etc. In particular, big data is used to generate additional layers that give extra insights for researchers and businesses.

For example, such layers are incorporated in farming software. They provide growers with the most recent and credible information to make weighted decisions regarding

EOS Crop Monitoring

Access high-resolution satellite images to ensure effective fields management!

There are many scenarios to apply the masking technique, depending on the purposes and the task stack. Irrelevant when the analysis of image series takes place, and to get more accurate results, it is necessary:

  • to study the image fragment without clouds, or
  • to replace the unclear fragments with clear ones whenever possible.

In any case, the issue has a negative impact on the useful data analysis. This is why it is necessary to remove them from the analysis process to reduce the error possibility in the estimated results.

How The Cloud Mask From EOSDA Works

The EOSDA technology helps identify and differentiate dense cloud coverage, atmospheric haze, and shadows. Open land cover and N/A (i.e., the “unseen” territory in an ordinary satellite image) constitute additional classification categories.

The EOSDA specialists employ the convolutional neural network (CNN) for detection. The CNN architecture is chosen with respect to the operation time and quality. The network is of the encoder-decoder type, which proves to be efficient and is typically used for cloud segmentation.

Does It Make Sense To Separate Clouds From Haze?

The essential difference is that EOSDA doesn’t use classification at the output any longer. The standard classification distinguishes clouds and haze as separate categories, yet it is not quite justified. These are similar objects, which differ by atmospheric relative humidity. The concentration of water vapor is higher in clouds and lower in haze. For this reason, it may be hard to separate one from the other – their borderlines can be obscure. Taking this into consideration, the EOSDA science team has elaborated its own approach to neural network activation function that allows a more accurate differentiation.

The training set by the markup specialists enabled experiments to find out the role of different bands in identification. So, remote sensing wavelength for cloud masking really matters. Since EOSDA uses the imagery retrieved by Sentinel-2, the scientists based their work on the theoretical guidelines from the satellite documentation. They employed the recommended cloud mask bands from Sentinel-2 at first. The experiments somewhat changed the choice of channels for detection. To be precise, there is no haze in the classification at all. Instead, there are clouds that can be categorized depending on the density of atmospheric vapor concentration.

encoder-decoder convolutional model

Proprietary Concept For Flexible Shadow Identification

Shadows are detected with the same neural network, and the principle of work is the same. However, EOSDA uses a slightly different activation function at the neural network output and, correspondingly, a slightly different error function. While the others apply the standard classification and softmaxloss, the company’s specialists employ a sophisticated regression that is more logical for the concept. Such an approach allows being flexible when it comes to haze, clouds, and their shadows. This way, the EOSDA science team gets neural network results and masks target objects with different thresholds, depending on the task specifics.

For example, semitransparent haze can be acceptable for one task and completely unacceptable for another. It refers to shadows as well. The team distinguishes deep shadows from dense clouds and light shadows from haze.

So, cloud masking at EOSDA goes far beyond the typically used softmaxloss classification. The company’s proprietary concept gives more flexible and accurate results.

The EOSDA mask can be used through API. Please, contact the sales department for details at

EOSDA Cloud Mask Vs. Sentinel-2 Cloud Mask

Cloud masking in Sentinel-2 is rather conventional and approximate. It is more suitable for rough filtering in the preliminary search of required images for further analysis, so more accuracy demands additional processing. Cloud masks obtained with the EOSDA algorithms are more accurate and give a more realistic picture.

The screens below compare EOSDA vs. Sentinel-2 masks.

comparison of EOSDA and Sentinel cloud mask EOSDA vs Sentinel cloud mask

The EOSDA sales department will gladly provide more information on using the cloud mask through API. Please, request details at

Cloud Mask Implementation In Crop Monitoring

In the Crop Monitoring tool, masks allow removing useless data that could negatively affect the analysis results.

However, it would be wrong to consider that cloud detection is the same as cloud masking. These are two different processes. Detection aims at identifying clouds in the image and obtaining a mask. Masking is the next step. It uses the received mask to “hide” useless data, for example, while generating indices in the product.

In particular, a 60m mask is used in Crop Monitoring. Still, the precision of 20m spatial resolution is provided upon request. Such data is available in vector format, too.

With a precise mask, EOSDA enables its customers to analyze satellite images more accurately thanks to excluding images with 60% cloudiness and more, as too dense cover in the image can decrease the quality of the obtained results.

Enjoy The Difference

The EOSDA scientists calculate multiple indices, and clouded images are undesirable for all of them. So, the best option is to remove unsuitable imagery from the analysis for more accuracy.

As appears, it is also important to differentiate between deep shadows and dense clouds as well as light shadows and haze. Furthermore, it is possible to ignore haze for some tasks, but it won’t work for others.

The point is to be flexible and adjust the results to the customer’s needs, and EOSDA does it. Use the EOSDA cloud mask – enjoy extra accuracy with the innovative approach and keep ahead of the game.