SoNonLocalMeansFilterProcessing Class Reference
[Smoothing And Denoising]

ImageViz SoNonLocalMeansFilterProcessing engine More...

#include <ImageViz/Engines/ImageFiltering/SmoothingAndDenoising/SoNonLocalMeansFilterProcessing.h>

Inheritance diagram for SoNonLocalMeansFilterProcessing:
SoImageVizEngine SoEngine SoFieldContainer SoBase SoRefCounter SoTypedObject

List of all members.

Public Types

enum  KernelShape {
  CUBE = 0,
  BALL = 1
}

Public Member Functions

 SoNonLocalMeansFilterProcessing ()

Public Attributes

SoSFEnum computeMode
SoSFImageDataAdapter inImage
SoSFEnum kernelShape
SoSFInt32 kernelSize
SoSFInt32 patchSize
SoSFFloat similarity
SoImageVizEngineOutput
< SoSFImageDataAdapter,
SoImageDataAdapter * > 
outImage

Detailed Description

ImageViz SoNonLocalMeansFilterProcessing engine

The SoNonLocalMeansFilterProcessing engine implements the non-local means algorithm for image denoising.

The algorithm computes a weighted mean of a large set of voxels (search windows) around the voxel to be denoised (target voxel). The voxel weight in the search window is a function of the similarity between this voxel and the target voxel. A voxel very similar to the target voxel will be assigned to a weight close to 1.0. A voxel very different from the target voxel will be assigned to a weight close to 0 and will have a small impact on the final denoised value.

The similarity between a voxel and the target voxel is based on the squared intensity differences between 2 patches of size patchSize centered on the current voxel and the target voxel.

Images are often corrupted by noise because of acquisition process. For an image $y$ defined on $\Omega$ which is the image domain, the noise added by acquisition is assimilated to an additive white gaussian noise:

\[ y[k] = x[k] + e[k], \forall k \in \Omega \]

where

\[ e[k] \sim \mathcal{N}(0, \sigma), \forall k \in \Omega \]

The denoised image $z$ is an estimation of $x$ as following:

\[ z[k] = \frac{\sum_{n\in\mathbb{Z}^3} w[k,n]\cdot y[k+n]}{\sum_{n\in\mathbb{Z}^3} w[k,n]} \]

with

\[ w[k,n] = f(n)\exp\left(-\frac{ssd(k,n)}{\lambda}\right) \]

where:

The algorithm looks for matches within the search window area around each voxel.

The naive implementation of previous equation has a complexity of $\mathcal{O}(P^3*S^3*N*M*Q)$ that can be reduced to $\mathcal{O}(S^3*N*M*Q)$ where N*M*Q is the dimension of the input image, P is the patchSize and S is the kernelSize.

FILE FORMAT/DEFAULT


Library references: non_local_mean non_local_mean3d


Member Enumeration Documentation

The shape of the kernel.

Enumerator:
CUBE 

The shape is an hypercube (i.e a square in 2D mode or a cube in 3D mode).

BALL 

The shape is an euclidean ball (i.e a disk in 2D or a sphere in 3D).


Constructor & Destructor Documentation

SoNonLocalMeansFilterProcessing::SoNonLocalMeansFilterProcessing (  ) 

Constructor.


Member Data Documentation

Select the compute Mode (2D or 3D or AUTO) Use enum ComputeMode.

Default is MODE_AUTO

Input image.

Default value is NULL. Supported types include: grayscale binary label image.

The search window shape to apply.

The shape of the window search area can be use to improve the run time. Indeed, choosing a ball shape reduces of 50% the computation time (in 3D). Use enum KernelShape. Default is BALL

The search window size to apply.

  • In case of a square/cube, a value N will produce a square/cube with 2N+1 side length
  • In case of a disk/ball, a value N will produce a disk/ball with a 2N+1 diameter's length The larger the search window is, the better the results usually are. But the size of the search window also effects the run time significantly. This value has to be set to a large enough value so that similar structures can be found within the search window area. Too small values will result in simple blurring of the image because there is not enough structural data within the search window area. Default value is 6.

The output image.

Default value is NULL. Supported types include: grayscale binary label color image.

The patch box size to apply.

The weight of a voxel in the search windows is computed by comparing the neighborhood of this voxel with the neighborhood of the target voxel. The value represents the edge size of the neighborhood volume in number of voxels (neighborhood is a cube), and affects the quality of the result. If this value is either much smaller or much larger than fine structures in the data the algorithm shows little or no effect at all. This parameter has almost no effect on the computation time. Default value is 3.

The similarity is used to compute weight assigned to each voxel in the search window (see expression of $w(k,n)$).

The squared similarity is proportional to the standard deviation of the assumed gaussian noise of the input image. As result, the larger the value, the more the resulting image will be smoothed. The similarity has no effect on the computation time. Default value is 0.6f.


The documentation for this class was generated from the following file:

Open Inventor Toolkit reference manual, generated on 15 Mar 2023
Copyright © Thermo Fisher Scientific All rights reserved.
http://www.openinventor.com/