|Serie de libros (78)||
|Medien- und Kommunikationswissenschaften||
|Bioquímica, biología molecular, tecnología genética||107|
|Ecología y conservación de la tierra||130|
5. Auflage bestellen
|Numero de paginas||124|
|Laminacion de la cubierta||mate|
|Lugar de publicacion||Göttingen|
|Lugar de la disertacion||Braunschweig|
|Fecha de publicacion||15.04.2019|
|Clasificacion simple||Tesis doctoral|
|Palabras claves||Angewandte Mathematik, Optimierung, Variationsmethoden, mathematische Bildverarbeitung, Deflektometrie, Primal-Dual Algorithmus, Totalvariation, verallgemeinerte Totalvariation, Regularisierungen, Dualitätslücke, applied mathematics, optimization, variational methods, mathematical imaging, denoising, deflectometry, primal-dual algorithm, total variation, total generalized variation, regularization, primal-dual gaps|
|URL para pagina web externa||https://www.tu-braunschweig.de/iaa/personal/komander|
Consider manufactured parts, such as screws, car doors, lenses, or mirrors for lasers, for example. All these manufactured parts have to go through quality inspections checking if there are unwanted bumps or scratches that should not be there. There are different methods to measure the manufactured parts. The goal is to describe the measured object exactly by the data. This is one example for a so-called inverse problem.
The application that we consider in the first part of this thesis is a data fusion process. The given dataset is a result of a deflectometric measurement process. These processes deal with object measurements of specular objects, such as lenses or mirrors. The aim is to calculate a dataset that describes the measured object exactly. The output data consists of two sets of separately measured types of data which suffer from different accuracy due to different sensitivity to noise. We resolve this issue with a data fusion process by solving a minimization problem which uses the more accurat data as a reference value and by duing so increases the other.
Taking the gained insights we are able to develop new theories for image denoising. We formulate certain minimization problems in which use suitable reference values. In image denoising the reference value we want to use is an approximation of the image gradient vectors. Consequently, our approaches calculate such an approximation and use it as a reference value. Hence, our approaches are two-stage methods. Another approach to prevent the staircasing effect is to go to higher orders of differentiation within the regularization term. One approach was proposed in 2010 namely the total generalized variation (TGV) functional. We propose different kinds of combinations of these functionals. In this way we are able to formulate different minimization problems that are in some sense equivalent to the TGV problem. One advantage of some of these problems lies in the easy parameter choice rules that perform equally well as the TGV problem. Additionally, the duality gaps of these new problems are finite instead of infinite as it is usually the case in the primal-dual gap for the TGV problem. Hence, these can be used to create a reasonable stopping criterion for the optimization process. An additional advantage is the decreased runtime of the two-stage methods, since the problem is divided into two smaller problems.