Cookies helfen uns bei der Bereitstellung unserer Dienste. Durch die Nutzung unserer Dienste erklären Sie sich damit einverstanden, dass wir Cookies setzen.
De En Es
Kundenservice: +49 (0) 551 - 547 24 0

Cuvillier Verlag

30 Jahre Kompetenz im wissenschaftlichen Publizieren
Internationaler Fachverlag für Wissenschaft und Wirtschaft

Cuvillier Verlag

Premiumpartner
De En Es
Titelbild-leitlinien
Deflectometry and Image Denoising

Printausgabe
EUR 48,90

E-Book
EUR 34,20

Deflectometry and Image Denoising

Birgit Komander (Autor)

Vorschau

Leseprobe, PDF (680 KB)
Inhaltsverzeichnis, PDF (520 KB)

ISBN-13 (Printausgabe) 9783736999978
ISBN-13 (E-Book) 9783736989979
Sprache Englisch
Seitenanzahl 124
Umschlagkaschierung matt
Auflage 1.
Erscheinungsort Göttingen
Promotionsort Braunschweig
Erscheinungsdatum 15.04.2019
Allgemeine Einordnung Dissertation
Fachbereiche Mathematik
Angewandte Mathematik
Schlagwörter Angewandte Mathematik, Optimierung, Variationsmethoden, mathematische Bildverarbeitung, Deflektometrie, Primal-Dual Algorithmus, Totalvariation, verallgemeinerte Totalvariation, Regularisierungen, Dualitätslücke, applied mathematics, optimization, variational methods, mathematical imaging, denoising, deflectometry, primal-dual algorithm, total variation, total generalized variation, regularization, primal-dual gaps
URL zu externer Homepage https://www.tu-braunschweig.de/iaa/personal/komander
Beschreibung

Consider manufactured parts, such as screws, car doors, lenses, or mirrors for lasers, for example. All these manufactured parts have to go through quality inspections checking if there are unwanted bumps or scratches that should not be there. There are different methods to measure the manufactured parts. The goal is to describe the measured object exactly by the data. This is one example for a so-called inverse problem.
The application that we consider in the first part of this thesis is a data fusion process. The given dataset is a result of a deflectometric measurement process. These processes deal with object measurements of specular objects, such as lenses or mirrors. The aim is to calculate a dataset that describes the measured object exactly. The output data consists of two sets of separately measured types of data which suffer from different accuracy due to different sensitivity to noise. We resolve this issue with a data fusion process by solving a minimization problem which uses the more accurat data as a reference value and by duing so increases the other.
Taking the gained insights we are able to develop new theories for image denoising. We formulate certain minimization problems in which use suitable reference values. In image denoising the reference value we want to use is an approximation of the image gradient vectors. Consequently, our approaches calculate such an approximation and use it as a reference value. Hence, our approaches are two-stage methods. Another approach to prevent the staircasing effect is to go to higher orders of differentiation within the regularization term. One approach was proposed in 2010 namely the total generalized variation (TGV) functional. We propose different kinds of combinations of these functionals. In this way we are able to formulate different minimization problems that are in some sense equivalent to the TGV problem. One advantage of some of these problems lies in the easy parameter choice rules that perform equally well as the TGV problem. Additionally, the duality gaps of these new problems are finite instead of infinite as it is usually the case in the primal-dual gap for the TGV problem. Hence, these can be used to create a reasonable stopping criterion for the optimization process. An additional advantage is the decreased runtime of the two-stage methods, since the problem is divided into two smaller problems.