Data driven regularization by projection Andrea Aspri Joint work - PowerPoint PPT Presentation
Data driven regularization by projection Andrea Aspri Joint work with Y. Korolev and O. Scherzer Joint meeting Fudan University and RICAM Shanghai & Linz - 10, June 2020 logo.png Andrea Aspri (RICAM) Data driven regularization by
Data driven regularization by projection Andrea Aspri Joint work with Y. Korolev and O. Scherzer Joint meeting Fudan University and RICAM Shanghai & Linz - 10, June 2020 logo.png Andrea Aspri (RICAM) Data driven regularization by projection
Motivation Given Au = y and y δ noisy measurements s.t. � y − y δ � ≤ δ . Goal: Given measurements y δ , reconstruct the unknown quantity u . Assume we have { u i , y i } i =1 , ··· , n s.t. Au i = y i Questions How can we use these pairs in the reconstruction process? How can we use these pairs when A is not explicitly known ? logo.png Andrea Aspri (RICAM) Data driven regularization by projection
Learning an operator: is it possible? Main goal: develop stable algorithms for finding u such that Au = y without explicit knowledge of A , but having only training pairs : { u i , y i } i =1 , ··· , n s.t. Au i = y i noisy measurements y δ , s.t. � y − y δ � ≤ δ Question Is there a regularization method capable of learning a linear operator? Spoiler: Yes, there is... logo.png Andrea Aspri (RICAM) Data driven regularization by projection
Learning an operator: is it possible? Main goal: develop stable algorithms for finding u such that Au = y without explicit knowledge of A , but having only training pairs : { u i , y i } i =1 , ··· , n s.t. Au i = y i noisy measurements y δ , s.t. � y − y δ � ≤ δ Question Is there a regularization method capable of learning a linear operator? Spoiler: Yes, there is... logo.png Andrea Aspri (RICAM) Data driven regularization by projection
Terminology & Notation u i are “training images” y i are “training data” U n := Span { u i } i =1 , ··· , n Y n := Span { y i } i =1 , ··· , n P U n orthogonal projection onto U n P Y n orthogonal projection onto Y n u † solution to Au = y . logo.png Andrea Aspri (RICAM) Data driven regularization by projection
Main assumptions 1. Operator ◮ A : U → Y , with U , Y Hilbert spaces. ◮ A is bounded, linear and injective (but A − 1 is unbounded). 2. Data ◮ Linear independence of { u i } i =1 , ··· , n , ∀ n ∈ N . ◮ Uniform boundedness: ∃ C u > 0 s.t. � u i � ≤ C u , ∀ i ∈ N . ◮ Sequentiality: training pairs are nested, i.e. { u i , y i } i =1 , ··· , n +1 = { u i , y i } i =1 , ··· , n ∪ { u n +1 , y n +1 } , hence U n ⊂ U n +1 and Y n ⊂ Y n +1 , ∀ n ∈ N . ◮ Density: training images spaces are dense in U , i.e., � n ∈ N U n = U . Consequences: Training data y i are linearly independent and uniformly bounded as well; Training data spaces are dense in R ( A ). logo.png Andrea Aspri (RICAM) Data driven regularization by projection
Main assumptions 1. Operator ◮ A : U → Y , with U , Y Hilbert spaces. ◮ A is bounded, linear and injective (but A − 1 is unbounded). 2. Data ◮ Linear independence of { u i } i =1 , ··· , n , ∀ n ∈ N . ◮ Uniform boundedness: ∃ C u > 0 s.t. � u i � ≤ C u , ∀ i ∈ N . ◮ Sequentiality: training pairs are nested, i.e. { u i , y i } i =1 , ··· , n +1 = { u i , y i } i =1 , ··· , n ∪ { u n +1 , y n +1 } , hence U n ⊂ U n +1 and Y n ⊂ Y n +1 , ∀ n ∈ N . ◮ Density: training images spaces are dense in U , i.e., � n ∈ N U n = U . Consequences: Training data y i are linearly independent and uniformly bounded as well; Training data spaces are dense in R ( A ). logo.png Andrea Aspri (RICAM) Data driven regularization by projection
Main assumptions 1. Operator ◮ A : U → Y , with U , Y Hilbert spaces. ◮ A is bounded, linear and injective (but A − 1 is unbounded). 2. Data ◮ Linear independence of { u i } i =1 , ··· , n , ∀ n ∈ N . ◮ Uniform boundedness: ∃ C u > 0 s.t. � u i � ≤ C u , ∀ i ∈ N . ◮ Sequentiality: training pairs are nested, i.e. { u i , y i } i =1 , ··· , n +1 = { u i , y i } i =1 , ··· , n ∪ { u n +1 , y n +1 } , hence U n ⊂ U n +1 and Y n ⊂ Y n +1 , ∀ n ∈ N . ◮ Density: training images spaces are dense in U , i.e., � n ∈ N U n = U . Consequences: Training data y i are linearly independent and uniformly bounded as well; Training data spaces are dense in R ( A ). logo.png Andrea Aspri (RICAM) Data driven regularization by projection
Regularization by projection Approximate u † using the Minimum Norm Solution (MNS) of finite dimensional problems Least-Squares Proj. Dual Least-Squares Proj. (1) AP n u = y (2) Q n Au = Q n y P n = orthogonal projection onto a Q n orthogonal projection onto a finite dimensional space of U . finite dimensional space of Y . Our idea to use training pairs Choose P n = P U n . Choose Q n = P Y n It can be proven It can be proven u U n = A − 1 P Y n y u Y n = P A ∗ Y n u U MNS : MNS : n In general u U n � u † . In this case u Y n → u † . Engl, H. W. and Hanke, M. and Neubauer, A., Regularization of Inverse Problems, Springer (1996). Seidman T. I., Nonconvergence results for the application of least-squares estimation to ill-posed problems, J. Opt. Th. logo.png Appl. (1980). Andrea Aspri (RICAM) Data driven regularization by projection
Regularization by projection Approximate u † using the Minimum Norm Solution (MNS) of finite dimensional problems Least-Squares Proj. Dual Least-Squares Proj. (1) AP n u = y (2) Q n Au = Q n y P n = orthogonal projection onto a Q n orthogonal projection onto a finite dimensional space of U . finite dimensional space of Y . Our idea to use training pairs Choose P n = P U n . Choose Q n = P Y n It can be proven It can be proven u U n = A − 1 P Y n y u Y n = P A ∗ Y n u U MNS : MNS : n In general u U n � u † . In this case u Y n → u † . Engl, H. W. and Hanke, M. and Neubauer, A., Regularization of Inverse Problems, Springer (1996). Seidman T. I., Nonconvergence results for the application of least-squares estimation to ill-posed problems, J. Opt. Th. logo.png Appl. (1980). Andrea Aspri (RICAM) Data driven regularization by projection
Regularization by projection Approximate u † using the Minimum Norm Solution (MNS) of finite dimensional problems Least-Squares Proj. Dual Least-Squares Proj. (1) AP n u = y (2) Q n Au = Q n y P n = orthogonal projection onto a Q n orthogonal projection onto a finite dimensional space of U . finite dimensional space of Y . Our idea to use training pairs Choose P n = P U n . Choose Q n = P Y n It can be proven It can be proven u U n = A − 1 P Y n y u Y n = P A ∗ Y n u U MNS : MNS : n In general u U n � u † . In this case u Y n → u † . Engl, H. W. and Hanke, M. and Neubauer, A., Regularization of Inverse Problems, Springer (1996). Seidman T. I., Nonconvergence results for the application of least-squares estimation to ill-posed problems, J. Opt. Th. logo.png Appl. (1980). Andrea Aspri (RICAM) Data driven regularization by projection
Regularization by projection Approximate u † using the Minimum Norm Solution (MNS) of finite dimensional problems Least-Squares Proj. Dual Least-Squares Proj. (1) AP n u = y (2) Q n Au = Q n y P n = orthogonal projection onto a Q n orthogonal projection onto a finite dimensional space of U . finite dimensional space of Y . Our idea to use training pairs Choose P n = P U n . Choose Q n = P Y n It can be proven It can be proven u U n = A − 1 P Y n y u Y n = P A ∗ Y n u U MNS : MNS : n In general u U n � u † . In this case u Y n → u † . Engl, H. W. and Hanke, M. and Neubauer, A., Regularization of Inverse Problems, Springer (1996). Seidman T. I., Nonconvergence results for the application of least-squares estimation to ill-posed problems, J. Opt. Th. logo.png Appl. (1980). Andrea Aspri (RICAM) Data driven regularization by projection
Regularization by projection Approximate u † using the Minimum Norm Solution (MNS) of finite dimensional problems Least-Squares Proj. Dual Least-Squares Proj. (1) AP n u = y (2) Q n Au = Q n y P n = orthogonal projection onto a Q n orthogonal projection onto a finite dimensional space of U . finite dimensional space of Y . Our idea to use training pairs Choose P n = P U n . Choose Q n = P Y n It can be proven It can be proven u U n = A − 1 P Y n y u Y n = P A ∗ Y n u U MNS : MNS : n In general u U n � u † . In this case u Y n → u † . Engl, H. W. and Hanke, M. and Neubauer, A., Regularization of Inverse Problems, Springer (1996). Seidman T. I., Nonconvergence results for the application of least-squares estimation to ill-posed problems, J. Opt. Th. logo.png Appl. (1980). Andrea Aspri (RICAM) Data driven regularization by projection
Regularization by projection Approximate u † using the Minimum Norm Solution (MNS) of finite dimensional problems Least-Squares Proj. Dual Least-Squares Proj. (1) AP n u = y (2) Q n Au = Q n y P n = orthogonal projection onto a Q n orthogonal projection onto a finite dimensional space of U . finite dimensional space of Y . Our idea to use training pairs Choose P n = P U n . Choose Q n = P Y n It can be proven It can be proven u U n = A − 1 P Y n y u Y n = P A ∗ Y n u U MNS : MNS : n In general u U n � u † . In this case u Y n → u † . Engl, H. W. and Hanke, M. and Neubauer, A., Regularization of Inverse Problems, Springer (1996). Seidman T. I., Nonconvergence results for the application of least-squares estimation to ill-posed problems, J. Opt. Th. logo.png Appl. (1980). Andrea Aspri (RICAM) Data driven regularization by projection
Recommend
More recommend
Explore More Topics
Stay informed with curated content and fresh updates.