Abstract: 

The roadmap of the talk is

1. All linear numerical methods for solving linear PDE problems using finitely many given data can be reformulated as linear discrete recovery formulas for solution values in terms of the given data.

2. These recovery formulas are linear functionals on a reproducing kernel Hilbert space (RKHS), e.g.: a Sobolev space.

3. The pointwise error of such formulas also is a linear functional on that space.

4. The norm of the pointwise error functional can be evaluated exactly and explicitly using the kernel of the RKHS. It provides explicit and quantitative pointwise worst–case error bounds as percentages of the (unknown) RKHS norm of the true solution of the PDE problem.

5. This allows a fair comparison of all linear PDE solvers, including finite elements (standard or generalized, multigrid or not) and various meshless methods, in weak or strong form. The comparison is independent of how the actual calculations of the PDE solver are performed, since the comparison focuses on the error of the recovery formula that is uniquely associated to the solver, not the solver itself. Special features of algorithms like different strategies of numerical integration enter into the comparison, but indirectly.

6. Some numerical examples will compare the errors of finite element methods with those of certain meshless techniques, under the assumption that they use the same input data and work on a fixed Sobolev space.

7. One can ask for a linear PDE solver that minimizes the pointwise error in the above sense. The solution turns out to be the meshless symmetric kernel–based collocation method dating back to the nineties. Examples will show how far away standard finite–element methods and certain meshless techniques are from the error–optimal solver.

Speaker

Prof. Robert Schaback

Research Area
Affiliation

University of Göttingen, Germany

Date

Thu, 11/04/2013 - 10:00am to 11:00am

Venue

Red Center 4082