Examples
Here we want to mention and link some examples and applications that, in our opinion, nicely present the functionalities of TorchPhysics and the PINN idea.
More examples can be found under the examples-folder.
Poisson problem
One of the simplest applications is the forward solution of a Poisson equation:
This problem is part of the tutorial and is therefore explained with alot of details. The corresponding implementation can be found here.
Learning parameter dependencies
A natural extension of the PINN approach is to learn parameter dependencies, that appear in the differential equation. A simple example would be the problem:
where we want to train a family of solutions for \(k \in [0, 2]\). We want to find the function \(u(x, k) = e^{kx}\). Implemented is this example in: simple-parameter-dependency-notebook
This approach is also possible for complexer problems, see for example this notebook. Where we apply this idea to the heat equation.
Inverse heat equation
For an inverse problem we consider the heat equation:
with \(\Omega = [0, 10] \times [0, 10]\). Here \(D\) can either be a constant value or function itself. Here we start with some data \(u(t_i, x_i)\) and want to find the corresponding \(D\).
The aim of the following two examples is to show how one can implement this in TorchPhysics:
Heat equation on moving domain
To demonstrate how easy one can create a time (or parameter) dependent domain, we consider the PDE:
Where \(\Omega\) will be a circle with a moving hole and \(\Gamma_\text{in}(t)\) the boundary of the hole. The animation on the main page belongs to the solution of this problem.
Link to the notebook: moving-domain-notebook
Using hard constrains
For some problems, it is advantageous to build some prior knowledge into the used network architecture (e.g. scaling the network output or fixing the values on the boundary). This can easily be achieved in TorchPhysics and is demonstrated in this hard-constrains-notebook. There we consider the system:
where the high frequency is problematic for the usual PINN-approach.
Interface jump
For an example where we want to solve a problem with a discontinuous solution, we study, for \(\Omega = (0, 1) \times (0, 1)\) and \(\Gamma\) the line form \((0.5, 0)\) to \((0.5, 1)\), the PDE:
with \(i = 1, 2\) and the solution \(u=(u_1, u_2)\), split up into left and right part.
For this problem we need two networks, since one alone can, in general, not approximate the jump of the solution. Therefore, this example focus on the training of two neural networks on disjoint domains, coupled over the interface.
Link to the notebook: jump-notebook