4.7 Article

Hierarchical deep-learning neural networks: finite elements and beyond

Journal

COMPUTATIONAL MECHANICS
Volume 67, Issue 1, Pages 207-230

Publisher

SPRINGER
DOI: 10.1007/s00466-020-01928-9

Keywords

Neural network interpolation functions; Data-driven; r- and rh-adaptivity; Fundamental building block; Rational functions (i.e. RKPM, NURBS and IGA)

Funding

  1. Research Experience for Undergraduate [CMMI-1762035]
  2. National Natural Science Foundation of China [11832001, 11988102, 11890681]
  3. NSF [CMMI-1762035]

Ask authors/readers for more resources

HiDeNN is developed by constructing structured deep neural networks (DNNs) hierarchically, and a special case called HiDeNN-FEM is established for representing the Finite Element Method. In HiDeNN-FEM, weights and biases are functions of nodal positions, optimizing nodal coordinates during training to enhance both local and global accuracy. By fixing the number of hidden layers and increasing neurons, rh-adaptivity is achieved for further improvement in solution accuracy. The generalization of rational functions is achieved by developing three fundamental building blocks of deep hierarchical neural networks, allowing for the construction of deep learning interpolation functions for various theories like Lagrange polynomials and NURBS.
The hierarchical deep-learning neural network (HiDeNN) is systematically developed through the construction of structured deep neural networks (DNNs) in a hierarchical manner, and a special case of HiDeNN for representing Finite Element Method (or HiDeNN-FEM in short) is established. In HiDeNN-FEM, weights and biases are functions of the nodal positions, hence the training process in HiDeNN-FEM includes the optimization of the nodal coordinates. This is the spirit of r-adaptivity, and it increases both the local and global accuracy of the interpolants. By fixing the number of hidden layers and increasing the number of neurons by training the DNNs, rh-adaptivity can be achieved, which leads to further improvement of the accuracy for the solutions. The generalization of rational functions is achieved by the development of three fundamental building blocks of constructing deep hierarchical neural networks. The three building blocks are linear functions, multiplication, and inversion. With these building blocks, the class of deep learning interpolation functions are demonstrated for interpolation theories such as Lagrange polynomials, NURBS, isogeometric, reproducing kernel particle method, and others. In HiDeNN-FEM, enrichment functions through the multiplication of neurons is equivalent to the enrichment in standard finite element methods, that is, generalized, extended, and partition of unity finite element methods. Numerical examples performed by HiDeNN-FEM exhibit reduced approximation error compared with the standard FEM. Finally, an outlook for the generalized HiDeNN to high-order continuity for multiple dimensions and topology optimizations are illustrated through the hierarchy of the proposed DNNs.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available