IP 7

IP 7. Hierarchical tensor decomposition and approximation
Chair: Peter Benner

The (numerical) linear algebra related to tensor computations plays an immensely powerful role in theoretical and applied sciences: applied mathematics, biology, chemistry, information sciences, physics and many other areas. In this talk we consider high-order or high-dimensional tensors of the form A ∈ R^{nו••×n} as they appear in multivariate (discrete or continuous) approximation problems. These require special techniques in order to allow a representation and approximation for high dimension d (cf. the curse of dimension). A typical example is the canonical polyadic (CP) format
A_{i1,…,id} =\sum_{j=1}^{r} a_{1}^{r}(i1) ••• a_{d}^{r}(id)
that requires O(d • n • r) degrees of freedom for the representation. However, this simple format comes along with several numerical obstructions, because it is highly non-linear. In the last three years some new hierarchical formats (namely Tensor Train and Hierarchical Tucker) have been developed and analysed. These include as a subset all tensors of CP format (rank parameter r fixed), but in addition they allow for some nice linear algebra approaches, e.g. a hierarchical SVD in complexity O(d • n • r3). The hierarchical SVD is similar to the SVD for matrices in the sense that it allows us to reliably compute a quasi-best approximation in the hierarchical format. We will outline the basic linear algebra behind the new approaches as well as open questions and possible applications.
Lars Grasedyck
Institute for Geometry and Practical Mathematics RWTH Aachen, Germany