# Tree tensor networks, associated singular values and high-dimensional approximation

- Baum Tensor-Netzwerke, assoziierte Singulärwerte und hochdimensionale Approximation

Krämer, Sebastian; Grasedyck, Lars (Thesis advisor); Schneider, Reinhold (Thesis advisor); Backmayr, Markus (Thesis advisor)

*Aachen (2020)*

Dissertation / PhD Thesis

Dissertation, RWTH Aachen University, 2020

Abstract

In this thesis, we develop an algebraic and graph theoretical reinterpretation of tensor networks and formats. We investigate properties of associated singular values and demonstrate their importance for high-dimensional approximation, in particular for model complexity adaption. This leads us to a concept of stability for iterative optimizations methods which we discuss at length for discrete matrix and tensor completion. We further generalize these ideas to the approximate interpolation of scattered data, and demonstrate the potential of the introduced algorithms on a data set that describes rolling press experiments. These largely algorithmic considerations are supplemented and supported by the theoretical examination of the interrelation between tensor singular values, and its relation to the quantum marginal problem. Tensor networks are essentially multilinear maps which reflect the connections between collections of tensors. In the first part, we discuss how two familiar concepts in mathematics yield an arithmetic that naturally describes such networks, and which formalizes the underlying, simple graph structures through universal assertions. The practicability of this calculus is reflected on by the straightforward implementations, which we provide also of well known algorithms. As a central theorem of this thesis serves the generalizing tree singular value decomposition, which, while not novel in its basic idea, incorporates various gauge conditions that stem from different, corresponding tensor formats. In the second part, we discuss details of high-dimensional, alternating least squares optimization in tree tensor networks, which are those families of tensors that form tree graphs. Due to the special properties of this class of formats, even high-dimensional problems can effectively be handled, in particular when the occurring, linear subproblems are solved via a conjugate gradient method. Subsequent to this introductory segment, we investigate the meaning of singular values in this context. As the model complexity is determined by the tensor ranks of the iterate, the proper calibration of such becomes essential in order to obtain reasonable solutions to recovery problems. Based on a specific definition of stability, we introduce and discuss modifications to standard alternating least squares as well as the relation to reweighted l1-minimization. We in particular demonstrate the use of these concepts for rank-adaptive algorithms. Such are further generalized from the discrete to the continuous setting, which we apply to the approximate interpolation of rolling press simulations. As the singular values associated to tensor networks stem from different matricizations of the same tensor, the question about the interrelation between such arises. In the third part we first show that the tensor feasibility problem is equivalent to a version of the quantum marginal problem. While the latter one has been well known in physics for multiple decades, the tensor version originating from mathematics has only recently been considered. We transfer several results into our setting and subsequently utilize the tree singular value decomposition in order to decouple high-dimensional feasibility problems into much simpler, smaller ones. Last but not least, we specifically consider this situation for the tensor train format, which leads us to cone theory, so-called honeycombs and the application of linear programming algorithms.

### Identifier

- DOI: 10.18154/RWTH-2020-05412
- RWTH PUBLICATIONS: RWTH-2020-05412