The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format

阅读量:

520

作者:

S HoltzT RohwedderR Schneider

展开

摘要:

Recent achievements in the field of tensor product approximation provide promising new formats for the representation of tensors in form of tree tensor networks. In contrast to the canonical r-term representation (CANDECOMP, PARAFAC), these new formats provide stable representations, while the amount of required data is only slightly larger. The tensor train (TT) format [SIAM J. Sci. Comput., 33 (2011), pp. 2295–2317], a simple special case of the hierarchical Tucker format [J. Fourier Anal. Appl., 5 (2009), p. 706], is a useful prototype for practical low-rank tensor representation. In this article, we show how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation. A formulation of the component equations in terms of so-called retraction operators helps to show that many structural properties of the original problems transfer to the micro-iterations, giving...

展开

DOI:

10.1137/100818893

被引量:

290

年份:

2012

通过文献互助平台发起求助,成功后即可免费获取论文全文。

相似文献

参考文献

引证文献

辅助模式

0

引用

文献可以批量引用啦~
欢迎点我试用!

关于我们

百度学术集成海量学术资源,融合人工智能、深度学习、大数据分析等技术,为科研工作者提供全面快捷的学术服务。在这里我们保持学习的态度,不忘初心,砥砺前行。
了解更多>>

友情链接

百度云百度翻译

联系我们

合作与服务

期刊合作 图书馆合作 下载产品手册

©2025 Baidu 百度学术声明 使用百度前必读

引用