论文标题

张开的随机预测

Tensorized Random Projections

论文作者

Rakhshan, Beheshteh T., Rabusseau, Guillaume

论文摘要

我们引入了一种新型的随机投影技术,以有效地降低非常高度张量的尺寸。基于高斯随机预测的经典结果和约翰逊·林斯特劳斯(Johnson-Lindenstrauss)的转换〜(JLT),我们提出了两个依赖张量列车〜(TT)和CP分解格式的张量的随机投影映射。这两个地图提供了非常低的内存需求,并且当输入是CP或TT格式给出的低等级张量时,可以有效地应用。我们的理论分析表明,JLT中的密集高斯矩阵可以被带有随机因子的压缩形式的低级张量代替,同时仍然大致保留了预测输入的欧几里得距离。此外,我们的结果表明,就达到相同的失真比所需的随机投影的大小而言,TT格式基本上优于CP。关于合成数据的实验验证了我们的理论分析,并证明了TT分解的优越性。

We introduce a novel random projection technique for efficiently reducing the dimension of very high-dimensional tensors. Building upon classical results on Gaussian random projections and Johnson-Lindenstrauss transforms~(JLT), we propose two tensorized random projection maps relying on the tensor train~(TT) and CP decomposition format, respectively. The two maps offer very low memory requirements and can be applied efficiently when the inputs are low rank tensors given in the CP or TT format. Our theoretical analysis shows that the dense Gaussian matrix in JLT can be replaced by a low-rank tensor implicitly represented in compressed form with random factors, while still approximately preserving the Euclidean distance of the projected inputs. In addition, our results reveal that the TT format is substantially superior to CP in terms of the size of the random projection needed to achieve the same distortion ratio. Experiments on synthetic data validate our theoretical analysis and demonstrate the superiority of the TT decomposition.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源