Tensor Contraction&Regression Networks

نویسندگان

  • Jean Kossaifi
  • Zachary C. Lipton
  • Aran Khanna
  • Tommaso Furlanello
  • Anima Anandkumar
چکیده

To date, most convolutional neural network architectures output predictions by flattening 3rd-order activation tensors, and applying fully-connected output layers. This approach has two drawbacks: (i) we lose rich, multi-modal structure during the flattening process and (ii) fully-connected layers require many parameters. We present the first attempt to circumvent these issues by expressing the output of a neural network directly as the the result of a multi-linear mapping from an activation tensor to the output. By imposing low-rank constraints on the regression tensor, we can efficiently solve problems for which existing solutions are badly parametrized. Our proposed tensor regression layer replaces flattening operations and fullyconnected layers by leveraging multi-modal structure in the data and expressing the regression weights via a low rank tensor decomposition. Additionally, we combine tensor regression with tensor contraction to further increase efficiency. Augmenting the VGG and ResNet architectures, we demonstrate large reductions in the number of parameters with negligible impact on performance on the ImageNet dataset.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Duality of Graphical Models and Tensor Networks

In this article we show the duality between tensor networks and undirected graphical models with discrete variables. We study tensor networks on hypergraphs, which we call tensor hypernetworks. We show that the tensor hypernetwork on a hypergraph exactly corresponds to the graphical model given by the dual hypergraph. We translate various notions under duality. For example, marginalization in a...

متن کامل

Geometrical Deformation Analysis of Gotvand-Olya Dam Using Permanent Geodetic Monitoring Network Observations

  In this paper, two-dimensional deformation analysis of the Gotvand-Olya dam is done using daily, monthly, seasonal and annual displacement vectors derived from permanent observations of the dam geodetic monitoring network.  The strain tensor and its invariant parameters like dilatation and maximum shear are computed as well. Nonlinear finite element interpolation based on C1 Cubic Bezier int...

متن کامل

Tensor Regression Networks with various Low-Rank Tensor Approximations

Tensor regression networks achieve high rate of compression of model parameters in multilayer perceptrons (MLP) while having slight impact on performances. Tensor regression layer imposes low-rank constraints on the tensor regression layer which replaces the flattening operation of traditional MLP. We investigate tensor regression networks using various low-rank tensor approximations, aiming to...

متن کامل

Audiometric findings with voluntary tensor tympani contraction

BACKGROUND Tensor tympani contraction may have a "signature" audiogram. This study demonstrates audiometric findings during voluntary tensor tympani contraction. METHODS Five volunteers possessing the ability to voluntarily contract their tensor tympani muscles were identified and enrolled. Tensor tympani contraction was confirmed with characteristic tympanometry findings. Study subjects unde...

متن کامل

Symmetric curvature tensor

Recently, we have used the symmetric bracket of vector fields, and developed the notion of the symmetric derivation. Using this machinery, we have defined the concept of symmetric curvature. This concept is natural and is related to the notions divergence and Laplacian of vector fields. This concept is also related to the derivations on the algebra of symmetric forms which has been discu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017