TAGA 24 Conference Resources

TAGA 24 Conference Resources

Abstracts, videos, and slides


Link back to conference home

JM Landsberg

Title: Spaces of matricies of bounded rank

Neriman Tokcan

Title: Tensor methods for omics data analysis

Abstract: The influx of high-dimensional, multimodal omics data presents exciting opportunities omics-based discoveries. However, the escalating complexity and dimensionality of these datasets pose formidable challenges in terms of data analysis. As advancements in genomics technology push the boundaries of data complexity, there is a parallel need for methodological innovations. Tensor methods have emerged as particularly promising tools for addressing these challenges.

In this talk, we will embark on a journey through the intricacies of high-dimensional omics data analysis. We will explore how tensors can naturally capture higher-order relationships and interactions within high-dimensional omics datasets. Additionally, we will delve into how tensor-based approaches facilitate integrative analysis by combining heterogeneous data modalities, leading to comprehensive insights into biological systems. Through illustrative case studies, we will demonstrate the utility and impact of these methods in genomic discoveries and personalized medicine. Additionally, we’ll delve into future directions and potential advancements in this field.

Alessandro Oneto

Title: Hadamard-Hitchcock Decompositions of Tensors


Hitchcock decompositions of tensors (aka Canonical Polyadic Decompositions or PARAFAC Decomposition) encode mixtures of independence models on discrete random variables, namely, statistical models given by one hidden discrete variable connected to a set of observed ones. Hadamard products, i.e., entry-wise products, of Hitchcock decompositions encode products of mixtures of independence models on discrete random variables, namely, statistical models given by a complete bipartite graph with a layer of hidden discrete variables and a layer of observed ones (aka Restricted Boltzmann Machines).

In this talk, after recalling some recent literature about these models, I will present a new generic identifiability result of such decompositions (under numerical assumptions on the number of states). Our approach also suggests an algorithm to compute a Hadamard-Hitchcock decomposition of a generic tensor.

This is a joint work with Nick Vannieuwenhoven (KU Leuven).

Nick Vannieuwenhoven

Title: Sensitivity of tensor decompositions


The tensor rank decomposition (CPD) express a tensor as a linear combination of elementary tensors. It has applications in chemometrics, computer science, machine learning, psychometrics, signal processing, and statistics. Its uniqueness properties render it suitable for data analysis tasks in which the elementary tensors are the quantities of interest. However, in applications, the idealized mathematical model is always corrupted by measurement errors. For a robust interpretation of the data, it is therefore imperative to quantify how sensitive these elementary tensors are to perturbations of the whole tensor. I will give an overview of recent results on the sensitivity of the tensor rank decomposition, including the matrix case, established with my collaborators Carlos Beltran, Paul Breiding, and Nick Dewaele.

Cole Gigliotti

Title: Skewness (sometimes) Does Imply Causation


An oft repeated phrase in layman statistics is that correlation is not causation. This is true, however, one can tease out underlying causation by studying the third-order moment tensor as well as the correlation matrix. In this short talk, we will see that the vanishing locus of polynomial equations in the coefficients of these two tensors will allow us to completely recover the causal graph underlying our data for the well-studied class of linear, non-Gaussian, acyclic causal models. In the first portion of the talk, we will deal with setting up the problem, which will include a short introduction to causal models. We will end by exhibiting a set of polynomials solving the problem, which all happen to be minors of certain matrices with graph theoretic flavor.

Edinah Koffi Gnang

Title: A combinatorial perspective on the tensor rank and the BM rank


We describe a combinatorial perspective on hypermatrix actions on vector spaces. We also discuss subtle differences which separate the tensor rank from the Bhattacharya-Mesner (BM) rank.

The talk is based on joint work with Rongyu Xu

Pascal Schweizer

Title: Recent developments on the Weisfeiler-Leman Algorithm


The Weisfeiler-Leman algorithm is used in various contexts such as symmetry detection, isomorphism problems, and machine learning on graphs. I will give a gentle introduction to the algorithm and then detail recent results obtained with various coauthors.

Martin Kassabov

Title: Sometimes row reducing bus timetables can be useful


I will describe an almost “nonsensical algorithm” which resembles something generated by AI, i.e, taking a bunch of step often used data science a chaining them in a seemingly random way, which surprisingly can be used to “simplify” some tensors by making the support smaller.

(this is a joint work with James Wilson and Pete Brooksbank)

Chris Liu

Title: QuickSylver - Fast solutions to Simultaneous Sylvester Systems


Solving the system of matrix equations (\forall i) XA_i + B_iY = C_i, also called a Simultaneous Sylvester System, is key to computing the adjoint and centroid a tensor. Counting constraints, there are n^3 equations in 2(n^2) variables. This suggests finding solutions in O(n^6) operations by solving the flattened matrix vector system. But viewing the Simultaneous Sylvester System as a single tensor equation unlocks insights for an O(n^3) algorithm for finding solutions. We describe the key ideas and illustrate the benefit of reasoning with tensor network diagrams over flattened matricies.

This is in joint work with Josh Maglione and James Wilson.

Xiaorui Sun

Title: Faster Isomorphism for p-Groups of Class 2 and Exponent p

Austin Conner

Title: Border apolarity and fast matrix multiplication

Hirotachi Abo

Title: Algebro-geometric approaches to the tensor eigenproblem


Eigenvectors of tensors, extensions of eigenvectors of matrices, were introduced by H. Lim and L. Qi independently in 2005 and have been studied in numerical multilinear algebra. Recently, the concept of tensor eigenvectors drew attention from the algebraic geometry community because algebraic geometry has proven to provide useful techniques for the tensor eigenproblem. This talk aims to discuss algebro-geometric aspects of the spectral theory of tensors.

This work is licensed under a Creative Commons Attribution 4.0 International License

Additional Questions?

If you have additional questions, feel free to reach out to a maintainer / contributor on the contact page.

Conference Day 1 TAGA24

Conference Day 2 TAGA24

Related Posts