Tensor-Based Least-Squares Solutions for Multirelational Signals and Applications
Publication Date
5-1-2024
Document Type
Article
Publication Title
IEEE Transactions on Cybernetics
Volume
54
Issue
5
DOI
10.1109/TCYB.2023.3265279
First Page
2852
Last Page
2865
Abstract
The approach of least squares (LSs) has been quite popular and widely adopted for the common linear regression analysis, which can give rise to the solution to an arbitrary critically-, over-, or under-determined system. Such a linear regression analysis can be easily applied for linear estimation and equalization in signal processing for cybernetics. Nonetheless, the current LS approach for linear regression is unfortunately limited to the dimensionality of data, that is, the exact LS solution can involve only a data matrix. As the dimension of data increases and such data need to be represented by a tensor, the corresponding exact tensor-based LS (TLS) solution does not exist due to the lack of a pertinent mathematical framework. Lately, some alternatives such as tensor decomposition and tensor unfolding were proposed to approximate the TLS solutions to the linear regression problems involving tensor data, but these techniques cannot provide the exact or true TLS solution. In this work, we would like to make the first-ever attempt to present a new mathematical framework for facilitating the exact TLS solutions involving tensor data. To demonstrate the applicability of our proposed new scheme, numerical experiments regarding machine learning and robust speech recognition are illustrated and the associated memory and computational complexities are also studied.
Funding Number
LEQSF(2021-22)-RD-A-34
Keywords
High-dimensional linear regression, joint polynomials-fitting, robust speech recognition, tensor inverse, tensor-based least-squares (TLS) methods
Department
Applied Data Science
Recommended Citation
Shih Yu Chang and Hsiao Chun Wu. "Tensor-Based Least-Squares Solutions for Multirelational Signals and Applications" IEEE Transactions on Cybernetics (2024): 2852-2865. https://doi.org/10.1109/TCYB.2023.3265279