Junkyu Lee, Michael Katz, et al.
NeurIPS 2023
We propose and investigate two new methods to approximate f (A)b for large, sparse, Hermitian matrices A. Computations of this form play an important role in numerous signal processing and machine learning tasks. The main idea behind both methods is to first estimate the spectral density of A, and then find polynomials of a fixed order that better approximate the function f on areas of the spectrum with a higher density of eigenvalues. Compared to state-of-the-art methods such as the Lanczos method and truncated Chebyshev expansion, the proposed methods tend to provide more accurate approximations of f (A)b at lower polynomial orders, and for matrices A with a large number of distinct interior eigenvalues and a small spectral width. We also explore the application of these techniques to (i) fast estimation of the norms of localized graph spectral filter dictionary atoms, and (ii) fast filtering of time-vertex signals.
Junkyu Lee, Michael Katz, et al.
NeurIPS 2023
Michael Hersche, Geethan Karunaratne, et al.
CVPR 2022
Tengfei Ma, Patrick Ferber, et al.
AAAI 2020
Ngoc Lan Hoang, Alexander Zadorojniy
INFORMS 2022