In the numerical linear algebra community, it is thought that to obtain nearly-optimal bounds for various problems such as rank computation and finding a linearly independent subset of columns, regression, low rankapproximation, maximum matching on general graphs and linear matroid union, onewould need to resolve the logarithmic factors in the sketching dimension for existingconstant factor approximation oblivious subspace embeddings . We show how tobypass this question using a refined sketching technique, and obtain optimal ornearly optimal bounds for these problems . For constant factorregression and low rank approximation we give the first optimal algorithms, for the current matrix multiplication exponent . Further, for constant factorRegression and . the current . matrix multiplication exponents we give our first optimal algorithm, forthe current matroid multiplication exponent. We give an optimal algorithm for the . current matrix multiplication exponent, for The current matrix multiplying exponent . We give a new matrix multiplicationexpansion

Author(s) : Nadiia Chepurko, Kenneth L. Clarkson, Praneeth Kacham, David P. Woodruff

Links : PDF - Abstract

Code :
Coursera

Keywords : optimal - matrix - current - multiplication - exponent -

Leave a Reply

Your email address will not be published. Required fields are marked *