* Sparse Convolution: Computing the convolution of two vectors is a fundamental algorithmic primitive. In the *sparse* convolution problem we assume that the input and output vectors have at most $t$ nonzero entries, and the goal is to design algorithms with running times dependent on $t$. For the special case where all entries are nonnegative, which is particularly important for algorithm design, it is known since twenty years that sparse convolutions can be computed in near-linear randomized time $O(t \log^2 n)$. In this thesis we develop a randomized algorithm with running time $O(t \log t)$ which is *optimal* (under some mild assumptions), and the first near-linear *deterministic* algorithm for sparse nonnegative convolution. We also present an application of these results, leading to seemingly unrelated fine-grained lower bounds against distance oracles in graphs.
* Sublinear Edit Distance: The edit distance of two strings is a well-studied similarity measure with numerous applications in computational biology. While computing the edit distance exactly provably requires quadratic time, a long line of research has lead to a constant-factor approximation algorithm in almost-linear time. Perhaps surprisingly, it is also possible to approximate the edit distance $k$ within a large factor $O(k)$ in *sublinear time* $\widetilde{O}(\frac nk + k^{O(1)})$. We drastically improve the approximation factor of the known sublinear algorithms from $O(k)$ to $k^{o(1)}$ while preserving the $O(\frac nk + k^{O(1)})$ running time.