Abstract:
Geometric representation learning has shown great promise for important tasks in artificial intelligence and machine learning. However, an open problem is yet how to integrate non-Euclidean representations with standard machine learning methods. In this work, we consider the task of regression onto hyperbolic space for which we propose two approaches: a non-parametric kernel-method for which we also prove excess risk bounds and a parametric deep learning model that is informed by the geodesics of the target space. By recasting predictions on trees as manifold regression problems we demonstrate the applications of our approach on two challenging tasks: 1) hierarchical classification via label embeddings and 2) inventing new concepts by predicting their embedding in a continuous representation of a base taxonomy. In our experiments, we find that the proposed estimators outperform their naive counterparts that perform regression in the ambient Euclidean space.