09/07/2020

A Nearly Optimal Variant of the Perceptron Algorithm for the Uniform Distribution on the Unit Sphere

Marco Schmalhofer

Keywords: Supervised learning, Halfspace Learning

Abstract: We show a simple perceptron-like algorithm to learn origin-centered halfspaces in $\mathbb{R}^n$ with accuracy $1-\epsilon$ and confidence $1-\delta$ in time \n \[\n \mathcal{O}\left(\frac{n^2}{\epsilon}\left(\log \frac{1}{\epsilon}+\log \frac{1}{\delta}\right)\right)\n \]\n using \n \[\n \mathcal{O}\left(\frac{n}{\epsilon}\left(\log \frac{1}{\epsilon}+\log \frac{1}{\delta}\right)\right)\n \]\n labeled examples drawn uniformly from the unit $n$-sphere. This improves upon algorithms given in \cite{DBLP:journals/neco/Baum90}, \cite{DBLP:journals/ipl/Long94} and \cite{DBLP:conf/colt/Servedio99}. The time and sample complexity of our algorithm match the lower bounds given in \cite{DBLP:journals/tnn/Long95} up to logarithmic factors.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at COLT 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers