Semi-supervised Learning in Reproducing Kernel Hilbert Spaces Using Local Invariances
dc.contributor.author | LEE, Wee Sun | en_US |
dc.contributor.author | ZHANG, Xinhua | en_US |
dc.contributor.author | TEH, Yee Whye | en_US |
dc.date.accessioned | 2006-03-14T02:03:37Z | en_US |
dc.date.accessioned | 2017-01-23T07:00:00Z | |
dc.date.available | 2006-03-14T02:03:37Z | en_US |
dc.date.available | 2017-01-23T07:00:00Z | |
dc.date.issued | 2006-03-14T02:03:37Z | en_US |
dc.description.abstract | We propose a framework for semi-supervised learning in reproducing kernel Hilbert spaces using local invariances that explicitly characterize the behavior of the target function around both labeled and unlabeled data instances. Such invariances include: invariance to small changes to the data instances, invariance to averaging across a small neighbourhood around data instances, and invariance to local transformations such as translation and rotation. These invariances are approximated by minimizing loss functions on derivatives and local averages of the functions. We use a regularized cost function, consisting of the sum of loss functions penalized with the squared norm of the function, and give a representer theorem showing that an optimal function can be represented as a linear combination of a finite number of basis functions. For the representer theorem to hold, the derivatives and local averages are required to be bounded linear functionals in the reproducing kernel Hilbert space. We show that this is true in the reproducing kernel Hilbert spaces defined by Gaussian and polynomial kernels. | en_US |
dc.format.extent | 715690 bytes | en_US |
dc.format.mimetype | application/pdf | en_US |
dc.identifier.uri | https://dl.comp.nus.edu.sg/xmlui/handle/1900.100/1914 | en_US |
dc.language.iso | en | en_US |
dc.relation.ispartofseries | TRB3/06 | en_US |
dc.title | Semi-supervised Learning in Reproducing Kernel Hilbert Spaces Using Local Invariances | en_US |
dc.type | Technical Report | en_US |