Browsing by Author "TEH, Yee Whye"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
- ItemA Bayesian Interpretation of Interpolated Kneser-Ney(2006-02-02T09:23:28Z) TEH, Yee WhyeInterpolated Kneser-Ney is one of the best smoothing methods for n-gram language models. Previous explanations for its superiority have been based on intuitive and empirical justifications of specific properties of the method. We propose a novel interpretation of interpolated Kneser-Ney as approximate inference in a hierarchical Bayesian model consisting of Pitman-Yor processes. As opposed to past explanations, our interpretation can recover exactly the formulation of interpolated Kneser-Ney, and performs better than interpolated Kneser-Ney when a better inference procedure is used.
- ItemSemi-supervised Learning in Reproducing Kernel Hilbert Spaces Using Local Invariances(2006-03-14T02:03:37Z) LEE, Wee Sun; ZHANG, Xinhua; TEH, Yee WhyeWe propose a framework for semi-supervised learning in reproducing kernel Hilbert spaces using local invariances that explicitly characterize the behavior of the target function around both labeled and unlabeled data instances. Such invariances include: invariance to small changes to the data instances, invariance to averaging across a small neighbourhood around data instances, and invariance to local transformations such as translation and rotation. These invariances are approximated by minimizing loss functions on derivatives and local averages of the functions. We use a regularized cost function, consisting of the sum of loss functions penalized with the squared norm of the function, and give a representer theorem showing that an optimal function can be represented as a linear combination of a finite number of basis functions. For the representer theorem to hold, the derivatives and local averages are required to be bounded linear functionals in the reproducing kernel Hilbert space. We show that this is true in the reproducing kernel Hilbert spaces defined by Gaussian and polynomial kernels.