Papers
Topics
Authors
Recent
2000 character limit reached

Semi-supervised Vector-valued Learning: Improved Bounds and Algorithms (1909.04883v4)

Published 11 Sep 2019 in cs.LG and stat.ML

Abstract: Vector-valued learning, where the output space admits a vector-valued structure, is an important problem that covers a broad family of important domains, e.g. multi-task learning and transfer learning. Using local Rademacher complexity and unlabeled data, we derive novel semi-supervised excess risk bounds for general vector-valued learning from both kernel perspective and linear perspective. The derived bounds are much sharper than existing ones and the convergence rates are improved from the square root of labeled sample size to the square root of total sample size or directly dependent on labeled sample size. Motivated by our theoretical analysis, we propose a general semi-supervised algorithm for efficiently learning vector-valued functions, incorporating both local Rademacher complexity and Laplacian regularization. Extensive experimental results illustrate the proposed algorithm significantly outperforms the compared methods, which coincides with our theoretical findings.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.