Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Localized Linear Regression in Networked Data (1903.11178v2)

Published 26 Mar 2019 in cs.LG and stat.ML

Abstract: The network Lasso (nLasso) has been proposed recently as an efficient learning algorithm for massive networked data sets (big data over networks). It extends the well-known least absolute shrinkage and selection operator (Lasso) from learning sparse (generalized) linear models to network models. Efficient implementations of the nLasso have been obtained using convex optimization methods lending to scalable message passing protocols. In this paper, we analyze the statistical properties of nLasso when applied to localized linear regression problems involving networked data. Our main result is a sufficient condition on the network structure and available label information such that nLasso accurately learns a localized linear regression model from a few labeled data points. We also provide an implementation of nLasso for localized linear regression by specializing a primaldual method for solving the convex (non-smooth) nLasso problem.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Alexander Jung (56 papers)
  2. Nguyen Tran (7 papers)
Citations (27)

Summary

We haven't generated a summary for this paper yet.