Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Hybrid Instance-based Transfer Learning Method (1812.01063v1)

Published 3 Dec 2018 in cs.LG, cs.AI, cs.CV, and stat.ML

Abstract: In recent years, supervised machine learning models have demonstrated tremendous success in a variety of application domains. Despite the promising results, these successful models are data hungry and their performance relies heavily on the size of training data. However, in many healthcare applications it is difficult to collect sufficiently large training datasets. Transfer learning can help overcome this issue by transferring the knowledge from readily available datasets (source) to a new dataset (target). In this work, we propose a hybrid instance-based transfer learning method that outperforms a set of baselines including state-of-the-art instance-based transfer learning approaches. Our method uses a probabilistic weighting strategy to fuse information from the source domain to the model learned in the target domain. Our method is generic, applicable to multiple source domains, and robust with respect to negative transfer. We demonstrate the effectiveness of our approach through extensive experiments for two different applications.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Azin Asgarian (7 papers)
  2. Parinaz Sobhani (6 papers)
  3. Ji Chao Zhang (2 papers)
  4. Madalin Mihailescu (1 paper)
  5. Ariel Sibilia (3 papers)
  6. Ahmed Bilal Ashraf (3 papers)
  7. Babak Taati (27 papers)
Citations (16)

Summary

We haven't generated a summary for this paper yet.