Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

r-softmax: Generalized Softmax with Controllable Sparsity Rate (2304.05243v3)

Published 11 Apr 2023 in cs.LG

Abstract: Nowadays artificial neural network models achieve remarkable results in many disciplines. Functions mapping the representation provided by the model to the probability distribution are the inseparable aspect of deep learning solutions. Although softmax is a commonly accepted probability mapping function in the machine learning community, it cannot return sparse outputs and always spreads the positive probability to all positions. In this paper, we propose r-softmax, a modification of the softmax, outputting sparse probability distribution with controllable sparsity rate. In contrast to the existing sparse probability mapping functions, we provide an intuitive mechanism for controlling the output sparsity level. We show on several multi-label datasets that r-softmax outperforms other sparse alternatives to softmax and is highly competitive with the original softmax. We also apply r-softmax to the self-attention module of a pre-trained transformer LLM and demonstrate that it leads to improved performance when fine-tuning the model on different natural language processing tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Klaudia Bałazy (8 papers)
  2. Łukasz Struski (37 papers)
  3. Marek Śmieja (48 papers)
  4. Jacek Tabor (106 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.