Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 143 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 33 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 117 tok/s Pro
Kimi K2 195 tok/s Pro
GPT OSS 120B 436 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Improve Language Modelling for Code Completion through Statement Level Language Model based on Statement Embedding Generated by BiLSTM (1909.11503v2)

Published 25 Sep 2019 in cs.SE

Abstract: LLMs such as RNN, LSTM or other variants have been widely used as generative models in natural language processing. In last few years, taking source code as natural languages, parsing source code into a token sequence and using a LLM such as LSTM to train that sequence are state-of-art methods to get a generative model for solving the problem of code completion. However, for source code with hundreds of statements, traditional LSTM model or attention-based LSTM model failed to capture the long term dependency of source code. In this paper, we propose a novel statement-level LLM (SLM) which uses BiLSTM to generate the embedding for each statement. The standard LSTM is adopted in SLM to iterate and accumulate the embedding of each statement in context to help predict next code. The statement level attention mechanism is also adopted in the model. The proposed model SLM is aimed at token level code completion. The experiments on inner-project and cross-project data sets indicate that the newly proposed statement-level LLM with attention mechanism (SLM) outperforms all other state-of-art models in token level code completion task.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.