TACO: Topics in Algorithmic COde generation dataset (2312.14852v3)
Abstract: We introduce TACO, an open-source, large-scale code generation dataset, with a focus on the optics of algorithms, designed to provide a more challenging training dataset and evaluation benchmark in the field of code generation models. TACO includes competition-level programming questions that are more challenging, to enhance or evaluate problem understanding and reasoning abilities in real-world programming scenarios. There are 25433 and 1000 coding problems in training and test set, as well as up to 1.55 million diverse solution answers. Moreover, each TACO problem includes several fine-grained labels such as task topics, algorithms, programming skills, and difficulty levels, providing a more precise reference for the training and evaluation of code generation models. The dataset and evaluation scripts are available on Hugging Face Hub (https://huggingface.co/datasets/BAAI/TACO) and Github (https://github.com/FlagOpen/TACO).
- Program synthesis with large language models. arXiv preprint arXiv:2108.07732, 2021.
- Language models are few-shot learners. Advances in neural information processing systems, 33:1877–1901, 2020.
- Rich Caruana. Multitask learning. Machine learning, 28:41–75, 1997.
- Evaluating large language models trained on code. arXiv preprint arXiv:2107.03374, 2021.
- Measuring coding challenge competence with apps. arXiv preprint arXiv:2105.09938, 2021.
- Antti Laaksonen. Competitive programmer’s handbook. Preprint, 5, 2017.
- Competition-level code generation with alphacode. Science, 378(6624):1092–1097, 2022.
- Starcoder: may the source be with you! arXiv preprint arXiv:2305.06161, 2023.
- Is your code generated by chatgpt really correct? rigorous evaluation of large language models for code generation. arXiv preprint arXiv:2305.01210, 2023.
- Codegen: An open large language model for code with multi-turn program synthesis. arXiv preprint arXiv:2203.13474, 2022.
- OpenAI. Gpt-4 technical report. ArXiv, abs/2303.08774, 2023.
- Code llama: Open foundation models for code. 2023.
- Attention is all you need. Advances in neural information processing systems, 30, 2017.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.