Papers
Topics
Authors
Recent
2000 character limit reached

Automatic Generation of German Drama Texts Using Fine Tuned GPT-2 Models (2301.03119v2)

Published 8 Jan 2023 in cs.CL

Abstract: This study is devoted to the automatic generation of German drama texts. We suggest an approach consisting of two key steps: fine-tuning a GPT-2 model (the outline model) to generate outlines of scenes based on keywords and fine-tuning a second model (the generation model) to generate scenes from the scene outline. The input for the neural model comprises two datasets: the German Drama Corpus (GerDraCor) and German Text Archive (Deutsches Textarchiv or DTA). In order to estimate the effectiveness of the proposed method, our models are compared with baseline GPT-2 models. Our models perform well according to automatic quantitative evaluation, but, conversely, manual qualitative analysis reveals a poor quality of generated texts. This may be due to the quality of the dataset or training inputs.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.