论文标题

对语言产生的有效且无训练的控制

Efficient and Training-Free Control of Language Generation

论文作者

Wu, Shangda, Sun, Maosong

论文摘要

近年来,人们对能够生成具有可控属性的文本的语言模型的发展越来越感兴趣。尽管已经提出了几种方法,但其中许多方法都需要特定条件的数据或大量的计算资源。在这项研究中,我们提出了一种称为Gamma采样的新方法,该方法可以使可控的语言生成无需任何训练数据并保持快速生成速度。伽玛采样将与属性相关的信息合并到采样过程中,有效地指导语言模型以产生具有所需属性的文本。我们的实验结果表明,在应用于GPT2上的伽玛采样在多样性,属性相关性和生成样品的整体质量方面优于代表性基准。

In recent years, there has been a growing interest in the development of language models capable of generating text with controllable attributes. While several approaches have been proposed, many of these methods require condition-specific data or significant computational resources. In this study, we propose a novel method called Gamma Sampling, which enables controllable language generation without the need for any training data and maintains a fast generation speed. Gamma Sampling incorporates attribute-related information into the sampling process, effectively guiding the language model to produce text with desired attributes. Our experimental results demonstrate that Gamma Sampling, when applied to GPT2, outperforms representative baselines in terms of diversity, attribute relevance, and overall quality of the generated samples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源