COLLIE: Systematic Construction of
Constrained Text Generation Tasks

Shunyu Yao*   Howard Chen*   Austin Wang*   Runzhe Yang*   Karthik Narasimhan
(* authors contributed equally)

We propose the COLLIE framework for easy constraint structure specification, example extraction, instruction rendering, and model evaluation.

The steps for the whole pipeline is described below (referring to the above figure):
  1. Specification: user specifies the constraint structure without a specific target value (expressed in $*$)
  2. Extraction: constraint structure is used to extract examples from text corpora containing the target values
  3. Rendering: constraint structure and target values are rendered into a natural language instruction
  4. Evaluation: model's generation is evaluated against the constraint and the extracted examples
In this example, the model (gpt-3.5-turbo) violates the constraints by exceeding word limits and leaving the word `mankind' at the end instead of the specified position.

Paper Abstract

Text generation under constraints have seen increasing interests in natural language processing, especially with the rapidly improving capabilities of large language models. However, existing benchmarks for constrained generation usually focus on fixed constraint types (e.g., generate a sentence containing certain words) that have proved to be easy for state-of-the-art models like GPT-4. We present COLLIE, a grammar-based framework that allows the specification of rich, compositional constraints with diverse generation levels (word, sentence, paragraph, passage) and modeling challenges (e.g., language understanding, logical reasoning, counting, semantic planning). We also develop tools for automatic extraction of task instances given a constraint structure and a raw text corpus. Using COLLIE, we compile the COLLIE-v1 dataset with 2,080 instances comprising 13 constraint structures. We perform systematic experiments across five state-of-the-art instruction-tuned language models and analyze their performances to reveal shortcomings. COLLIE is designed to be extensible and lightweight, and we hope the community finds it useful to develop more complex constraints and evaluations in the future.


Citation

@misc{yao2023collie,
    title={COLLIE: Systematic Construction of Constrained Text Generation Tasks}, 
    author={Shunyu Yao and Howard Chen and Austin W. Hanjie and Runzhe Yang and Karthik Narasimhan},
    year={2023},
    eprint={2307.08689},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}