In this work, we discuss methods of promptprogramming, emphasizing the usefulness of considering prompts through the lensof natural language . We explore techniques for exploiting the capacity of narrative and cultural anchors to encode nuanced intentions . We suggest that the function of few-shot examples in these cases is better described as locating an already learned task rather than meta-learning . Informed by this more encompassing theory of prompt programming, we also introduce the idea of a metaprompt that seeds the model to generate its own natural language prompts for a range of tasks . We discuss how these more general methods of interacting with language models can be incorporated into existing and future benchmarks and practical applications . We also discuss howthese more general . methods of interaction with . language models . can be . incorporated into . existing and . future benchmarks, and practical

Author(s) : Laria Reynolds, Kyle McDonell

Links : PDF - Abstract

Code :
Coursera

Keywords : language - models - methods - discuss - future -

Leave a Reply

Your email address will not be published. Required fields are marked *