When using Large Language Models (LLMs) such as ChatGPT there are multiple ways that you can generate responses. When you pose a question or ask the Generative AI to do something, you are creating a prompt. There are several different types of prompt and the number is growing the more people learn about how LLMs work.
Prompt Patterns: Building consistency
Prompt patterns or the action of structuring a statement can give users more control over the output (there are obviously variations in the output itself), but there is consistency in the behaviour (the user’s intention).
Read more: Flipped Interaction Pattern
Instead of prompting the GenAI for a response ask the GenAI to ask you questions in order to create a more comprehensive response. To use this pattern your prompt could be written as follows Example Act as a learning designer and ask me questions about my learning my Adult Education Research methods 400-level course in…
Read more: Reflect on Reflection (RoR)
Ask the model to evaluate the response that it generated and then to rewrite this response based on the evaluation. To use this pattern your prompt could be written as follows Example Evaluate the recipe you have just shared based on ease, deliciousness and clarity. Now rewrite the recipe.
Read more: Chain of Thought Prompting (CoT)
Ask the mode to show its work step by step or explaining its reasoning. This approach improves the quality of the output and can help you understand the reasoning used in producing the output. To use this pattern, your prompt should include the following statement(s): Example A juggler can juggle 16 balls in total. Half…
Read more: A Prompt Game
The goal of this pattern is to create a game that you can use with ChatGPT or other GenAI to practice effective prompting. To use this pattern your prompt could be written as follows Example I am currently taking a course on learning how to prompt with large language models. Create a prompt game to…
Read more: Cognitive Verifier Pattern
The intent of this pattern is to ensure the large language model is clear on the query. It will ask questions to better understand and refine it. To use this pattern, your prompt could be written as follows: Example
Read more: Few Shot
Few-shot learning: In few-shot learning, the model is trained on a few examples (usually a small number) per new class. This is a more practical scenario than one-shot learning, as it allows the model to see a bit more data for each new class and improve its generalization. Examples
Read more: Single Shot
A “single shot” prompt refers to a concise and self-contained input provided to a large language model in a single instance, typically consisting of a short sentence or a few words. Unlike multi-turn conversations or dialogue-based prompts, a single shot prompt doesn’t rely on context from previous interactions and is treated as an isolated input.…
Read more: Subdivide Questions
LLMs can often reason better if a question is subdivided into additional questions that could be used to better answer the original question. To use this pattern, your prompt could be written as follows. When you are asked a question, follow these rules: Examples
Read more: The QEC Model
This prompt technique is composed of the following statements: Example Example 1 Example 2 Did you know? For each output generated, ask ChatGPT the criteria that were used to generate that response, and to regenerate the response based on those criteria. A technique called reflect on reflexion. Reflect on reflexion
Read more: The Audience Persona Pattern
For the audience persona pattern ask the AI to produce an output for an audience persona that is appropriate for that audience, instead of providing rules. Examples