Prompting and Prompt Engineering Best Practices
- Featured Insights
- November 24, 2023
What Is Prompting?
Prompting, or prompt engineering, is not as typical as it sounds. Its understanding is vital for us to interact with LLMs (Large Language Models). Prompt Engineering allows the model to draft responses based on details, instructions and occasional components. Efficient prompt engineering will optimize interactions and enhance accuracy with problem-solving.
Prompts can be mandatory (primary input) or optional (supplementary). This article will explore prompt types and discuss zero-shot, few-shot, and chain-of-thought techniques. Acquiring prompt engineering skills is essential as AI tools play an increasing role in daily life.
Understanding Prompts
Prompts are pivotal in engaging with models, necessitating precise and customized natural language directives to guarantee precise outcomes. The right prompt will comprise multiple components like contextual details, queries, and even partial inputs, which the model employs to craft responses. These responses may encompass diverse tasks such as text generation or summarization, code creation, image and video production, grammar correction in multiple languages, sourcing online information to answer questions, and beyond.
Types Of Prompt Content
Mandatory
This is the main part of the prompt, which the model will respond to, such as questions or tasks like completing sentences, generating content, summarizing text, answering questions that require web research and more.
Optional
Optional content is the additional information, context, or examples that can enhance prompt effectiveness, not mandatory.
Although this part of the prompt is not mandatory, it does help increase the accuracy of the results. Some examples of context are instructions that tell the model how it should behave or information that the model can use for reference.
Prompt Examples
Here are some prompt examples:
- Extract all the important entities mentioned in the text below. First, extract all person names, then extract all addresses of each person, then extract all person’s ages and finally extract the country of residence of each person. (add text)
- Provide ideas to decorate an apartment using a minimalistic style.
- Classify the following items from largest to smallest.
- Give me a list of things I should bring on a camping trip.
- Can you give me a list of top places to visit in New York in August?
- Provide a summary of the text below containing all main ideas organized in bullet points.
- Make the below text clearer and more concise, and correct all grammar errors.
Prompt Techniques
Numerous prompting techniques exist, and the most prevalent methods are as follows:
Prompt Direct prompting (Zero-shot)
This technique provides direct instruction to the model with no examples, and it relies on its capabilities. For instance, if the model has been primarily trained for sentiment analysis, you can instruct it to classify a specific text, and the model will deliver a response, as illustrated below:
In the above example, no classification was provided to the model, yet it could understand the instruction and classify it as negative. See the below variation with the same results:
Few shot prompting
Few shot approach offers specific examples to the model, serving as guiding instances to generate the intended result based on the provided information. It proves especially valuable when precise instruction is lacking, and instead, you can demonstrate the desired outcome from the model. This concept is akin to performing few-shot prompting, as exemplified below: 
Chain-of-thought prompting
Chain-of-thought prompting is a strategy that equips the model with advanced reasoning capabilities, prompting it to delve into reasoning. This can be better understood with an example:
The model could have provided a plain response of 13 apples, yet it described the reasoning behind the result; this is a chain of thought prompting.
Tips To Create A Prompt
- Get the latest model version, read through the latest documentation, and understand the model’s capabilities, strengths, weaknesses, and any specific syntax that would help the model perform better.
- Be as specific as possible, and provide clear instructions on the desired outcome, style, length, constraints, and format. Provide context or examples to the model.
- Use different techniques, experiment with different words and styles, fine-tune when needed and select the technique that produces the best results.
- Break down complex tasks into a sequence of simpler prompts.
Conclusion
Prompt engineering is a skill that everyone in the binary age should acquire as AI tools are becoming increasingly relevant for our day-to-day activities. Using different techniques and applying the main tips in this article will help get more accurate results.
Getting to know the algorithm through the documentation, fine-tuning and experimenting with different scenarios will help the user get the expected outcome.
Sources
- https://developers.google.com/machine-learning/resources/prompt-eng
- https://developers.generativeai.google/guide/prompt_best_practices
- https://developers.generativeai.google/guide/prompt_best_practices
- https://platform.openai.com/docs/guides/code/code-completion-private-beta
- https://help.openai.com/en/articles/6654000-best-practices-for-prompt-engineering-with-openai-api