10 Tips for Improving Your AI Prompts — LLM Prompt Engineering
In the rapidly evolving field of natural language processing (NLP), prompt engineering has emerged as a crucial skill. By carefully crafting inputs or queries, we can guide generative AI models to produce desired outcomes with precision and creativity. Here, we share ten invaluable tips for mastering the art of prompt engineering, leveraging cutting-edge tools and techniques to enhance your generative AI outputs.
1. Understand the Purpose
Generative AI models create original content based on learned patterns and examples from vast amounts of data. However, these models are not mind readers; they rely heavily on the details and context provided in the prompts to generate accurate and pertinent content. Therefore, providing as much detailed and clear information in your prompts is essential for obtaining the best possible outputs from these AI systems.
By putting effort into crafting specific prompts, you ensure that the AI system can produce content that closely aligns with your expectations and requirements. This is key in maximizing the efficiency and effectiveness of the interaction, resulting in more relevant and high-quality responses.
The quality of responses is a function of the quality of the data or text in your prompt so making sure to put a little more in the prompt will go a long way.
2. Be Specific
When creating prompts, specificity is your best friend. Vague or broad prompts can lead to equally vague responses. Instead, use specific details, clear instructions, and well-defined parameters to guide the AI towards generating the precise output you need. The more sharply you define your needs, the better the AI can meet them.
3. Set Clear Objectives
Decide what you want to achieve with your prompt before you craft it. Are you looking for a creative story, a factual explanation, or perhaps a piece of dialogue? By setting clear objectives, you can tailor your prompt to align tightly with your desired end result, making it easier for the AI to deliver a suitable response.
4. Use Examples
Examples can greatly enhance the AI’s understanding of what you’re seeking. By providing sample outputs within your prompt, you give the AI a concrete idea of your expectations. This can be particularly useful for creative tasks, such as generating poetry or designing marketing copy, where nuance and style play a crucial role.
Few-shot prompting refers to the prompt engineering technique which involves providing a few examples of the response you would like to see given similar or the same prompt. It’s been proven to provide substantial improvements in response accuracy although not perfect.
Many-shot is using a bunch of examples as opposed to just 2–3 of few-shot to see.
5. Context is Key
Providing context within your prompt helps the AI understand the background of your request. This includes relevant details, scenarios, or background information that can steer the model’s response to be more accurate and contextually appropriate.
There are several ways you can provide context:
Retrieval Augmented Generation – The process of using similarity search on a prompt to retrieve relevant context from a vector database to improve your contextualise your prompt further. The most common and effective way thanks to developments in vector databases and similarity search. RAG as they call it has emerged as increasingly impactful area within generative AI for its ability to improve usefulness of LLM for specific knowledge based tasks.
Images or PDFs – Some LLM interfaces are multi-modal meaning they can interpret images or scan PDFs for text and use it to more accurately answer your prompt. You can create your own tools that scan images and convert them into text or even PDF readers that do the same thing. The text can be injected into your prompt for more relevant responses.
Manually Insert Context – Context windows are large enough nowadays it’s more viable than ever to make use of them by providing information about your task or use case. It’s not recommended that you include too much information firstly for privacy reasons. Secondly, the ability to precisely pick out information from the window and generate tokens relating to them decreases as you increase your context window utilisation. For example, if specific facts or figures about an annual report you are analysing are stuck in the middle of your text, it may easily be missed or misinterpreted. It’s referred to as the needle in a haystack problem.
6. Experiment and Iterate
Effective prompt engineering often involves experimentation. Don’t be afraid to tweak and iterate on your prompts to see how different variations affect the AI’s responses.
Even the smallest tweaks in language used in your prompts can have a significant impact on the responses so play around and see what works.
This trial and error approach can lead to discovering the most effective and efficient way to frame your prompts.
7. Harness Feedback Loops
Utilise feedback from the outputs to continuously improve your prompts. This involves reviewing the AI’s responses and refining your prompts accordingly. By creating a feedback loop, you can progressively enhance the quality and relevance of the generated content.
How would you do it?
Create a prompt store of all your useful prompts and the responses they produced
Update with new prompts as models update and techniques
Refine prompts based on previous responses to achieve your new and desired AI response
8. Adjust The Parameters
This tip is only available if you access to prompt playgrounds or if you use LLMs a bit earlier in the software stack.
If you are developing software that requires the use of APIs, you can tune parameters such as top_p or temperature which are two values that affect randomness of responses. Both which can vary between 0 and 1.
Higher values lead to more creative but inherently random responses because the next token is not selected from the set of most probable but rather slightly lower probability tokens.
This is particularly useful in creative tasks.
For example, our TEV1 AI image generator makes use of top_p to create more interesting or off the wall artworks.
9. Use AI
If you are unsure what prompt your task might benefit from, ask AI to improve your current prompt!
For example, let’s say you were trying to generate an SEO description for your article, there are many factors and things you might consider to write a effective description.
“generate a prompt for generating an SEO description for a given article” turns into
“Generate an engaging, keyword-rich SEO description for an article [INPUT ARTICLE INFORMATION]. Target a [AUDIENCE TYPE] aged 25–40, focus on productivity and work-life balance, and keep it under 160 characters.”
10. Keep Learning
The field of AI and NLP is constantly evolving. Stay updated with the latest research, techniques, and tools in prompt engineering.
Thanks for reading. Hopefully, you found some value in a few of the tips, if not all of them, to enhance your creative tasks and generative AI outputs. On behalf of Asycd, we appreciate your engagement and look forward to helping you further. If you have any queries or need additional assistance, feel free to reach out.