In this tutorial, we will explore promptrefiner
: A tiny python tool I have created to create perfect system prompts for your local LLM, by using the help of the GPT-4 model.
The python code in this article is available here:
https://github.com/amirarsalan90/promptrefiner.git
Crafting an effective and detailed system prompt for your program can be a challenging process that often requires multiple trials and errors, particularly when working with smaller LLMs, such as a 7b language model. which can generally interpret and follow less detailed prompts, a smaller large language model like Mistral 7b would be more sensitive to your system prompt.
Let’s imagine a scenario where you’re working with a text. This text discusses a few individuals, discussing their contributions or roles. Now, you want to have your local language model, say Mistral 7b, distill this information into a list of Python strings, each pairing a name with its associated details in the text. Take the following paragraph as a case: