Prompt Library
This is a growing collection of prompts that you can import into the Prompts CMS
How to import a prompt
To import a prompt into your agent, head to the prompts CMS and click on the 'import' button in the top right.
Note: This collection of prompts is growing - we'll add more examples each time we create a new template!
General Prompts
The prompt step is used to generate a response and show it directly to the customer.
Sales agent
The simplest way to use Voiceflow is the same way you use chatGPT, just a loop with a large prompt. This prompt asks the user specific questions and then sends a final message when its done.
Knowledge Base - Simple answer (ideal for Voice)
To generate an answer using information from the knowledge base, you first need to search the knowledge base with the KB search step (item 1 in the image below). The relevant information is saved to a variable. In this case, we have called that variable {chunks}.
The two prompts below show how to use that information to generate a simple answer (for voice) and a detailed answer (for chat)
Knowledge Base - detailed answer (ideal for chat)
Used in the same way as the prompt above
This prompt uses the information from the Knowledge base (saved in the {chunks} variable) to generate a highly detailed answer for the user that is properly formatted with titles, bullet points, markdown etc. Ideal for a chat experience.
Prompts with the Set Step
Prompts can be attached to the Set step to save the output to a variable for later use. Here are some examples.
Optimize the customers question with memory
Sometimes customers don't ask questions that have all the necessary context. This prompt combines the conversation memory with the last question the user asked (saved in {last_utterance}) to generate a question that is better optimized for AI search.
This is especially valuable when you use the Knowledge Base search step since it doesn't use memory by default.
In this example we optimize the customers question and save it to the {optimized_question} variable. Then we use that in the KB search step for higher quality results.
Detect customers language
Another scenario is detecting the customers language. In this case we use a prompt to determine the language and save it to a variable called {language_JSON}. We then use that variable in a prompt later on to tell the AI to respond in the language indicated in {language_JSON}.
Prompts with condition steps
Prompts can be attached to a condition step to have AI process logic for you. Below are a few great examples.
Evaluate Conversation
This prompt is used to evaluate if the conversation should end or keep going. It looks at the last message that the agent has sent to the user to see if it contains anything similar to "I have all the information I need". Here is an example of how its used in a conversation. (Download this template)
Check is a clarifying question is required
This prompt is intended to be used with the Knowledge Base search step. It evaluates the question that the customer asked in {last_utterance} and evaluates the information that was found in the knowledge base (saved in the {chunks} variable).
If it determines that a clarifying question is required it will output a #. Here is an example of how it is used in a flow.
Updated 9 days ago