Create Prompt
Last updated
Last updated
After you've clearly outlined the use cases for your project, the next pivotal step is to craft prompts that will power these tasks with the intelligence of Large Language Models (LLMs). Creating effective prompts is essential for eliciting the desired responses from your chosen LLMs, thereby enhancing the functionality and value of your application.
Here’s how to get started with creating prompts:
Select a Task: Begin by associating your new prompt with one of the previously defined tasks. This helps ensure that your prompt is targeted and relevant.
Craft Your Prompt Text: Write the actual text of your prompt. This should be constructed carefully to guide the LLM towards generating the type of response or content that fulfills your use case requirements.
Here first and important aspect is to select the LLM model you want to use for this prompt.
To ensure that our system generates the most relevant and accurate responses for you, please follow these instructions when submitting your request:
The System Text is designed to provide context to the AI about the kind of response or information you're looking for. While optional, including this can help tailor the AI's output more closely to your needs.
If you choose to provide System Text: Think of it as setting the scene for the AI. Describe the overall context or specify the tone, style, or format you expect in the response. For example, if you're asking for a LinkedIn post from user text for users with certain persona traits.
The User Text is where you pose your specific question or request to the AI. This is your main input and should be as clear and detailed as possible.
Crafting Your User Text: Clearly state what you need from the AI. Be specific about your request, and if relevant, include any critical details that will help the AI understand exactly what you're looking for. For instance, if you're seeking a social media post, specify the topic, desired length, and any key points that must be included.
System Prompt Text:
User Prompt Text:
Further Customizing Your Prompt
After inputting the basic details in the "Create Prompt" screen, you'll move to the editing screen for more advanced configurations. This stage allows you to fine-tune your prompt to achieve the best possible outcomes from your LangFlair API calls.
For scenarios where you need to select specific prompts based on certain criteria, you can use "Filter Params". This feature is particularly useful when managing multiple prompts for the same use case across different platforms, such as creating AI-enhanced posts from user thoughts for LinkedIn, Twitter, etc. Detailed examples and code snippets for utilizing these filters will be provided in later sections of this documentation.
While LangFlair offers the capability to set additional parameters for the LLMs, this option should be approached with caution:
Caution Advised: We strongly recommend against adjusting LLM parameters unless you have a thorough understanding of their implications. Incorrect settings could lead to suboptimal results or unexpected behaviors from the LLM.
Before deploying your prompt in a live environment, it's crucial to test its performance and output. The "Test Variables" section facilitates this:
Utilize Test Variables: These variables are derived from the ones used in your System and User Prompt Texts. By providing values for these variables, you can simulate real-world inputs and assess the prompt's effectiveness directly from the console.
This phase of prompt creation is instrumental in ensuring that your prompts are precisely tailored and optimized for their intended applications, contributing to the overall success of your project with LangFlair.