LangFlair Docs
  • WELCOME
    • Introduction
    • What is LangFlair?
  • Setting Up
    • Create Account
    • Team Setup
    • Project Setup
    • Add Tasks
  • Prompt Management
    • Create Prompt
    • Test and Refine Prompt
    • Prompt Status
    • Task Prompts
    • Task Testing via API
  • API Integration
    • Working with LangFlair APIs
  • Stats and Insights
    • Prompt Call Logs Overview
    • Leveraging Task Stats for A/B Testing
  • SUPPORT
    • Contact Us
Powered by GitBook
On this page
  1. Prompt Management

Create Prompt

PreviousAdd TasksNextTest and Refine Prompt

Last updated 10 months ago

After you've clearly outlined the use cases for your project, the next pivotal step is to craft prompts that will power these tasks with the intelligence of Large Language Models (LLMs). Creating effective prompts is essential for eliciting the desired responses from your chosen LLMs, thereby enhancing the functionality and value of your application.

Here’s how to get started with creating prompts:

  • Select a Task: Begin by associating your new prompt with one of the previously defined tasks. This helps ensure that your prompt is targeted and relevant.

  • Craft Your Prompt Text: Write the actual text of your prompt. This should be constructed carefully to guide the LLM towards generating the type of response or content that fulfills your use case requirements.

Here first and important aspect is to select the LLM model you want to use for this prompt.

To ensure that our system generates the most relevant and accurate responses for you, please follow these instructions when submitting your request:

System Text (Optional):

The System Text is designed to provide context to the AI about the kind of response or information you're looking for. While optional, including this can help tailor the AI's output more closely to your needs.

  • If you choose to provide System Text: Think of it as setting the scene for the AI. Describe the overall context or specify the tone, style, or format you expect in the response. For example, if you're asking for a LinkedIn post from user text for users with certain persona traits.

User Text:

The User Text is where you pose your specific question or request to the AI. This is your main input and should be as clear and detailed as possible.

  • Crafting Your User Text: Clearly state what you need from the AI. Be specific about your request, and if relevant, include any critical details that will help the AI understand exactly what you're looking for. For instance, if you're seeking a social media post, specify the topic, desired length, and any key points that must be included.

System Prompt Text:

You are an AI assistant skilled at crafting engaging LinkedIn posts from provided text input. Your goal is to create a post that captivates the audience, expands on the core theme in an insightful manner, and concludes with a gentle yet meaningful call to action.

Present the output as final post which can directly be posted on LinkedIn.

{{additional_instructions}} 

User Prompt Text:

Here is my thought {{thought_text}}. Create a linkedin post for the same please.

Further Customizing Your Prompt

After inputting the basic details in the "Create Prompt" screen, you'll move to the editing screen for more advanced configurations. This stage allows you to fine-tune your prompt to achieve the best possible outcomes from your LangFlair API calls.

Adding Filter Parameters

For scenarios where you need to select specific prompts based on certain criteria, you can use "Filter Params". This feature is particularly useful when managing multiple prompts for the same use case across different platforms, such as creating AI-enhanced posts from user thoughts for LinkedIn, Twitter, etc. Detailed examples and code snippets for utilizing these filters will be provided in later sections of this documentation.

Optional LLM Parameters

While LangFlair offers the capability to set additional parameters for the LLMs, this option should be approached with caution:

  • Caution Advised: We strongly recommend against adjusting LLM parameters unless you have a thorough understanding of their implications. Incorrect settings could lead to suboptimal results or unexpected behaviors from the LLM.

Testing Your Prompt

Before deploying your prompt in a live environment, it's crucial to test its performance and output. The "Test Variables" section facilitates this:

  • Utilize Test Variables: These variables are derived from the ones used in your System and User Prompt Texts. By providing values for these variables, you can simulate real-world inputs and assess the prompt's effectiveness directly from the console.

This phase of prompt creation is instrumental in ensuring that your prompts are precisely tailored and optimized for their intended applications, contributing to the overall success of your project with LangFlair.

Add Prompt to a Task
Add Prompt
Select the right Model for your prompt
Prompt More Options