Working with LangFlair APIs
LangFlair APIs are designed with a broad range of product and engineering use cases in mind, offering flexible solutions to integrate Large Language Model (LLM) capabilities into your applications. Here's how to get started:
Endpoint
Base URL: https://www.langflair.com/api
Authentication
LangFlair uses team API keys for authentication, which can be generated from the team management screen. To authenticate your API calls, pass the key as a URL parameter.
Example:
Core APIs
LangFlair provides three main APIs for handling both self-managed and LangFlair-managed LLM calls:
1. Prompt API
Use this API to retrieve the prompt for a given use case if you plan to handle LLM calls on your own.
CURL Command:
Here "filter_params" are optional. They are required if you have created prompts for given use case with filters.
Output Format:
Now you can use these system and user prompt text and call mentioned llm call at your end.
Currently, LangFlair doesn't have any python or node library to call various LLMs from user code. We are working on it and keep our users posted on the same.
2. Execution API (Recommended)
For an end-to-end managed LLM call response, use this API. LangFlair manages the LLM calls, simplifying the process.
CURL Command:
Here "template_params" correspond to variables in both system and user prompt texts. "filter_params" are optional for refining prompt selections by additional criteria. "context_id" and "previous_messages," also optional, enhance LLM interactions with prior context. Use a unique "context_id" for LangFlair-managed contexts, or manually provide "previous_messages" for alternative context handling.
Output Format:
Make a note of prompt_call_id; as this would be used for the feedback api below.
3. Feedback API
Capture end-user feedback on the product experience generated from LLM output. This data is invaluable for prompt optimization.
Library Support
Currently, LangFlair does not offer Python or Node libraries for calling various LLMs from user code. We are actively working on developing these resources and will keep our users updated on our progress.
By leveraging these APIs, you can seamlessly integrate sophisticated LLM functionalities into your applications, enhancing them with AI-powered content generation, analysis, and interaction capabilities.
Last updated