garak.generators.langchain_serve

class garak.generators.langchain_serve.LangChainServeLLMGenerator(name=None, config_root=<module 'garak._config' from '/home/docs/checkouts/readthedocs.org/user_builds/garak/checkouts/latest/docs/source/../../garak/_config.py'>)

Bases: Generator

Class supporting LangChain Serve LLM interfaces via HTTP POST requests.

This class facilitates communication with LangChain Serve’s LLMs through a web API, making it possible to utilize external LLMs not directly integrated into the LangChain library. It requires setting up an API endpoint using LangChain Serve.

Utilizes the HTTP POST method to send prompts to the specified LLM and retrieves the generated text response. It is necessary to ensure that the API endpoint is correctly set up and accessible.

Inherits from Garak’s base Generator class, extending its capabilities to support web-based LLM services. The API endpoint is set through the ‘LANGCHAIN_SERVE_URI’ environment variable, which should be the base URI of the LangChain Serve deployment. The ‘invoke’ endpoint is then appended to this URI to form the full API endpoint URL.

Example of setting up the environment variable:

export LANGCHAIN_SERVE_URI=http://127.0.0.1:8000/rag-chroma-private

DEFAULT_PARAMS = {'config_hash': 'default', 'context_len': None, 'max_tokens': 150, 'temperature': None, 'top_k': None}
ENV_VAR = 'LANGCHAIN_SERVE_URI'
config_hash = 'default'
generator_family_name = 'LangChainServe'