garak.generators.langchain
LangChain generator support
- class garak.generators.langchain.LangChainLLMGenerator(name='', config_root=<module 'garak._config' from '/home/docs/checkouts/readthedocs.org/user_builds/garak/checkouts/latest/docs/source/../../garak/_config.py'>)
Bases:
Generator
Class supporting LangChain LLM interfaces
- See LangChain’s supported models here,
Calls invoke with the prompt and relays the response. No per-LLM specific checking, so make sure the right environment variables are set.
Set –model_name to the LLM type required.
Explicitly, garak delegates the majority of responsibility here:
the generator calls invoke() on the LLM, which seems to be the most widely supported method
langchain-relevant environment vars need to be set up there
There’s no support for chains, just the langchain LLM interface.
- DEFAULT_PARAMS = {'context_len': None, 'frequency_penalty': 0.0, 'k': 0, 'max_tokens': 150, 'p': 0.75, 'presence_penalty': 0.0, 'preset': None, 'stop': [], 'temperature': 0.75, 'top_k': None}
- generator_family_name = 'LangChain'