garak.generators.litellm

LiteLLM model support

Support for LiteLLM, which allows calling LLM APIs using the OpenAI format.

Depending on the model name provider, LiteLLM automatically reads API keys from the respective environment variables. (e.g. OPENAI_API_KEY for OpenAI models)

e.g Supply a JSON like this for Ollama’s OAI api: ```json {

“litellm”: {
“LiteLLMGenerator”{

“api_base” : “http://localhost:11434/v1”, “provider” : “openai”

}

}

}

The above is an example of a config to connect LiteLLM with Ollama’s OpenAI compatible API.

Then, when invoking garak, we pass it the path to the generator option file.

` python -m garak --model_type litellm --model_name "phi" --generator_option_file ollama_base.json -p dan `

class garak.generators.litellm.LiteLLMGenerator(name: str = '', generations: int = 10, config_root=<module 'garak._config' from '/home/docs/checkouts/readthedocs.org/user_builds/garak/checkouts/latest/docs/source/../../garak/_config.py'>)

Bases: Generator

Generator wrapper using LiteLLM to allow access to different providers using the OpenAI API format.

DEFAULT_PARAMS = {'context_len': None, 'frequency_penalty': 0.0, 'max_tokens': 150, 'presence_penalty': 0.0, 'stop': ['#', ';'], 'temperature': 0.7, 'top_k': None, 'top_p': 1.0}
generator_family_name = 'LiteLLM'
supports_multiple_generations = True