garak.generators.octo
OctoML LLM interface
- class garak.generators.octo.InferenceEndpoint(name='', config_root=<module 'garak._config' from '/home/docs/checkouts/readthedocs.org/user_builds/garak/checkouts/latest/docs/source/../../garak/_config.py'>)
Bases:
OctoGenerator
Interface for OctoAI private endpoints
Pass the model URL as the name, e.g. https://llama-2-70b-chat-xxx.octoai.run/v1/chat/completions
This module tries to guess the internal model name in self.octo_model. We don’t have access to private model so don’t know the format. If garak guesses wrong, please please open a ticket.
- class garak.generators.octo.OctoGenerator(name='', config_root=<module 'garak._config' from '/home/docs/checkouts/readthedocs.org/user_builds/garak/checkouts/latest/docs/source/../../garak/_config.py'>)
Bases:
Generator
Interface for OctoAI public endpoints
Pass the model name as name, e.g. llama-2-13b-chat-fp16. For more details, see https://octoai.cloud/tools/text.
- DEFAULT_PARAMS = {'context_len': None, 'max_tokens': 128, 'presence_penalty': 0, 'temperature': 0.1, 'top_k': None, 'top_p': 1}
- ENV_VAR = 'OCTO_API_TOKEN'
- generator_family_name = 'OctoAI'
- supports_multiple_generations = False