Looks like open ai is hardcoded here, I think it should be referencing the selected model I've setup in my settings (a local server on my network)
|
def generate(prompt, engine, api_base, api_key): |
|
openai.base_url, openai.api_key = api_base + '/', api_key |
|
#print('calling engine', engine, 'at endpoint', openai.api_base) |
|
#print('prompt:', prompt) |
|
response = openai.completions.create(prompt=prompt, |
|
max_tokens=1, |
|
n=1, |
|
temperature=0, |
|
logprobs=100, |
|
model=engine).dict() |
|
return response |
Looks like open ai is hardcoded here, I think it should be referencing the selected model I've setup in my settings (a local server on my network)
loom/util/multiverse_util.py
Lines 8 to 18 in 91ca920