You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Are You on the Latest version?
You did a git pull and are running the latest version/build?
Y
Please describe the bug
I downloaded and started a local llmm (mistral-7b-instruct-v0.2.Q8_0.llamafile)
A new window opened for me in dos with LLAMAFILE logo:
software: llamafile 0.9.0
model: mistral-7b-instruct-v0.2.Q8_0.llamafile
compute: Intel Core i9-10900F CPU @ 2.80GHz (skylake)
server: http://127.0.0.1:8080/
but when I go to Remote LLM Chat (Horizontal) -> select Llama.cpp
send a simple "Hello" to bot, the error appears :
Llama: Error occurred while processing summary with llama: ('Connection aborted.', ConnectionResetError(10054, "Connessione in corso interrotta forzatamente dall'host remoto", None, 10054, None))
And in the llama dos debug windows :
software: llamafile 0.9.0
model: mistral-7b-instruct-v0.2.Q8_0.llamafile
compute: Intel Core i9-10900F CPU @ 2.80GHz (skylake)
server: http://127.0.0.1:8080/
A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the human's questions.
libc++abi: terminating due to uncaught exception of type nlohmann::json_abi_v3_11_3::detail::type_error: [json.exception.type_error.306] cannot use value() with string
error: Uncaught SIGABRT (SI_TKILL) at 0 on DESKTOP-VUAULBG pid 25492 tid 12056
/E/tldw/tldw/llamafile.exe
Protocol not available
Windows Cosmopolitan 4.0.2 MODE=x86_64 DESKTOP-VUAULBG 10.0
Is the bug reproducable reliably?
Yes
Desktop (please complete the following information):
OS: Windows 10
Browser Chrome
Date you performed the last git pull: Today
The text was updated successfully, but these errors were encountered:
Are You on the Latest version?
You did a git pull and are running the latest version/build?
Y
Please describe the bug
I downloaded and started a local llmm (mistral-7b-instruct-v0.2.Q8_0.llamafile)
A new window opened for me in dos with LLAMAFILE logo:
software: llamafile 0.9.0
model: mistral-7b-instruct-v0.2.Q8_0.llamafile
compute: Intel Core i9-10900F CPU @ 2.80GHz (skylake)
server: http://127.0.0.1:8080/
but when I go to Remote LLM Chat (Horizontal) -> select Llama.cpp
send a simple "Hello" to bot, the error appears :
Llama: Error occurred while processing summary with llama: ('Connection aborted.', ConnectionResetError(10054, "Connessione in corso interrotta forzatamente dall'host remoto", None, 10054, None))
And in the llama dos debug windows :
software: llamafile 0.9.0
model: mistral-7b-instruct-v0.2.Q8_0.llamafile
compute: Intel Core i9-10900F CPU @ 2.80GHz (skylake)
server: http://127.0.0.1:8080/
A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the human's questions.
error: Uncaught SIGABRT (SI_TKILL) at 0 on DESKTOP-VUAULBG pid 25492 tid 12056
/E/tldw/tldw/llamafile.exe
Protocol not available
Windows Cosmopolitan 4.0.2 MODE=x86_64 DESKTOP-VUAULBG 10.0
Is the bug reproducable reliably?
Yes
Desktop (please complete the following information):
The text was updated successfully, but these errors were encountered: