Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

libc++abi: terminating due to uncaught exception of type nlohmann::json_abi_v3_11_3 #550

Open
blademckain opened this issue Feb 14, 2025 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@blademckain
Copy link

Are You on the Latest version?
You did a git pull and are running the latest version/build?
Y

Please describe the bug
I downloaded and started a local llmm (mistral-7b-instruct-v0.2.Q8_0.llamafile)
A new window opened for me in dos with LLAMAFILE logo:
software: llamafile 0.9.0
model: mistral-7b-instruct-v0.2.Q8_0.llamafile
compute: Intel Core i9-10900F CPU @ 2.80GHz (skylake)
server: http://127.0.0.1:8080/

but when I go to Remote LLM Chat (Horizontal) -> select Llama.cpp
send a simple "Hello" to bot, the error appears :
Llama: Error occurred while processing summary with llama: ('Connection aborted.', ConnectionResetError(10054, "Connessione in corso interrotta forzatamente dall'host remoto", None, 10054, None))

And in the llama dos debug windows :
software: llamafile 0.9.0
model: mistral-7b-instruct-v0.2.Q8_0.llamafile
compute: Intel Core i9-10900F CPU @ 2.80GHz (skylake)
server: http://127.0.0.1:8080/

A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the human's questions.

libc++abi: terminating due to uncaught exception of type nlohmann::json_abi_v3_11_3::detail::type_error: [json.exception.type_error.306] cannot use value() with string

error: Uncaught SIGABRT (SI_TKILL) at 0 on DESKTOP-VUAULBG pid 25492 tid 12056
/E/tldw/tldw/llamafile.exe
Protocol not available
Windows Cosmopolitan 4.0.2 MODE=x86_64 DESKTOP-VUAULBG 10.0

Is the bug reproducable reliably?
Yes

Desktop (please complete the following information):

  • OS: Windows 10
  • Browser Chrome
  • Date you performed the last git pull: Today
@blademckain blademckain added the bug Something isn't working label Feb 14, 2025
@rmusser01
Copy link
Owner

rmusser01 commented Feb 14, 2025

That looks like an error in llamafile and not my app. Please try with the llama.cpp server binary and see if the same error occurs.

@blademckain

@rmusser01
Copy link
Owner

Adding to this, I verified that this behavior occurs while using Llamafile and not llama.cpp .

Without doing too much testing, I believe that it's related to the JSON library llamafile uses and its handling of parsing errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants