• Sign in
  • Sign up
Elektrine
EN
Log in Register
Modes
Overview Chat Timeline Communities Gallery Lists Friends Email Vault DNS VPN
Back to Timeline
  • Open on feddit.nl

Don Antonio Magino

@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

0 Followers
0 Following
Joined June 11, 2023

Posts

Open post
In reply to
DonAntonioMagino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
Don Antonio Magino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
@DonAntonioMagino@feddit.nl · Apr 10, 2026

I couldn’t get it to install… Something about ldconfig not being in the path.

I’ll try pip later, then…

EDIT: never mind, this is a barebones version (‘lmster’) anyway.

But going back to version 0.3.39-2 (which I found here) works perfectly, so great!

EDIT AGAIN: not quite perfectly. It wouldn’t start Gemma 4 giving an error about its architecture. Seems like it’s too new a model? I’m now trying 0.4.6-1 and this version does run Gemma 4.

View full thread on feddit.nl
1
0
0
0
Open post
In reply to
DonAntonioMagino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
Don Antonio Magino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
@DonAntonioMagino@feddit.nl · Apr 09, 2026

You mean the Debian installer? Seems like a bad idea on OpenSUSE.

EDIT: looking in the bug reports on Github, I’ve found a very recent bug report identical to what I’ve described, so it doesn’t seem to be isolated at least.

View full thread on feddit.nl
1
2
0
0
Open post
In reply to
DonAntonioMagino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
Don Antonio Magino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
@DonAntonioMagino@feddit.nl · Apr 09, 2026
It detects my hardware at least, but for whatever reason the Hub is empty and I can’t download the default Jan model... But it works when I import the models manually.
View full thread on feddit.nl
1
0
0
0
Open post
In reply to
DonAntonioMagino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
Don Antonio Magino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
@DonAntonioMagino@feddit.nl · Apr 09, 2026
Thanks for the tips. How do I go about doing this? What I’ve tried is run LM Studio from the console with --no-sandbox. No idea if that has something to do with what you’re referring to. I’ve also tried running it with sudo, which gives this error message: ::: spoiler spoiler [15557:0409/215138.000614:ERROR:ui/ozone/platform/x11/ozone_platform_x11.cc:249] Missing X server or $DISPLAY [15557:0409/215138.000647:ERROR:ui/aura/env.cc:257] The platform failed to initialize. Exiting. Segmentatiefout :::
View full thread on feddit.nl
1
2
0
0
Open post
DonAntonioMagino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
Don Antonio Magino
Don Antonio Magino
@DonAntonioMagino@feddit.nl

De Hoog-geleerde Dr. Antonio Magino, proffesoor en Matimaticus der Stadt Bolonia in Lombardyen.

feddit.nl
@DonAntonioMagino@feddit.nl in linux · Apr 09, 2026

[(Pretty much) solved] Can’t get LM Studio to work on OpenSUSE Leap 16

Solution: download an earlier version of the LM Studio AppImage. Or 0.4.6-1, which works better for me. I recently bought a new computer, and, after again trying Windows 11 for a bit, I decided I wanted to keep using OpenSUSE. Probably unpopular here: I enjoy screwing around with local LLM models. I used LM Studio on my old computer (on which I had also installed OpenSUSE Leap). I also tested it on my new computer in Windows 11, and it worked very nicely. Now I’m trying on my new computer with OpenSUSE Leap 16, and it doesn’t work at all. Specifically: no runtimes nor engines are present, and my hardware isn’t recognised at all - not my GPU nor my CPU, nothing. I’m thinking it’s a driver issue. I’ve looked around quite a bit, and also looked up (what seems to me) the most important error messages I got when running the AppImage from the console: spoiler [BackendManager] Surveying hardware with backends with options: {“type”:“newAndSelected”} [BackendManager] Surveying new engine ‘llama.cpp-linux-x86_64-avx2@2.12.0’ [ProcessForkingProvider][NodeProcessForker] Spawned process 13407 [ProcessForkingProvider][NodeProcessForker] Exited process 13407 21:17:29.644 › Failed to survey hardware with engine ‘llama.cpp-linux-x86_64-avx2@2.12.0’: LMSCore load lib failed - child process with PID 13407 exited with code 127 [BackendManager] Survey for engine ‘llama.cpp-linux-x86_64-avx2@2.12.0’ took 9.47ms [BackendManager] Surveying new engine ‘llama.cpp-linux-x86_64-nvidia-cuda-avx2@2.12.0’ [ProcessForkingProvider][NodeProcessForker] Spawned process 13408 [ProcessForkingProvider][NodeProcessForker] Exited process 13408 21:17:29.648 › Failed to survey hardware with engine ‘llama.cpp-linux-x86_64-nvidia-cuda-avx2@2.12.0’: LMSCore load lib failed - child process with PID 13408 exited with code 127 [BackendManager] Survey for engine ‘llama.cpp-linux-x86_64-nvidia-cuda-avx2@2.12.0’ took 3.70ms [BackendManager] Surveying new engine ‘llama.cpp-linux-x86_64-vulkan-avx2@2.12.0’ [ProcessForkingProvider][NodeProcessForker] Spawned process 13409 [ProcessForkingProvider][NodeProcessForker] Exited process 13409 21:17:29.651 › Failed to survey hardware with engine ‘llama.cpp-linux-x86_64-vulkan-avx2@2.12.0’: LMSCore load lib failed - child process with PID 13409 exited with code 127 [BackendManager] Survey for engine ‘llama.cpp-linux-x86_64-vulkan-avx2@2.12.0’ took 3.57ms This is my system with installed drivers: I did get Ollama to work… Any thoughts?
View on feddit.nl
12
17
0
0
313k7r1n3

Company

  • About
  • Contact
  • FAQ

Legal

  • Terms of Service
  • Privacy Policy
  • VPN Policy

Email Settings

IMAP: mail.elektrine.com:993

POP3: pop3.elektrine.com:995

SMTP: mail.elektrine.com:465

SSL/TLS required

Support

  • support@elektrine.com
  • Report Security Issue

Connect

Tor Hidden Service

khav7sdajxu6om3arvglevskg2vwuy7luyjcwfwg6xnkd7qtskr2vhad.onion
© 2026 Elektrine. All rights reserved. • Server: 15:38:12 UTC