← Home
Local desktop runtime
LM Studio
GUI-first local LLMs: discover models, chat, ship a localhost API — popular with evaluators.
Official siteAxes (0–100)
- Local control / custody86
- Open stack (models & tooling)78
- Regulatory posture (curated)38
- Interoperability72
Last reviewed: 2026-04-07
Facts (curated)
- Positioning
- Closed-source app, free for personal use; strong model browser and one-click runs on Mac/Win/Linux.
- Vs QVAC
- Excellent for trying weights quickly — not a vertically integrated Tether product + device agent story.
Pros
Lowest friction for non-terminal users who still want local inference and OpenAI-compatible localhost serving.
Cons / risks
Enterprise features (RBAC, audit, SLAs) are not the default story — often paired with other tooling.
Related links
- LM Studio — local LLM desktop app
LM Studio · 2026-03-22