LM Studio Logo

LM Studio

Your local AI toolkit.

Free
Screenshot of LM Studio

Description

LM Studio provides a user-friendly desktop interface for discovering, downloading, and running various open-source Large Language Models (LLMs) directly on your personal computer. Designed for simplicity, it allows users to operate models like Llama, DeepSeek, Mistral, and Phi entirely offline, ensuring data privacy as all processing remains local. The application supports Windows, Mac (M1-M4), and Linux systems.

Beyond basic chat interaction, LM Studio enables users to set up a local inference server, making the models accessible to other applications through an API. It also features capabilities for Retrieval-Augmented Generation (RAG), allowing interaction with local documents. While the main GUI application is proprietary, LM Studio offers open-source SDKs for Python and TypeScript, facilitating the development of local AI-powered applications. It leverages libraries like llama.cpp to provide efficient model execution without requiring users to manage complex setups.

Key Features

  • Run LLMs Locally & Offline: Operate models on your computer without an internet connection.
  • Model Discovery & Download: Find and download open-source LLMs (GGUF, MLX formats).
  • Local LLM Server: Host models locally for use via an API.
  • Chat Interface: Interact with downloaded LLMs through a simple UI.
  • Chat with Local Documents (RAG): Use Retrieval-Augmented Generation with your files.
  • Cross-Platform Support: Works on Mac (M1-M4), Windows (x86/ARM), and Linux (x86).
  • Developer SDKs: Python and TypeScript libraries for building local AI apps.
  • Beginner Friendly: Easy to start using local LLMs without expertise.

Use Cases

  • Running LLMs privately without data leaving the local machine.
  • Developing and testing AI applications using a local inference server.
  • Experimenting with various open-source LLMs easily.
  • Interacting with local documents for information retrieval or analysis.
  • Exploring LLM capabilities without requiring cloud services.

Frequently Asked Questions

Does LM Studio collect any data?

No. Your data remains private and local to your machine. LM Studio is designed for privacy and offline operation.

What are the minimum hardware / software requirements?

LM Studio works on M1/M2/M3/M4 Macs, Windows (x86 or ARM), and Linux PCs (x86) with a processor that supports AVX2.

What kind of models can I run?

You can run compatible Large Language Models (LLMs) from Hugging Face in GGUF (llama.cpp) format, as well as MLX format (Mac only). GGUF text embedding models are also supported. Image generation models are not yet supported.

Is LM Studio open source?

The LM Studio GUI app is not open source. However, LM Studio‘s CLI (lms), Core SDK, and MLX inferencing engine are MIT licensed and open source.

Can I use LM Studio at my company or organization?

Please fill out the LM Studio @ Work request form for inquiries about organizational use.

You Might Also Like