| check_lms_version | Check if the installed LM Studio CLI meets the minimum requirement |
| has_lms | Check if LM Studio CLI is installed |
| install_lmstudio | Help the user install or update LM Studio |
| list_models | List available models |
| lms_chat | Chat Completion with LM Studio |
| lms_chat_batch | Batch Chat Completion with LM Studio |
| lms_chat_native | Chat Completion via Native API |
| lms_chat_openai | Chat Completion via OpenAI Compatibility API |
| lms_chat_openresponses | Chat Completion via OpenResponses API |
| lms_daemon_start | Start the LM Studio headless daemon |
| lms_daemon_status | Check the global status of LM Studio |
| lms_daemon_stop | Stop the LM Studio headless daemon |
| lms_download | Download a model via REST API |
| lms_download_status | Get the status of a download job |
| lms_load | Load a model via REST API |
| lms_path | Get the absolute path to the LMS executable |
| lms_score_expected | Calculate Expected Scores and Uncertainty from Logprobs |
| lms_server_start | Start the LM Studio local server |
| lms_server_status | Check the status of the LM Studio server |
| lms_server_stop | Stop the LM Studio local server |
| lms_unload | Unload a model from memory via REST API |
| lms_unload_all | Unload all models from memory |
| with_lms_daemon | Run code with the LM Studio daemon active |