inferenceLMStudio
Version 1.0.0 (2.41 KB) by
Ahmad Faisal Ayob (A. F. Ayob)
Calling LM Studio (v0.2.10) from MATLAB using curlCommand
LM-Studio-MATLAB-Call
A demonstration of MATLAB call to LM Studio HTTP server that behaves like OpenAI's API.
function outputText = inferenceLMStudio(userString)
%INFERENCELMSTUDIO Calling LM Studio (v0.2.10) from MATLAB using curlCommand
%A demonstration of MATLAB call to LM Studio HTTP server that behaves like OpenAI's API
% Examples/Steps:
% 1: Load Model in LM Studio
% 2: Navigate LM Studio and find "Local Inference Server"
% 3: Start Server
% 4: outputText = inferenceLMStudio('hello')
% Copyright 2024 Ahmad Faisal Ayob, VSG Labs Sdn Bhd
Cite As
Ahmad Faisal Ayob (A. F. Ayob) (2026). inferenceLMStudio (https://www.mathworks.com/matlabcentral/fileexchange/158141-inferencelmstudio), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Created with
R2023b
Compatible with any release
Platform Compatibility
Windows macOS LinuxTags
Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
| Version | Published | Release Notes | |
|---|---|---|---|
| 1.0.0 |
