Skip to main content

useModelList

Hook to discover available models on a local LLM runtime. Fetches the model list on mount and provides a refresh function.

Signature

function useModelList(options?: ModelListOptions): ModelListResult;

Parameters

options

Type: ModelListOptions · Optional

PropertyTypeDefaultDescription
endpointstring"http://localhost:11434"LLM server URL
backendBackendAuto-detectedBackend type

Return value

Returns a ModelListResult object:

PropertyTypeDescription
modelsLocalModel[]Array of available models
isLoadingbooleanWhether the model list is loading
errorError | nullAny error that occurred
refresh() => voidRe-fetch the model list

LocalModel

interface LocalModel {
name: string; // Model name (e.g. "gemma3:1b")
size?: number; // Size in bytes
modifiedAt?: string; // Last modified timestamp
digest?: string; // Model digest hash
}

Behavior

  1. Fetches models automatically on mount
  2. Handles both Ollama ({ models: [...] }) and OpenAI-compatible ({ data: [...] }) response formats
  3. Call refresh() to re-fetch manually

Examples

Model selector

import { useModelList } from "use-local-llm";

function ModelSelector({ onSelect }: { onSelect: (model: string) => void }) {
const { models, isLoading, error, refresh } = useModelList();

if (isLoading) return <p>Loading models...</p>;
if (error) return <p>Error: {error.message}</p>;

return (
<div>
<select onChange={(e) => onSelect(e.target.value)}>
{models.map((m) => (
<option key={m.name} value={m.name}>
{m.name} {m.size ? `(${(m.size / 1e9).toFixed(1)} GB)` : ""}
</option>
))}
</select>
<button onClick={refresh}>Refresh</button>
</div>
);
}

LM Studio models

const { models } = useModelList({
endpoint: "http://localhost:1234",
backend: "lmstudio",
});

Model dashboard

import { useModelList } from "use-local-llm";

function ModelDashboard() {
const { models, isLoading, refresh } = useModelList();

return (
<div>
<h2>
Available Models ({models.length})
<button onClick={refresh} disabled={isLoading}>
{isLoading ? "Loading..." : "↻ Refresh"}
</button>
</h2>

<table>
<thead>
<tr>
<th>Name</th>
<th>Size</th>
<th>Modified</th>
</tr>
</thead>
<tbody>
{models.map((m) => (
<tr key={m.name}>
<td><code>{m.name}</code></td>
<td>{m.size ? `${(m.size / 1e9).toFixed(1)} GB` : "—"}</td>
<td>{m.modifiedAt ? new Date(m.modifiedAt).toLocaleDateString() : "—"}</td>
</tr>
))}
</tbody>
</table>
</div>
);
}

API paths by backend

BackendEndpoint
OllamaGET /api/tags
LM StudioGET /v1/models
llama.cppGET /v1/models
OpenAI-compatibleGET /v1/models

Source

src/hooks/useModelList.ts