edgible ai
Higher-level helpers for running a local LLM (Ollama) and exposing it as an Edgible application. Convenient for getting a private API endpoint to a model you’re hosting yourself.
edgible ai setup
Section titled “edgible ai setup”End-to-end setup: install Ollama, pull a model, optionally install Open WebUI, optionally expose either as Edgible applications.
edgible ai setup --model llama3 --auto-installedgible ai setup --model llama3 --expose-ollama --setup-webui| Flag | Description |
|---|---|
--model <name> | Model to pull (e.g. llama3, mistral). |
--auto-install | Install Ollama if missing. |
--local-only | Don’t create Edgible applications. |
--expose-ollama | Publish the Ollama API as an application. |
--setup-webui | Install Open WebUI on the device. |
--device-id <id> | Device to place on (defaults to local). |
--ollama-device-id <id> | Override the placement of just the Ollama workload. |
--webui-device-id <id> | Override the placement of just the WebUI workload. |
--gateway-ids <ids> | Pin to specific gateways. |
--webui-deployment <type> | How WebUI runs (docker or managed-process). |
--non-interactive | Fail rather than prompt. |
edgible ai stop
Section titled “edgible ai stop”Stop Ollama and any UI you started with setup.
edgible ai stopedgible ai serve
Section titled “edgible ai serve”Start Open WebUI locally pointing at a (possibly different) Ollama URL.
edgible ai serve --port 3200 --ollama-url http://localhost:11434| Flag | Description |
|---|---|
--port <n> | Port WebUI listens on. Default 3200. |
--ollama-url <url> | Where to reach Ollama. |
-d, --detached | Run in the background. Default true. |
edgible ai status
Section titled “edgible ai status”Show whether Ollama and WebUI are running.
edgible ai statusedgible ai test
Section titled “edgible ai test”Run a one-shot prompt against Ollama and print the response.
edgible ai test --model llama3edgible ai teardown
Section titled “edgible ai teardown”Stop services and remove Edgible applications.
edgible ai teardown --stop-ollama --remove-volumes| Flag | Description |
|---|---|
--stop-ollama | Also stop Ollama on the device. |
--remove-volumes | Also delete model storage volumes. |