Auto-Starting Software
By default, the device runs/home/distiller/distiller-cm5-python/spin_up.sh at boot. This script launches the main app you see on-screen. If you want to disable it—or add or remove processes—edit this file.
Work Environment
We use uv for Python dependency management (think of it as a fasterpip install). All dependencies are listed in requirements.txt.
Pay attention to two repositories:
- distiller-cm5-python – contains the agent module (MCP server, MCP client, LLM backend), UI module, and configuration files.
- distiller-cm5-sdk – provides SDKs for Whisper transcription (mic), Piper TTS (speaker), and RGB LED control.
Recommended Way to Start Coding on Distiller
As mentioned in the Quickstart guide, we recommend using Cursor (a fork of VS Code) to set up your coding environment. Once you’ve connected to your Distiller device via SSH using the Cursor server:Tutorial Video
When Vibe Coding
To start, let Cursor access your working folder:

/home/distiller/distiller-cm5-python/distiller_cm5_python/mcp_server
Create your Python file with a name ending in _server.py. Our main UI will automatically detect and load any MCP server script. (Learn more about MCP here )
We also recommend using /home/distiller/distiller-cm5-python/llms.txt to help brainstorm ideas and kickstart your scripting. For example:
![]() | ![]() |
|---|
Configurations You Need to Know
-
Set your default MCP server for CLI mode
Running
python main.pystarts the app in CLI mode for quick testing. It loads the default task configuration from:
/home/distiller/distiller-cm5-python/distiller_cm5_python/utils/default_config.jsonMake sure to set theserver_script_pathto the MCP server you want to test with. For example: -
Set the Model You Want to Use
The device includes a Qwen2.5 3B model located in:
/home/distiller/distiller-cm5-python/distiller_cm5_python/llm_server/modelsIf you’d like to try a different model (in GGUF format), simply download it to the same folder and update the model name in the config underllm_providers -> model_name. For example: -
What If I Want to Use an Online Model?
We’ve prepared a way to connect with OpenRouter, where you can choose from a variety of models to try. Make sure to pick a model that supports tool calling and streaming. (Learn more here.)
-
Set the
model_nameandapi_keyyou receive from OpenRouter -
Switch the
active_llm_providerfromllama-cpptoopenrouter
-
Set the
-
Adjust Logging Level for Better Debugging
To enable more detailed logs, simply change
logging -> levelfromINFOtoDEBUG.Note: WhenDEBUGlevel is enabled:- A
debug_message_traffic_<timestamp>.jsonfile will be created in the current folder, containing the full conversation history of that session. - An
event_logs/event_log_<timestamp>.jsonlfile will also be generated, capturing all events sent to the UI.
- A
-
Update System Prompt
The
default_system_promptapplies to all conversation sessions. Adjust it as needed to guide the model’s behavior.
/home/distiller/distiller-cm5-python folder)
The fastest way to validate your changes is by running:


