
phi4minecraftbot
Create client-side Minecraft mod: use local Ollama phi4 model to answer chat; on chat message starting with !phi4 [question], post playerName: [answer]
00—FreefabricClient1.21.1mod
What the mod does
- Listens to all client-side chat; any message containing
!phi4triggers a request to your local Ollamaphi4model. - Immediately posts
[phi4] phi4 is thinking...to server chat, then streams the model reply into chat. - First streamed chunk is prefixed with the triggering player name (e.g.,
Player: text); later chunks send just the text. - Responses are buffered until a newline or sentence-ending punctuation, so chat updates arrive in readable pieces.
- Runs only on the client; original
!phi4messages remain visible.
How to use it
- Keep Ollama running locally on
http://localhost:11434with thephi4model available. - Say
!phi4followed by any text (even empty) in chat; the mod will forward the prompt to Ollama and stream back the reply for everyone to see. - Only one request per player runs at once; extra triggers from the same player show a private notice
[phi4] Request already in progress; please wait.. - If the Ollama call fails or times out (60s), you see a private system chat message
[phi4] An error has occured!.
Notable implementation details
- Uses the streaming
/api/generateendpoint with body{ "model": "phi4", "prompt": <question>, "stream": true }. - Sentence buffering trims leading/trailing whitespace around each emitted chunk and discards newline separators between chunks.
Limitations and future ideas
- Requires a reachable local Ollama service; no fallback to remote endpoints.
- Placeholder lines are not removed when the response finishes.
- Requests from players without an identifiable sender share a single queue slot, so only one of those can run concurrently.