AI Chat ๐ฌ Tutorial¶
You can interact with Enola’s AI using any of the following commands.
They all have an --agents
argument to configure AI Agents.
Web UI Server¶
For the simplest possible initial demo, try to just launch:
docker pull ghcr.io/enola-dev/enola:main
docker run --rm --volume "$PWD":/app/CWD/:Z --tty -p7070:7070 ghcr.io/enola-dev/enola:main \
server --chatPort=7070 --lm=echo:/
This will give you a Chat web UI on http://localhost:7070 which will simply echo everything you write back to you, for starters.
Of course, next this also works with real AI Language Models; for more details, see server
.
Terminal (TUI)¶
If you prefer using the terminal instead of a Web UI, then alternatively just launch:
./enola chat
For more details, see chat
.
AI CLI¶
See ai
command.