Chat 💬 (Shell)¶
Enola’s Chat is The Shell for the 2030s+ and the “last Chat you’ll ever need”.
After installing, you can use it like this:
Screencast¶
Usage¶
$ HOSTNAME=demo ./enola chat <docs/use/chat/demo.input
Welcome here! Type /help if you're lost.
runner@demo in #Lobby> /help
System> Enola.dev vf68d7b4 -- Commands:
/whoami - Show your user details.
/commands - Available commands.
/help - This help.
Say "ping" to get a "pong" back.
@echo ... echoes your message back to you.
runner@demo in #Lobby> @echo hi
Echoer> hi
runner@demo in #Lobby> hello, world
runner@demo in #Lobby> /whoami
System> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
<subject://demo/runner> rdfs:label "runner@demo" .
runner@demo in #Lobby> whoami
demo> runner
runner@demo in #Lobby> pwd
demo> /home/runner/work/enola/enola
runner@demo in #Lobby>
Exec¶
Note how the last two “messages” in the example chat above were whoami
and pwd
.
They were both executed as commands on the local system. (Whereas the /whoami
with slash was not a system executable but a built-in command.)
In order to avoid confusion with certain terms which are both typically valid UNIX command names but also valid natural language words which you may well start a prompt to the LLM, a hard-coded list excludes them from being executed as commands (e.g. who
and time
or uptime
etc.).
Prefixing input with $ followed by a space (to avoid potential future confusion with environment variables) will force it to be interpreted as a command to be executed instead (e.g. $ uptime
).
Commands are currently executed using /usr/bin/env bash -c ...
, but this may be changed in the future.
AI¶
If you have Ollama up and running locally on its default port 11434
, then this Chat will have an LLM>
participant using gemma3:1b
which will chime into the conversation, like this:
$ ./enola chat
Welcome here! Type /help if you're lost.
vorburger@yara in #Lobby> hi
LLM> Hi there! How’s your day going so far? 😊
Is there anything you'd like to chat about, or anything I can help you with today?
vorburger@yara in #Lobby> how ya feel'
LLM> As an AI, I don't have feelings in the same way humans do. However, I can say that I’m functioning well and ready to assist you! 😊
It’s a good day for me to be here. How about you? How are *you* feeling today?
This will be extended to support other 🔮 LLMs and 🪛 Tools and 🕵🏾♀️ Agents in the future - watch this space.
SSH¶
This Chat feature is also available via an SSH server!
Exec is only enabled in (local) enola chat
, but disabled over SSH.
/whoami
includes the public key of the user connected over SSH.
Configuration¶
Your ~/.inputrc
and /etc/inputrc
are loaded on startup. For example, to bind Ctrl-Backspace
, do echo '"\C-h": backward-kill-word' >> ~/.inputrc
.
The line editor has mouse support, e.g. to click on the prompt line to move the cursor.
History is persisted e.g. in ~/.local/share/enola/history
.
See Help doc for all other options.