I published a video showing an experimental model that interacts with basic emotions and is accessible via an API.
What appears in the video:
- An HTTP server (using
HTTP.jl) handling requests. - The model responding to emotions (anger, joy, sadness) in an experimental manner.
- The
serve_lstm.jlcode in video (with dummy/hidden paths for security purposes).
Current improvements:
- Loss: 5.18 → 4.9998
- Info%: 24.7% → 28.3%
The core model code is completely sealed. It cannot be discussed, requested, or disclosed. This video is the maximum that can be shown at this stage.
The original post with the video is on LinkedIn
Processing: Capture d’écran du 2026-04-30 17-40-36.png…

