Background LLM Polling

Interfacing with massive machine-learning models usually forces your servers to freeze while waiting for giant JSON payloads to generate and return. Even standard Node.js streaming can heavily tax the memory pool.

Sofuu avoids this using native `libuv` threading. Calling sofuu.ai.stream() immediately spawns a background thread written in C that natively parses Server-Sent Events from the LLM, leaving your entire Javascript runtime totally free to handle other traffic.

1
2
3
4
5
6
7
8
9
// stream() triggers a background C-thread over libuv
const stream = sofuu.ai.stream("Explain gravity.");
// Main logic loop continues seamlessly here!
// The iterator bridges native SSE chunks without freezing the Node loop
for await (const chunk of stream) {
res.write(chunk.text);
}
View full API specs