One engineer.
One infrastructure.
For everyone.
I'm Salka Elmadani. I built Inference-X from scratch in Morocco — a 305 KB binary that runs any AI model on any hardware, with no cloud, no account, no limit. I'm still building. This page explains what, why, and how you can help.
"The best engine is the one you don't notice. You should hear the model, not the framework."
The problem isn't the AI models — they're extraordinary. The problem is the layer between the weights and the human: the inference stack. It's bloated, cloud-dependent, and controlled by a handful of companies.
Same model. Cleaner signal. Every unnecessary step removed.
There is no team. No VC timeline. No roadmap driven by investor pressure.
Servers: €53/month keeps all 6 inference-x.com services running.
Development time — the engine, the organ pipeline, the forge tools, the store architecture. All built alone, in the margins of everything else.
Direct donation via PayPal
Even €5 covers a day of server time. No account, no subscription.
| 𝕏 / Twitter | @ElmadaniSa13111 — fastest response |
| Elmadani.SALKA@proton.me | |
| Code | git.inference-x.com/inference-x |
| Site | inference-x.com |
"I don't beg. I build. If you see what I see — you already know what to do."
Morocco → the world · Salka Elmadani, 2024–2026