Support the builder

One engineer.
One infrastructure.
For everyone.

I'm Salka Elmadani. I built Inference-X from scratch in Morocco — a 305 KB binary that runs any AI model on any hardware, with no cloud, no account, no limit. I'm still building. This page explains what, why, and how you can help.

"The best engine is the one you don't notice. You should hear the model, not the framework."
What I'm building

The problem isn't the AI models — they're extraordinary. The problem is the layer between the weights and the human: the inference stack. It's bloated, cloud-dependent, and controlled by a handful of companies.

Standard engine: weights → framework → dequant buffer → matmul → buffer → output Inference-X: weights → fused dequant+dot → output 305 KB. 2 steps. Zero buffer. Zero noise.

Same model. Cleaner signal. Every unnecessary step removed.

How your support is used

There is no team. No VC timeline. No roadmap driven by investor pressure.

Servers: €53/month keeps all 6 inference-x.com services running.

Development time — the engine, the organ pipeline, the forge tools, the store architecture. All built alone, in the margins of everything else.

Support the infrastructure
Other ways to help
💻
Code
Open a PR. Add a backend. The engine is public.
🌍
Craton Admin
Represent your region. Maintain the community in your language.
📢
Signal
Share, repost, talk about it. Every developer who finds this is a potential builder.
⚖️
Holding — Zug CH
SALKA HOLDING SA is forming. Swiss corporate structure expertise welcome.
Contact
𝕏 / Twitter@ElmadaniSa13111 — fastest response
EmailElmadani.SALKA@proton.me
Codegit.inference-x.com/inference-x
Siteinference-x.com
"I don't beg. I build. If you see what I see — you already know what to do."

Morocco → the world · Salka Elmadani, 2024–2026