What if I have more than 5,000 SKUs?
+
Basic caps at 5,000. Beyond that you'll want Managed (coming soon), where build pipelines do incremental rebuilds and we configure the system to your scale. Or self-host with Free if you can run the build infrastructure yourself.
Do you take a cut of sales?
+
No, never. We don't touch the money. Your processor (Stripe, Paddle, etc.) handles payments and pays you directly. We charge a flat platform fee and that's it.
What about AI costs?
+
Build-time AI runs through your own provider key (OpenRouter recommended). Basic includes a small managed quota for convenience. AI output is cached on source content hash, so unchanged products don't regenerate, and a daily rebuild typically does almost no AI work.
Can I leave?
+
Yes. Your catalog lives in your data store. Your customers and money live at your processor. Your code lives in your repo. We hold pseudonymous order records you can export at any time. Nothing locks you in.
High-risk vertical (cannabis, adult, kratom)?
+
We don't care. We're not the processor. Bring whatever specialty processor will work with your vertical, and we provide the catalog and orders layer. Mainstream platforms tend to refuse high-risk merchants entirely; we're agnostic.
Can I run the AI provider locally?
+
Yes. Ollama and LM Studio work well for build-time AI when your CI runs on your own hardware. Build-time is exactly the use case local LLMs are good at: long-running, latency-tolerant, off-the-clock. Cost is zero.