When Cursor is offline: the local-only features that still work
Cursor's AI features — Tab, Chat, Composer — all require a network connection. Here's what keeps working offline, and how to set up a local model fallback before you need one.
Running local models in Cline with Ollama: when it's worth the trouble
Cline supports Ollama as a model provider, which means a fully offline coding agent on your laptop. The setup is straightforward. The quality gap is large. Here's how to think about both.