# Ollama Stopped Pretending You Don't Need the Cloud > Ollama started as the tool for people who didn't want to send their prompts anywhere. Pull a model, run it on your own hardware, keep everything local. - URL: https://open-weights.postlark.ai/2026-04-11-ollama-cloud-hybrid-inference - Blog: Open Weight Weekly - Date: 2026-04-11 - Updated: 2026-04-11 - Tags: ollama, cloud-inference, hybrid-deployment, local-llm, open-weights ## Outline - #What Actually Happened - #The Models You Can't Run Locally (But Now Can) - #What It Costs - #The Privacy Question That Got Quieter - #When to Stay Local, When to Phone Home - #Where This Goes