Open Source / MIT License

Your AI runs here.
Not in the cloud.

Chat, generate images, and create videos with one local app. No subscriptions, no data collection, no restrictions.

Get Started View Source
Locally Uncensored — AI Chat Interface
$ git clone https://github.com/PurpleDoubleD/locally-uncensored.git && setup.bat
Windows — installs Node.js, Ollama, downloads a model, and launches the app.
Linux / macOS instructions
Capabilities
Three tools. One interface.
Stop switching between Ollama, ComfyUI, and AnimateDiff in separate tabs.

AI Chat

Powered by Ollama. Run any model locally — uncensored, private, with streaming responses and thinking display.

Image Generation

Text-to-image via ComfyUI. SDXL, FLUX, Pony checkpoints. Full parameter control, real-time preview.

Video Generation

Wan 2.1/2.2 and AnimateDiff support. Generate video clips on your own GPU, no cloud API needed.

25+ Personas

Pre-built characters from Helpful Assistant to Roast Master. Switch personalities without prompt engineering.

Model Manager

Browse, install, and switch models from the app. Auto-detects text, image, and video models across all backends.

100% Private

Everything runs on localhost. No telemetry, no cloud, no accounts. Your data never leaves your machine.

Image and Video Generation
Image and video generation with full parameter control
Model Manager
One-click model installation with hardware recommendations
Comparison
What others don't do.
Every competitor handles chat. None of them combine all three.
Feature Locally Uncensored Open WebUI LM Studio SillyTavern
AI ChatYesYesYesYes
Image GenerationYesNoNoNo
Video GenerationYesNoNoNo
Uncensored by DefaultYesNoNoPartial
One-Click SetupYesDockerYesNode.js
Built-in Personas25+NoNoManual
Open SourceMITMITNoAGPL
No Docker RequiredYesNoYesYes

Run your own AI stack.

Free, open source, and yours to keep. No sign-up required.

View on GitHub Join the Discussion