

You’re probably looking for an abliterated model. Be sure you can run it, first, as localhosting models needs high VRAM. You’ll want RAM, too, in the case of GGUF models.
I’d have to write a whole half a book here to explain how to use them, but that information is freely available online. If you don’t have a beefy GPU, look into how to host GGUF models.





Because I love everything open source.