@karebu if you want i can share my docker-compose.yml but it's literally like, just the port, environment variable and that's it. i have an instance of open webui for it but that's just because it's pretty
@karebu if you're hosting ollama in docker and your amd gpu isn't officially supported (anything below 6800 i think?) then you pass an environment based on which GPU you've got (for me it's HSA_OVERRIDE_GFX_VERSION=10.3.0, you can find them here)
@karebu i'm using my 6700XT for it and it's performing like any other 14b model. the difference is that it will display the thought process as regular text so for an actual response you need to wait a bit longer
@fiore oh my god I INSTALLED ICESHRIMP.NET ON 700 PLANES AND LET THEM FEDERATE WITH EACH OTHER AND FLY IN DIFFERENT DIRECTIONS. THE ONE WHO DISCONNECTS LAST TAKES 1 MILLION DOLLARS!!
i want fedi instances working on toasters . i want fedi instances on dehumidifiers . i want fedi instances on . On . On . I dont even know . I just Want them there .
@sneexy i mean im saying that peanut butter jelly time was a forum classic. idk if thats 100% true but i definitely saw it when i used to lurk forums years ago