The meta list of various /aicg/ guides on running local

back to the comfy times when we had pyg in our OP

(Updated 2024-10-02)



Guides

https://rentry.org/freellamas

  • the one that started it all anew, even when this great anon doesn't host a proxy, you can get the settings and the set up info here

https://rentry.org/hostfreellamas

  • guide on how to set up a local model for yourself or for others - free on your own machine

https://rentry.org/colabfreellamas

https://rentry.org/koboldcpp_colab_guide

  • guide on how to set up a local model for yourself or for others with KoboldCpp - free on colab

https://docs.sillytavern.app/usage/local-llm-guide/how-to-use-a-self-hosted-model

  • from SillyTavern docs, guide on how to set up a local model for yourself to be used in SillyTavern - free on your own machine

https://vast.ai/docs/guides/oobabooga

  • guide on how to set up a local model for yourself or for others - paid (anon was shilling this, no idea how much it costs)

https://rentry.org/aicglocal

  • guide on how to set up a local model for yourself with KoboldCpp - free on your own machine

https://rentry.org/llama_v2_sillytavern

  • from /lmg/, guide on how to set up a local model for yourself with KoboldCpp - free on your own machine

https://rentry.org/better-llama-roleplay

  • stolen from /lmg/, listed there under "LLaMA RP Proxy"

https://rentry.org/stheno-guide

  • from /lmg/, model-specific guide for Stheno-L2-13B with SimpleProxy after making it work using aforementioned guides

https://rentry.org/easylocalnvidia

  • guide on how to set up a local model for yourself with KoboldCpp - free on your own machine

https://rentry.org/ky239

  • borrowed from /CHAG/, guide on how to set up a local model for yourself with a lot of additional explanations - free on colab

https://rentry.org/MixtralForRetards

https://rentry.org/mixtral_vastai_for_dummies

  • model-specific guide on how to set up Mixtral for yourself or for others with KoboldCpp - paid

https://rentry.org/MagnumProxy#localcloud-hosting

  • model-specific guide on how to set up Magnum for yourself with KoboldCpp

Ayumi's LLM Role Play & ERP Ranking

https://rentry.org/ayumi_erp_rating obsolete
https://ayumi.m8geil.de/erp4_chatlogs/

  • stolen from /lmg/, a huge table with various models performance in automated benchmarks, admits being flawed
    pic 23.08.2023, also thanks, Weird Constructor
    pic is very old!

More rankings

https://rentry.org/lmg-13b-showdown obsolete

  • stolen from /lmg/, a small test of 13B models popular at the time

https://old.reddit.com/user/WolframRavenwolf/submitted/?sort=new

  • linked here, llm comparison with ST roleplay tests

https://snombler.neocities.org/logs dead

  • some tests of various llms specifically for long-form roleplay, performed by a botmaker

https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard garbage

  • stolen from /lmg/, looks like a foundation models ranking

https://rentry.org/thecelltest obsolete

  • a small test of multiple models, checks understanding of a specific scenario

https://oobabooga.github.io/benchmark.html

  • stolen from /lmg/, a test consisting of 48 manually written multiple-choice private questions

https://huggingface.co/spaces/DontPlanToEnd/UGI-Leaderboard

  • the amount of uncensored/controversial information an LLM knows and is willing to tell the user

https://huggingface.co/spaces/flowers-team/StickToYourRoleLeaderboard

  • compares LLMs based on undesired sensitivity to context change / can they stick to their role whatever the context



Sister rentries:

Email for feedback and suggestions:

Edit
Pub: 23 Aug 2023 04:22 UTC
Edit: 10 Oct 2024 17:13 UTC
Views: 31350