Leaked PixArt alpha model.
https://pixeldrain.com/u/jJUAsyiq
Runs out of vram on 3090 when using the provided interface.py here https://github.com/PixArt-alpha/PixArt-alpha
Loading it with T5 in 8bit seems to take about 14GB vram, replace their diffusion/model/t5.py with this if you're incompetent https://pastebin.com/LZpT6Tj2
If you're loading T5 in 8bit, you'll need to have a working bitsandbytes version, on windows you can add it to your venv with
python -m pip install bitsandbytes==0.41.1 --prefer-binary --extra-index-url=https://jllllll.github.io/bitsandbytes-windows-webui