Raw GPT-J-6B step_383500_slim.tar.zstd

The original model using the JAX library, as released by EleutherAI and used by the original GPT-J-6B colab. Only supports top-p and temperature options, but runs on a TPU.
magnet:?xt=urn:btih:def49bbe532984d1131864533b0ff32910b2894d&dn=step_383500_slim.tar.zstd
MEGA: https://mega.nz/file/rtwSFCia#e9kll5HSYajqkMQzztVJXr08iaU-cm0QZIctnKzVyc8

Kobolded GPT-J-6B j6b_ckpt.tar

Converted to Torch, e.g. for use with the GPT-J-6B Rev 2 colab. Supports additional options such as repetition penalty, but requires a GPU with more than 12GB of VRAM.
magnet:?xt=urn:btih:7d6e0a0af0fa8f8effef096b9efc24f8e0a16021&dn=j6b_ckpt.tar
MEGA: https://mega.nz/file/CtIQDRpA#xJLlxR3culc6Vv2SyiZ06Cs8i3ioDZMK4Jy2wnHnSpE

PyTorch Conversion Tool

Downloads the raw model from the-eye, converts it into a Torch-compatible model suitable for use with the Rev 2 colab, and saves it to your Google Drive.
https://colab.research.google.com/drive/1JPSg7b7gfUd499iHOBg95UcNJUj-mJik

Edit
Pub: 12 Jun 2021 10:19 UTC
Edit: 13 Jun 2021 22:09 UTC
Views: 3508