Run Llama 2 Web UI on Colab or LOCALLY!

11 個月前
-
-
(基於 PinQueue 指標)
Llama 2 is latest model from Facebook and this tutorial teaches you how to run Llama 2 4-bit quantized model on Free Colab.

Camenduru's Repo https://github.com/camenduru/text-generation-webui-colab

Colab used in the video - https://colab.research.google.com/github/camenduru/text-generation-webui-colab/blob/main/llama-2-7b-chat.ipynb

To run locally (on Linux):

Download the Colab as ipynb
Run all (make sure you have the GPU set up ready)


❤️ If you want to support the channel ❤️
Support here:
Patreon - https://www.patreon.com/1littlecoder/
Ko-Fi - https://ko-fi.com/1littlecoder
-
-
(基於 PinQueue 指標)
0 則留言