Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Try Llama-3 in a Colab Notebook (colab.research.google.com)
5 points by danielhanchen on April 18, 2024 | hide | past | favorite | 1 comment


Made a Colab notebook to finetune Llama-3 2x faster and use 70% less VRAM on free T4 GPUs, but you can also directly inference with it etc!

15 trillion tokens is yikes!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: