You can run that and deploy your own GPT-2 API locally, or in the cloud. I'd also recommend checking out Max Woolf's gpt-2-simple, which is a library that makes it really easy to fine tune GPT-2 with your own text (if you've played the free version of AI Dungeon, this is how they trained their model):
out of the box probably not. gpt-3 is pretrained on a much larger corpus and is a much higher capacity model, which is why its been performing very well in zero-shot learning examples like the one above. That being said you should try it.
Note the huggingface library is a python library, so you'd technically be writing some code, though the library API is super user friendly and the community is very helpful.