Experimenting with the gpt-2 models and text generation
I’m experimenting with the gpt-2 774M model to generate text based on a prompt. Starting up with: python3 src/interactive_conditional_samples.py –temperature=0.7 –model_name=774M –nsamples=1 –length=100 And then providing a prompt of something relevant like : “The problem with training Machine Learning models today is that the model is only as good as the data it is trained …
Continue reading “Experimenting with the gpt-2 models and text generation”