Write With Transformer uber/pplm
Select suggestion ↑ ↓ and enter
Cancel suggestion esc

PPLM builds on top of other large transformer-based generative models (like GPT-2), where it enables finer-grained control of attributes of the generated language (e.g. gradually switching topic 🐱 or sentiment πŸ˜ƒ).

⚠️ 🐍 We had to turn off the PPLM machine as it was costly to host – try it locally using transformers, or contact us if you really need it as a hosted service. 🐍 ⚠️

Written by Transformer Β· transformer.huggingface.co πŸ¦„
Model & decoder settings
Bag-of-words
Discriminators
pplm
Step size
KL-scale
GM-scale
Num iterations (impacts gen. time)
Gen. length (impacts gen. time)
Use sampling