Viewing a single comment thread. View all comments

fellow_nerd t1_ivsv8dh wrote

I'm using huggingface's python transformer library with GPT2. I'm trying to bias it to insert newlines, but I found the logit scores of newline tokens during generation to be -inf. Is this a constraint in the library itself to generate text with no newlines, or is it the GPT2 model that won't generate it?

EDIT: Ah, I set the score of the newlines to -inf in earlier generations. Does this effect future text generation.

EDIT: Lesson learned. Slices of tensors are views: if you mutate the original, you mutate the slice.

EDIT: No, that was a lie, I just got my order of operations wrong

EDIT: I did it. GPT2 can now add paragraphs to horrible un-line-broken text from terrible MTL'd web novel chapters.

3