alwaysalready: (Default)
I was asked about my process for creating neural net poetry, and thought it would be fun to write it out. If you want to have a look at my stuff, you can get my zines here. Everything is free but donations are gratefully appreciated.

So I have two distinct ways of generating neural net poetry, really. Going to write this as a guide in case anyone else wants to give it a try.

The first, and my favourite way, is when i’m working on the untrained model of gpt-2 - this means i’m just using it as it comes, through an interface like talktotransformer.com or through google colab without doing any training of my own. What I do is, I either write 3-4 lines of poetry to use as a prompt, or I take 3-4 lines of a poem i like (either by me or by someone like Keats or Emily Dickinson) and use that as a prompt.

Usually I will use those lines a few times to get an idea of what kind of output it’s going to give me, and then I start to adjust. So if it’s starting to give me some biblical-sounding nonsense, which is a hazard with anything vaguely old-fashioned, I will try to change the language a bit to replace any words that i think might be causing that. I might also take nouns out and replace them with some weirder, or more ostentatious nouns. I will probably try and include lots of different types of nouns, etc. - I don’t want them all to be similar to each other, or all in the same register as each other. It can be nonsense - nobody is going to see this instead of me, so it doesn’t have to read like good poetry on its own. But each word i use is giving the neural net something to generate from… i like to use language that is specific and unusual, so that i’m more likely to get something unusual as my output. And i want to use words that contrast with each other for similar reasons. I don’t want everything to be super pastoral or super sad, unless i am going in with an idea of a type of pastoral poem that i want to make, i guess.

So, let’s say i start with the opening lines of the waste-land:

    April is the cruellest month, breeding
    Lilacs out of the dead land, mixing
    Memory and desire, stirring
    Dull roots with spring rain.

Now, these could get me something interesting! But i’m more interested right now in the syntax than the words themselves. So i want to start replacing nouns with other nouns, adjectives with other adjectives, and so on. And i’m going to remove that last full stop, so the neural net doesn’t think that this sentence has ended and it has to start with something completely new. Im also going to edit it so that the lines don’t start with capital letters. I play around with this - sometimes i get good results with capitalised lines, sometimes it works better without. It’s all about what kind of response the neural net gives to it, in combination with what you’ve written.

So, after playing Eliot madlibs, i might end up with with a seed text that looks sometjhing like like this:

    october is the oldest month, baking
    roses in the dead land, mixing
    sumac and seaweed, singing
    root worms with rituals all night

It follows the loose structure/syntax (although i’ve changed “spring rain” for “rituals all night” because any formulation to do with weather often brings lots of boring weather-y results) of the waste land, but it’s a bit garbled, and pretty goth. I’ve put in specific food words, and some words to do with magic and ritual, because that’s what i’d like to see in the text i generate. This is how the first response looks:

    for the harvest to come: crack
    the shells of the nightbugs and hear
    tongues made of twigs screeching
    in fury and hunger. open
    your hand into the marrow of the
    earth. hang tight to the outer
    shell of a beetle that was already dead
    before you knew it. swirl
    the guts of your artichoke
    into a whirlpool in a waltz
    to remember. sing to see if the tide
    will come.


I like this! It actually feels suspiciously like a poem so i’m going to pick a few phrases to google to make sure it’s not just recycling a poem the neural net was trained on… ok, no results for “tongues made of twigs” or “guts of your artichoke”, so this seems to be a neural net-written original. If i’m lucky enough to get someting like this, i might decide it’s basically there already. I cut off anything extraneous at the beginning or end (in this case it gave me part of a biography of a poet the net had made up to go with the poem) and might edit linebreaks (i don’t like ‘sing to see if the tide / will come’, i’d probably put the linebreak elsewhere) but otherwise it’s kind of complete, imo. The work was all done before the generating.

Ok, so i’m going to hit generate again a few times. It’s not giving me anything else that feels as complete, but i like parts of this:

    saying grace after every bowl
    of soup. i have heard
    of weddings now, heard
    sweet souls said goodbye to one
    another in song as the winter
    rot set in and blossoms fell.
    now, almost in full leaf
    the agapanthus bursts through the door,
    sending up their lacy, winged blossoms,
    light as feathers, but everywhere
    an abiding darkness is beating
    and words that make me cry:
    —and now, with aching beauty,
    let us sing:
    sing it out for those left behind
    if they are part of you—

Some of these lines are cool, but it’s too long and some of it is boring and cliched imo. I don’t like “aching beauty” or “light as feathers”.

So, as is the case here, sometimes i might get a response that i don’t find as interesting overall, but there might be a few lines in there i like, or i might get a response i like, but it cuts off before it reaches what i think the end should be. In that case, i’ll copy across what i like into a google docs file and keep generating, and see if the same seed throws up anything else that seems to go well with it. I might change a word or two in the seed text and see if that tweaks my responses. I might also edit lines that the neural net has given me, if i mostly like them but wish a word was different. It’s fine – this isn’t supposed to be a science experiment, i’m just trying to make something cool.

Or, if i want to try and make something that directly follows on from text i’ve already generated, i might just copy the last few lines of that into the neural net and generate based on that.

Stitching lines and stanzas together can take time. Sometimes i’m happy sticking with a small fragment. But it can be really fun to put a longer poem together. I don’t worry about having a narrative, or having something that makes any kind of sense. It’s all about the images, and the sounds.

It might take a few tries to get anything you’re interested in, even with a good seed text. Neural nets are not reliable - they’re not consciousnesses creating work, of course, just strange tools that generate text, and they will sometimes get distracted by a stray word in a different register, and suddenly spit out text that looks like it belongs in a technical manual, or an email, or harry potter fanfiction. Or the output might just repeat the word “october” over and over again for some reason. Whatever. Try it a few more times before you decide to change to something else.

I hope this is clear! Let me know if you have any more questions.

The other way i generate involves training a gpt-2 model on a courpus of poetry by a writer like Emily Dickinson or John Keats, and then giving it prompts that look like this, or prompts that are just like - the title of a poem. And then letting it go wild. This can give good results, but it’s a bit trickier to do. If you’d like to give it a try, Max Woolf has a Google Colab notebook that walks you through the process of training a small model of gpt-2, and then you just need to play around with whatever prompts interest you.

Max Woolf’s notebook also lets you prompt the untrained model, and it’s free, so if you run out of your allowance on talktotransformer.com that’s the best place to go next.