 We are very nearly ready. So coming up next just to let you know You are in tense. Oh Tent see I had no idea This is my favorite tent of all of them. Don't tell ten a it'll get jealous But 10 today where you are going to be Seeing an amazing talk by Barney Livingston talking about his poetoid lyric cam. Have I pronounced that right? Yes, sure. Why not? So what I need I need you to start clapping, but very very gently. Can you start clapping very gently? Yeah, and now just bring it up a bit a bit more bit more bit more. Please welcome Bertoy Lyricam Thank you My name is Barney. I'd like to talk to you about this Bertoy Lyricam thing that I made before I start I'd like to Give a disclaimer in case there are any real poets in the room What this produces is probably best described as Pro's poetry or free verse or maybe bad poetry If there are any actual poets, I'm quite interested in collaborating to improve the poeticalness of it And Okay, and so I'll start with a demonstration I'm going to take a poem of you all and In case you worried about being photographed There's the actual images only kept briefly and as I'm thrown away and there's no network or anything And if you're still worried then hide your face now We have to wait for it to develop and I've no idea what it's going to say I apologize if it's a less than complimentary Okay, the city a group of people sitting on a bench Many images of people on a city street People are sitting while standing in a street under flags City standing kids The woman is talking near other kids standing in the street kids Okay, so have that work Inside there is a Raspberry Pi B plus Until recently there was a 3B, but I seem to have a killed the cereal on that recently as a pie camera in the front and A nano thermal printer from Adafruit to 18650 lithium batteries And series to make about 8 volts as a pimeroni wide input shim attached to the pie and So the printer wants 5 to 9 volts and the pie and shim wants 3 to 16 volts. So They both work within the range of the batteries Batteries tend to last more than a day I find So for the camera itself It's it was originally called super color swinger by Polaroid made in about 1975 It was pretty terrible camera and the film is no longer made. So I don't feel guilty about gutting it and There's a has the advantage of having a lot of space inside and I was able to keep the shutter button working And it has a handy locking function This is the third version of This the first version I made for emf in 2016 It used to send the poem to a server in my house via the Wi-Fi and if you remember Emf in 2016 the Wi-Fi was not very reliable. So the camera wasn't very reliable So I was determined to make it entirely self-contained, which I've done The second iteration was a bit much too slow. It took about close to a minute for the poem to come out But I've spent quite a lot of time fixing that and as you saw it's quite a lot reasonably quicker now So Before I talk about the intricacies of the software, I'd like to go back and have a look at the history of How this project came about 1916 during the second world the first world war. So a group of artists refugees gathered in Zurich Their reaction to the horror going around them was to abandon the usual forms of art and embrace the absurd They chose a name from their group by slating a word at random from a French-German dictionary That name was Dada a prominent member of the group was the Romanian poet Tristan Zara The Dada was to wrote a lot of manifestos In 1920 he wrote the manifesto on feeble love and bitter love in which he gave instructions for how to make a Dadaist poem to make a Dadaist poem Take a newspaper Take some scissors choose from the paper an article of length you want to make your poem Cut out the article Next carefully cut out each of the words that makes up this article and put them all in a bag Shake gently next take out each cutting one after the other Copy conscientiously in the order in which they have the bag The poem will resemble you and There you are an infinitely original author of charming sensibility even though unappreciated by the vulgar herd Yeah This became known as the cut-up technique Several people across literature have employed the cut-up across literature and music have employed the cut-up technique The beat author William S. Burroughs used in several of his books After his friend and collaborator Brian Geissen rediscovered the technique Vaxed and by while cutting out the amount for a drawing with a Stanley blade on top of a pile of newspapers Tom York from Radiohead used it in Radiohead's Kid A album and also this chap When we were in Los Angeles in 74 You were still using that technique of cut-ups. Do you still use it or do you do yeah? Yeah Yeah, increasingly so to a great extent on outside Even say on the new album Earthlyn if you put three or four disassociated ideas together and created awkward relationships with them the the Unconscious intelligence that comes from that those pairings It is really quite startling sometimes quite quite provocative a Friend of mine in San Francisco developed a program for me on the computer, which enables me to do it Really quickly So instead of going to the laborious presence of cutting things that you use your computer Yeah, and you can and you can work with far more material. So I'll take articles out of newspapers Poems that I've written pieces of other people's books And put them all into this little warehouse of this container of of information And then hit hit the probe are the the random button and it'll randomize everything and I'll get reams of papers back out of it with interesting ideas and then I'll either take Sentences verbatim as it as it spews them out or there might be something within a sentence which triggers off an idea I love how pleased he is with this program. I think I'd have liked to have been David Bowie software developer Around all this during all this time I was born Growing up, I suppose the influence of the Dadaists Was came to me through the this country tradition of absurdist comedy Spike Maillac and Monty Python big and Bob all that lot Probably my first encounter with generating text came in the form of mrs. Hathaway's knickers This was a children's party game invented by my grandfather Each child was given a piece of paper with a noun or noun phrase written on it The adult would start reading from a book at some point they would point at a child and That child would then read the word for example mrs. Hathaway's knickers mrs. Hathaway being one of my mom's teachers This was hilarious Like many of you here of my age, I learned to program on a BBs on a BBC micro It's so that the in fact the exact BBC micro was over in the bar showing Twitter at the moment And I was about 13 My friend Peter Jones not that one and I collaborated on a program called hoed I'll demonstrate it now so as you can see the first line is based on the The title of the poem by the poet master grunt of the flatulence of the Asgore's for Korea the second worst poetry poets in the galaxy From the section on Vogue on poetry in Douglas Adams's hitchhike's guide This is basically a more sophisticated version of mrs. Hathaway's knickers in that it As well as nouns it incorporates verbs adjectives and present participles It consists of a large amount of very buggies spaghetti code and You can probably see some bugs sometimes, but as at its heart there are two lines Two lines of templates with placeholder variables that are replaced by words or chunks of text selected at random from lists Of the appropriate type. This is similar to the game mad libs With the right choice of templates and carefully crafted lists this technique can be used to produce some interesting and amusing results Much better nodes certainly Fact bot one is a Twitter bot written by Eric Drass which Display which plays with the idea of that people on Twitter general or and generally will believe plausible sounding facts This bot dates back to before the current fake news disaster. You can hear more about it from him in fact Shortly after this talk in stage B Frequently, it's not quite as plausible there Yoko no bot by Rob Manuel off the beaters Combines Templates based on Yoko no's tweets and writings with text of several lists of things like foods computers and celebrities Tracery is a JavaScript library written by Kate Compton, which uses grammars described in Jason Which so they produce text in a similar way, so the those these words are replaced by items from these lists and the results come out like that This is used By a site called cheap bots done quick comm created by George Buckingham that allows anyone to make Twitter bots easily and and many have and I Think they tend to make Twitter slightly less horrible than it is generally. These are a couple of my favorites. This is a Like a kind of a Dungeons and Dragons based in Ikea have some tasty bargains So now back to stuff I made and Markov chains. I first discovered these when playing with dissociated text program built into the text editor emacs Which is very strange in itself as I'm a by-user Markov chains are built from existing bodies of text which are cut up Usually into single words and then a graph is built by creating Which encodes the likelihood that a word will be followed by another word the graph can then be traversed randomly to build a new text similar to the original and Usually somewhat grammatically correct It's probably easier if I demonstrate this right so at the top is a word is a sentence that we're going to Take apart and turn into a graph so I start with a and that's a So we have an edge going to a then we take Markov That comes off a chain is and then from is we go back to a and Stochastic from a model and then Stop that's the end of the sentence Just help you out here. Okay, and then we have another sentence So same thing we start and start node Then we have Andre Markov. We already have Markov to go back to there was here a again back to the a node popular node Russian mathematician stop Okay, final sentence a cat is a very mammal. So what happens this time start then to a But we already got a from still we already have from start to a so we've updated the edge to two and then cats comes off a and then is and Then we already have an edge from is to a so that comes to and then very Mammal and stop. Okay. So now we have a complete graph of our text We can I can show you what it happens if we traverse it. So start at start Then for instance, it's quite likely that we go to a because it's two vertices one So a then we can say Go to cat Then we can go to is Then we have to go back to a and then we can go anywhere we like so we could say Stochastic Model a cat is a stochastic model. There we go Okay, we do again start because it's fine. Maybe we go to Andre Markov was a very mammal And so Yeah, that's right. That's how Markov chains work for generating text Generally the larger the body of text you feed it the better the results because I have more variety Sam our Cosgrave as a Twitter brought line made based on Markov chains What it does is it regularly searches Twitter for hype in the hashtag haiku and records what it finds it's been running for several years now and as accumulators Close to 900,000 haikus It then uses these to build three Markov chains one per line There's another example of its output here Inspire ration is another Markov boss I made This one feeds on thousands of inspirational quote type text type text Markov chain text tends to be quite incoherent, but sometimes it will spit out the occasional gym Eventually, I got bored with Markov chains and moved on to AI. I started playing with Character recurrent neural networks char R&N in the torch framework And these can be used in kind of a similar way to Markov chains and that you feed them a large body of text And then they learn how to produce text that looks somewhat like the original So I had a laugh with that and produced some bizarre new episodes of friends Then I found neural neural talk to model This is a neural network that takes an image as input and produces captions It's trained on the MS cocoa data set, which is a set of about 300,000 photos with five captions each And this is what I used to create the original Lyric am and I Notice that the captions when I was playing with it had a kind of vaguely poetic feeling In November 2016 I took part in the nano gen mode national novel generating month similar to nano RIMO But the aim was to spend a month generating a novel of 50,000 words This is the result AI AI This is a an AI's take on the film AI. I extracted 5036 stills from the movie and had neural talk to produce a caption for each Then I formatted the captions into sentences possibly joining to then paragraphs from sets of sentences and chapters from number of paragraphs chapter titles of are the most common five or more letter word from the chapter That's not already been a chapter title Finally, I formatted it all through latex to produce jpeg on jpeg PDF Then my friend Libby was kind enough to get one printed for me I'll read you a sample chapter 9 picture The nameless are located on the slippery side of the indiscernible a man is taking a picture of a mirror Lighted man with a train bike looking bad at window and a man is taking a picture of himself in a mirror There is a man in a restaurant using the phone a woman is holding a cat in a kitchen I'm afraid you can't get much sense of the of the of the film from the book It occasionally talks about children and teddy bears which I gathered in the film. I've not actually seen it So now you're back to the camera Torch doesn't work on Raspberry Pi's so I had to switch to using TensorFlow again, this is on the M2 text model which is the same as Neural talk to but and or similarly trained on the MS Coco's data sets So when the shutters pressed an image is captured and stored in a python array This is then passed the neural net and four captions are generated Neural talk to had an option called temperature Which you could use to adjust the sensibility of the output. I generally had it turned down to a quite insensible Into text is more sensible in that it always tries to create the most accurate captions possible So I had to modify it to to dumb down the search code and introduce a random element captions are then cleaned up and Fed through the the eng tagger library, which tags each word with a part of speech Nouns verb adjectives preposition etc Then there's a random chance That it will break the line On our on our verb a conjunction like and or or preposition like four of This kind of makes a vaguely poem shaped block of text Sometimes sometimes it will repeat some of the work the nouns in the list form Finally the results into the printer So that's it at this as I said at the start. I'd love to improve the quality of the poetry And I'd be interested in collaborating with people on that There's a lot of scope for employing various techniques for transforming the captions I'd also like to investigate changing the the neural net to be more poetic The only reason I haven't looked into this is that on the hardware I've got training the the model from scratch would probably take weeks So I'll be wandering around for the rest of the event is my poet's hat on Taking poems if you'd like one Please ask These are my contact details. I also should thank my employer lobster pictures For being understanding while I've prepared this talk and prepared for the MF Thank you. Does anyone have any questions for our speaker? Have to wait for it to develop anyone else have a question we can I can read it afterwards How big is the database of images and text that you use in the raspberry pi? Well, the original database of images was 300,000 or so But the model that it produces. I think the file the checkpoint file is about 180 megabytes or something like that Which fits quite comfortably in RAM It takes about a minute to load off the of the SD card though Okay, we want to hear this Ode to the two glasses of wine an older fellow is wearing green And white looks like a tie a Man wearing a tie and a hat looks at their teeth For the evening an older man wearing a yellow shirt taking two glasses of wine. He holding a glass Hair any other questions? Well, then I think on That lovely note. Let's have another round of applause for our speaker