Surprise Me!

GPT-2 Poetry Test #5 : Screencast 2019-05-08 12:46:45

2019-05-08 1 0 Vimeo

GPT-2 345M model retrained on custom poetry corpus for ReadingRites (2019) More info: http://glia.ca/rerites --- Retrained the GPT-2 345M model on 26.1mb of contemporary poetry Fifth test of GPT-2 poetry I didn't watch as much. Was busy. Would return to peer at screen now and then as one does a house plant, scrutinizing it for growth, examining its current outbursts, the stray leaf, a new bud. With some returns nothing seems to have changed... the incoherent exuberance of the machine is transparent: its modes of using a recurrent refrain, a word fixation to offer the semblance of story eventually grows transparent, a child's ploy. Yet, speculatively, I imagine how the large full unreleased OpenAI model, if finetuned on an amalgam of contemporary and Gutenberg corpus, might yield some extremely intriguing simulacrums, proximal and even exceeding human expertise in short verse sprints. Text here: https://docs.google.com/document/d/1t-YThmixCI0J3dBlCFq_MRYzlk2qS3tc-ePDq1a_MLY/edit?usp=sharing --- GPT-2 is an attention-based transformer model created by openai. https://openai.com/blog/better-language-models/ It can be retrained (finetuned) using code from https://github.com/nshepperd/gpt-2 I learned of this from Gwern https://www.gwern.net/GPT-2 --- The binnacle glows with earth--dark coral rubble; it bears its marks And marks wolves from miles to far. A shaggy beach discards The dead tide's tide here; at their feet rats devour The stars that never outlived the Gods. ---- File: Screencast 2019-05-08 12:46:45_GPT-2_test4

Buy Now on CodeCanyon