Generating Inform 7 with recurrent neural networks

You folks might enjoy my latest blog post: lea.zone/blog/inform-7-plus-torch-rnn/

Very interesting!

That is at once weird and surprisingly compelling.

My favorite bit is probably The description of the player is "It is stone." Kinda sounds like a neat mini-prompt, to make a game where that line makes sense…

I’d like to see a jam or a comp based around this; each entrant gets a chunk of generated code and is tasked to make a small game that’s at least loosely based on it.

I love this.

I think my favorite thing about it is the inevitable appearance of an NPC named Clark, since at some point “Clark” became my default name for NPCs in Recipe Book examples and test suite code.

Oh, ha, I’d never noticed! I do indeed see 101 instances of “Clark” in the combined corpus. (Including, cleverly, Clarks Kent and Gable as well as William Clark.)

Yeah, it became a running joke between me and Graham. The biggest/most annoying test in the test suite is the one that tests a lot of permutations of NPC commands (asking Clark to take an object that is a backdrop, asking Clark to drive a vehicle that you are or are not riding in, …), so Graham built up a certain amount of resentment around this character.

(Don’t judge. When you’re writing several hundred examples and test cases you have to learn to make your own fun.)

:smiley: :smiley: generated innuendo :slight_smile:

Hey, depending on how accurately you model the human body, we’re all fluid containers in a certain sense…

This is wonderful nonsense.

I ran the Standard Rules through a Markov once. It was mildly amusing. One of the more memorable ones was “The plural of woman is usually not pushable between rooms.” which seems like a good rule to live by.