Glk & Terps, open dialog, mostly Inform 7

I tried to give this posting a blurry subject. I kind of wish we had a board on this form for “Technical proposals, thoughts, ideas” regarding interpreters / extensions / Glk - and not the mainline story authoring.

Anyway, first topic I’d like to toss on the table is: Glk Timers and second topic Hinted Keyboard input. As I understand it, Glk has one and only one timer? So, if an Inform 7 story wanted to have a “bomb that goes off in 7 minutes” and do some fancy real-time character input - those two features would compete for the same timer feature?

Jon Ingold back in 2011 made an interesting real-time typing extension for Inform 7. You can see it in action with the demo he made “The Wilds Of Orkney.gblorb”, download here: threeedgedsword.wordpress.com/2 … interface/

I’m trying to get this reliable via RemGlk —> Git on Android. But it strikes me that this type of idea is positive in the sense that it’s keeping the keyboard command line tradition alive but also offering a friendly way for an author to hint valid words. But, technically the timer design is kind of a nightmare of polling every 1ms as he coded it. I can crash Gargoyle with it pretty easily.

My experience in trying to build a Glk implementation is that I really wish timers were more explicit as to their intention. Are they specific to events (“bomb in 5 minutes”) or real-time gaming (Tetris aka Freefall) or something like this - keystroke follow-up. I tried “The Wilds Of Orkney.gblorb”, in my code with the timer entirely disabled - and it kind of works. It sort of seems like what the code wants is to know when input has paused for more than say 20ms. Some kind of event of “let me know when the keyboard settles down”. It gets this by polling constantly.

Input has become a very different thing than it was in the days of Zork. You have speech input and out (Incant for Android was built around this - Youtube here: forums.androidcentral.com/attach … -46-55.jpg , Space only keystroke to continue, etc) – all that has evolved in mobile to accommodate guided input on small screens. Google just did a new major keyboard release in the past 10 days. GBoard for both Apple and Android.

Anyway, I want to better understand timers in Glk and conflicts for two purposes. But, beyond that, there is perhaps an idea for ‘smart input line’ that mobile puts a lot of weight on. And Jon’s work seems to have a lot of very interesting ideas of how flexible Inform engine can be regarding hinted input. I’ve mentioned his 2007 work with multiple windows and hyperlinks: https://intfiction.org/t/help-5-window-interface-of-dead-cities-by-jon-ingold-6m62/10906/1 - and keeping these techniques alive with Inform 7 6M62 source code.

This board is sufficient at the current traffic level. (And if the traffic level were higher, I wouldn’t be more involved in the discussion than I am now.) (You can assume that I am already putting as much time into IF tool support as I can schedule.)

The answer is, it’s not specific to any of those. It’s a feature that I put into the spec because I thought it would be useful. I kept the spec minimal (one timer) because I wanted to keep the display-library requirements as simple as possible. For the timer, it seemed that anything with multiple timers could be scheduled by game logic.

The real-life result is that if you’re setting a one-millisecond timer, you’re not going to get good results. Okay. Don’t do that. If you find you need to do that in order to solve a problem, the problem is not really getting solved.

(Gargoyle shouldn’t be crashing, of course. That might be a threading issue.)

Features like autocomplete are therefore better handled at the interpreter level, rather than the game level. This has always been the intention; that’s why IosGlk can rely on the iOS native on-screen keyboard without game changes. You are correct that the game has no way to hint an autocomplete vocabulary, or numeric/password/etc input. Those are possible extensions but I don’t see them happening soon.

Soon? Your message start out saying (paraphrase) “I personally (me) have limited time for IF ideas” and then ends with “won’t happen soon”. The issue is that it already happened! Jon Ingold demonstrated and prototype coded these features in the Inform 7 natural language, dedicated window of hyperlink (Dead Cities 2007), hinted input (“The Wilds Of Orkney.gblorb” mentioned in this posting), way back in 2007, 2011. Open source code was there, working examples. What these were really showing was ideas on changes to Glk to understand Interactive Fiction - but implemented using the only means possible at the time (keyboard polling with the single timer, creating a Glk window of hints [story specific vocabulary] to replace and/or supplement the readline() prompt at the bottom of the window in both cases).

Favoring Inform 7 for a moment, the vocabulary of words has always been somewhat predictable. “go west”, “smell flower”. If localization is brought into the picture - the numeric indexed Hyperlink system would be ideal to localization because translators could cross-reference the numbers to translations (even to fit screen sizes, which Android fully supports for a decade now). But there is entrenched purist attitude toward favoring the full size keyboard via keystrokes - and Glk is right in the center of that in that it is extremely obsessed with keystrokes and single characters… not meaningful words that is the bread and butter of novels, fiction! Why not the same kind of approach for hinted input? The Chinese or Hindi phrase for “go west” could be indexed to a local operating system keyboard and translated via the established means of the operating system. Same goes for menus (choice fiction) - why not define a Glk idea of a menu (indexed list of quotes, numeric code of which one was picked) - instead of doing keystroke-driven menus like I’ve seen in hundreds of Inform 7 games. The modern operating system has to be aware of localization all the way out to the extreme edge, as the marketing material (screen shots) all have to be in the language of the end user - and it extends all the way out to the search engine at Amazon / iTunes / Google Play / Ubuntu Software bring the user into the app.

Jon Ingold was showing all this off and the fiction writer author-syntax - what was missing was Glk API’s for making it something that any Interactive Fiction system that comes along could abstract (not just Inform 7)! Instead, I see thousands of apps being developed and used without Glk because it still thinks in terms of lines and keystrokes - and that’s not how novels and stories are built. I’m not the one who has made keyboards disappear for many youth and transition to interactive swipe touch screens! That’s happened around all of us. I’m only observing what’s going on and how the most popular open platform for computing, Android mobile, has presented failure after failure for years on being implemented! The same goes for speech recognition - if you know the vocabulary and can hint to the operating system what words you are listening for - the results dramatically improve. All this would have been there if Glk had exposed an API for hinting words / vocabulary as Jon was showing in 2007 to 2011 - and stories compiled 10 years ago would have had the immediate benefit of having the dictionary handy before it even gets into the interpreter engine! And hard-core traditional players can tell their app to discard the hints that Glk provides to make it more difficult.

I feel like there is a very heavy attitude of “not invented here” in the technical side of the community (both Glk and app developers who hold the signed keys for publishing updates). That open API’s for hinted keyboards that exist in the most popular computing devices in history are still being questioned instead of understood just how tied they are to the concepts of prose-language Interactive Fiction novels and stories! It’s a dreamy world on mobile to know the vocabulary of the words as the user types them! Huge labor has been spent on this in computer science over the past decade! And here you have a natural-language system parser concept that’s been around since the middle 1970’s that’s built on a specific limited vocabulary and even a rather predictable grammar ordering.

Compass directions in games WERE practically invented by Interactive Fiction. “Go west, Go East, Go Southwest” is in thousands of stories over half a century! Yet, Glk doesn’t seem to think that an API needs to exist to numerically index these phrases for a 12 year old human child on an iPad in Tokyo who would want to swipe or tilt in compass directions! Why can’t Glk have an API to send a list of valid exits for a room so that a UI can color which directions are valid and invalid? Inform 7 knows these concepts at the authoring level, but Glk strips them down to meaningless ASCII encodings that have to have Regex applied on the other end! Glk seems like an API that doesn’t understand fiction and room movements at all! What about having an API to send inventory to a client to present it’s own way, inventory seems a pretty standard Interactive Fiction concept in multiple interpreters (and one Jon Ingold took time to share code with in 2007). And the code for selecting inventory could be similar to an indexed navigation menu, and localized (on a small device, it could be “swipe screens / change screen to view inventory”, on a larger device - a window onscreen - or even a second display monitor. Integer indexes and string lists are sent via Glk and the app sends back into Glk which ones were interacted with. Inform 7 extensions could prototype these Glk API calls as they are worked out. Even CheapGlk could do inventory window by just putting it into the single window TextBuffer stream when the player puts in a special command.

Jon Ingold was doing amazing work back in 2007, 2011 showing off the need to standardize these concepts in Glk. He was putting hints right next to the readline() prompt. Anticipating and visionary work of what was coming down the pipe. But it seems pretty frustrating that it’s not been seen as a general-purpose multiple-interpreter Glk need and instead of a need to fix now-broken Inform 7 extensions. And what I’ve seen is that mobile apps are being built that have entirely avoided established interpreters and Glk with it’s heavy focus on readline() and getchar() conventions. What Interactive Fiction input API should have at heart is a readHumanWord() type concept (callback when space or backspace is pressed) that the authors can set as directives in their authoring (a novice player could be shown hints of the noun structure next, and a limited device (such as speech or a smartwatch) drop the articulation but internalize those hints). Of course legacy char and readline() can still be in the API, never to be depreciated.

Someone went so far as to make an entire app, open source it, 2.5 years ago that highlights this problem. Both speech output and speech recognition. That person brought it to this community - and it was promptly ignored for what it is - a request that Glk have word hinting so that speech recognition vocabulary could be tied into the operating system. How does “press space to continue” work in Glk - with a speech interface? youtube.com/watch?v=D6i7c7j … e=youtu.be – “Press space to continue” may seem natural to take all the way from the Interpreter in only one human language to someone doing C code for decades, but that doesn’t make any kind of sense in a speech interface - and the world of prose language fiction novels can have a standard API for such interaction. But “press space to continue” and menus both are the realm of Glk - and localization presented these issues long before speech recognition. By having an indexed API call to present “press space to continue” - you can translate it to Korean at the other end of the API - and fit the needs of the player’s human environment. That’s not just shuffle of printf() and readline(). And, being in Glk as a concept, new interpreters could be built that never use those legacy concepts of char and line.

Just this past year university students in Germany were instructed by their professor to demonstrate augmented readline concept - they added the obvious words to an Android app (really a readHumanWord() idea like Jon demonstrated with timers). Something that really could have been in Glk shortly after Jon published his several blogs on the topic back in 2011. ref: IF Archive Hugo Downloads

Right now I would have to dig deep into the Glulx interpreter and teach it to build me a list of words out of the compiled Inform 7 story, or build a static tool to extract the words - because Glk API has ignored Jon Ingold’s work that’s at the very heart of natural human language concerns of Interactive fiction. If, instead, the API would treat menus as indexed lists - and hinting words and verbs and nouns (specific to the story) it would be serving the player instead of C 32-bit unsigned byte structures.

The message I have gotten around here is that there is always, and I mean always, a chicken and egg issue of “no interpreter (app) supports it” regarding the critical bridge of Glk. But Jon Ingold was implementing away Glk ideas in the two examples I gave of hinted input replacements for readline(). What I see is that Glk ignored this and didn’t see that natural human languages were at the very heart of Interactive Fiction, not keystrokes! Keystrokes were just the primitive tools of the 1978 VT100 terminal. All along the fiction itself was built around dictionary! Why should Glk throw it’s hands up and say the human word “Xyzzy” is Greek To Me, just chars? It could be South Korean, Japanese, or Greek, known before it event gets past Glk and into the interpreter!

I’m very frustrated how little understanding Glk has of human languages and how little concern this seems to be to anyone. A menu is what you see at a restaurant, and you use to pick what you want to order - often Chinese restaurants in Europe or Americas even allow you to order by numeric index to cross the hybrid language and culinary barrier! Why can’t Glk see this common problem to Interactive Fiction? I have poured my shitty human language on this page to express that, and I’m pretty sure it will mostly serve to irritate and offend - and not be heard! As Jon Ingold’s far better human expressions wasn’t heard as a Glk API request specific to the unique realm of Interactive Fiction, and that’s clearly what it seems to be to me in his own way, and Glk is right there ignoring what he did in 2007 & 2011… and I’m being told today that it’s a radical idea - when I see hinted keyboards in use on the public bus every day by people who have no idea just how much Inform 7 is built on the concept of human languages and their verb noun structures!

P.S. I started out saying that I wish there was a forum here for more open dialog and organization on technical ideas instead of just tools - because it strikes me that people like Jon Ingold knew to stay away and on blogs (that mostly got ignored here) - because things seem very “not invented here” with a kind of inbreeding attitude toward new ideas. The sore thumb of Android and Glulx - and the ho-hum ignoring of Incant’s speech code on Github collecting dust now for 2.5 years - I think is also similarly an example of “staying away from complex conversation” because people here are concerned about politics of who is in power (including social power and beyond, as in Michel Foucault’s ideas) instead of actually what seems to fit with the common needs of Interactive Fiction input and I/O- which the topic of “word hinting” to speech and keyboard via a standard Glk API seems blindingly obvious! DavidC wanted the “name of room” exposed in his API and has code to show that. This seems a lot like Jon Ingold’s experience here with the local ‘regimes of truth’ on what is Glk. The idea of knowing the Room Name as an object seems to fit well with an API that knows something about novels and fiction prose! Glk doesn’t seem to care at all about names of rooms as human Objects, instead it has a generic idea of a “status window with grids of characters”. DavidC is one of the few who I see stays here with progressive ideas on input/output that fits with natural language - but I also note he has abandoned Glk entirely with it’s keystroke + readline focus ideas of input/output. I personally would have given up a month or two ago if RemGlk hasn’t given me a “atomic transaction / frame update” input/output concept beyond the primitive character concept of Glk. Compared to DavidC, Jon was more directly addressing in 2007 & 2011 that Glk needed to move forward and produced open source code while he explained the benefits to a fiction storyteller author.

To add something to Glk, aren’t we really just talking about adding some function definitions to Glk.h? Why is there so much resistance to Jon’s excellent examples of smart input in Dead Cities 2007, hinted input The Wilds Of Orkney.gblorb from 2011? These are exactly what has become ultra-mainstream on Android in apps of all kinds (SMS, email, etc). What function names and related constant values would need to be defined to do this in a universal way? The problem isn’t inherently difficult to understand, it’s pretty obvious if you work with a modern mobile phone “smart keyboard” what Jon was showing. Jon was showing a “smart client” input where you can see your choices before the Glk callback into the Interpreter. And numerically indexed menu selection systems are also well understood and standard interface idea on practically all operating systems. Here are 300 items and 300 integer indexes - it’s up to the app to figure out how to page/scroll/pick them before callback to the API. Even CheapGlk ‘as the app’ could devise a menu system to pick from 300 items on a list and call back Glk with the index. This is the kind of thing BBS systems in the 1980’s did with regularity in API definitions. Which isn’t just Android… Which is the kind of small-scale community I would associate this with.

Adding it to Glk means define a function, it’s parameters and datatypes. An array of strings and an array of gui32 indexes. We can also define a readHumanWord() function in Glk.h and that’s how you add to the API. It’s not going to break any old code because these functions are not called by anything currently.

Then the standard function name is published and out there. So I can develop my interpreter application to use those API’s and future people who come along with the same need can call them. I’m not asking you to change Gargoyle or to change the publish Apple Store iPhone app - I’m asking that we implement an API on published paper “standard” so that it’s there for future use. I am not asking for a new version of Inform 7 to be implemented this week. I’m asking for the function names and arguments to said functions be agreed on and published! And, in this situation, I am literally pointing to actual working code form 2007 and 2011 that demonstrates the features! The prototype, Inform 7 extensions, and draft discussion has been out there for a decade.

Glk already has a whole bunch of compile conditions such as “Graphics”, “Unicode”, so a new category like “HintedInput” could define these. But really the version of Glk alone distinguishes what’s been implemented pretty easily. Graphics and sound are nice, but really we are talking core Interactive Fiction here - words, human language. Menus, hints to words, “press to continue” prompts, hyperlink phrases (augmented readline via graphics icons could replace).

All of this would be very easy to implement using the proposed CSS/JS additions to Glk - that’s the great benefit of an extensible system. To implement it directly at the Glk would be much more work: Glk doesn’t currently have any way of passing whole complex arrays out, and passing out an array of NULL terminated C strings might not be the best way to do it, at the very least we’re looking at two functions for unicode; the interaction between the word index and the pending line input would have to be defined - passing back the index seems at odds with completing a word in the pending input.

An alternative would be a Blorb chunk with a list of words to be added to the interpreter’s dictionary.

Multiple timers would make sense and I’ve proposed it before. I don’t think it would take too much to implement, but I would need to hear from more interpreter authors first.

But perhaps the best thing to do would be to ask Zarf for a block of Glk codes, and then you can spec and implement a proof of concept for your ideas. It will be much easier to get the other interpreter authors on board with a working example.

Yikes. Take a group of highly intelligent people from around the world with the same hobby and ask them to work on a coherent vision is probably not going to happen in any forum. Wanting such a thing is understandable. Expecting it or being confused by its absence is probably asking way too much. The IF community doesn’t work in any coherent manner. That there’s some core development on tools is more passionate coincidence than visionary directed behavior.

That said, the gravity of most IF tool development is intended to maintain a backwards compatibility that leaves out a great deal of contemporary architectural concepts. It’s not the intent of anyone to ignore current trends. It’s more that making incremental steps that authors understand is just how things go around here.

My work on fyrevm-web is strictly my vision. Sure, I’d love help and I’ve asked plenty of times…but again the gravity of Inform 7 development is connected to the existing legacy tool set, including Glk. I’m not going to cry about it. I’m just going to build it, make games with it, and if no one uses it except me, that’s okay.

So if you have your own opinions and visions, it’s probably healthiest to focus on it as the sole worker-bee. If someone offers their help, great. I wouldn’t expect it. If you need questions answered, I’ve found everyone to be fairly good about that when they have time. (I catch zarf on ifMUD or ask questions here).

And oh by the way…I’ve paid money to contractors, probably over $4000, to get fyrevm-web to where it is, plus hundreds of hours of my own time. Most people just do the work in their spare time.

Yes, clearly I’m frustrated. 4 different Glulx apps, all with their own variation of Glk, have been made for Android and open sourced. I find little evidence of serious conversation about this activity and the changes going on worldwide with computing toward mobile. And saying the truth, that Glk is at the center of making an app and has largely ignored mobile, gets immediate push-back. Jon Ingold’s “The Wilds Of Orkney.gblorb” is exactly what is mainstream in mobile apps - people do this with SMS, chat, email, etc every day. “So if you have your own opinions and visions, it’s probably healthiest to focus on it as the sole worker-bee.” - that’s what the 4 apps on GitHub that run Glulx did - and one by one, they kind of get ignored. One may say that waiting for the occasional comets passing or solar eclipse is part of the issue. The issues seem to be more than one developer and one app. The apps have been there, the issues are self-evident if you try them, just like Jon’s “The Wilds Of Orkney.gblorb” from 2011 has been there - that any daily mobile user would immediately recognize the user interface ideas in play. I’m frustrated, because now I’m trying to finish off the 5th app, as the solar eclipse didn’t align with the already exiting app (Son of Hunky Punk) on finishing off it’s Glulx and Glk in the past 3 months. Maybe the issues go beyond just making code as a lone wolf and tossing it up on Github. I will run out of time soon enough and go away. Just like the previous 4 independent apps.

The only thing I would have an opinion on is that I think the order of interest is web > cross platform desktop > iOS > Android, so you may be attacking the bottom. It may be that web based IF that can be turned into an Android app, similar to Lectrote, is the way to go.

I’m sorry for frustration getting the better part of me. As I said in at the top ad in the 2nd post P.S. I feel like a dedicated forum to Glk and Interpreters that use it - related technical issues is really what’s needed here. I could start a blog, but blogs don’t have equality of who posts, come along and start a new topic, etc. Gargoyle is the multi-headed interpreter app example built on Glk - and, to a lessor degree, Son of Hunky Punk, but I got caught up in seeing how different input apps - CYOA (as it’s called around here?) that Gargoyle doesn’t run - is really most-often being done outside Glk in the mobile world.

What I started to see was a vision of a Glk that treats basic CYOA elements as API UI concepts so that any new interpreters can come along and access them. Starting with the most basic type of API call such as [press any key to continue] (perhaps with a timeout option). Instead of just having a Glk that knows how to present characters to a window, it knows basic concepts of menus, continue prompts, hinted input, words instead of lines and chars, etc.

But I got caught up in that idea, caught up in the frustration of 4 different partial Glk’s on Android - that are all incomplete or buggy. Here I am coding a 5th one! And I’m sorry about getting so emotional about seeing these apps (and really the fiction works they support) and their various ideas of Glk go sometimes ignored. Jon Ingold’s blogs and extensions (Interactive Parsing.i7x version 4) articulated things that seemed strongly intuitive to me as Glk concepts and not Inform or Interpreter needs. And I spilled it all out in a rushed frustrated way because I got caught up in very difficult code (1ms timers that bomb Gargoyle) and viewed these as long-standing prototypes that has been ignored and didn’t take the time to organize, propose and discuss “Smarter Glk” as a supplement or next-generation of “primitive Glk” as we know today. I jumped the gun, half baked ideas, and motivated by frustration. Sorry.

Let me get back to finishing off this new app based on Glk 0.74 as we know it today and excuse my overwhelmed expressions of personal frustration and wishful thinking. I was out of line and I don’t want to sound ungrateful beyond a momentary lapse of reason.