AI and Liquidware

With the advent of LLMs, we now have tools capable of generating tools on the fly—like the demo from Anthropic. This isn’t just an impressive technical feat; it represents a potential paradigm shift in information technology.

Until now, the digital world has revolved around two pillars: hardware and software. The introduction of software transformed hardware by enabling reprogrammability, unlocking new possibilities for solving diverse problems on a single machine and spawning innovative business models. This shift fundamentally changed how humans interact with machines: new ideas could be prototyped, novel concepts implemented, and the only real constraint was the power of the hardware itself.

Yet, there was another critical limitation: the learning curve. Mastering a specialized software tool often requires days or weeks of training—understanding its logic, memorizing menus, and internalizing hotkeys. Proficiency accelerates workflow, but only after significant investment. In this sense, software mirrors hardware: with practice, the tool becomes an extension of the user’s body. Neuroscience confirms this: skilled users operate tools intuitively, achieving remarkable efficiency. I’ve personally configured hardware control surfaces for colorists, allowing them to leverage muscle memory and bypass cumbersome interfaces entirely.

Autodesk Lustre Control Surface refurbished by the amazing Mazze @Angry Face

With AI-generated interfaces—as in this proof of concept from Marmelab—we enter the era of what I call “liquidware”: tools that require neither expertise nor training, only the ability to articulate your needs clearly. This shift is transformative, as it empowers a much broader audience to create tools for accessing and manipulating data. Previously, developing such tools required specialists, making the process complex, expensive, and time-consuming—whether building a custom spreadsheet solution or an entire information system. The barriers were high, and the inconveniences many.

Limitations Exist, However:

  1. Articulating Needs: Many people struggle to express their goals explicitly, and chatbots can be challenging for them. While my experience shows that chatbots can effectively guide users toward specificity, some still find it difficult. A significant part of an engineer’s role is to “guesstimate”—to use experience to fill gaps, test options, and anticipate unspoken needs. This ability often distinguishes exceptional engineers (or engineering firms) from mediocre ones. The best engineers don’t just deliver what’s asked; they recognize when a client’s specifications are flawed and proactively address the underlying issues.
  2. Imagination and Serendipity: Requesting a single, precise tool for a specific task can limit flexibility in data manipulation. The serendipitous process of exploring tools often sparks new ideas. Traditional tools are typically designed by communities with shared needs and perspectives, embedding collective wisdom that can help solve problems in unexpected ways. While chatbots are improving at suggesting solutions, they often lack the critical mindset to challenge assumptions or propose truly innovative approaches.
  3. Muscle Memory and Physical Interaction: Achieving “muscle memory” with liquidware remains a challenge—unless it’s mapped to hardware or we embrace technologies like Neuralink. Less intrusive EEG solutions exist today, with promising results, but I believe much of our intelligence is tied to movement. Some of my best ideas, for instance, have come to me while cycling.

Your next computer interface ?

Ultimately, we will likely see a decline in traditional software tools and apps in the consumer space—a world that has already moved beyond the concepts of “files” and “storage.” This shift will fundamentally transform the role of many software engineers, who will need to adopt a “data-first” mindset, rather than treating data as a byproduct of the GUI.

In the B2B market, there is already a growing awareness of the importance of structured data. We can expect to see more tools designed to connect data and build context, making systems more integrated and efficient. However, this requires industry-wide collaboration to standardize how data is described and structured. Semantic technologies will play a key role here, as demonstrated by initiatives in the media industry—such as the EBU’s EBUCore+ project and MovieLabs’ Ontology for Media Creation—where working groups are actively developing shared frameworks for data description.

Very interesting topics and changes ahead, I’m much looking forward to drive through those, hoping that liquidware doesn’t turn too often into vaporware !

This text has been written with the assistance of Le Chat, no animals where armed in the process. If you’re interested in that discussion you’re welcome to continue it on the Splectrum Discord.

Originally published on Substack.