Building text-to-app problem solver – Moneta Studio case study

  • Category:
  • Artificial Intelligence Services
Building text-to-app problem solver – Moneta Studio case study

Tooploox delivered all the development, design, and implementation of the Moneta Studio product – the conversational App-building assistant, designed to provide the user with a web app to solve one’s problem. 

 

The system is powered by generative AI and uses AI agent to orchestrate the process of app creation and interaction with the user.

moneta-case-study

What is generative AI (GenAI)

Generative AI is a relatively new buzzword, yet the techniques at its core have been around for a longer time. At the basics, GenAI is about generating new data from input data, called a prompt. The models may operate in a text-to-text model, a text-to-image model, an image-to-video, or various other arrangements. 

 

Generative AI has transformed the business by enabling individuals to automate multiple tasks and workflows, shifting a large part of their work from crafting to overseeing the process and validating the outcomes. This is a true paradigm shift, with countless professionals getting access to their own personal jack-of-all-trade assistants. Exploding Topics data shows 92% of Fortune 500 companies have already adopted GenAI tools and the generative AI market reached a value of  $44.89 billion in 2024. 

 

What’s unique, this market is powered by ideas. The products just generate data, yet the context, the use, and the implementation of it is on users. But this is changing.

The client

Moneta Studio (or simply Moneta) is an innovative player on the market aiming to enable users to bring their ideas into reality easier and faster. 

The company is building a text-to-application platform that enables users to create web apps using prompts, in the same way Midjourney creates images or ChatGPT writes its responses. The user does not need to code, implement, compile, or design these apps. All the work is delivered using the chat interface. 

Personal apps – the paradigm shift

The company aims to bring a conversational interface to app creation. While it may be possible to prepare the code, the layout, and multiple other assets using a more common GenAI stack, Moneta aims to deliver a fully functional platform, where the new app just arrives, with no knowledge on how to connect the blocks required. As such, the tool can be used for ideation, prototyping, testing proofs of concept, or to build micro-apps for personal use.

What makes Moneta different from a typical app builder is its approach. The system works in a problem-first approach, rather than solution-first. The difference is vital. In a solution-first approach, the system focuses on a tool, while in a problem-first approach, the system aims to solve a problem. If the user approaches Moneta with “I need a task tracking system,” they will get a solution.

This is a huge change, compared to the traditional approach. Instead of using the available software to fit their needs in a non-perfect way, one can get super-tailored solution, delivered for sake of solving a particular, unique problem. If the user is the only person in the world who has this problem – that’s OK. Or at least OK for the AI-powered assisting tool like Moneta. Not exactly for commercial software vendors. 

The challenge is so immense, that one may not know where to start. But luckily – Tooploox was there to help.

The challenge

Moneta Studio is a visionary project that needs to tackle multiple risks, challenges, and obstacles in order to deliver the system of their design.

Costs (API for conversational AI)

With a chat interface powering the system, it was necessary to use a powerful Large Language Model to foster the interaction between the platform and the user. Initially, the company used the ChatGPT API, so the solution was costly overall, with the need to pay for every token processed. 

Apart from the cost of the tokens themselves, it was necessary to provide infrastructure for the apps created by users. This may be a minute usage when it comes to a single user, yet at scale it sums up into big numbers that require huge amounts of money and resources.

Usage and optimization

The amassing costs of both the tokens and infrastructure created pressure for optimization. It is possible to reduce the appetite of the system using only software tweaks. It includes replacing components and rewriting some code. It was one step toward making the platform more stable, affordable, and less prone to cost spikes. 

Building and scaling the tool 

Last but not least, the tool needed to be built. It is a project incomparable to anything else – basically a text-to-app AI tool. This comes with a multitude of challenges, from planning, through design and implementation on-the-go, and maintenance. 

Tooploox is an AI software development company, with a deep understanding of the complexity of the process, so creating a system that delivers end-to-end app production was a challenge of a scope we understood. 

New functionalities, new features, more stable environment

Last but not least, the platform itself is not a project to be finished and forgotten about. The users and founders constantly request new features; for example, a seamless integration with existing solutions. Thus, rebuilding the proof of concept into a full-fledged and usable product is a challenge in itself. 

But nothing to be really scared of.  

moneta-case-study-challenge

The solution

As mentioned above, the scope and scale of the project was very nearly intimidating in terms of complexity and sophistication. That’s why the Tooploox team decided to take on work with a focus on particular cases and challenges.

Transfer from API to custom implementation

The Large Language Model is the heart and soul of Moneta’s offering, being responsible not only for maintaining the conversation with the user, but also for generating the final code. Managing it is the key element of running the system. 

 

To give users more freedom as well as optimize costs, the founders decided to add Anthropic’s Claude models to the tech stack. The system connects with it using the openrouter.ai. Otherwise, the user may connect with OpenAI’s ChatGPT using Microsoft Azure tech stack.

LLM usage and optimization

Large Language Models are expensive to use and generate a lot of data to manage and work with. To reduce this cost, the Tooploox team implemented a tool calls interface. 

  • Icon check

    What are tool calls?

    Tool calls are a solution provided by OpenAI to easily access external tools or data providers. Using the tool call, the developer may write a custom function that connects the OpenAI API with an external system and deliver an answer based on both. For example, it may combine weather data and LLM to get information about the current weather in Amsterdam or Paris. 

    To save on memory use, the team decided to limit the amount of stored information to the most recent 5 tool calls in history. This resulted in reducing input token spending by limiting the history sent to the LLM provider.

Rebuilding the Application Architecture

The whole purpose of Moneta Studio is in delivering a text-to-app service. Backend operations and interactions with the LLM are only an auxiliary activity. The Tooploox team suggested and implemented a set of technology solutions to facilitate the work of apps generated by users. 

UI Integration and prototyping

Initially the company aimed for beautiful interface generation. Yet while the UI “sells” the app, prototyping is about delivering working software. This forced a shift toward data storage and integration with external services. 

The latest versions of the platform aim to make interacting with data easier. By creating interfaces for data in an automated way, the company opened space for new groups of users with various needs. For example, the system may be used to deliver a quick analysis of data stored in multiple systems and various formats. The challenge is eased by cooperation with the LLM, which can, for example, match headings or merge various formats into a shared standard.

The effect

Text-to-text or text-to-image models are something we’ve become used to. Creating a text-to-working-app model was a totally different story. But it worked! But the outcome is different, than one would initially expect. 

Moneta is not your go-to system when you have an app in mind, but rather a super-sophisticated assistant, that can create an app on-the-go, when you need it, no matter if it is a shared to-do list, a meal counseling service, or anything you can think about – including an assistant that searches for LEGO bricks instructions. Because – why not?

Moneta Studio generates deployable mini-applications that have their own logic, user interface, and data storage systems. The apps can be developed in a “multiplayer” model, where multiple users can cooperate to develop an app together, for example, by working on different parts or functions. 

The system generates HTML,CSS and JavaScript code. All images required for the UI or any other purpose are generated using Dall-e. To save on tokens and computing power, the generated code is stored and re-executed when necessary, instead of being generated anew in each instance.

Connection and collaboration

The tool can interact with the user not only with the text interface, but also it can be controlled with voice commands. Also, it is replying in a natural, conversational way. The system also shines with its set of integrations with popular systems and solutions. 

 

 

moneta-case-study-future

The future

The system is constantly evolving to provide users with more options and possibilities to use. The company aims to further automate the bug-handling process and enrich the library of possible integrations.

Decore Serpent

Let our specialists solve the problems and tackle the challenges that hold you from conquering the world.

Let’s talk