How to use AI Studio with Mealie as the LLM Recipe Parser

My family is trying Mealie.io as our recipe management system, and I found an easy way to use Nebius AI Studio with Mealie as the LLM Recipe Parser. This post documents the beginning of my journey, as this path hasn’t been documented yet.

For now, I am only using AI Studio as the recipe parser within Mealie. The image-processing capabilities are still not working reliably yet, and I’ll update this post as I figure this part out.

What is Mealie.io?

As quick context, Mealie.io is an opensource recipe management system that you can easily self-host. The software allows you to easily import and manage recipes for yourself and your friends/family. This is particularly interesting for our family, as we have amassed quite the collection of recipes over the decades. We’re currently using Plan To Eat to manage our recipes, but we much rather be able to manage our recipes locally rather than in a SaaS that holds us at their pricing and uptime whims.

Installing the software locally was relatively easy using the official Docker images. I set our local instance up on my Synology NAS using Marius Bogdan’s guidance, which included how to also install Portainer to easily compose and manage the container’s backend. While Portainer is new to me, I love how it simplifies the composition and management process.

Why add an LLM to Mealie?

The Mealie recipe import process is fantastic and works most of the time for us, but it had issues importing the ingredients on a couple of food blog sites that we follow. Thankfully, Mealie provides the ability to connect the backend to an OpenAI-compatible LLM.

I selected AI Studio for four reasons: (1) I use Claude, but it doesn’t support OpenAI protocol; (2) I didn’t want to add the complexity of OpenRouter just for home recipe imports; (3) I didn’t feel like learning how to self-host an open LLM; and (4) I want a simple LLM experience – no compute management, no container management, etc.

For these reasons, Nebius AI Studio felt like a no-brainer – they provide a wide selection of inference models hosted on their own infrastructure at a much better price. For example, below are screenshots comparing a couple of models from Nebius to OpenAI’s [cheaper] models. Even for GPT nano, Nebius is half the cost and their models are fine-tunable. Additionally, I am interested in learning more about open LLM/inference models, and this provides me with an easy onramp into that experience.

Screenshots comparing model costs at Nebius and OpenAI.

So with an LLM provider in mind, let’s connect up AI Studio with Mealie as the LLM recipe parser.

Connecting AI Studio to Mealie

It’s really easy to connect Nebius AI Studio to Mealie:

  1. Select the LLM model that you want to use
  2. Retrieve your API key for AI Studio
  3. Update the Mealie configuration file
  4. Validate that it works in Mealie

Step 1: Select your LLM model

There are a lot of LLM options out there – I am starting out this week testing openai/gpt-oss-20b (for text only parsing) and google/gemma-3-27b-it (for image processing), but I’m going to be looking at a few additional models in the coming month.

For me, the choice will likely come down to how model performance and how fine-tuning progresses for this use case. So far, they are performing well without the need for fine-tuning, but I will update this as I learn more.

Step 2: Retrieve an API key from AI Studio

To get an API key that you can use in Mealie, complete the following steps (and note that visual aid below to view the steps):

  1. Login to Nebius AI Studio
  2. Click the ‘Get API key’ button
  3. In the popup dialog, provide a name for the API key (e.g., ‘Mealie App’)
  4. Click ‘Create’ to create the API key
  5. Copy the new API key using the clipboard icon (and save this somewhere safe!)
  6. Click ‘Close’ to close the dialog
Screenshot for Install Steps - Creating a new API key in AI Studio
Screenshot for Install Steps - Prompt to save the API key

Step 3: Update the Mealie configuration file

Following the instructions for Mealie’s OpenAI integration, I used the following steps to update Mealie running on my Synology NAS:

  1. Open up your Portainer instance and connect to your local environment
  2. Select Stacks from the left navigation pane
  3. Select the stack you want to update – for example, ‘Mealie’
  4. Within the Stack details page, click Editor to edit the stack’s Docker Compose file
  5. Add the OpenAI settings to the end of the environment section of Compose file
  6. Click the Update the stack button to apply the changes
  7. When prompted to confirm, click the Update button and Portainer will update Mealie’s stack

There are four environment configuration lines that you’ll need for AI Studio (replacing API_KEY with your real API key from Step 2) to do recipe parsing using the LLM:

OPENAI_BASE_URL: https://api.studio.nebius.com/v1/
OPENAI_API_KEY: API_KEY
OPENAI_MODEL: openai/gpt-oss-20b
OPENAI_ENABLE_IMAGE_SERVICES: false

To use Mealie’s image services, we need to enable image services and add a new configuration line that enables us to increase the maximum number of tokens to support the extra processing.

OPENAI_BASE_URL: https://api.studio.nebius.com/v1/
OPENAI_API_KEY: API_KEY
OPENAI_MODEL: google/gemma-3-27b-it
OPENAI_ENABLE_IMAGE_SERVICES: true
OPENAI_CUSTOM_PARAMS: '{"max_tokens": 4096, "temperature": 0.5}' 

Below are screenshots of the Portainer experience, for those who want to follow along visually.

Screenshot for Install Steps - Environment list in Portainer
Screenshot for Install Steps - Stack list for a Portainer environment
Screenshot for Install Steps - Updating the configuration settings for a stack in Portainer
Screenshot for Install Steps - Portainer update confirmation dialog

Step 4: Validate that it works in Mealie

Now point your browser to your Mealie instance (e.g., 192.168.1.120:9250) and do the following to verify that everything is wired up:

  1. Once logged in as an admin user, click Settings
  2. Select Admin Settings from the fly-out menu
  3. Verify that ‘OpenAI Ready’ has a green checkmark, marking it as working properly
Mealie Screenshot - Accessing Admin Settings

Once Mealie sees the OpenAI connection available, you can use the debug menu to validate the parsers work:

  • OpenAI debugs that image processing works. The screenshot below shows a recipe tested using this debug option.
  • Parser validates how text is processed. The primary test that you are looking for here is how the ingredients section is parsed.
Mealie screenshot - debugging OpenAI image processing

Next Step – Getting image processing to work

As mentioned earlier in the post – parsing recipes from an image is not yet working for me.

  • The OpenAI image processer returns success (see screen capture above)
  • The LLM successfully parses the recipe and returns a response (once I increased the maximum number of tokens)
  • Beyond that, results are inconsistent – Some very basic recipe images process successfully, while others fail with this general ‘Something Went Wrong!’ message (see screenshot below)
Oops - something went wrong?

I’m in the process of debugging Mealie to figure out what’s happening – my current thought is that the LLM is returning a different JSON structure than Mealie is expecting. If you have thoughts or suggestions, let me know.

Next Step – Fine tuning the LLM

I also want to experiment with different models and fine-tuning to see if that makes a difference in the parsing process. Everything seems to “just work” at the moment, but the imports still require human review during the import/parsing process.

That’s it – it’s time to start importing

That’s all you need to do. Honestly, I was surprised at both how easy it was to add an LLM to Mealie and how much power an open LLM provides out-of-the-box.