JSON Mode

JSON mode instructs the LLM to give output in JSON and asks it to conform to a provided schema in the prompt. To activate JSON mode, you need to select a model that supports it.


Supported Models

Currently, the following models support JSON mode.

OpenAI

  • gpt-4o
  • gpt-4o-2024-08-06
  • gpt-4o-mini
  • gpt-4-turbo
  • gpt-4-turbo-preview
  • gpt-4-0125-preview
  • gpt-4-1106-preview
  • gpt-3.5-turbo
  • gpt-3.5-turbo-0125
  • gpt-3.5-turbo-1106

Google

  • gemini-1.5-pro
  • gemini-1.5-flash
  • gemini-1.5-flash-8b

Together

  • Mistral-7B-Instruct-v0.1
  • Mixtral-8x7B-Instruct-v0.1

Use JSON Mode in your Pipe

To use JSON mode, ensure that you have selected a model that supports it. You should see a JSON mode toggle in the Pipe IDE. Turn the toggle ON to activate JSON mode.

Additionaly, you can also provide a schema in the system prompt or messages to further optimize the output.

Alternative

If you are using a model that does not support JSON mode, try asking the model to produce an output in JSON and providing the schema in your prompt. The LLM will try to conform to the schema as much as possible.