Guide: How to use Vision

A step-by-step guide to using Vision capabilities from a Vision model using Langbase Pipes.


In this guide, we will learn how to send an image to a Vision model in a Langbase Pipe and get it to answer questions about it.

What is Vision?

LLM models with vision capabilities can take images as input, understand them, and generate text-based answers about them. Vision models can be used to answer questions about images, generate captions, or provide descriptions of visual content. Vision is also used for OCR tasks, like image classification, and object detection.

LLM TypeInputOutput
Unimodal without VisionTextText
Multimodal with VisionText + ImageText

Let's say we send the following image to a Vision model and ask it to describe the image.

The Vision model will process the image and generate a text-based response like this.

The image depicts an iridescent green sweat bee, likely of the genus Agapostemon or Augochlorini.

In the image, the bee is perched on a flower, likely foraging for nectar or pollen, which is a common behavior for these pollinators.

How to use Vision in Langbase Pipes?

Vision is supported in Langbase Pipes across different LLM providers, including OpenAI, Anthropic, Google, and more. Using Vision in Langbase Pipes is simple. You can send images in the API request and get text answers about them.

Sending Images to Pipe for Vision

First, select a Vision model that supports image input in your Langbase Pipe. You can choose from a variety of Vision models from different LLM providers. For example, OpenAI's gpt-4o or Anthropic's claude-3.5-sonnet.

The Pipe Run API matches the OpenAI spec for Vision requests. When running the pipe, provide the image in a message inside the messages array.

Here is what your messages will look like for vision requests:

// Pipe Run API
"messages": [
	{
		"role": "user",
		"content": [
			{
				"type": "text",
				"text": "What is in this image?"
			}
			{
				"type": "image_url",
				"image_url": {
					"url": "data:image/png;base64,iVBOR...xyz" // base64 encoded image
				}
			}
		]
	}
]

In the above example, we are sending an image URL (base64 encoded image) as input to the vision model pipe, which will process the image and give a text response.

Follow the Run Pipe API spec for detailed request types.

Image Input Guidelines for Vision

Here are some considerations when using vision in Langbase Pipes:

  1. Message Format

    • Images can be passed in user role messages.
    • Message content must be an array of content parts (for text and images) in vision requests. While in text-only requests, the message content is a string.
  2. Image URL

    • The image_url field is used to pass the image URL, which can be:
      1. Base64 encoded images: Supported by all providers.
      2. Public URLs Supported only by OpenAI.
  3. Provider-specific limits

    • Different LLM providers may impose varying restrictions on image size, format, and the number of images per request.
    • Refer to the specific provider’s documentation for precise limits.
    • Langbase imposes no additional restrictions.
  4. Image Quality Settings (OpenAI only)

    • OpenAI models support an optional detail field in the image_url object for controlling image quality.
    • The detail field can be set to low, medium, or high to control the quality of the image sent to the model.

Examples

Here are some example Pipe Run requests utilizing Vision models in Langbase Pipes.

Example 1: Sending a Base64 Image

Here is an example of sending a base64 image in a Pipe Run API request.

curl https://api.langbase.com/v1/pipes/run \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer <YOUR_PIPE_API_KEY>' \
-d '{
  "stream": false,
  "messages": [
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "Describe this image."
        },
        {
          "type": "image_url",
          "image_url": {
			{/* An example image of colorful squares */}
            "url": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAgAAAAIACAIAAAB7GkOtAAANG0lEQVR4nOzXf6/fdX3G8XPaLziBQmu7TSitVWGtdC5S6IEKnvFjskUZcgQqwqwiMEm2cIaigktBWDvYJtDqiuKZM2v5Fea6QIHCRiKCoxsQB7TUShyQTAu2heCQdutq6624EpLr8bgB1+vzxyd55j04b/03h5JOPuAd0f2j/uSc6P7Qsuz8se88O7q/e/vq6P49V8+J7q+e9p3o/r2TpkX3X9i7Nrq/9OvZ/39i9RHR/bEzt0T3l+3ZF91/cseq6P6CHz8e3Z8UXQfgTUsAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQaXvu2j2QPXLIxur95cF92//Fro/tzjn80ur/kU9ui+2MPjUf3z967K7q/9dMrovsXXbQ3uv/i8Mei+7/2t4dH9+8e/Ul0/43BZdH98z53XXT/1I98I7rvBQBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBqemDwSPXD9utXR/RWH/DK6v+x9z0T3L9z6oej+f3zoiOj+ziXD0f25v39ndH/lv22K7j87eXF0/5OvL43uv+34+6P72z6xI7p/4Jefju6P7fjj6P4V586N7nsBAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBrcd8InogSOefm90/5rTT43u773zsej+Ly7eL7q/9qL50f3H/vKV6P6D9/4iuv+BY/eP7m9e+HJ0f/mRo9H9Jyc/EN0/c3BzdH/R1uei+zd9bFV0//nlT0X3vQAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFKDZy59LnpgYsui6P4P79wY3V86463R/bkzDozuzzvp0uj+9GPPi+5vP3lDdP/fP3NNdH/8/qui+/9/6rej+5fP/PPo/iGbzorun/7JNdH9+TMuiO4fPWUkuu8FAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUGj708yujB+Ycd090/5/uOy66/7unzI/uH/ntf4nu/9acQXT//ZdPju5/5a5Z0f3D5o1E93cfuSq6v3PnW6P7n7lv/+j+4iXTovvD886P7t/wV4dE96dOvTK67wUAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQarH9pYfTAB6c8G92/+JBF0f0nvrczuv+FxS9E92+4Ynt0f8X4hdH9524aRPdnzJmI7r8x+vbo/o5FH47un7FkbXT/R4euie5vmbwsun/70B9E9ydtuz67H10H4E1LAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUGpx7zsLogZ/+64bo/lf/Ojo/NLpnT3R/xWEHRfePmPn16P7Eoj+M7n93+i3R/ZOPOTy6Pzb+8+j+jQ//OLo/Pn1WdP+R7+2K7q9Y/Zbs/pKp0f0bN74W3fcCACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKDTZtvyN6YO633hXdf37TzOj+I9ftju5PPDAa3X/Ha7Oi+zOnnR/dv/X+W6P7f3f1yuj+lSufj+7ffPj06P69+90W3V+zeVV0//CRhdH9i6ceFN1fe/lL0X0vAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACg1GD5yNXRAyNjs6L7//w/S6P7t7zySHR/zcJV0f3bLv7t6P5n3/mz6P7Qhoei86dvOz66v3j9tOj+TesfiO7Pu/vA6P66G++I7j946Ibo/p+9uCC6P+dP90X3vQAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFKDr95+T/TAN67fFd1fevtT0f3//uZvRvfXPfq/0f2Dfv0r0f1pn50a3R97eF90/0s7Ph3d/+6e46P7z7zwcHT/zC9NRPefmHRLdP/7+16I7s9e8Gp0/4t7/z667wUAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQanHLc16IHTrvooej+/ifNi+5veerh6P61G0+L7q/88rTo/uMHL43uXzr9H6L7P3jwhOj+y0edFd1/9ZIPRPe/OP890f0Z678V3R9M2h3dX7fp/6L7S8d3Rfe9AABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUoNXZ/1X9MB3DtgW3X/kmjOj+0M3r43Oz7/ijOj+lC0nRfff+L0l0f35Z9wZ3Z+7LPv9nz9qanT/lKHXo/uPHvxcdH/0xEuj+2M/+350/7DPrYvuz96yX3TfCwCglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKDV86D9+PHrgRzP/Jrr/tR8uiO5fcv6J0f1Xzr4ruv/Mihej+wfOWB3d3zCYEd0/98ILovu/3Jr9f04fGonuz3vfydH98dm3R/eP+fAPovsbX3tXdP/oyzZE970AAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSw49NnBs98PpdT0f3T7j1/uj+HVedEt0/+P1vj+5fufiy6P67H/1odP+A0dHo/uyjj4zunzjp59H9hdfeGN2fe9q7o/tzNr83uv87//l4dH/BDddF9zdOfym67wUAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQaPuaEj0cP/MaUs6L7L1/9luj+R1+6N7r/RxufiO5/6tmfRPff89Nd0f3xzcuj+3c/eU50/4NXzY7uj21dGd3fdOJIdP8LV6yJ7u/Y/4Ho/vIpx0T3/+LaC6L7XgAApQQAoJQAAJQSAIBSAgBQSgAASgkAQCkBACglAAClBACglAAAlBIAgFICAFBKAABKCQBAKQEAKCUAAKUEAKCUAACUEgCAUgIAUEoAAEoJAEApAQAoJQAApQQAoJQAAJQSAIBSAgBQSgAASgkAQKlfBQAA//+DUW1hSVkpbAAAAABJRU5ErkJggg=="
          }
        }
      ]
    }
  ]
}'

Example 2: Sending Image as a Public Image URL (supported by OpenAI only)

Public image URLs are only supported by OpenAI, so make sure you are using an OpenAI model.

curl https://api.langbase.com/v1/pipes/run \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer <YOUR_PIPE_API_KEY>' \
-d '{
  "stream": false,
  "messages": [
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "Describe this image."
        },
        {
          "type": "image_url",
          "image_url": {
            "url": "https://upload.wikimedia.org/wikipedia/commons/b/b5/Iridescent.green.sweat.bee1.jpg"
          }
        }
      ]
    }
  ]
}'

Example 3: Sending multiple images

You can also send multiple images attached to the same message.

curl https://api.langbase.com/v1/pipes/run \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer <YOUR_PIPE_API_KEY>' \
-d '{
  "stream": false,
  "messages": [
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "How are these images different?"
        },
        {
          "type": "image_url",
          "image_url": {
            "url": "<image_1_base64>"
          }
        },
		{
          "type": "image_url",
          "image_url": {
            "url": "<image_2_base64>"
          }
        }
      ]
    }
  ]
}'

Replace <image_1_base64> and <image_2_base64> with the base64 encoded images you want to send.

Example 4: Sending multiple images in conversation turns (chat)

You can also send multiple images in different messages across conversation turns.

Let's say you start the conversation with the first image:

curl https://api.langbase.com/v1/pipes/run \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer <YOUR_PIPE_API_KEY>' \
-d '{
  "stream": false,
  "messages": [
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "Describe this image."
        },
        {
          "type": "image_url",
          "image_url": {
            "url": "<image_1_base64>"
          }
        }
      ]
    }
  ]
}'

Then, in the next turn, you can send the second image:

curl https://api.langbase.com/v1/pipes/run \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer <YOUR_PIPE_API_KEY>' \
-d '{
  "stream": false,
  "thread_id": "<thread_id>",
  "messages": [
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "Is this image different from the previous one?"
        },
        {
          "type": "image_url",
          "image_url": {
            "url": "<image_2_base64>"
          }
        }
      ]
    }
  ]
}'

By including the thread_id returned from the first request, in the second request, your Langbase Pipe automatically continues the conversation from the previous turn.

FAQs

  • Make sure to use the correct Vision model that supports image input.
  • Langbase currently supports Vision models from OpenAI, Anthropic and Google. More providers will be supported soon.
  • Vision support is live on Langbase API. Vision in Studio playground is coming soon.
  • Langbase currently does not store images sent to Vision models.