# toAISdkStream()

Converts Mastra streams (agent, network, or workflow) to AI SDK-compatible streams. Use this function when you need to manually transform Mastra streams for use with AI SDK's `createUIMessageStream()` and `createUIMessageStreamResponse()`.

This is useful when building custom streaming endpoints outside Mastra's provided route helpers such as [`chatRoute()`](https://mastra.ai/reference/ai-sdk/chat-route) or [`workflowRoute()`](https://mastra.ai/reference/ai-sdk/workflow-route).

`toAISdkStream()` keeps the existing AI SDK v5/default behavior. If your app is typed against AI SDK v6, pass `version: 'v6'` in the options object.

## Structured output in UI streams

When the source agent stream includes a final structured output object, `toAISdkStream()` emits it as a custom AI SDK UI data part:

```json
{
  "type": "data-structured-output",
  "data": {
    "object": {}
  }
}
```

The `object` field contains your full structured output value. This maps Mastra's final structured output chunk into the AI SDK UI stream. Partial structured output chunks are not emitted.

## Usage example

Next.js App Router example:

```typescript
import { mastra } from '../../mastra'
import { createUIMessageStream, createUIMessageStreamResponse } from 'ai'
import { toAISdkStream } from '@mastra/ai-sdk'

export async function POST(req: Request) {
  const { messages } = await req.json()
  const myAgent = mastra.getAgent('weatherAgent')
  const stream = await myAgent.stream(messages)

  const uiMessageStream = createUIMessageStream({
    originalMessages: messages,
    execute: async ({ writer }) => {
      for await (const part of toAISdkStream(stream, { from: 'agent' })) {
        await writer.write(part)
      }
    },
  })

  return createUIMessageStreamResponse({
    stream: uiMessageStream,
  })
}
```

> **Tip:** Pass `messages` to `originalMessages` in `createUIMessageStream()` to avoid duplicated assistant messages in the UI. See [Troubleshooting: Repeated Assistant Messages](https://ai-sdk.dev/docs/troubleshooting/repeated-assistant-messages) for details.

## Parameters

The first parameter is the Mastra stream to convert. It can be one of:

- `MastraModelOutput` - An agent stream from `agent.stream()`
- `MastraAgentNetworkStream` - A network stream from `agent.network()`
- `MastraWorkflowStream` or `WorkflowRunOutput` - A workflow stream

The second parameter is an options object:

**version** (`'v5' | 'v6'`): Selects the AI SDK stream contract to emit. Omit it or pass \`'v5'\` for the existing default behavior. Pass \`'v6'\` when your app is typed against AI SDK v6 response helpers. (Default: `'v5'`)

**from** (`'agent' | 'network' | 'workflow'`): The type of Mastra stream being converted. (Default: `'agent'`)

**lastMessageId** (`string`): (Agent only) The ID of the last message in the conversation.

**sendStart** (`boolean`): (Agent only) Whether to send start events in the stream. (Default: `true`)

**sendFinish** (`boolean`): (Agent only) Whether to send finish events in the stream. (Default: `true`)

**sendReasoning** (`boolean`): (Agent only) Whether to include reasoning-delta chunks in the stream. Set to true to stream reasoning content from models that support extended thinking. (Default: `false`)

**sendSources** (`boolean`): (Agent only) Whether to include source citations in the output. (Default: `false`)

**includeTextStreamParts** (`boolean`): (Workflow only) Whether to include text stream parts in the output. (Default: `true`)

**messageMetadata** (`(options: { part: UIMessageStreamPart }) => Record<string, unknown> | undefined`): (Agent only) A function that receives the current stream part and returns metadata to attach to start and finish chunks.

**onError** (`(error: unknown) => string`): (Agent only) A function to handle errors during stream conversion. Receives the error and should return a string representation.

## Examples

### Converting a workflow stream

```typescript
import { mastra } from '../../mastra'
import { createUIMessageStream, createUIMessageStreamResponse } from 'ai'
import { toAISdkStream } from '@mastra/ai-sdk'

export async function POST(req: Request) {
  const { input } = await req.json()
  const workflow = mastra.getWorkflow('myWorkflow')
  const run = workflow.createRun()
  const stream = await run.stream({ inputData: input })

  const uiMessageStream = createUIMessageStream({
    execute: async ({ writer }) => {
      for await (const part of toAISdkStream(stream, { from: 'workflow' })) {
        await writer.write(part)
      }
    },
  })

  return createUIMessageStreamResponse({
    stream: uiMessageStream,
  })
}
```

### Converting a network stream

```typescript
import { mastra } from '../../mastra'
import { createUIMessageStream, createUIMessageStreamResponse } from 'ai'
import { toAISdkStream } from '@mastra/ai-sdk'

export async function POST(req: Request) {
  const { messages } = await req.json()
  const routingAgent = mastra.getAgent('routingAgent')
  const stream = await routingAgent.network(messages)

  const uiMessageStream = createUIMessageStream({
    execute: async ({ writer }) => {
      for await (const part of toAISdkStream(stream, { from: 'network' })) {
        await writer.write(part)
      }
    },
  })

  return createUIMessageStreamResponse({
    stream: uiMessageStream,
  })
}
```

### Converting an agent stream with reasoning enabled

```typescript
import { mastra } from '../../mastra'
import { createUIMessageStream, createUIMessageStreamResponse } from 'ai'
import { toAISdkStream } from '@mastra/ai-sdk'

export async function POST(req: Request) {
  const { messages } = await req.json()
  const reasoningAgent = mastra.getAgent('reasoningAgent')
  const stream = await reasoningAgent.stream(messages, {
    providerOptions: {
      openai: { reasoningEffort: 'high' },
    },
  })

  const uiMessageStream = createUIMessageStream({
    originalMessages: messages,
    execute: async ({ writer }) => {
      for await (const part of toAISdkStream(stream, {
        from: 'agent',
        sendReasoning: true,
      })) {
        await writer.write(part)
      }
    },
  })

  return createUIMessageStreamResponse({
    stream: uiMessageStream,
  })
}
```

### Converting an agent stream for AI SDK v6

```typescript
import { mastra } from '../../mastra'
import { createUIMessageStream, createUIMessageStreamResponse } from 'ai'
import { toAISdkStream } from '@mastra/ai-sdk'

export async function POST(req: Request) {
  const { messages } = await req.json()
  const myAgent = mastra.getAgent('weatherAgent')
  const stream = await myAgent.stream(messages)

  const uiMessageStream = createUIMessageStream({
    originalMessages: messages,
    execute: async ({ writer }) => {
      for await (const part of toAISdkStream(stream, {
        from: 'agent',
        version: 'v6',
      })) {
        await writer.write(part)
      }
    },
  })

  return createUIMessageStreamResponse({
    stream: uiMessageStream,
  })
}
```

### Using `messageMetadata`

```typescript
import { mastra } from '../../mastra'
import { createUIMessageStream, createUIMessageStreamResponse } from 'ai'
import { toAISdkStream } from '@mastra/ai-sdk'

export async function POST(req: Request) {
  const { messages } = await req.json()
  const myAgent = mastra.getAgent('weatherAgent')
  const stream = await myAgent.stream(messages)

  const uiMessageStream = createUIMessageStream({
    originalMessages: messages,
    execute: async ({ writer }) => {
      for await (const part of toAISdkStream(stream, {
        from: 'agent',
        messageMetadata: ({ part }) => ({
          timestamp: Date.now(),
          partType: part.type,
        }),
      })) {
        await writer.write(part)
      }
    },
  })

  return createUIMessageStreamResponse({
    stream: uiMessageStream,
  })
}
```

### Client-side stream transformation

If you're using the Mastra client SDK (`@mastra/client-js`) on the client side and want to convert streams to AI SDK format:

```typescript
import { MastraClient } from '@mastra/client-js'
import { createUIMessageStream } from 'ai'
import { toAISdkStream } from '@mastra/ai-sdk'
import type { ChunkType, MastraModelOutput } from '@mastra/core/stream'

const client = new MastraClient({
  baseUrl: 'http://localhost:4111',
})

const agent = client.getAgent('weatherAgent')
const response = await agent.stream('What is the weather in Tokyo?')

// Convert the client SDK stream to a ReadableStream<ChunkType>
const chunkStream = new ReadableStream<ChunkType>({
  async start(controller) {
    await response.processDataStream({
      onChunk: async chunk => {
        controller.enqueue(chunk)
      },
    })
    controller.close()
  },
})

// Transform to AI SDK format
const uiMessageStream = createUIMessageStream({
  execute: async ({ writer }) => {
    for await (const part of toAISdkStream(chunkStream as unknown as MastraModelOutput, {
      from: 'agent',
    })) {
      await writer.write(part)
    }
  },
})

for await (const part of uiMessageStream) {
  console.log(part)
}
```