Skip to content

Streaming responses incompatible with Expo server runtime #309

@GoestaHuppenbauer

Description

@GoestaHuppenbauer

TanStack AI version

0.5.1

Framework/Library version

Expo SDK 52, Expo Router v3, @expo/server

Describe the bug and the steps to reproduce it

Description

Expo's server runtime does not support streaming ReadableStream responses in API routes. The response is never sent to the client and isLoading immediately returns to false.

Both toServerSentEventsResponse and toHttpResponse return ReadableStream-based responses which crash in Expo's server runtime with:

TypeError: Cannot read properties of undefined (reading 'statusText')
at respond (@expo/server/src/vendor/http.ts:99:31)

Steps to reproduce:

  1. Create an Expo API route
  2. Use chat() to create a stream
  3. Return toServerSentEventsResponse(stream) or t_oHttpResponse(stream)_
  4. Call the route from a React Native client

Workaround:

Collect all chunks and return as a single JSON response:

const chunks = [];
for await (const chunk of stream) {
  chunks.push(chunk);
}
return new Response(JSON.stringify(chunks), {
  headers: { 'Content-Type': 'application/json' },
});

Your Minimal, Reproducible Example - (Sandbox Highly Recommended)

Snack

Screenshots or Videos (Optional)

No response

Do you intend to try to help solve this bug with your own PR?

Yes, I think I know how to fix it and will discuss it in the comments of this issue

Terms & Code of Conduct

  • I agree to follow this project's Code of Conduct
  • I understand that if my bug cannot be reliable reproduced in a debuggable environment, it will probably not be fixed and this issue may even be closed.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions