-
Notifications
You must be signed in to change notification settings - Fork 160
[ai] Fix collectUiMessages option by accumulating chunks in a separate step call #784
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Signed-off-by: Peter Wielander <[email protected]>
🦋 Changeset detectedLatest commit: 6d2df7d The changes in this PR will be included in the next version bump. This PR includes changesets to release 2 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
📊 Benchmark Results
workflow with no steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Express | Nitro | Next.js (Turbopack) workflow with 1 step💻 Local Development
▲ Production (Vercel)
🔍 Observability: Next.js (Turbopack) | Express | Nitro workflow with 10 sequential steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Nitro | Express | Next.js (Turbopack) Promise.all with 10 concurrent steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Express | Next.js (Turbopack) | Nitro Promise.all with 25 concurrent steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Nitro | Express | Next.js (Turbopack) Promise.race with 10 concurrent steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Next.js (Turbopack) | Express | Nitro Promise.race with 25 concurrent steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Nitro | Express | Next.js (Turbopack) Stream Benchmarks (includes TTFB metrics)workflow with stream💻 Local Development
▲ Production (Vercel)
🔍 Observability: Nitro | Express | Next.js (Turbopack) SummaryFastest Framework by WorldWinner determined by most benchmark wins
Fastest World by FrameworkWinner determined by most benchmark wins
Column Definitions
Worlds:
|
🧪 E2E Test Results❌ Some tests failed Summary
❌ Failed Tests🪟 Windows (1 failed)nextjs-turbopack (1 failed):
🌍 Community Worlds (21 failed)mongodb (1 failed):
redis (1 failed):
starter (18 failed):
turso (1 failed):
Details by Category✅ ▲ Vercel Production
✅ 💻 Local Development
✅ 📦 Local Production
✅ 🐘 Local Postgres
❌ 🪟 Windows
❌ 🌍 Community Worlds
❌ Some E2E test jobs failed:
Check the workflow run for details. |
readUIMessageStream
readUIMessageStreamSigned-off-by: Peter Wielander <[email protected]>
Signed-off-by: Peter Wielander <[email protected]>
Signed-off-by: Peter Wielander <[email protected]>
Signed-off-by: Peter Wielander <[email protected]>
When a step function is invoked, only capture `this` if it has a `classId` property (indicating it's a class constructor registered for serialization by the SWC plugin). This prevents serialization errors when step functions are called as methods on arbitrary objects (e.g., `tool.execute()`) where the object has non-serializable properties like Zod schemas.
…ent-collectUiMessages
I, Peter Wielander <[email protected]>, hereby add my Signed-off-by to this commit: 0e8fe8c Signed-off-by: Peter Wielander <[email protected]>
This is a refactor of #768 which moves the accumulation logic to a step. The test harness I was using to test #768 didn't correctly account for the workflow transformation, so
collectUiMessagesdidn't actually work as intended before (also reported in issue #739).This PR moves the final accumulation call to a
"use step"at the end of theagent.streamcall. The actual logic still uses AI SDK'sreadUIMessageStreaminternally, just like before.I alternatively considered pre-creating a stream in a setup step, then double piping to that stream, before passing it to the final step that can call
readUIMessageStreamon it. In a non-workflow world, this would result in a performance improvement (native stream use), but since we are awaiting step boundaries anyway, it would not help in this case, and it would make the content less observable.I tested this in

flight-booking-app. See the final step it makes here that gets the chunks and returns theuiMessages: