Skip to content

Add support for Lambda Response Streaming in ASP.NET Core bridge packages#2293

Draft
normj wants to merge 4 commits intonormj/response-streamingfrom
normj/aspnetcore-responsestreaming
Draft

Add support for Lambda Response Streaming in ASP.NET Core bridge packages#2293
normj wants to merge 4 commits intonormj/response-streamingfrom
normj/aspnetcore-responsestreaming

Conversation

@normj
Copy link
Member

@normj normj commented Mar 10, 2026

Issue #, if available:
#1635

Description of changes:
PR #2288 adds support in Amazon.Lambda.Core and Amazon.Lambda.RuntimeSupport for Lambda Response Stream. This PR builds on top of that by allowing the ASP.NET Core bridge packages to allow users to opt-in to using Lambda Response Stream support when marshalling the response back to client. This will allow ASP.NET Core Lambda users to be able to stream response back to client as the response is still being created.

Important Note
To maximize code reuse between the standard buffering approach and streaming I updated the build targets from .NET 6 and 8 to .NET 8 and 10. That makes this a major version bump but since .NET 6 has been out of support for quite a few years in Lambda this is a safe change. Since I was able to drop .NET 6 I was able to remove some #if lines.

Here is an example ASP.NET Core application that sets the EnableResponseStreaming property to true to tell the the library create a Lambda response stream and send the response the Lambda response stream as it is being generated.

#pragma warning disable CA2252

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddAWSLambdaHosting(LambdaEventSource.RestApi, options =>
{
    options.EnableResponseStreaming = true;
});

var app = builder.Build();


app.UseHttpsRedirection();
app.UseAuthorization();

app.MapGet("/", () => "Welcome to running ASP.NET Core Minimal API on AWS Lambda (Updated)");

app.MapGet("/streaming-test", async ([FromServices] ILogger<Program> logger, HttpContext context) =>
{
    var cancelationToken = context.RequestAborted;
    if (context.Items["LambdaContext"] is ILambdaContext lambdaContext)
    {
        var source = new CancellationTokenSource(lambdaContext.RemainingTime.Add(TimeSpan.FromSeconds(-5)));
        cancelationToken = source.Token;
    }

    logger.LogInformation("Starting streaming");

    context.Response.ContentType = "text/plain";
    context.Response.StatusCode = 200;

    logger.LogInformation("Creating stream");
    using var stream = context.Response.BodyWriter.AsStream();
    logger.LogInformation("Got BodyWriter as stream");
    using var writer = new StreamWriter(stream, leaveOpen: true);
    for (var i = 1; i <= 1000000; i++)
    {
        var message = $"Hello - {i}";
        await writer.WriteLineAsync(message);

        if (i % 100 == 0)
        {
            logger.LogInformation("Logged {Count} messages", i);
            await Task.Delay(1);
        }

        if (cancelationToken.IsCancellationRequested)
        {
            await writer.WriteLineAsync("Request cancelled, stopping the stream.");
            break;
        }
    }
});

app.Run();

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

@normj normj marked this pull request as draft March 10, 2026 23:38
@normj normj force-pushed the normj/response-streaming branch 5 times, most recently from 1471d93 to d0861c6 Compare March 12, 2026 07:05
@normj normj marked this pull request as ready for review March 12, 2026 18:53
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds opt-in Lambda Response Streaming support to the ASP.NET Core bridge packages (Amazon.Lambda.AspNetCoreServer + Hosting), enabling responses to be streamed back to the caller as they’re produced. It also updates bridge packages and test projects to target .NET 8 and .NET 10 to simplify the streaming implementation and reduce conditional compilation.

Changes:

  • Add EnableResponseStreaming (preview) to the ASP.NET Core Lambda function base class and implement a streaming execution path that writes via LambdaResponseStreamFactory.CreateHttpStream.
  • Introduce StreamingResponseBodyFeature to bridge ASP.NET Core response-body APIs (Stream/PipeWriter/SendFileAsync) to Lambda response streaming semantics.
  • Update TFMs across bridge packages and tests from net6/net8 to net8/net10 and add extensive unit tests for the streaming path.

Reviewed changes

Copilot reviewed 24 out of 24 changed files in this pull request and generated 8 comments.

Show a summary per file
File Description
Libraries/src/Amazon.Lambda.AspNetCoreServer/AbstractAspNetCoreFunction.cs Adds preview streaming flag, streaming prelude builder, and streaming request execution path.
Libraries/src/Amazon.Lambda.AspNetCoreServer/Internal/StreamingResponseBodyFeature.cs New IHttpResponseBodyFeature that buffers pre-start bytes and streams post-start bytes to Lambda response streams.
Libraries/src/Amazon.Lambda.AspNetCoreServer/Internal/InvokeFeatures.cs Removes NET6 conditional compilation around newer ASP.NET Core feature interfaces.
Libraries/src/Amazon.Lambda.AspNetCoreServer/Internal/HttpRequestMessageConverter.cs Removes NET8 conditional compilation (now always compiled under net8/net10).
Libraries/src/Amazon.Lambda.AspNetCoreServer/Amazon.Lambda.AspNetCoreServer.csproj Targets net8/net10 and exposes internals to test assembly.
Libraries/src/Amazon.Lambda.AspNetCoreServer.Hosting/HostingOptions.cs Adds preview EnableResponseStreaming option (doc needs correction).
Libraries/src/Amazon.Lambda.AspNetCoreServer.Hosting/Internal/LambdaRuntimeSupportServer.cs Wires HostingOptions.EnableResponseStreaming into the handler instance.
Libraries/src/Amazon.Lambda.AspNetCoreServer.Hosting/ServiceCollectionExtensions.cs Removes NET8 conditional compilation around SnapStart helper API.
Libraries/src/Amazon.Lambda.AspNetCoreServer.Hosting/Internal/GetBeforeSnapshotRequestsCollector.cs Removes NET8 conditional compilation so the type is always available under net8/net10.
Libraries/src/Amazon.Lambda.AspNetCoreServer.Hosting/Amazon.Lambda.AspNetCoreServer.Hosting.csproj Targets net8/net10 (TFM update).
Libraries/src/Amazon.Lambda.Logging.AspNetCore/Amazon.Lambda.Logging.AspNetCore.csproj Targets net8/net10 (TFM update).
Libraries/test/Amazon.Lambda.AspNetCoreServer.Test/Amazon.Lambda.AspNetCoreServer.Test.csproj Targets net8/net10 and imports common.props.
Libraries/test/Amazon.Lambda.AspNetCoreServer.Test/TestApiGatewayHttpApiV2Calls.cs Removes NET8-only test guard due to new TFMs.
Libraries/test/Amazon.Lambda.AspNetCoreServer.Test/StreamingResponseBodyFeatureTests.cs New tests validating buffering/flush ordering and file sending behavior.
Libraries/test/Amazon.Lambda.AspNetCoreServer.Test/StreamingFunctionHandlerAsyncTests.cs New tests covering streaming-path behavior and exception scenarios.
Libraries/test/Amazon.Lambda.AspNetCoreServer.Test/ResponseStreamingPropertyTests.cs New property-style tests around streaming behavior and prelude building.
Libraries/test/Amazon.Lambda.AspNetCoreServer.Test/BuildStreamingPreludeTests.cs New unit tests for status/header/cookie mapping into the streaming prelude.
Libraries/test/Amazon.Lambda.AspNetCoreServer.Hosting.Tests/ResponseStreamingHostingTests.cs New tests validating HostingOptions wiring via AddAWSLambdaHosting.
Libraries/test/Amazon.Lambda.AspNetCoreServer.Hosting.Tests/ResponseStreamingPropertyTests.cs New tests validating handler wrapper routing/selection behavior.
Libraries/test/Amazon.Lambda.AspNetCoreServer.Hosting.Tests/AddAWSLambdaBeforeSnapshotRequestTests.cs Removes NET8-only test guard due to new TFMs.
Libraries/test/TestWebApp/TestWebApp.csproj Targets net8/net10 (TFM update).
Libraries/test/TestMinimalAPIApp/TestMinimalAPIApp.csproj Targets net10 (TFM update).
Libraries/test/Amazon.Lambda.Logging.AspNetCore.Tests/Amazon.Lambda.Logging.AspNetCore.Tests.csproj Targets net10 (TFM update).
.autover/changes/f0d5a912-bcfa-4244-96cb-ac3c847f877c.json Records major bumps and streaming preview notes (wording needs minor fixes).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Comment on lines +73 to +75
CapturedFeatures = aspNetCoreItemFeature as InvokeFeatures;
PipelineSetupAction?.Invoke(CapturedFeatures);
base.PostMarshallItemsFeatureFeature(aspNetCoreItemFeature, lambdaRequest, lambdaContext);
Copy link

Copilot AI Mar 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PipelineSetupAction returns a Task but is invoked without awaiting inside PostMarshallItemsFeatureFeature. This can produce unobserved task exceptions and makes test setup order nondeterministic if an async delegate is ever provided. Consider changing this to an Action<InvokeFeatures> (since the override is synchronous) or synchronously waiting on the task (and surfacing any exceptions) to keep the tests deterministic.

Copilot uses AI. Check for mistakes.
Comment on lines +776 to +780
if (!streamOpened && IncludeUnhandledExceptionDetailInResponse)
{
var errorPrelude = new Amazon.Lambda.Core.ResponseStreaming.HttpResponseStreamPrelude
{
StatusCode = System.Net.HttpStatusCode.InternalServerError
Copy link

Copilot AI Mar 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the exception path, if the pipeline throws before the response stream is opened and IncludeUnhandledExceptionDetailInResponse is false, no Lambda response stream is created and the handler will still return default (null). That will cause the runtime to fall back to sending a serialized null response (buffered path), which is incorrect for streaming mode. Consider always opening a streaming response with a 500 prelude (optionally with an empty/generic body), or rethrowing so the runtime reports an invocation error when the stream was never created.

Copilot uses AI. Check for mistakes.
Comment on lines +91 to +93
if (_started) return;
_started = true;

Copy link

Copilot AI Mar 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

StartAsync sets _started = true before executing OnStarting callbacks and before opening the Lambda stream. If either the callbacks or _streamOpener() throws, the feature is left permanently "started" with _lambdaStream still null, so later calls to StartAsync/CompleteAsync become no-ops and the buffered bytes may never be flushed. Also, the current _started check/set is not thread-safe and can race under concurrent writes/flushes. Consider using an interlocked/lock-based one-time initialization and only marking started after the stream has been successfully opened (or resetting state on failure).

Copilot uses AI. Check for mistakes.
Comment on lines +205 to +209
await Inner.FlushAsync(cancellationToken);
// Recreate inner writer against the Lambda stream after StartAsync.
_inner = null;
await _feature.StartAsync(cancellationToken);
// Inner now wraps _lambdaStream; nothing extra to flush (StartAsync already
Copy link

Copilot AI Mar 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FlushAsync ignores the FlushResult returned by the inner writer (including cancellation/completion) and always returns a new FlushResult(false,false) when it triggers StartAsync. This can mask cancellation and make upstream pipeline logic think the flush succeeded. Consider propagating the actual FlushResult (or at least returning isCanceled/isCompleted accurately) and avoiding opening the stream if the initial flush was canceled.

Copilot uses AI. Check for mistakes.
Comment on lines +228 to +230
// Complete (sync) — delegate
public override void Complete(Exception exception = null) => Inner.Complete(exception);

Copy link

Copilot AI Mar 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PipeWriter.Complete(Exception?) (sync) currently just delegates to the inner writer without ensuring the Lambda response stream is opened. If ASP.NET Core (or middleware) calls the sync Complete path before any flush, the response prelude/stream may never be created. Consider mirroring the CompleteAsync behavior here (ensure a flush + StartAsync happens when not started) or explicitly document why the sync path is safe to ignore.

Suggested change
// Complete (sync) — delegate
public override void Complete(Exception exception = null) => Inner.Complete(exception);
// Complete (sync) — mirror CompleteAsync behavior to ensure the response is started.
public override void Complete(Exception exception = null)
{
if (!_feature._started)
{
// Flush buffered bytes into the pre-start buffer, then open the stream.
Inner.FlushAsync().GetAwaiter().GetResult();
_inner = null;
_feature.StartAsync().GetAwaiter().GetResult();
}
Inner.Complete(exception);
}

Copilot uses AI. Check for mistakes.
Comment on lines +37 to +39
/// When true, the Lambda hosting server will invoke <c>StreamingFunctionHandlerAsync</c>
/// instead of <c>FunctionHandlerAsync</c>, enabling Lambda response streaming.
/// Requires net8.0 or later.
Copy link

Copilot AI Mar 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The XML doc for EnableResponseStreaming mentions invoking StreamingFunctionHandlerAsync, but the implementation toggles behavior inside FunctionHandlerAsync via the EnableResponseStreaming property (no StreamingFunctionHandlerAsync method exists). This doc mismatch may confuse users; consider updating the comment to describe the actual behavior (FunctionHandlerAsync switches to streaming mode and returns null while writing to a Lambda response stream).

Suggested change
/// When true, the Lambda hosting server will invoke <c>StreamingFunctionHandlerAsync</c>
/// instead of <c>FunctionHandlerAsync</c>, enabling Lambda response streaming.
/// Requires net8.0 or later.
/// When true, the Lambda hosting server enables Lambda response streaming behavior
/// when invoking <c>FunctionHandlerAsync</c>. In streaming mode,
/// <c>FunctionHandlerAsync</c> writes directly to the Lambda response stream and
/// returns <c>null</c>. Requires net8.0 or later.

Copilot uses AI. Check for mistakes.
"Type": "Major",
"ChangelogMessages": [
"[Breaking] Update build targets from .NET 6 and 8 to .NET 8 and 10",
"[Preview] Add support Lambda Response Streaming enabled by setting the EnableResponseStreaming property from the base class AbstractAspNetCoreFunction"
Copy link

Copilot AI Mar 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changelog message grammar: "Add support Lambda Response Streaming enabled by setting..." is missing "for" and reads awkwardly. Consider rephrasing to something like "[Preview] Add support for Lambda response streaming, enabled by setting the EnableResponseStreaming property on AbstractAspNetCoreFunction" for clarity.

Copilot uses AI. Check for mistakes.
"Type": "Major",
"ChangelogMessages": [
"[Breaking] Update build targets from .NET 6 and 8 to .NET 8 and 10",
"[Preview] Add support Lambda Response Streaming enabled by setting the EnableResponseStreaming property on the HostingOptions object passed into the AddAWSLambdaHosting method"
Copy link

Copilot AI Mar 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changelog message grammar: "Add support Lambda Response Streaming enabled by setting..." is missing "for" and would be clearer if rephrased. Consider something like "[Preview] Add support for Lambda response streaming, enabled by setting EnableResponseStreaming on HostingOptions passed to AddAWSLambdaHosting".

Copilot uses AI. Check for mistakes.
@normj normj marked this pull request as draft March 13, 2026 06:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants