When dealing with network streams, sockets, or large files in .NET, developers often look for ways to process data efficiently. That’s where the System.IO.Pipelines
API comes in — a low-level, high-performance abstraction for working with streaming data.
At the center of this API is PipeReader
, which provides a fast way to consume bytes from a buffer. And with .NET 10, things just got even better:
JsonSerializer.Deserialize
now directly supports PipeReader
.
A Quick Refresher: What is PipeReader
?
A PipeReader
represents the reading side of a pipeline. It’s built for performance scenarios where data doesn’t come all at once, but rather in chunks (like from a socket or file stream).
-
Producer → writes bytes into a
PipeWriter
-
Consumer → reads those bytes using a
PipeReader
This design avoids unnecessary data copying, supports async APIs naturally, and allows you to process data incrementally.
Example – Basic PipeReader usage:
var pipe = new Pipe();
// Producer
_ = Task.Run(async () =>
{
var writer = pipe.Writer;
byte[] msg = Encoding.UTF8.GetBytes("Pipeline rocks!");
writer.Write(msg);
await writer.FlushAsync();
writer.Complete();
});
// Consumer
while (true)
{
var result = await pipe.Reader.ReadAsync();
var buffer = result.Buffer;
foreach (var segment in buffer)
Console.Write(Encoding.UTF8.GetString(segment.Span));
pipe.Reader.AdvanceTo(buffer.End);
if (result.IsCompleted) break;
}
pipe.Reader.Complete();
The Limitation Before .NET 10
While PipeReader
was excellent for raw data, integrating it with JSON APIs had a catch:
You couldn’t directly use it with JsonSerializer
.
Instead, you had to first convert it to a Stream
:
using var stream = pipeReader.AsStream();
var person = await JsonSerializer.DeserializeAsync<Person>(stream);
This worked, but it added:
- Extra memory allocations
- Performance overhead
- More boilerplate code
The .NET 10 Upgrade
With .NET 10, you can now deserialize JSON directly from a PipeReader
:
Person? person = await JsonSerializer.DeserializeAsync<Person>(pipeReader);
That’s it — no conversion step, no AsStream()
, just clean and efficient deserialization.
Quick Comparison
Scenario | Before .NET 10 | After .NET 10 |
---|---|---|
Deserialize from PipeReader | ❌ Not supported directly → must use .AsStream()
|
✅ Supported natively by JsonSerializer
|
Performance | Extra allocations & memory copying | Zero-copy, faster |
Code clarity | Boilerplate required | Clean, one-liner |
Networking scenarios | More complex integration | Natural fit with pipelines |
Example: JSON Over a Socket
Here’s how you might combine PipeReader
with JsonSerializer
when handling socket data:
var pipe = new Pipe();
_ = FillPipeFromSocketAsync(socket, pipe.Writer);
var person = await JsonSerializer.DeserializeAsync<Person>(pipe.Reader);
Console.WriteLine($"Received person: {person?.Name}");
The pipeline continuously fills from the socket, while JsonSerializer
consumes JSON as it streams in — no extra conversions required.
Wrap-Up
If you’re working on high-performance .NET applications — such as servers, networking libraries, or real-time data processors — you’ll want to keep an eye on PipeReader
.
And now, with direct JsonSerializer
support in .NET 10, your JSON workloads can be both simpler and faster.
Tags: dotnet
, csharp
, performance
, pipelines
I’m Morteza Jangjoo and “Explaining things I wish someone had explained to me”
Top comments (0)