Skip to main content





Updated

Chrome 85 has an experimental implementation of request flows, which means that you can start making a request before you have the entire body available.

You could use this to:

  • Calienta el server. En otras palabras, puede iniciar la solicitud una vez que el Username enfoca un campo de entrada de texto y quitar todos los headers, luego esperar hasta que el usuario presione ‘enviar’ antes de enviar los datos que ingresaron.
  • Envíe gradualmente datos generados en el client, como audio, video o datos de entrada.
  • Recrea los sockets Web through HTTP.

But since this is a low-level web platform feature, don't limit yourself to me ideas. Maybe you can think of a much more interesting use case for request streaming.

Test the request sequences

Enable support during the proof of origin phase

Recovery request streams are available in a proof of origin starting with Chrome 85. The proof of origin is expected to finish in Chrome 87.

Origin testing allows you to test new features and provide feedback on their usability, practicality, and effectiveness to the web standards community. For more information, see the Origin testing guide for web developers. To enroll in this or any other proof of origin, visit the registration page.

Register for proof of origin

  1. Request a token by your origin.
  2. Add the token to your pages. There are two ways to do it:
    • Add a origin-trial tag to the header of each page. For example, this might look like this:
    • If you can configure your server, you can also add the token using a Origin-Trial HTTP header. The resulting response header should look like this:
      Origin-Trial: TOKEN_GOES_HERE

Enabling via chrome: // flags

Test request streams in Chrome 85 by flipping an experimental flag:
enable-experimental-web-platform-features.

Manifestation

This shows how you can transmit user data to the server and send data that can be processed in real time.

Yeah okay, not the most imaginative example, I just wanted to keep it simple, okay?

Anyway, how does this work?

Previously on the exciting adventures of fetch streams

Answer Las transmisiones han estado disponibles en todos los browsers modernos durante un tiempo. Le permiten acceder a partes de una respuesta a medida que llegan del servidor:

const response = await fetch(url);
const reader = response.body.getReader();

while (true) {
const { value, done } = await reader.read();
if (done) break;
console.log('Received', value);
}

console.log('Response fully received');

Every value is a Uint8Array bytes. The number of arrays you get and the size of the arrays depends on the speed of the network. If you have a fast connection, you will get fewer "chunks" of larger data. If you have a slow connection, you will get more smaller chunks.

If you want to convert the bytes to text, you can use
TextDecoderor the most recent transformation stream if your target browsers support it:

const response = await fetch(url);
const reader = response.body
.pipeThrough(new TextDecoderStream())
.getReader();

TextDecoderStream is a current of transformation that traps all those Uint8Array
fragments and turns them into strings.

The streams are great as you can start to act on the data as it arrives. For example, if you receive a list of 100 'results', you can display the first result as soon as you receive it, instead of waiting for 100.

Anyway, that's response streams, the new and exciting thing I wanted to talk about is the request stream.

Transmission request bodies

Requests can have bodies:

await fetch(url, {
method: 'POST',
body: requestBody,
});

Previously, you needed the whole body ready to go before you could start the request, but now in Chrome 85 you can provide yours ReadableStream of data:

function wait(milliseconds) {
return new Promise((resolve) => setTimeout(resolve, milliseconds));
}

const stream = new ReadableStream({
async start(controller) {
await wait(1000);
controller.enqueue('This ');
await wait(1000);
controller.enqueue('is ');
await wait(1000);
controller.enqueue('a ');
await wait(1000);
controller.enqueue('slow ');
await wait(1000);
controller.enqueue('request.');
controller.close();
},
}).pipeThrough(new TextEncoderStream());

fetch(url, {
method: 'POST',
headers: { 'Content-Type': 'text/plain' },
body: stream,
});

This will send "This is a slow request" to the server, one word at a time, with a one second pause between each word.

Each part of the body of a request must be a Uint8Array of bytes, so I am using
pipeThrough (new TextEncoderStream ()) para hacer la conversion por mí.

Recordable streams

Sometimes it is easier to work with transmissions when you have one WritableStream. You can do this by using an 'identity' sequence, which is a readable / writable pair that takes everything that is passed to its write end and sends it to the read end. You can create one of these by creating a TransformStream without any argument:

const { readable, writable } = new TransformStream();

const responsePromise = fetch(url, {
method: 'POST',
body: readable,
});

Now everything you send to the write stream will be part of the request. This allows you to compose streams together. For example, here's a silly example where data is fetched from one URL, compressed, and sent to another URL:


const response = await fetch(url1);
const { readable, writable } = new TransformStream();


response.body
.pipeThrough(new CompressionStream('gzip'))
.pipeTo(writable);


await fetch(url2, {
method: 'POST',
body: readable,
});

The example above uses compression flows to compress arbitrary data using gzip.

Feature detection

Si proporciona un objeto de cuerpo que el browser no maneja específicamente, llamará toString () on the object and use the result as the body. If the browser doesn't support request sequences, that means the request body becomes
"[object ReadableStream]" – probablemente no be lo que desea enviar al servidor. Para evitar esto, use la detección de características:

const supportsRequestStreams = !new Request('', {
body: new ReadableStream(),
method: 'POST',
}).headers.have you('Content-Type');

if (supportsRequestStreams) {
} else {
}

This works because the browser adds a Content-Type header of
text / plain; charset = UTF-8 to the request if the body is text. The browser only treats the body as text if do not support request flows, otherwise it won't add a Content-Type header at all.

Restrictions

Stream requests are a new powerhouse for the web, so they come with a few restrictions:

Restricted redirects

Algunas formas de redireccionamiento HTTP requieren que el navegador vuelva a enviar el cuerpo de la solicitud a otra URL. Para admitir esto, el navegador tendría que almacenar en búfer el contents de la transmisión, lo que anula el punto, por lo que no hace eso.

En cambio, si la solicitud tiene un cuerpo de transmisión y la respuesta es una redirect HTTP distinta de 303, la recuperación se rechazará y la redirección do not be followed.

303 redirects are allowed as they explicitly change the method to GET and discard the request body.

HTTP / 2 only by default

By default, retrieval will be rejected if the connection is not HTTP / 2. If you want to use streaming requests over HTTP / 1.1, you must participate:

await fetch(url, {
method: 'POST',
body: stream,
allowHTTP1ForStreamingUpload: true,
});

Caution:
allowHTTP1ForStreamingUpload It is non-standard and will only be used as part of the experimental Chrome implementation.

According to the HTTP / 1.1 rules, request and response bodies must send a
Content-Length header, so the other side knows how much data it will receive, or change the message format to use chunky encoding. With chunky encoding, the body is divided into parts, each with its own length of content.

Chunk encoding is quite common when it comes to HTTP / 1.1 answers, but very rare when it comes to requests. Because of this, Chrome is a bit concerned about compatibility, so it's enabled for now.

This is not a problem for HTTP / 2 as HTTP / 2 data is always 'chunked', even though it calls chunks
frames. Chunky encoding wasn't introduced until HTTP / 1.1, so requests with streaming bodies will always fail on HTTP / 1 connections.

Depending on how this test goes, the spec will restrict streaming responses to HTTP / 2 or always allow it for both HTTP / 1.1 and HTTP / 2.

No duplex communication

One little-known feature of HTTP (though, whether this is standard behavior depends on who you ask) is that you can start receiving the response while still sending the request. However, it is so little known that it is not well supported by servers and, well, browsers.

In the current Chrome implementation, you won't get the response until the body has been fully submitted. In the following example responsePromise it will not resolve until the readable transmission has been closed. Anything that the server sends before that point will be buffered.

const responsePromise = fetch(url, {
method: 'POST',
body: readableStream,
});

The second best option for duplex communication is to search for a transmission request and then search again to receive the transmission response. The server will need some way to associate these two requests, such as an ID in the URL. This is how the demo works.

Potential problems

Yes, so ... this is a new feature, and it is not widely used on the Internet today. Here are some issues to be aware of:

Incompatibility on the server side

Some app servers don't support streaming requests, and instead wait for the full request to be received before allowing you to see something, which frustrates the point. Instead, use an application server that supports streaming, such as
NodeJS.

But you're not out of the woods yet! The application server, like NodeJS, is typically behind another server, often called a "front-end server," which in turn may be behind a CDN. If any of them decide to buffer the request before delivering it to the next server in the chain, they will lose the benefit of transmitting requests.

Also, if you are using HTTP / 1.1, one of the servers may not be ready for chunk encoding and may fail with an error. But hey, you can at least try that and try to switch servers if necessary.

... long sigh ...

Incompatibility beyond your control

Si está utilizando HTTPS, no necesita preocuparse por los proxies entre usted y el usuario, pero el usuario puede estar ejecutando un proxy en su máquina. Algún software de protección de Internet hace esto para permitirle monitorear todo lo que ocurre entre el navegador y la red.

There may be cases where this software's buffers request bodies, or in the case of HTTP / 1.1, don't expect chunky encoding and it breaks in some interesting way.

At this time, it is unclear how often, if at all, this will happen.

If you want to guard against this, you can create a 'feature test' similar to the demo above, where you try to stream some data without closing the stream. If the server receives the data, it can respond through a different lookup. Once this happens, you will know that the client supports end-to-end streaming requests.

Comments welcome

Los comentarios de la comunidad son cruciales para el diseño de nuevas API, así que pruébelo y díganos lo que piensa. Si encuentra algún error, por favor Report them, but if you have general comments, please send them to blink-network-dev Google Group.

Photo by Laura Lefurgey-Smith
in
Unsplash