Viewing a single comment thread. View all comments

[deleted] t1_j8zocyb wrote

[deleted]

0

currentscurrents t1_j8zq4tn wrote

>I wouldn’t say it’s common to design networks with information flow in mind

I disagree. The entire point of the attention mechanism in transformers is to have a second neural network to control the flow of information.

Similarly, the autoencoder structure is ubiquitous these days, and it's based around the idea of forcing information to flow through a bottleneck. Some information must be thrown away, so the neural network learns which parts of the data are most important to keep, and you get a good understanding of the structure of the data.

I'd say many of the recent great ideas in the field have come from manipulating information flow in interesting ways.

11

Phoneaccount25732 t1_j90eyfv wrote

This is my preferred interpretation of RESNETs too.

5

currentscurrents t1_j90hs2i wrote

Yeah, the skip connections allow higher layers to have access to information from lower layers. Same thing goes for U-Nets; they're basically an autoencoder with skip connections.

6