Viewing a single comment thread. View all comments

carbocation t1_iyb3f36 wrote

While convolution is a bit funky with tabular data (what locality are you exploiting?), I think that attention is a mechanism that might make sense in the deep learning context for tabular data. For example, take a look at recent work such as https://openreview.net/forum?id=i_Q1yrOegLY (code and PDF linked from there).

6

eternalmathstudent OP t1_iyb3ny8 wrote

I did not want to use resnet as is, I'm not requiring the convolutional layer itself. I'm looking for general purpose residual blocks with skip connections

2

carbocation t1_iybb5a8 wrote

Yes, which is why I think you’ll find that link of particular interest since they comment on it (and attention).

3

rjog74 t1_iybbf02 wrote

Any particular reason why resnet only and looking for general purpose residual blocks

3