Comments

You must log in or register to comment.

ok531441 t1_iqrfsup wrote

Did you just make something up and ask us to explain it to you?

8

Laafheid t1_iqr68yz wrote

Why would you want to do this?

2

dasayan05 t1_iqrlbf7 wrote

where did you get this arch from ? any reference ? it's not clear what the meaning of this is.

2

McMa t1_iqrd23r wrote

Regardless of your end goals and how much sense this makes or not, this is not too difficult to implement:

Let’s create a ”virtual hidden layer” between input and hidden layers. The weights (and biases) between the input and the “virtual hidden layer” are normal weights, just like the ones in your network. The weights between the virtual hidden layer and the original hidden layer are frozen with the value of 1, bias with 0. And voilà, now you are passing the formerly unknown vales from the hidden unit nodes to each other.

I’m not sure what this would be good for, but let us know if you find something interesting.

1

abystoma t1_iqrtynd wrote

Thanks for your response. What exactly do you mean by virtual hidden layer, is this a layer for just initializing the input in the hidden layer and that layer is not used again, something like that?

1

McMa t1_iqtcdfp wrote

Just a normal hidden layer. The connection between nodes of the same layer in your schema can be thought of as two layers where each node forwards it’s exact value to the corresponding node of the next layer (weight frozen to 1), and the rest of the weights get trained.

1

harharveryfunny t1_iqrodvi wrote

What you have drawn there is a recurrent model where there is a feedback loop. The only way that can work is if what you are feeding back into a layer/node is from the *previous* input/time-step, otherwise you've got an infinite loop.

You can certainly build something like that if you want you (2nd "layer" is an LSTM or RNN), however this is total overkill if all you are trying to do is build is a binary classifier. Depending on your input a single fully-connected layer may work, else add more layers.

1