Viewing a single comment thread. View all comments

XecutionStyle t1_j78jsua wrote

Error drives learning:

If Error ∝ (Target - Output)

Then you start your network with random weights (so the Output is random and error is large). When you pass the error back through the network, the weights are adjusted proportional to the error. Over time, the weights will settle where (Target - Output) is as low as possible.

This concept is true for any situation: if you're working with image data, no matter what architecture is used to produce the Output, you still compare it with Target (or 'label', Length of Pagrus for your case), and pass the Error back through the network to improve it iteratively.

Try building the simplest neuron: 1 input -> 1 output and use backpropagation to train until convergence.

For your assignment you could use a CNN (but a simple Feed-forward network would work too as you're just outputting 1 value for total length, so it's really a regression task) to get the Output, and train its weights which are internally shared (the window you shift across the image) which are trained the same way. You compute the output, compare it with the actual length of the Pagrus fish (you passed in as input), get the Error and the method above to improve the Network for the task applies.

1