Pages: 1
Topic closed
How does a neural network perform a non-linear function on its inputs? Because the way I see it: Say your neural network looks like this (A, B, C, D, E as the numbers, A, B are inputs, E is output):
A B |X| C D V E
And the weights are ac, ad, bc, bd, ce, de. (ex. AC means weight of A on C or whatever)
C would equal ac*A+bc*B
D would equal ad*A+bd*B
Right?
Then E would equal ce*C+de*D
Substituting the resuts above for C and D, we get
ce*(ac*A+bc*B)+de*(ad*A+bd*B).
Simplifying:
ce*ac*A+ce*bc*B+de*ad*A+de*bd*B
Which also equals to (undo distributive property):
A*(ce*ac+de*ad)+B*(ce*bc+de*bd)
Since the weights don't change, we can think of them as all constants:
A*C1+B*C2
This means that E is a linear function of A and B.
Where is this wrong? I know neural nets are more interesting than this...
Offline
Bump... Anyone?
?-mv(ckp7(e2ziep1oxk81'g2(38rmy/zsmj6.nll!1-.opy s50gmdztl37e9o0c6y
Encoding is encoder-decoder.
Offline
Bump.
Offline
Bump... Anyone?
Offline
You could ask ohaiderstudios, he's programming a neural net in Python. He might be able to help you.
Offline
How would I ask him?
Offline
Oh hey, cool! I was just searching 'neural net,' and found this! I'm flattered that you should recommend me, Magnie.
Unfortunately, while I can follow the logic your question is using, I don't see anything wrong with it. The only thing I can think of is that although you said,
Molybdenum wrote:
Since the weights don't change, we can think of them as all constants:
I think that in neural nets the weights do change (feel free to correct me if I'm wrong). But other than that, I'm not smart enough to help you either
Last edited by ohaiderstudios (2012-06-29 15:01:11)
Offline
ohaiderstudios wrote:
Oh hey, cool! I was just searching 'neural net,' and found this! I'm flattered that you should recommend me, Magnie.
Unfortunately, while I can follow the logic your question is using, I don't see anything wrong with it. The only thing I can think of is that although you said,Molybdenum wrote:
Since the weights don't change, we can think of them as all constants:
I think that in neural nets the weights do change (feel free to correct me if I'm wrong). But other than that, I'm not smart enough to help you either
Well, I thought that each neural net had a certain weight for each link, and each neural net held them constant when "processing" the data, but when learning, it changes the weights based on the data/relationships it has learned.
{Grr... 180 second rule. Maybe a neural net can learn how to get used to it }
Offline
Molybdenum wrote:
ohaiderstudios wrote:
Oh hey, cool! I was just searching 'neural net,' and found this! I'm flattered that you should recommend me, Magnie.
Unfortunately, while I can follow the logic your question is using, I don't see anything wrong with it. The only thing I can think of is that although you said,Molybdenum wrote:
Since the weights don't change, we can think of them as all constants:
I think that in neural nets the weights do change (feel free to correct me if I'm wrong). But other than that, I'm not smart enough to help you either
Well, I thought that each neural net had a certain weight for each link, and each neural net held them constant when "processing" the data, but when learning, it changes the weights based on the data/relationships it has learned.
{Grr... 180 second rule. Maybe a neural net can learn how to get used to it }
Yeah, that would be correct, I guess I just misunderstood what you meant.
Offline
Topic closed
Pages: 1