71:
53:
22:
316:
As far as I can see, there are two understandings of the "perceptron" term. The first is the original
Rosenblatt's perceptron with the step function as an activation function. The second is just the synonym of an artifitial neuron or even a network. The first seems to be prevalent now (?), but the
457:
The point about the hinge loss was really a side remark. My main contention is that the article should point out connections between the delta rule and similar concepts in statistics and optimization: squared error, linear regression. A lot of the early NN folks were reinventing (or independently
234:
This article suggests training perceptrons with a squared error loss function. That's a pretty bad idea, as the gradient of that loss function is no good for classification tasks -- it doesn't approximate the actual zero-one loss very well and is therefore quite slow to converge. A variant of the
325:
follows the first definition, except for the notion of the Delta rule there which makes no sense in it. If I'm right, this mix-up must be resolved explicitly in all affected articles. Sadly I'm not sure I'm right :).
373:
I've never seen linear regression models being called perceptrons. I'm not too familiar with the old NN terminology, but I think even a step neuron trained fr an MSE loss function is not a perceptron, but an
438:@Qwertyus that function isn't differentiable, is it? Also this is the article for the delta function. Using a different error function would result in a different learning rule, and therefore it wouldn't be
378:. (Note that the application of the step function is only used for classification, and classification must *always* use such a function, even in logistic regression, SVMs and multilayer nets.)
296:
617:
224:
194:
351:
article already contains the indication that "multilayer perceptron" is a misnomer. But still there is no agreement whether we call for example linear neuron a perceptron.
87:
301:
Unfortunately, I don't have a source for this -- but the fact that squared error loss is inappropriate should be covered in any good textbook that covers linear models.
145:
555:
551:
537:
78:
58:
503:
152:
443:
598:
33:
554:
to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the
242:
156:
21:
504:
https://web.archive.org/web/20160304032228/http://uhavax.hartford.edu/compsci/neural-networks-delta-rule.html
589:
495:
447:
83:
356:
331:
573:
If you have discovered URLs which were erroneously considered dead by the bot, you can report them with
561:
318:
226:
and back, probably an oversight. I will alter it to what I think it should be, correct me if I'm wrong.
39:
494:. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit
528:
507:
337:
86:
on
Knowledge. If you would like to participate, please visit the project page, where you can join
467:
421:
387:
306:
199:
169:
558:
before doing mass systematic removals. This message is updated dynamically through the template
409:
574:
581:
70:
52:
123:
540:, "External links modified" talk page sections are no longer generated or monitored by
352:
327:
580:
If you found an error with any archives or the URLs themselves, you can fix them with
611:
459:
413:
379:
302:
547:
546:. No special action is required regarding these talk page notices, other than
487:
348:
322:
236:
412:
confirm that this is used to train perceptrons. Removed the disputed tag.
151:
Yep. It's the weighted sum of all the outputs of the neurons going into j.
317:
second continues to thrive: this article uses it, as well as for example
375:
458:
discovering) things that have different names in different fields.
603:
508:
http://uhavax.hartford.edu/compsci/neural-networks-delta-rule.html
472:
451:
426:
392:
360:
341:
310:
160:
147:? Is it the value previously sent to the activation function?
15:
513:
When you have finished reviewing my changes, please set the
498:
for additional information. I made the following changes:
491:
245:
202:
172:
126:
82:, a collaborative effort to improve the coverage of
550:using the archive tool instructions below. Editors
290:
218:
188:
139:
261:
536:This message was posted before February 2018.
96:Knowledge:WikiProject Artificial Intelligence
8:
618:WikiProject Artificial Intelligence articles
99:Template:WikiProject Artificial Intelligence
19:
486:I have just modified one external link on
291:{\displaystyle \ell (y)=\max(0,-t\cdot y)}
47:
244:
207:
201:
177:
171:
131:
125:
49:
525:to let others know (documentation at
7:
76:This article is within the scope of
79:WikiProject Artificial Intelligence
38:It is of interest to the following
410:These slides from an MIT professor
14:
490:. Please take a moment to review
298:, works much better in practice.
102:Artificial Intelligence articles
69:
51:
20:
285:
264:
255:
249:
1:
604:15:03, 10 December 2016 (UTC)
427:21:23, 18 February 2014 (UTC)
90:and see a list of open tasks.
473:12:27, 29 January 2015 (UTC)
452:04:33, 29 January 2015 (UTC)
311:13:20, 1 December 2012 (UTC)
219:{\displaystyle \omega _{ij}}
189:{\displaystyle \omega _{ji}}
393:10:42, 9 January 2014 (UTC)
361:03:28, 9 January 2014 (UTC)
342:02:43, 9 January 2014 (UTC)
634:
567:(last update: 5 June 2024)
483:Hello fellow Wikipedians,
161:03:08, 4 March 2009 (UTC)
64:
46:
479:External links modified
93:Artificial Intelligence
84:Artificial intelligence
59:Artificial Intelligence
292:
220:
190:
141:
28:This article is rated
319:Multilayer perceptron
293:
221:
191:
142:
140:{\displaystyle h_{j}}
548:regular verification
243:
200:
170:
124:
538:After February 2018
517:parameter below to
592:InternetArchiveBot
543:InternetArchiveBot
288:
216:
186:
137:
34:content assessment
568:
471:
463:
425:
417:
391:
383:
118:
117:
114:
113:
110:
109:
625:
602:
593:
566:
565:
544:
532:
465:
461:
419:
415:
385:
381:
344:
297:
295:
294:
289:
225:
223:
222:
217:
215:
214:
196:switch between
195:
193:
192:
187:
185:
184:
146:
144:
143:
138:
136:
135:
104:
103:
100:
97:
94:
73:
66:
65:
55:
48:
31:
25:
24:
16:
633:
632:
628:
627:
626:
624:
623:
622:
608:
607:
596:
591:
559:
552:have permission
542:
526:
496:this simple FaQ
481:
335:
241:
240:
232:
203:
198:
197:
173:
168:
167:
166:The indices on
127:
122:
121:
101:
98:
95:
92:
91:
32:on Knowledge's
29:
12:
11:
5:
631:
629:
621:
620:
610:
609:
586:
585:
578:
511:
510:
502:Added archive
480:
477:
476:
475:
436:
435:
434:
433:
432:
431:
430:
429:
400:
399:
398:
397:
396:
395:
366:
365:
364:
363:
340:comment added
321:. The article
287:
284:
281:
278:
275:
272:
269:
266:
263:
260:
257:
254:
251:
248:
231:
228:
213:
210:
206:
183:
180:
176:
164:
163:
153:64.231.108.197
134:
130:
116:
115:
112:
111:
108:
107:
105:
88:the discussion
74:
62:
61:
56:
44:
43:
37:
26:
13:
10:
9:
6:
4:
3:
2:
630:
619:
616:
615:
613:
606:
605:
600:
595:
594:
583:
579:
576:
572:
571:
570:
563:
557:
553:
549:
545:
539:
534:
530:
524:
520:
516:
509:
505:
501:
500:
499:
497:
493:
489:
484:
478:
474:
469:
464:
456:
455:
454:
453:
449:
445:
441:
428:
423:
418:
411:
408:
407:
406:
405:
404:
403:
402:
401:
394:
389:
384:
377:
372:
371:
370:
369:
368:
367:
362:
358:
354:
350:
346:
345:
343:
339:
333:
329:
324:
320:
315:
314:
313:
312:
308:
304:
299:
282:
279:
276:
273:
270:
267:
258:
252:
246:
238:
229:
227:
211:
208:
204:
181:
178:
174:
162:
158:
154:
150:
149:
148:
132:
128:
106:
89:
85:
81:
80:
75:
72:
68:
67:
63:
60:
57:
54:
50:
45:
41:
35:
27:
23:
18:
17:
590:
587:
562:source check
541:
535:
522:
518:
514:
512:
485:
482:
444:71.50.59.119
442:delta rule.
439:
437:
300:
233:
165:
119:
77:
40:WikiProjects
529:Sourcecheck
336:—Preceding
30:Start-class
599:Report bug
488:Delta rule
349:Perceptron
347:Well, the
323:Perceptron
237:hinge loss
582:this tool
575:this tool
353:Igogo3000
328:Igogo3000
612:Category
588:Cheers.—
462:VVERTYVS
416:VVERTYVS
382:VVERTYVS
303:Qwertyus
230:Disputed
120:What is
515:checked
492:my edit
376:ADALINE
338:undated
523:failed
36:scale.
519:true
448:talk
357:talk
332:talk
307:talk
157:talk
556:RfC
533:).
521:or
506:to
468:hm?
440:the
422:hm?
388:hm?
334:)
262:max
614::
569:.
564:}}
560:{{
531:}}
527:{{
450:)
359:)
309:)
280:⋅
274:−
247:ℓ
239:,
205:ω
175:ω
159:)
601:)
597:(
584:.
577:.
470:)
466:(
460:Q
446:(
424:)
420:(
414:Q
390:)
386:(
380:Q
355:(
330:(
305:(
286:)
283:y
277:t
271:,
268:0
265:(
259:=
256:)
253:y
250:(
212:j
209:i
182:i
179:j
155:(
133:j
129:h
42::
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.