BRN 2.38% 20.5¢ brainchip holdings ltd

Ultra-low power neuromorphic execution of convolutional neural networks, page-48

  1. 9,830 Posts.
    lightbulb Created with Sketch. 26009
    Sceptisim is healthy but agenda's that dress themselves in sceptisim to achieve an end goal dishonestly are not. The full quote from the 2017 address by Geoffrey Hinton makes clear that this person is not being honest intellectually and has cherry picked from his comments as anyone who has even the shallowest understanding of AKID technology will attest. I have emboldened and underlined his concluding remarks which give lie to this persons alleged honest sceptisim:

    In 1986, Geoffrey Hinton co-authored a paper that, three decades later, is central to the explosion of artificial intelligence. But Hinton says his breakthrough method should be dispensed with, and a new path to AI found.

    Speaking with Axios on the sidelines of an AI conference in Toronto on Wednesday, Hinton, a professor emeritus at the University of Toronto and a Google researcher, said he is now "deeply suspicious" of back-propagation, the workhorse method that underlies most of the advances we are seeing in the AI field today, including the capacity to sort through photos and talk to Siri. "My view is throw it all away and start again," he said.

    The bottom line: Other scientists at the conference said back-propagation still has a core role in AI's future. But Hinton said that, to push materially ahead, entirely new methods will probably have to be invented. "Max Planck said, 'Science progresses one funeral at a time.' The future depends on some graduate student who is deeply suspicious of everything I have said."

    How it works: In back propagation, labels or "weights" are used to represent a photo or voice within a brain-like neural layer. The weights are then adjusted and readjusted, layer by layer, until the network can perform an intelligent function with the fewest possible errors.

    But Hinton suggested that, to get to where neural networks are able to become intelligent on their own, what is known as "unsupervised learning," "I suspect that means getting rid of back-propagation."

    "I don't think it's how the brain works," he said. "We clearly don't need all the labeled data."

 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
20.5¢
Change
-0.005(2.38%)
Mkt cap ! $380.4M
Open High Low Value Volume
21.0¢ 22.0¢ 20.3¢ $1.148M 5.421M

Buyers (Bids)

No. Vol. Price($)
8 165683 20.5¢
 

Sellers (Offers)

Price($) Vol. No.
21.0¢ 10000 1
View Market Depth
Last trade - 16.10pm 19/07/2024 (20 minute delay) ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.