close this bookVolume 5: No. 13
View the documentPolitics and policy
View the documentSoftware development
View the documentAI news
View the documentUS income tax
View the documentPersonal finance
View the documentUpdates
View the documentJob opportunities
View the documentJob services
View the documentInternet resources
View the documentNeural networks
View the documentApril's fools
View the documentComputists' news

Honeywell has been awarded Patent 5,373,452 for using neural networks to make an "intangible sensor," able to identify discrete values of an intangible property of a substance. [Greg Aharonian , PATNEWS. misc.int-property, 1/27/95.] (I'd guess that it "rotates the sensor space" to infer quantities that can't be measured directly, such as human "intelligence factors.")

A $4 neural-network chip called the RSC-164 is available from start-up company Sensory Circuits Inc., 1735 N. First St., San Jose, CA 95112-4511; (408) 452-1000, (408) 452-1025 FAX. [, comp.speech, 1/27/95.] (I don't know if they're selling to the public.)

"Most neural nets used for data analysis _are_ statistical models. The exceptions work only with noise-free data, such as ART clustering algorithms that do not yield statistically consistent estimates with noisy data. Basic Kohonen networks (not LVQ or SOM) are very similar to k-means clustering. If the activation of the output nodes is determined by Euclidean distance rather than the more usual scalar product, then a Kohonen network is really an alternative algorithm for k-means clustering." [Warren S. Sarle , comp.ai.neural-nets, 1/25/95.]

"A simulation study comparing various Kohonen networks (in Neuralworks Professional II) with a k-means clustering program (FASTCLUS in the SAS/STAT product) was done by P.V. Balakrishnan et al. "A study of the classification capabilities of neural networks using unsupervised learning," Psychometrika, 59, 509-525 (1994). I recommend that neural net researchers read this article as an example of how to do simulations. The clustering program produced fewer classification errors than any of the Kohonen nets for all combinations of the number of clusters, the number of inputs, and the noise level. The error rate for the clustering program was sensitive to the amount of noise, as one would expect, but only slightly sensitive to the number of inputs (the more, the better) and not sensitive to the number of clusters. The error rates for the Kohonen nets were most sensitive to the number of clusters, with considerably less sensitivity to the number of inputs or the noise level (very strange!). Over all, the Kohonen nets made almost 10 times as many errors as the clustering program." [Ibid.] (I've edited Sarle's text to fit TCC's limited space.)

Neuron Digest is back after a 3-month lapse. Send a "subscribe" or "subscribe desired@net.address" subject line or message to . Submissions will now be redistributed in roughly the order received, rather than grouped by type. Volume numbers will now match the year number, starting with 95. Back issues can be retrieved via a "help" subject line sent to , or send an "archive" subject and "get new_letter" message to for a full description of the service. [Peter Marvit , 4/2/95.]