Monday, March 14, 2011

// // Leave a Comment

HyperNEAT in Neural Networks and Ontologies in NLP - Why They Seem Promising?

1. HyperNEAT

Exerpts from the site (bold - mine):

"In short, HyperNEAT is based on a theory of representation that hypothesizes that a good representation for an artificial neural network should be able to describe its pattern of connectivity compactly.

This kind of description is called an encoding. The encoding in HyperNEAT, called compositional pattern producing networks, is designed to represent patterns with regularities such as symmetry, repetition, and repetition with variation.

(...)

The other unique and important facet of HyperNEAT is that it actually sees the geometry of the problem domain. (...) To put it more technically, HyperNEAT computes the connectivity of its neural networks as a function of their geometry.

(...)

NEAT stands for NeuroEvolution of Augmenting Topologies. It is a method for evolving artificial neural networks with an evolutionary algorithm. NEAT implements the idea that it is most effective to start evolution with small, simple networks and allow them to become increasingly complex over generations. That way, just as organisms in nature increased in complexity since the first cell, so do neural networks in NEAT. This process of continual elaboration allows finding highly sophisticated and complex neural networks."

...


That is:

- Compression/Minimal message length

- Repetition as a clue for patterns (symmetry is repetition as well)
- Incrementing (small scale to big scale)

- Coordinates (topology in connectivy)

2. Ontologies in NLP/Computational Linguistics

Basically this is a semantic network, i.e. relations between concepts. WordNet is a sort of ontology. The issue is that they are often designed by hand. There are statistical methods, as well, but they're missing something I've mentioned many times in the series What's Wrong With NLP.

Why this happens to be useful?

- Because it resembles real cognitive hierarchy - it's "skeleton hierarchy"

Accordingly, they're prone to be too rigid and unable to self-extend.

0 коментара: