GreatMindsWorking.com is a site dedicated to news from fields including A.I., computational linguistics, robotics, developmental psychology, machine learning, and cognitive science, with special focus on language-related technologies.

This site also provides information about Experience-Based Language Acquisition (EBLA), the software system that I developed as part of my dissertation research at the LSU Department of Computer Science.

Brian E. Pangburn
May 27, 2003

Words, words, words...

Flu sent in the following links...

The first, Lexical FreeNet, is sort of a thesaurus meets Six Degrees of Kevin Bacon. The second is WordNet, a lexical database for the English language.

Tools for NLP

Here is a page that I came across that contains many software links for natural language processing (NLP). It is maintained by Kenji Kita, a professor in the Department of Information Science and Intelligent Systems at Tokushima University in Japan.

Our World Community...

I found this link particularily touching. It is nice to see such a display of empathy from the international community.

Neural Net that Mimics Presence of Nitric Oxide

Short BBC article introducing some new twists added to neural nets by researchers at University of Sussex, UK.

Some Background for Experience Based Language Acquisition

I just wanted to give a little background about my own research and the E.B.L.A. project. This link to the downloads section allows you to view/download the abstract and introduction to my dissertation proposal titled "Development of a Computational Model for Human Language Acquisition".

Over time I will start adding some of the actual Java code for the model.

AI Company Making Great Progress?

Artificial Intelligence NV (Ai), a company based in Israel, has been in the news quite a bit lately with stories of HAL, Windows based software that is acquiring language from scratch based on statistical models and positive/negative feedback. They claim that HAL now has the language ability of an 18-month old baby! You can read stories about HAL here and here.

While I think these guys are on the right track, from what I can tell, HAL's language is not grounded in anything. HAL might be able to correctly use "apple" in a sentence, but has no perceptual knowledge of what an apple is.

Introduction to Neural Networks

This link to IBM DeveloperWorks has a very good introduction to neural networks. At the bottom of the page, there is a nice list of links to additional resources.

Error Fixed

If you were one of the few people visiting this new site in the past 12 or so hours and got a "1030: Got error -1 from table handler" notice after the introduction, the problem has been fixed. The server on which the database backend for this site resides maxed-out a disk quota. We have deleted some unnecessary files and everything is fine now.

Neural Theory of Language (NTL) Research Group

The Neural Theory of Language (NTL) research group is a group of Computer Scientists, Linguists, Cognitive Scientists, and Psychologists at U. C. Berkeley focused on questions related to acquisition and use of language and how such processes might be modeled with neural computing.

Deb Roy

Deb Roy is an Assistant Professor of Media Arts and Sciences and Director of the Cognitive Machines Group at the MIT Media Laboratory. He has done significant work on modeling grounded human language learning. His projects include the development of Toco, a robot which can learn shape names and colors.

Syndicate content