(by George W. Smith 1991) this was a Natual Language Processing book. mostly dealing with text, but it also had some interesting bits about speech (both reco and synthesis). started out reading the book very closely, but ended up skimming about half way through. i've attempted to grok NLP a couple of times now, and it is just so dense, that i get bogged down and just cant take it anymore, and have to escape. with other AI specialties, i have to up my math skill set to understand what is going on. with NLP, i have to work on grammar ... yuck! if you read this blog then you should know that my grammar is about at a 5th grade level. sometimes my sentences are even yoda-like due to periodic bouts of dyslexia. i dont even do the grammar check on MS word, because it depresses me. so at this point i'm just training my own brain as i would a neural net. each time i show it the material, it picks up a little more than it did before. eventually it should reach a critical mass so that i might attempt to actually tackle this problem. regardless of my own shortcomings, this was one of the better NLP books that i have looked at
having just read the PalmPilot guys AI book, he talks about beating the handwriting reco problem by reducing the input set with 'graffiti' style strokes. why dont we do the same sort of thing for language? because most of the problems come from all the special cases that can occur. it could have a reduced vocabulary and grammar for computers to understand. somewhere in between command-and-control and dictation. it would be english - ebonics - homonyms - slang - etc... = 'speaketti'