I have recently started reading about animals and language. Animal use of language is really interesting to me. However, I don’t regard it as an area where significant breakthroughs in social science or natural science, unless of course we eventually encounter an alien civilization, intelligent life from some other planet.
In the context of the generative grammar hypothesis one interesting thing is that animals do not use their limited human language communicative ability to ask questions. This is interesting because it parallels Aristotle, who regarded animals as having only vegetative functions. For Aristotle, the natural slave can “apprehend, but not form questions”. So the fact that animals rarely or never appear to pose questions to humans may be a warrant for generative grammar. Likewise, it appears that animals trained to use human language do not do so using grammatical rules. That too is a warrant for Chomsky’s argument. However, those warrants are still far from any proof of the generative grammar hypothesis, and do not contrary the various observed grammatic divergences which tend to disprove generative grammar.
Chomsky, recall, argues that language is like a self-similar self-extracting archive, a recursive i.e. self referential structure, and that this structure is universal. I do not believe I have misread or misunderstood Chomsky’s thesis. He argues that all human grammars can be modelled using one function; that function, if it existed, could be better modelled recursively than iteratively. However, I have pointed out lots of instances of divergences in human grammar which could not arise were all grammatical structures self similar. Chomskys’ generative grmamar would be a meta-grammar which would account for both pre-positions and post-positions as well as for languages which do not use tenses and those which do, and which also accounts for languages which do or do not use cases, and if so which cases. Well, it is perhaps mathematically possible to make such a meta grammar. However, it is also possible to model the universe using a heliocentric model. The math however is needlessly complicated. Occam’s razor and similar heuristics indicate that a simpler theory which acounts for all the observed data is better than a more complex one. The theory that all language grew out of a nostratic model but then evolved in radically different directions is simpler, has at least as much explanatory value, and actually corresponds to the observed linguistic facts.
The limits on the trained instances of human cognizable language used by animals are
1) no questions
2) no grammar
while *consistent* with Chomsky’s generative grammar hypothesis but are not a *proof* of it. Chomsky argues that human grammar is universally the same, i.e. inherent. These findings in animal language do not contrary that. However, they are not inconsistent with the nostratic-evolutionary hypothesis which I am presenting as a better more accurate explanation of the origins of language and evolution of grammar. If generative grammar were true then grammars would not evolve. However, grammars do evolve as can be seen even within the English language.