tag:blogger.com,1999:blog-5275657281509261156.post7914851273785857793..comments2024-03-26T03:22:12.899-07:00Comments on Faculty of Language: Complexity redux reduxNorberthttp://www.blogger.com/profile/15701059232144474269noreply@blogger.comBlogger1125tag:blogger.com,1999:blog-5275657281509261156.post-67707117010980955202013-01-15T10:05:55.902-08:002013-01-15T10:05:55.902-08:00Apologies for not getting back to this before. Nor...Apologies for not getting back to this before. Norbert was arguing that it's important that the faculty of language produces data structures that are well suited to their interactions with other parts of cognition, lets say, primarily, certain kinds of thinking (say those that involve situations, participants in them, their temporal and locational structure, asserting and denying aspects of those situations, etc; things we admittedly know little about). Well suited meaning that the mapping between properties of parts of the data structures produced by FoL and properties of those in thinking is simple (where an isomorphic mapping would be the most simple, and we measure complexity by departures from that in a Goodmanesque way). Then the relevant notion of computational complexity is, I think, a notion about the selection of an optimal algorithm for producing the relevant kind of data structures. Before I understood this, it used to puzzled me, for example, why Merge should be taken to be binary set formation, not n-ary set formation, which might be seen as less restrictive and hence simpler in some sense; but if the data structures require some kind of embedding, then an optimal way of getting that kind of data structure might be via an algorithm that creates binary structures. Ditto for things like short vs long dependencies: a long dependency has to be created out of shorter ones because the relevant data structure needs to map neatly to, say, situations or individuals. The relevant notion of computational complexity is then just about optimising algorithms for the creation of certain kinds of data structures. But it's crucial that the optimisation is bounded by the empirical conditions on the data structures' interactions with other systems of cognition. As Norbert hinted above, that doesn't require that issues of actual memory limitations in processing etc are relevant to the optimisation: that would be something extra and unexpected on my reading of this.davidadgerhttps://www.blogger.com/profile/00821774928618824698noreply@blogger.com