1. What’s a lexical atom? Are syntactic atoms the same as phonological, semantic, morphological atoms? What’s the relation between a lexical atom and a concept? What’s the relation between a lexical atom and a word?
2. How/when do syntactic atoms enter a derivation? Are all atoms of a piece or do they enter G operations in different ways and at different points?
3. Is there a relatively small class of basic syntactic operations? Is so what are they? Can the class of such operations be reduced to just one or two? Are any peculiarly “linguistic”? Right now we have: Merge, Move, Bind, Agree, Probe, Lower, Obviate, Delete, Label Are all of these primitive? Are all linguistically proprietary.
4. How transparent are the rules of the performance system and those of the competence system? Are the rules that get used in performance the same as the basic rules if such exist? Or if there are basic operations, do these compile to form larger units that are the rules that performance systems use (e.g. constructions)? Or, assuming that rules like passive are not primitives of UG, might they nonetheless be rules that the parser uses? Is some version of the DTC viable and if so what does it tell us about the primitives of the competence system?
5. How is variation acquired? Is there enough information in the PLD to allow the attested variation to be “induced”? How much information is there in the PLD? What are the limits of the PLD (e.g. degree 0, 0+, 1)? How do kids actually use the PLD in fixing their Gs (what’s the learning path look like)?
6. Is there a bound on possible G (syntactic) variation? P&P theories assume that there is, i.e. that there is only a finitely number of ways that Gs can vary (up to lexical difference). Is this so? In other words, are parameter theories right?
7. Are all parameters lexical parameters (micro vs macro variation)?
8. Is there a universal base (Cinque) and if so what’s it’s source (e.g. is there a semantic basis for the hierarchy and if so what does this tell us about semantic primitives)?
9. Are the assumptions behind the ideal speaker-hearer model reasonable (note: we all believe that they are literarily false)? In particular how misleading is the idea that LADs get all of their data at once, i.e. does loosening this idealization lead to qualitatively different theories of G? If not what does loosening the assumption buy us? How important is the fact that Gs are acquired incrementally to the grammatical end states attained? Does incremental learning make the acquisition problem harder or easier and if so how?
10. Can any feature be a grammatical feature or are there limits on the kinds of features a G can have? So phi features exist. Is this a historical accident or a feature of FL? What’s the inventory of possible features? Can any feature be, in principle, grammatically recruited?
11. What’s the role of UTAH and the thematic hierarchy in UG? G? Do we need well-defined theta roles? Do DPs always have well defined/able theta roles? Does it matter? Where does the theta hierarchy come from?
12. Why do predicates have at most three arguments?
13. Is there a distinction between syntactic and semantic binding or are they the “same” thing? i.e. X syntactically binds Y iff and X semantically binds Y. And if they are different are there diagnostics to tell them apart? If there are two ways to bind why are there two ways to do the same thing?
14. How many kinds of locality conditions does UG allow (A-movement locality, Case/agreement locality, A’-locality, biding locality, thematic locality, selection locality)? Can these different notions be unified? What’s the relation, if any, between phases and minimality? Are both required?
15. Are islands the products of G or simply complexity effects? If the former, are islands derivational or interface effects? How are they to be derived given current conceptions of locality?
16. How are ECP effects to be integrated into the grammar given phase-based assumptions and the copy theory of movement?
17. What’s a G? In P&P it was a vector of P values. Now?
18. Why do Gs seem to clump into languages? Why is the space of possible Gs clumpy?
19. What kinds of relations are grammatical? Antecedence? Agreement? Selection? Can any relation be grammaticized? Topicalization yes, but “suitable mate movement” no? Is there an inventory of basic “constructions” that Gs try to realize (an idea that Emmon Bach once mooted)?
20. What are the phase heads and why?
21. Are phrases labeled and if so why? Are they required by the interface (Chomsky PoP) or by Gs? What is the evidence that the interfaces need labeled structures?
22. What is pied piping?
23. Is movement/merge free or feature driven?
24. Are labels endocentric or exocentric or both?
25. What’s the semantic structure of a ling unit? Conjunction like structure a la Davidson or Function/Argument structure a la Montague? Or some of each?
26. Is there an intrinsic bound on the PLD that kids use to fix their Gs and if there is what is it and where does the bound come from?
27. Can Gs be ordered wrt simplicity? If so, how? MDL, algorithmic complexity. A practical case or two to fix ideas would be very nice to have.
28. Why are determiners conservative?
29. Why do we have superlative ‘est’ meaning roughly ‘most’ but no analogous morpheme meaning ‘least’? I.e. ‘Biggest’ means most big (more big than the others). Why is there nothing like ‘biglest’ meaning least big (less big than the others)?
30. What are the sources for grammatical recursion? Is merge a basic or complex operation?
31. What’s the relation between a feature and a head? Can heads be arbitrarily large bundles of features? If not what is the upper bundling bound?
32. Is their a syntax below X0 and if there is how does it compare to that above X0?
33. Is grammaticality gradient? I.e need we assume that there are grades of grammaticality? Note we all assume that there are grades of acceptability.
34. What’s an interface constraint? Can performance constraints (e.g. structure of memory) influence the shape of the representations of the competence system and if so how? I.e. are such general performance constraints interface constraints?
35. Your turn….