David Poeppel and Michael Gazzaniga put together a great ideas symposium at the last CNS meetings. I heard about the even from Ellen Lau who thought it great fun, as well as provocative and instructive. Boy, was she dead on. I watched the presentations (here) and there is a little commentary and useful links here. The latter also links to the videos if you want one stop shopping. The videos are short, 20 minutes, and so easily watchable. I particularly urge you to look at the first two, Gallistel and Ryan and the last one by Krakauer (I really liked it and he was a hoot). But they are all excellent and given a good sense of what is going on now.
A word about Randy's presentation: the thing that struck me (again) is how coherent the picture he is painting is. There are is a deep story here and it starts from what seem innocuous starting points but quickly get one into deep waters. Curiously (and importantly), both he and Ryan (who seems to disagree with most everything Randy put forward (though I really didn't see how he did or even that he did))agree that the idea that memory is coded in synaptic weights in neural nets is a DEAD idea. Ryan noted that it has been categorically shown to be wrong. There may be something to synaptic connections, but it is NOT weights and adjustments to them. This seems like a real big deal to me if this is the cog-neuro consensus now. It puts a very deep nail into standard connectionist-associationist conceptions of mind/brain.
A few more remarks: First, it looks like modularity is back big time. Everyone buys into it. Krakauer noted that even (especially) the deep learning people are buying into this big time, shut when cog-neuro types were running away from the idea. He notes the irony here. But it looks like this idea is back and everyone loves it again.
Second, instinct is back and so are biologically rich constraints on learning. See Krakauer again on "cost functions" and how they are intrinsic to the systems and where all the action is. Yes, I was nodding my head in agreement.
So modules, innate cost functions, classical computations and even recursion. All topics here and well received. It is a good time to look to tie GG to cog-neuro. Watch the videos and enjoy.