Comments

Monday, August 28, 2017

The normalization of science

The title for this post is meant to suggest Kuhn’s distinction between revolutionary and normal science. The post is prompted by an article in PNAS that Jeff Lidz sent me. It’s by the mathematical brothers Geman. The claim in the opinion note (ON) is that contemporary science is decidedly small bore and lacks the theoretical and explanatory ambitions of earlier scientific inquiry. This despite the fact that there are more scientists doing science and more money spent on research today (I wish that were as obvious in linguistics!) than ever before. ON’s take is that, despite this, today
…advances are mostly incremental, and largely focused on newer and faster ways to gather and store information, communicate, or be entertained. (9384)
Rather than aiming to deliver abstract “unifying theories” concerning basic “mechanisms”, the research challenge is taken to be “more about computation, simulation and “big data”-style empiricism” (9385).
FoLers may recognize that this somewhat jaundiced view fits with my own narrower pet peeves concerning theory within GG. I have complained more than once that theoretical speculation is currently held in low regard. In fact, I believe (and have said so before) that many (in fact, IMO, most) practitioners consider theory to be, at best, useless ornamentation and, at worst, little more than horse doodoo. What is prized is careful description, corralling recalcitrant data points, smoothing well-known generalizations. More general explanatory ambitions are treated with suspicion and held to extremely high standards if given any hearing at all. This, at least, is my view of the current scene (and, IMO, the general hostility towards Minimalist speculation reflects this). The Geman brothers think that the disdain for fundamental unifying theory is part of the larger current scientific ethos. Their note asks what mechanisms drive it.

Before going through their claims, let me point out that neither the Brothers Geman (BG) nor yours truly want to be understood as dissing the less theory driven empirical work that is being done.  Both BG and I appreciate how hard it is to do this work and we also appreciate its importance. That is not the point. Rather, the point is to observe that nowadays only this kind of work is valued and that the field strongly marginalizes theoretical work that has different ambitions (e.g. unification, reduction, conceptual clarification).

OP canvasses several reasons for why this might be so. It considers a few endogenous factors. For example, that the problems scientists tackle today are just harder in that they are “ “unsimplifiable,” not “amenable to abstraction” (9385). OP replies that “many natural phenomena seem mysterious and hopelessly complex before being truly understood.” I would add that the passion for description fits poorly with the readiness to idealize and, if not tempered, it will make the abstraction required for fruitful theorizing impossible. We need to elevate explanatory “oomph” as a virtue alongside data coverage if we are to get beyond ““big data”-style empiricism.”

But, OP does not think that this is the main impetus behind small bore science. It thinks the problems are cultural. This comes in two parts, one of which is pretty standard by now (see here for some discussion and references), and one is more original (or at least I have never considered it). Let me begin with the first more standard observations.

OP believes that scientists today face an incentive system that rewards small bore projects. Fat CVs gain promotion, kudos, grants, and recognition. And fat CVs are best pursued by searching for the “minimal publishable unit” (I loved this term MPUs should become a standard measure) and seeking the best venues for public exposure (viz. wide ranging exposure and publicity being the current coin of the scientific realm). So publish often and be as splashy as possible is what the incentive system encourages and OP thinks that this promotes conservative research strategies that discourage doing something new and different and theoretically novel. Note that this assumes that theoretical novelty is risky in that its rewards are only evident in the longer term. This strikes me as a reasonable bet. However, I think that there is also a tension here: why splashiness encourages conservativity is unclear to me. Perhaps by splashy OP just means getting and remaining in the public eye, rather than doing something truly original and daring.

OP claims that the review process also functions as a conservative mechanism discouraging big ideas:

In academia, the two most important sources of feedback scientists receive about their performance are the written evaluations following the submission of papers for publication and proposals for research funding. Unfortunately, in both cases, the peer review process rarely supports pursuing paths that sharply diverge from the mainstream direction, or even from researchers’ own previously published work. (9386)

As I noted, these two observations are not novel (see here for example), even if they may be well placed. Frankly, I would love to hear from younger colleagues about whether this rings true for them. How deeply do these incentives work to encourage some styles of research and discourage others within linguistics? I think they obtain, but I would love to hear what my younger colleagues think.  I can say from my seat on tenure and promotion committees that CV size matters, though bulk alone is not sufficient. There is a hierarchy of journals and publishing in these is a pre-requisite for hiring and promotion, as are the all important letters. I will say a bit more about this at the end when I comment on OP’s one suggested fix.

OP makes two other cultural observations that I have not seen discussed before concerning how the internet may have changed the conduct of research in unfortunate ways. The first way strikes me as a bid fuddy-duddy in the sense that it sounds like the complaint an old person makes about youngsters.  In fact we hear this claim daily in the popular press regarding the apparent inability of the under 30 to focus given their bad multi-tasking habits. OP carries this complaint over to young researchers who, by constantly being “on-line” and/or “messaging”, end up suffering from a kind of research ADHD. Here is OP (9385):

Less discussed is the possible effect on creativity: Finding organized explanations for the world around us, and solutions for our existential problems, is hard work and requires intense and sustained concentration. Constant external stimulation may inhibit deep thinking. In fact, is it even possible to think creatively while online? Perhaps “thinking out of the box” has become rare because the Internet is itself a box.

This may be true, though I am not sure that I believe it (though being old, I am inclined to believe it). My younger colleagues don’t seem to be that distracted by these new communicative instruments nearly as much as I am. They seem used to it and treat it as just another useful tool. But again, I might be wrong and would love to know from younger colleagues if they think that there is any truth to this.
A second aspect of being connected that OP mentions rings more true to me. Here the issue is not “how we communicate” but “how much” we do so. OP identifies an “epidemic of communication” fed by “easy travel, many more meetings, relentless email and a low threshold for interaction” (9385). OP makes the interesting suggestion that this may be way too much of a good thing. Why so? Because it encourages “cognitive inbreeding.” Here is OP again (9385):

Communication is necessary, but, if there is too much communication, it starts to look like everyone is working in pretty much the same direction. A current example is the mass migration to “deep learning” in machine intelligence.

The first sentence is the useful point. I included the second one for spite because I don’t like Deep Learning and anything that takes a whack at its current fashionability is ok with me. But, the main point is interesting and worth considering. OP even provides a nice analogy with speciation in evolution. Evolution relies on diverse gene pools which requires some isolating of different populations. Too much interaction threatens to homogenize the gene pool and, by analogy, the set of acceptable ideas which in turn makes originality harder. The epidemic of communication also encourages team work, by making collaboration easier. In OP’s opinion, theory is largely a solitary matter and requires an iconoclastic bent of mind, something that is not fostered by an emphasis on team projects and too much collaboration. In place of “big ideas” the new technology fosters “big projects.” That’s the view.

I am not sure that I agree, but it is an intriguing suggestion. It fits with three things I have noticed.
First, that people would rather do anything than think. Thinking is really hard and frustrating. I know that when I am working on something I cannot get my head around I am always looking for something else to read (“can’t get down to the problem until I have mastered all the relevant literature”). I also tidy my office (well, a little). What I don’t do is stare at the problem and stare at the problem and think hard. It’s just too much work, and frustrating. So, I can believe that the ready availability of things to read makes it possible to avoid the hard work of thinking all the more.
Second, and now I am mainly focused on theoretical syntax, we have found no good replacement for Chomsky’s regular revolutions.  Let me be careful here. There is a trope that suggests that GG undergoes a revolution every decade or so and this is used as an indication that GG has made no scientific progress. I think that this is bunk. As I have noted before, I think that our knowledge has accumulated and later theory has largely conserved earlier findings. But, there is a grain of truth to this, but contrary to accepted wisdom the purported “revolutions” have been very good for the field for they have made room for those not invested in the old ideas to advance new ones. In other words, Chomsky’s constantly pulling the rug out from under his old students and shifting attention to new problems, technology and subject matter acted to disrupt complacency and make room for new ways of thinking (to the irritation of the oldsters (and I speak from experience here)). And this was very healthy. In fact, the periods of high theory all started in roughly this way. Chomsky was not the only purveyor of new ideas but he was a reliable source of intellectual disruption. We have, IMO, far less of this now. Rather, we have much more careful filigree descriptive work, but less exciting theoretical novelty. We really need more fights, less consensus. At any rate, this is consonant with OP’s main point, and I find it congenial.

Third, I think that we as a field have come to prize looking busy. In my department for example, it is noticed if someone is not “participating” and we track how many conferences students present at and papers they publish (visible metrics of success that we share with our academic overlords). The idea that a grad student’s main job is to sit and think and play with ideas is considered a bit quaint. Everyone needs to be doing something. But sitting and thinking is not doing in quite the same way that participating in every research group is. It’s harder and more solitary and less valued nowadays, or so it seems. I doubt that this is just true of UMD.

OP makes one more important point, but this sadly is not something we can do much about right now. OP notes that once jobs were very plentiful, as were grants. This made it possible to explore different ideas that might not pan out for your livelihood was not at stake if you swam against the tide or if it too some time before the ideas hit paydirt. I suspect that this is a big cause of the current atmosphere of conformity and timidity that OP identifies. In situations where decisions are made by committees and openings are scarce, the aim is not to offend. Careful, conventional filigree work is safer and playing it safe is a good idea when options are few.

That’s more or less the OP analysis. It also has one suggestion for making things better. It is a small suggestion. Here it is (9386):

Change the criteria for measuring performance. In essence, go back in time. Discard numerical performance metrics, which many believe have negative impacts on scientific inquiry … Suppose, instead, every hiring and promotion decision were mainly based on reviewing a small number of publications chosen by the candidate. The rational reaction would be to spend more time on each project, be less inclined to join large teams in small roles, and spend less time taking professional selfies. Perhaps we can then return to a culture of great ideas and great discoveries.


I like this idea, but I am not sure it will fly. Why? Because it requires that institutions exercise judgment and trust their local colleagues considered opinions. It is easy to count CV entries (it is also “objective” (and hence less liable to abuse)). It’s much harder to evaluate actual research sympathetically and intelligently (and it is also necessarily personal and “subjective” (and so more liable to abuse)). And it is even harder to evaluate evaluations for sympathy and intelligence. At the very least, this is all very labor intensive. I don’t see it happening. What I can see happening is a cut down version of this. We should begin to institutionalize one question (I call it “Poeppel’s question” because he made it the lead off query in everyone of his lab meetings) concerning work we read, review, listen to presentations of, advise: What’s the point?  Why should anyone care? If a demand for an answer to this question becomes institutionalized it will force us all to think more expansively and will promote another less descriptive dimension of evaluation. It’s a small thing, a doable thing. I think it might have surprisingly positive effects. I for one intend to start right away.

No comments:

Post a Comment