Plus ça change…

June 9, 2011 § 3 Comments

From Hubert M. Blalock, Jr., “The Real and Unrealized Contributions of Quantitative Sociology.” American Sociological Review 54: 447-460. 1989.:

… one finds a large number of journal articles that briefly discuss the measurement of selected variables, that also admit to the probability of errors, but that then effectively announce to the reader that the subsequent empirical analysis and related interpretations will proceed as if there were absolutely no measurement errors whatsoever!

This is but one illustration of the more general point that methodological ideas are adopted when it is relatively easy and costless to do so, but that they are resisted or totally ignored when it is to the investigator’s vested interest to do so.  There is also the related tendency to attempt to substitute sophisticated data analysis techniques for inadequate data collection procedures, which are of course far more costly and time consuming. (p. 450)

The Disgruntled Sociologist notes with some despair that this has not changed much at all over the past 22 years. In fact, the tendency to deploy fancy techniques has increased as it has gotten easier and easier to do so.  (Yes, this is your fault, Stata.) And ironically, while data collection has — thanks to on-line data sources — in many respects gotten easier, TDS detects little improvement in the creativity applied to data collection. The vast majority of graduate students today are looking to download their dissertation data.

Blalock then goes for the kill (pp. 457-458):

Sociology is not a high-quality discipline. Over at least the past three decades our undergraduate and graduate applicants have consistently scored near the very bottom on standardized tests, not only with respect to quantitative reasoning scores but verbal reasoning scores as well. [Followed by a long list of examples: undergrad curricula, graduate training, promotion criteria, journal standards.]

Finally, our professional associations, and especially the ASA, also need to face the quality question head on.  In the early 1970s, when I first served on ASA Council, “quality” was a dirty word suggesting elitism and an attempt to impose orthodoxy. I even encountered instances where potential journal editors were passed over because it was argued that their standards would be too demanding!  …

In the end, ASA policies are influenced rather heavily by those whom we elect to office, particularly those elected to Council and the Publications Committee.  I am not too optimistic that “politics” within the ASA will change dramatically over the coming years. If not, it will remain for our leading departments to take our quality problem much more seriously than we have in the past.

Sigh. It would appear that Blalock’s pessimism was well-warranted.


§ 3 Responses to Plus ça change…

  • krippendorf says:

    My sense, from sitting on many a graduate admissions committee, is that it’s not just that the mean quality of “inputs” is fairly low, but there is a real shortage in absolute terms in the number of students who are unambiguously above the bar. (There’s some cross-faculty and cross-department variation in how that’s defined, but in general I think the overlap is pretty large.)

    This creates competition for the best students, but it also increases the pressure for departments that aren’t likely to win these competitions to take greater risks. Sometimes these risks pan out but often they don’t, producing mediocre outputs. Or, departments can start admitting students with strong applied math or technical backgrounds, e.g., from engineering or computer science, but at the risk of producing students who have good technical skills but little engagement with, or even interest in, sociology.

    The upshot is that sociology needs more programs with rigorous undergraduate training — the Reed/John Pocks of the world — and, frankly, fewer or smaller graduate programs. Except at my university.

    Call me a pessimist, but I’d guess that the quality of data collection has gotten worse since Blalock’s day. Efforts to train students in the full range of methods, or to allow them to specialize quite early in their training (gotta publish, after all), has come at the cost of squeezing out required training in survey design, sampling, question framing, etc. I don’t even remember that this was offered as an elective in my grad program. Maybe survey design is seen as too technical, or too old-fashioned, or (falsely) irrelevant to newer methods, or simply too boring; or, maybe there just aren’t enough people qualified to teach it anymore. Regardless, sociology needs more Nora Cate Schaeffers almost as much as it needs more John Pocks.

    • Agreed on all counts, especially the point about the need for more training in research design and data collection strategies. When the modal ASR paper equates “ethnography” with “I talked to someone” then there is something wrong.

      It is not clear why sociology departments teach statistics at all. Historically, it might have made sense to the extent that sociologists were early adopters of some techniques (e.g., event history, structural equation modeling) compared to other social sciences. But that is far from true about most of the stuff that is taught today. There is nothing particularly sociological about a covariance matrix, but the insistence on acting as if there is comes at real costs. For example, sociology is seriously behind in the use of identification strategies. Instead, the quality of the inputs means that many departments devote resources to teaching people the difference between a mean and a median.

      Of course, change is hard. The politics of billets attached to teaching stats aside, most faculty think that the right kind of training is the kind they received in graduate school – how else to account for their success? That kind of selection on the dependent variable, of course, is further evidence of the need for training in research design.

  • James says:

    Just a quick note on this from the other side of the pond. My view is that the intellectual breadth of institutional sociology ought to be a good thing. Max Weber, of course, was not a donkey. However, too much “research” is allowed to be neither scholarship nor science. The official methodology is now “critique”. The strange thing about this technique is that it simultaneously fails to “commit sociology”, as Stephen Harper put it, while encouraging a public view of sociology which makes sociological critique less likely to be listened to. It also makes for emotionally driven teaching and seminars.

    We saw this here with Boris Johnson’s similar comments following the London riots. Most recently, the French PM, Manuel Valls, has made comparable comments about sociological ‘excuses’ for the Paris attacks. Three sociologists representing the ASES and the AFS (like the ASA, in France) replied in Le Monde saying that what sociology does is explain, not excuse. But quite often it doesn’t, here, and any attempt at narrower interrogation of facts or theories without an obvious political message seems to get ignored as either reactionary, irrelevant, or uninteresting. Which is ironic. Plus ça change indeed. I would like to make the further leap and say this filters into sociological education as well and explains everything you guys are complaining about, but that would be ironic too.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

What’s this?

You are currently reading Plus ça change… at The Disgruntled Sociologist.


%d bloggers like this: