When is appropriate to rely heavily on technical vocabulary and theoretical frameworks in academic writing, especially in the humanities? I think many people take for granted that this is simply what we do: just as physicists create theories of the physical world and them apply them to experiments, humanities scholars create critical or analytical frameworks and apply them to texts, discourses, philosophical problems, and so on. But I'm not sure we should take it for granted. I'm not sure what we should be doing in the humanities, but I'm starting to think that constant traffic in theory and technical jargon isn't just so much bullshit and obfuscation.
Aspirations to activism in areas like cultural studies and gender studies not withstanding, for the most part, there has been a pretty systematic disengagement of the humanities from social life in the last half century or more, at least in the American academy. Part of the increased professionalism of this trend involves treating what we're doing as some sort of secret, specialized knowledge, too difficult or arcane for the common person to fathom. When we use the various technical apparatus of counterfactual possible-worlds-semantics, Bayesian probability theory, textual deconstruction, or performance theory, we tend to exclude potential audience, the result being our increasing irrelevance.
I'm not attempting here to argue against technical work, or to say that it has no value, but merely to say that one has to be clear about the reason for applying some technical framework, and to really have a reason. And one should be clear about the value of the gains and losses resulting from professionalization. Because, really, let's be honest here, we academics working in the humanities don't have any sort of special knowledge. We are engaged in various kinds of projects, some of them more difficult than others, but there is no reason that accessibility can't be a major consideration. We need to be careful about being too technical.
I think there is a case to be made (though I won't make it here) that the following sorts of patterns occur in both the sciences and the humanities. Someone develops a new framework or method, or extends an old one, and in applying it generates some novel and interesting results. Seeing a winner, others take this up and apply it to other things, also with some fruitful success. Still others make even less fruitful but still fairly natural applications, perhaps demonstrating the breadth of this type of analysis or this theory. But many also attempt, and by dint of cleverness or stubbornness, manage to fit other situations to the framework that not only isn't illuminating, but also requires many ad hoc assumptions, metaphorical extensions of the vocabulary, or dubious re-descriptions of the evidence. If we align these trends on a spectrum from most fruitful to most ad hoc and uninformative, then we can give some measure of the discipline, area, or research tradition according to the proportion of one or the other (for those with some phil sci background, think Lakatos on progressive vs. degenerating research programmes).
Given not only that specialization can result in public alienation and the diminishment of a serious duty towards public scholarship, but also these inherent pitfalls in the process, it is important to ask whether one's 'theory' is mere window- dressing on the details one is hoping to bring into the light, or whether it is a crucial feature of the analysis. Can I do all the work I want, do the analysis I need to do, without relying on the technical apparatus of Deweyan epistemology? Can you write that paper without introducing a bunch of pseudo-logical formulae? These kinds of questions need to be more important in academia, even though I think the answer may often be "no."