- The first is one I discussed with my psychology PhD student sister-in-law and her psychology PhD student fiance over the holiday (they have a lot of the same publication bias issues in that field). They seemed intrigued. The idea is that you submit articles before your results are in and have them provisionally accepted for publication. So you have your motivating discussion, any lit review, your description of the data and the methodological set up, and any descriptive statistics. You might even have an initial conclusions section that discusses what to make of the results depending on which way they go. This gets refereed and provisionally accepted. Then you run the results and submit the final paper. Of course if there are any major problems with execution the journal could reject it, but you could have the journal publish all completed papers online that they rescinded their acceptance for for publication in the journal. This should, I imagine, cut down a lot on the publication bias in empirical results, and publishing the rescinded acceptances online would keep that from being abused.
- The second one was inspired just now by this post on Stumbling and Mumbling (HT - Gavin Kennedy), which discusses the extent to which economists are isolated from "real life" as Coase claims. To a certain extent this may be true, but I think it's one of those over-hyped criticisms that people that like to be critics of the mainstream like to make off the cuff without much real evidence. As the post points out, there's a lot of very useful stuff in economics for "real life" that many in "real life" also use. The recent Nobel Prize winners are an excellent example. You can think of giants in the field like Gary Becker that are acclaimed precisely because they pay attention to "real life". Anyway, so I think it's an overstated criticism but there's obviously something there too.
A solution would be to increase the extent to which economists did field work and interviews. I never really saw this problem at the Urban Institute, for example, because field work - or what was often referred to there as "qualitative research" - was a major part of the research program. The people that gave you estimates of disincentive effects of welfare programs actually went out and talked to welfare recipients or administrators, and they were surrounded by and always talking to people who did the same. It was impossible to get the thoughtless libertarian/conservative line that only speaks about welfare in terms of the disincentive effets or the thoughtless progressive approach that assumes away the disincentive effects because you're actually there seeing it in action (and then, of course, going back and working with the data as well). It's helpful working with Hal Salzman on my science and engineering workforce stuff too because he - as a sociologist - does a lot of his work in teh field. It's a nice balance with the data work that I do. Obviously this would be tough to push out to the discipline. It would require changing the way a lot of economists think about the science and it might require more money. But it would be a good change.