(Note I previously wrote this in sections for the AJPS Blog. I thought I would put it all in one place on my own blog. Nothing like recycling).
At the end of four years with AJPS (2010-2013) I thought I would write what seem to be common failings in manuscripts that come to AJPS. My sense is that these same failings are true for other journals as well. There is no secret to getting published. It takes creativity, a well-formulated question, an appropriate design for answering that question, an enormous amount of hard work and excellence. This is easy to write, but hard to pinpoint. It is much easier to detail those things that will derail an interesting manuscript when it comes to publishing.
I organize my comments around the “seven deadly sins.” These include:, greed, sloth, gluttony, wrath, lust, envy, and pride. Each represents a common failing for authors. Hopefully these comments will be taken as useful advice and not as sermons.
Greed. “An excessive desire and pursuit of material goods.”
I begin with greed and for most editors this is commonly expressed as “playing the lottery.” An author blinded by greed believes that getting into a journal is a random event. Editors are incapable of exercising judgment and reviewers are picked randomly. Therefore one may as well start at the top and send the manuscript to one of the “Big 3.” Once rejected, go on to the next. Once exhausted, move down to second tier journals and try them one after another. Typically this means submitting the manuscript the afternoon following a rejection without reading the reviews. After all, the process is stochastic. You might get lucky and get the jackpot right off the bat. This is very unlikely.
Editors take on the job because they are willing to exercise judgment and they often have a vision of what is valued in research. They do not roll a die to determine what makes it and what doesn’t. Likewise reviewers are not iid. Editors do not pick reviewers as a random draw from the reviewer pool. I typically want a portfolio of reviewers who can comment on the general merit of the question, address the research design and empirical strategy (if there is one) and knows the subfield. Rarely is it the case that all of these features are embodied in a single reviewer. Moreover I do not think I would want a single reviewer to give me advice. So a number of reviewers are chosen and they each offer a slightly different critical perspective. Reviewers are chosen with a purpose, not randomly.
It should not be a shock when the same reviewer has previously seen the manuscript at a different journal. This is fine. I would like to know if the advice the reviewer previously offered was followed. The most telling reviews are those that are identical to the previous review – largely because the manuscript has not changed. From my standpoint, how likely is an author going to undertake revisions, if the author is not even willing to pay attention to a reviewer’s prior efforts. Granted, sometime reviewers are off base. However, I have seen very few reviews that offer absolutely no useful advice to an author.
To avoid greed an author should be self-reflective about the content of her manuscript and honestly evaluate where it should go. Before I write up a piece of research I pinpoint my target audience and I select the journal that caters to that audience. Not everything I write belongs in a general journal – much of what I want to say is narrow, but useful for the subfield. I start with sending the manuscript to what I think is the appropriate journal and where it will have an impact.
A hallmark of sloth is when a manuscript arrives that is sloppi, diss-organized, poorly written, ya know, and plain drafty. In the era of word processors, spell checkers and grammar correction, it is odd to receive manuscripts that are rushed, mistake prone and incomplete. But it happens.
One way to think about journals is that you have one shot. You are presenting the journal with your very best work. You may as well make it as clean and well written as possible. While you may think that copy editing, upon acceptance, can rid your manuscript of its worst offenses, think again. Reviewers are going to notice instances where the prose is unclear, where typos abound and when a key figure is omitted. A manuscript that does not live up to minimal standards of formatting is likely to set off alarms for the editor and reviewers. What inference should these gatekeepers make if your manuscript appears slothful? The logical inference is to assume that the science underlying the work is equally sloppy.
Avoiding sloth is easy. First, see what the journal wants. All journals have a style sheet that you should consult before beginning to write. Make certain you fit the style. It is not that hard to do and it will save you plenty of time. Second, make certain the manuscript is perfect. When you have finished the manuscript and you are ready to send it to the journal, stop! Wait a few days, and then go back and read it critically checking for errors. Coming back to a manuscript with a fresh view can often find those pesky errors that remain (and do not become overly dependent on spell correction). The point is there is no hurry. After spending months or years getting your manuscript ready, a few more days will not hurt you.
Gluttony. “An overindulgence.”
I ran full steam into gluttony when I imposed an 8500 word limit on manuscripts. Many authors bitterly complained that they needed 50 or 60 pages to demonstrate their result and to ask otherwise was to hold back knowledge. It did not surprise me that those same authors, once it was clear that the constraint was going to be enforced, found that a manuscript could be turned in that was 8,498 words in length. Gluttony is manifest in excessive verbiage.
Gluttony takes on various guises in a manuscript. One culprit is the literature review. Gluttons prefer to be exhaustive (and exhausting) in their review of the literature. This means citing and detailing every piece that is tangentially related to the topic. The defense of gluttony is that surely a reviewer will object if she is not cited. Yet such a literature review usually ends up looking like an encyclopedia and fails to put the current research into its context. Even worse, it crowds out the contribution. A gourmand (rather than a glutton) will write a literature review that accentuates the research, highlighting the contribution.
Gluttons are also fond of taking a long and winding road to the data. The result is a flabby discussion of the research design and structure of the data. Of course the aim is well intentioned because, in the interest of full transparency, the author believes that every decision about case selection and coding must be put on the table. Yet most readers want to see the core elements of the design and data and move to the results. By all means write the fully blown version of the data. It should be available for researchers (and at AJPS it is more than welcome in the Supporting Information). An interested reader, who wants to know more, should have that information at her fingertips – but it should not be inflicted on all readers.
Finally, gluttons are fond of showing off their methodological prowess. For an empirical piece this may mean including every statistical check in the full text. If there is room for two models in a table, why not eight? The glutton knows no bounds. Of course, every robustness test will also be included, crammed into the manuscript, crowding out the point to the research. Robustness tests are critical and readers should demand them. However, they may not always belong in the main text. Again, at AJPS the Supporting Information is the logical place to put robustness checks. Strangely enough the glutton should thrive in this environment. Sadly, however, the SI is usually just a collection of tables with little indication why they are important. The SI should be written as a stand-alone document in which interested readers can follow the logic intended by the author in offering additional tests and information.
To protect against gluttony authors should re-read and edit their manuscripts before submitting to a journal. The key question while editing should always be: “Is this necessary?” It never hurts to have peers or colleagues give your manuscript a quick reading, especially if you ask them to comment on it over coffee (your treat). You might ask them whether sections can be cut or clarified. Science is a collective process and we learn from one another through or interactions.
Wrath. “Uncontrolled feeling of hatred and anger.”
This is a common failing that usually accompanies a letter rejecting a manuscript. It takes the form of “The editor is an idiot and the reviewers are jerks.” While one or the other may be true, it is unlikely that they are conjointly true. At most journals reviewers are carefully selected for their expertise in the subfield and for their capacity to assess the impact of the manuscript. The editor values their advice and you should value it too.
Wrath leads to two kinds of problems. First, by getting angry, you risk ignoring good advice. Rejections hurt and you should feel free to get mad, shout, rail against the powers that be and then wait two days. After that try to figure out what the editor and reviewers are telling you. If your manuscript was reviewed, that’s great. It means that you made it past one hurdle and you received the intense scrutiny of your peers. Make the best of it.
Wrath creates a second problem if you immediately write a blistering email detailing the degree and type of idiot the editor must be. After all, two of the reviewers suggested that the manuscript might possibly be given an R&R. A journal is not a democracy. If it was, then the median reviewer would dictate outcomes and we would have median science. Earlier I mentioned that excellence is a requirement. Arguing with an editor (the person who will ultimately make the decision to publish) is not a great strategy. Please write a blistering email and then put it aside for two days. It will be cathartic. It will also not be sent based on sober second thoughts. Sometimes a reviewer or the editor may be mistaken. Politely write a reasoned note indicating why. I have been known to change my mind. I am more likely to do so when an argument is made for why I should.
To avoid wrath please feel free to get angry, rant to your pet and then put your rejection aside. After a few days return to the manuscript and the reviews and try to figure out what was being said. Oftentimes you are probably right in that the reviewers have missed your point. But that should be a signal that you have not clearly communicated your point. All reviews and letters from the editor contain some nugget of information that will help you when revising the manuscript for another journal.
Lust. “An intense desire.”
Lust often appears as a wonton commitment to a tool or a finding. When reading a manuscript lust leaps out of the pages when an author exclaims that the latest complex estimation/modeling/textual technique is the only solution for this (and all other) problem. If only the prose matched the passionate embrace of the technique. It rarely does and all too often the chosen method ends up a one-night stand.
The danger with lust, whether it is a method or a finding, is that an author loses perspective and fails to be self critical. Who knows the research project best? Obviously it is the researcher carrying out the work. But being blinded by technique, it is easy to ignore problems with the research. Reviewers and editors will quickly find those blind spots and reject the manuscript. It is good science not to be too enamored with your tools, bright and shiny though they may be.
Authors also lust after a finding. Before fully embracing a finding, as attractive and fresh as it may seem, step back and add some perspective. Is the finding spurious? How robust is it to alternative specifications? Will the finding persist? These are all questions that should be posed before committing the finding to paper. Yet lust can easily overcome sensibility. If you don’t raise these questions, others not blinded by lust surely will.
To avoid lust an author needs to be self-critical. Don’t get mislead by a novel technique or finding. Make certain what you have to say is robust. After all, if you get published, you will have to live with your article for the rest of your career. You should prefer something that will hold up under scrutiny, rather than be shown to be flawed.
Envy. “An insatiable desire to possess what another has.”
Envy manifests itself in the publication by others. Scholars want to get their work into the journals to further the science. Envy arises when furthering the science takes a backseat to careerism and merely counting publications. Journals have limited capacity, so publication often looks like a zero-sum game.
A common source of envy is the view that “an inferior paper was published and my manuscript will provide a much needed correction.” This is rarely a concern with replication, which is a valuable part of the scientific enterprise. Instead it typically involves a minor extension of a well-known result. Adding yet another control variable may be useful for the subfield, but it does not make for a path breaking manuscript. Worse, with a cluttered set of independent variables, it may not be clear what inference to draw. While an author may be envious of another’s publication, it does not mean that the new, improved, minor contribution should automatically be published in a general journal. It may mean that the manuscript is a good contribution to the subfield and with plenty of subfield journals that should be the first place an author should head.
Another source of envy is the view that journals are clubs where those who get in do so on the basis of knowing the editor or have special connections with the Editorial Board. Naturally this leads to envy of purported club members. I wish that was true that I was a gatekeeper to a secret club because I would have loved to figure out a way to collect rents off of authors submitting to AJPS. Alas, I am not the richer for being editor. AJPS, like almost all journals, takes conflicts of interest very seriously. I never handle a manuscript from my colleagues or graduate students. An Associate Editor handles anything that might look like a conflict of interest. Likewise, in selecting reviewers we avoid conflicts of interest, which includes people at the same institution, co-authors and people who directed the author’s dissertation. We occasionally make mistakes, but they are very rare.
Rather than being envious, authors should worry about their contribution and how it furthers knowledge.
Pride. “An excessive admiration of self.”
Pride is the root of all sin. For authors overweening pride in a manuscript is often a combination of all the other sins. Pride leads one to think that the manuscript’s importance is self evident. So much so, that its importance may never be written down, but simply asserted by the author. Feeling that the manuscript stands on its own and needs no further explanation is problematic. Readers (and reviewers) are hardly omniscient and should not be expected to read an author’s mind. What an author is doing should be clearly spelled out. But pride can be blinding and this manifests itself in two ways.
First, the manuscript is either not asking or not answering a research question. While the manuscript may have bright and shiny new techniques (see Lust) and fresh data, unless it is linked with a clear, theoretically motivated question, it will make little sense. This will be clear to a reviewer. But pride in the technique or the data can be blinding and cause the author to ignore detailing how the analysis ties in with a research question. A result without a motivating question is an anecdote.
Second, the manuscript does not engage a relevant debate. Again pride in technique may lead the author to ignore the fact that the debate was settled long ago. Often this is due to an author carrying out the analysis, isolating a finding, and then doing a cursory job of finding a literature that might fit the finding. This usually means missing the fact that others have already made important contributions to resolving the question. It is painful to learn from the editor or a reviewer that, while the empirics are competently executed, the question was long ago resolved and that the manuscript has re-invented the wheel.
As with all of the sins, pride can be overcome with humility. Researchers know their own work better than anyone else. Being honest about weaknesses in your work is a crucial part of the process of science. Recognizing, rather than covering up, those weaknesses is important for generating knowledge. Realizing that what you are doing is building science, rather than advancing your career, should provide a healthy dose of humility.
These sins that I have written about I have learned both the hard way (my own mistakes) and by reading thousands of manuscripts over the past four years. I don’t pretend to be the ultimate authority on publishing, so take my thoughts with a grain of salt. However, if this description of sins to avoid gets you to think more carefully about your work and how you present it, then it should make for better science.