Monthly Archives: April 2014

Advice to Junior Faculty

Rather than writing my own (stale) advice for junior faculty I thought I would bring together advice offered by others.  Many of these postings are of recent vintage, although I remember some of the same advice given to me long ago.  What is remarkable is the consistency of the advice to junior faculty.

Chris Blattman’s posting on “Advice for new Assistant Professors” got me to thinking about what kind of advice that I might give.  His list is short, sweet and to the point.  It boils down to: get your research done; pick your research wisely (with a view toward the tenure clock); and don’t be afraid to seek out mentors to give you advice and feedback.  He also has a cogent and important argument for using social media as part of your research and service obligations.  Chris’ posting gave rise to several postings, including Tom Pepinsky’s plea for considering book chapters and Laura McLay’s posting that points to the importance of time management.  Of special note is the point that one should guard weekends and evenings for all those things that also matter in life.  Working smarter (and shorter hours) is better than working longer hours.

Pamela Oliver offers very useful advice for navigating the personalities of the Department you are suddenly immersed in.  Departments are like extended families.  You’re stuck with them, but it doesn’t mean you have to like everyone. Her point 1 is quite true – learn not to take anything personally.  Of course, her points 10 and 11 warn you about settings in which you should take it personally.  This includes both overt and subtle racism or sexism.  One should never have to put up with such things. She also warns that your first year or two will seem horrible.  I always advise graduate students going off to their first job that their first year will be terrible.  You’ll try to do too much, you’ll try to fit in too hard and you’ll be trying to learn a new culture.  Take it easy and focus on the things that are really important that first year (see the rich advice given in the prior paragraph and given by Dr. Karen here.   Along the same vein, useful lists of advice are given by John Regehr and Billie Hara.

Finally, there is very good advice about staying sane while on the tenure track.  Radhika Nagpal in “The Awesomest 7-year Postdoc …” notes how you should carve out space for yourself.     A longer read is by Shane Henderson “Staying Sane on the Tenure Track”  in which he notes that it is always important to keep your life and that of others around you in perspective.  As he notes, if you obsess about your job and your discipline, ask yourself  “whether your next-door neighbor knows the most famous person in your field.” My guess is that the answer is no.

This advice has a common theme:  it’s just a job and there is more to life than living in your office/lab.  This was made clear to me by my dissertation advisor after I had been in my job for three years.  She was visiting Rice to give a talk.  In a private discussion with me, she suddenly asked about my “back-up plan.”  For a second I was a bit puzzled, but I knew that she meant what would I do if I didn’t get tenure.  I told her that I had my eye on some property where I could open a garage rebuilding old British sports cars and open a wine shop on the side.  These were interests that I had at the time and I could have made a reasonable living at it.  She smiled and said that was a healthy approach to tenure.  I turned the table and asked what her back-up plan had been.  Without hesitation she indicated that she and her husband were going on the market together in 1965 and if there was nothing forthcoming, they were going to move to San Francisco and open a woodworking shop in Haight-Ashbury (prescient and well in advance of the summer of love).  Fortunately for political science there were both hired at Indiana University and Lin Ostrom later went on to win the Nobel prize in Economics.  What woodworking lost, the social sciences gained.  The point is that finding balance in work and play is important.

Finally, I would be remiss if I didn’t note the series of blog posts edited by Meg Shannon in The Political Methodologist.  Here there is a good deal of advice for all of us about mentoring women and the barriers that we sometimes set up against women.

 

What Should Journals Do?

While attending the 11th meeting of EGAP (Experiments in Governance and Politics) this weekend a session was held on transparency and replication. The discussion was fascinating and for the panel, Don Green, Macartan Humphreys and Jenny Smith presented a paper entitled “Read it, understand it, believe it, use it: Principles and proposals for a more credible research publication.”  The paper presents a challenge to the academy and to journals in particular. The goal is nothing less than defining “best-practices” for social science journals. I see the authors as presenting a challenge to business-as-usual and pressing for a response from professional associations and from journal editors alike. I am not in a position of power in a professional association nor am I an editor. But I have some experience in the latter and thought I would share some of my own thoughts.

The paper begins from the premise that fraud, deceit and cheating are serious concerns, especially for the social sciences. While no evidence is mustered for whether these problems are widespread, the more general point is that even a few bad apples will spoil the rest of us. I discount this basis for their argument. They also argue that these proposed changes will force the social sciences to be more careful with inference and lead us to committing fewer false positives. With this I wholeheartedly agree. At heart are two concerns – the first aligning the incentives of researchers and journals and the second aligning the incentives of researchers and their discipline. I’ll take each in order.

Aligning the incentives of researchers and journals.

Table 2 from their paper (and which I’ve reproduced here) summarizes the 14 proposed innovations and details the costs and risks attached to each, as well as whether coordination is needed across journals. The assumption is that if journals implement these, then the incentives for researchers will change and science will improve. Many of these changes are worth considering as a challenge to the current way we run journals. But, I have some concerns. Journal_4_12_14   Innovation #4 asks for open access. This same point is being pushed by Congress, especially for research funded by Federal grants. The point is reasonable, and as Green, Humphreys and Smith recognize, this will require a new business model for journals. While the general journals in the social sciences are supported by professional associations this is not true for all. Many journals are run on a shoestring and supported by publishers. The revenue streams are uneven for all journals and most could not survive if abandoned by publishers. Journals could easily forego publishing hard copies of each issue and putting these on line. However, the costs of printing an issue is minor compared with the basic costs of production. These include staffing support for journals (most of which run on a shoestring – even the general journals), support for electronic submissions and reviewer submissions, copyediting and numerous other costs. While some of these costs could be recouped from submission fees, most social scientists are adverse to such costs. Realistically, submission fees could be as high as $500 per manuscript for general journals if moving to such a model.

A number of innovations that are proposed are of minimal cost – at least to authors. Numbers 7 and 8 are great ideas. They ensure data quality before being sent to reviewers, asking editors and an in-house staff to vet replication files before being sent out for review. While authors bear no costs, the journals will. Where will these resources come from? Not from publishers if open access is imposed. If I did it all over again, I would probably ask for support for an in-house statistician. To optimally use that person’s skills, I would require that all manuscripts given an R&R be required to submit all of their files prior to being reviewed under revision. This only partially gets at what is suggested, but it is feasible for a general journal. For subfield journals, this is more difficult.

Innovation 10 is a great check on the robustness of a finding. It asks that publication be withheld for a year. A manuscript that has been tentatively accepted, after the data has been vetted, it has survived the review process and has passed the muster of the editor, should then be put out for public comment. So far, so good. But, this further delays publication and puts junior faculty at risk. It is probably the case that getting good science is far more important than the careers of faculty. Yet there is a question of whether delay serves science. Published work does attract interest – especially if there are flaws in that work. But good science may be worth the wait. Who moderates the period of public commenting? The Editor? She is likely busy with a myriad of other things and cannot diligently attend to all postings as they come through. The author, certainly shouldn’t be trusted with moderating. Public comment could be left unmoderated. But will public comment devolve into some version of the “rumors” postings? This would do little to serve the interests of science.

Innovation 2 is intriguing. Authors should submit their manuscript as fully written, except without results. The aim is to get Editors and reviewers to focus on the hypotheses, constructs and research design without reference to the significance of the findings. This might serve to welcome null results and nudge findings from the p just less than .05 phenomenon. As with many things, however, this will increase the waiting time for authors. Passing the bar for a data-blind review may result in a second review before deciding on an R&R. It will be impossible to gauge effect sizes from data-blind review. Admittedly this would press an additional standard on a researcher – specifying a priori what constitutes an important effect. In the best of all possible worlds, such a manuscript would be identical to a registered design, but it will be impossible to know if the researcher “peeked” at the data before sending in the data-blind manuscript (and post hoc decided effect sizes and detailed the analysis plan).

Innovations related to open data and materials are critical. I see no reason that all journals shouldn’t require and mandate this. I discovered that authors are very willing to comply when asked to provide such materials contingent on final acceptance. Of course, such material should be stored on some publically accessible platform. There are many. Expecting researchers to store and maintain materials on their own website is not going to be a long-term solution. Many of the innovations asked in this paper are going to require a huge change. The investment of time and energy by editors and reviewers is going to be substantial. It is important to remember that both give of their time and create public goods. Science operates on the basis of scientists making sacrifices in their own work to provide public goods. I worry that many of these innovations will crowd out the provision of public goods. This will be to the detriment of science.

Aligning the incentives of researchers and the discipline.

Green, Humphreys and Smith pay less attention to this aspect of aligning interests. Yet it is important. If nothing else it is important to change the incentives created by disciplines for scholars. First, it is important to encourage replication of findings in different contexts. At present there is too much emphasis on pursuing novelty at the cost of credibility. There is little incentive for researchers to replicate a novel finding in a different setting. This is different than replicating with the same data. While the latter is useful to ensure that a finding is not due to a mistake, replicating in a different environment or with different data helps provide confidence in the veracity of a finding. The complaint is that journals are not interested in replication. This is perhaps true of the general journals. But plenty of field-specific journals are open to replications with new data. More importantly senior faculty should reward junior faculty for producing corroborating findings.

Second, the community needs to support norms for scholars that pre-register their designs and analysis plans. There are very few subfields in political science that wouldn’t benefit from pre-registration. Observational studies can easily detail their hypotheses, research design, variable construction and analysis plan well before touching data. While most people realize that research is rarely carried out in this manner (many of us go fishing once we have the data), it is also the case that most of us do some version of pre-registration when we embark on our research. We have a very good idea of what model we are using to focus the research, we have a clear design in mind, we know the data we want to use and we have a plan for analysis. We just don’t write it down. Doing so is valuable. This is not to say that what we find unfolds exactly as we expected. However, failing to find what I was looking for is very instructive – I learn a lot from what is unexpected. Registration is simply a way of reminding me when something is unexpected.

Third, we need to change the way we teach our graduate students. In my first year graduate seminar or in my experimental design seminar I force students to write a pre-analysis paper. I want them to focus on the model, the hypotheses, the constructs and the design. I do not want them to focus on where they can find someone else’s data set where they can cobble together bad proxies for the things they want to measure. I would rather they focus on being creative about their research design and their outcome variables. Of course this means that graduate students will not get the jump on their peers by churning out research right from the beginning. However, I would rather they learn how to do the research properly in the first place, rather than learning by trial and error.

Finally, we need to move the focus away from the “big 3.” We put enormous pressure on our junior faculty to get their work into the general journals. The reality is that there is only limited space in the general journals. If researchers spend their time crafting all of their work to try to “hit” the big journals they will be spinning their wheels. I would much rather see us reward our junior colleagues for producing a full portfolio of work focused on a carefully defined research program. If this means producing a number of papers that carefully replicate and corroborate an interesting “fact” then that should be fine. It will give us greater confidence in our knowledge. Isn’t that what we’re supposed to be doing?

What Can We Do?

Last Saturday I was on a panel at the Midwest Political Science Association entitled “NSF …”  Representative Dan Lipinski (D-IL) provided his thoughts about what is likely to happen to NSF funding in the House.  From his vantage point on the House Subcommittee on Research and Technology  he was able to give us some insight into views on the social sciences.  The message was bleak.  The House Republicans like science.  But science is defined as the “hard” sciences.  The socials sciences are not viewed as “real” sciences, and as such, are ripe for being zeroed out.  Skip Lupia, in his comments, noted that the social sciences are indeed under attack.  Consequently it is important that we articulate what it is that we are capable of doing.  Sadly social scientists are often not at the table when policy decisions are made.  Yet we have the fact-based evidence that can inform those decisions.

While I added to the doom and gloom, I also suggested things that we can do as individuals and scholars.  It strikes me that there are six easy things that many of us can do.

  • First, we should lean on our professional Associations. Political scientists have produced a plethora of studies on interest groups and lobbying.  Why don’t we put our research into practice?  We know that lobbying can payoff and that Associations can band together more easily than individuals.  Part of what a professional Association does is solve the collective action problem inherent in representing interests.  Jennifer Nicoll Victor details how critical lobbying was saving Political Science in the last time around.  We also have learned that we should target our friends on the hill – contributions are not likely to change the minds of those opposed.  Contributions are not likely to buy votes.  But contributions to those already predisposed to an issue are a bit more receptive to access and listening to the arguments lobbyists present.
  • Second, we need to target our audience.  While some of us have access to Members of Congress, these are busy people.  Few MoCs have the time to take my phone call or respond to my email or letter.  But almost all of us have students that we have trained over the years who are staffers for legislators in Washington.  We should use those contacts to let them know what kind of research we are doing and why it is important.  Direct lobbying of staffers (like MoCs) will probably not be useful.  Cultivating staffers by giving them some insight into our research and how it applies to them is useful.  People on the Hill need to hear about what we’re doing and why it is important.
  • Third, for those who are lucky enough to be funded by the National Science Foundation, we need to provide our Program Officers with practical, understandable outcomes from our research.  This doesn’t mean dumbing down the research, but rather making what we do accessible.  Basic research sometimes looks like it is far from practical or applied.  Nonetheless, communicating how and why we carried out the research is important to communicate.  It is especially important to give our Program Officers the kind of stories that can be communicated to policy makers.
  • Fourth, pressure our professional Associations to highlight the work they are sponsoring through their meetings and journals.  It was good to see the MPSA organize the Empire Lecture Series talks at the 2014 meeting.  These talks, by senior researchers, were accessible, characterized the field and broadly pointed out what we know.  The journals should take further steps to advertise the work they are publishing.  AJPS has taken to blogging about new articles that are forthcoming or have been published.  Most of these blogs are written by the authors, they are short and they are designed to be widely accessible.  Other journals should follow suit.
  • Fifth, authors should take some responsibility for advertising their research.  This shouldn’t be seen as self-promotion, but rather promoting research produced by the discipline.  All of us have University Media Relations.  While it is sometimes difficult to get through to this understaffed group, it is important to remember that they are our friends.  Make friends with someone in Media Relations and pitch them your recently published work.  Often this can be done by writing a couple of paragraphs and sending them a copy of the article.  They aren’t going to publicize everything, but if they do, they will work very hard to get the word out.
  • Sixth, as individual scholars we need to do more to expand our audience.  Gone are the days when we could hide behind our jargon and expect that the outside world will give us money to study what we wish without further explanation.  Other scholars (particularly those in the natural sciences) are doing a great job of showing how their research is exciting and pushes the boundaries of science.  They are perfectly comfortable with using social media to promote their work.  Luckily a number of our junior colleagues are also comfortable using social media – whether blogging, tweeting, using Facebook or using YouTube-like animations to reach a broader audience.  As a dinosaur in the discipline, what I can do is value and reward my junior colleagues for making the effort to reach out.  Social media should be counted as an integral part of teaching, service and research.  Scholars shouldn’t be told that it is a waste of time and that they would be better off writing “real” research.  Of course research is important, but so too is getting that work out to a broader audience.  Personally, I’d much rather have a million hits on a YouTube video touting some of my research, than that same piece getting 150 cites.  I’d rather see the same in my colleagues as well.

Publishing (And the Seven Deadly Sins)

(Note I previously wrote this in sections for the AJPS Blog.  I thought I would put it all in one place on my own blog.  Nothing like recycling).

At the end of four years with AJPS (2010-2013) I thought I would write what seem to be common failings in manuscripts that come to AJPS. My sense is that these same failings are true for other journals as well. There is no secret to getting published. It takes creativity, a well-formulated question, an appropriate design for answering that question, an enormous amount of hard work and excellence. This is easy to write, but hard to pinpoint. It is much easier to detail those things that will derail an interesting manuscript when it comes to publishing.

I organize my comments around the “seven deadly sins.” These include:, greed, sloth, gluttony, wrath, lust, envy, and pride. Each represents a common failing for authors. Hopefully these comments will be taken as useful advice and not as sermons.

 Greed. “An excessive desire and pursuit of material goods.”

I begin with greed and for most editors this is commonly expressed as “playing the lottery.” An author blinded by greed believes that getting into a journal is a random event. Editors are incapable of exercising judgment and reviewers are picked randomly. Therefore one may as well start at the top and send the manuscript to one of the “Big 3.” Once rejected, go on to the next. Once exhausted, move down to second tier journals and try them one after another. Typically this means submitting the manuscript the afternoon following a rejection without reading the reviews. After all, the process is stochastic. You might get lucky and get the jackpot right off the bat. This is very unlikely.

Editors take on the job because they are willing to exercise judgment and they often have a vision of what is valued in research. They do not roll a die to determine what makes it and what doesn’t. Likewise reviewers are not iid. Editors do not pick reviewers as a random draw from the reviewer pool. I typically want a portfolio of reviewers who can comment on the general merit of the question, address the research design and empirical strategy (if there is one) and knows the subfield. Rarely is it the case that all of these features are embodied in a single reviewer. Moreover I do not think I would want a single reviewer to give me advice. So a number of reviewers are chosen and they each offer a slightly different critical perspective. Reviewers are chosen with a purpose, not randomly.

It should not be a shock when the same reviewer has previously seen the manuscript at a different journal. This is fine. I would like to know if the advice the reviewer previously offered was followed. The most telling reviews are those that are identical to the previous review – largely because the manuscript has not changed. From my standpoint, how likely is an author going to undertake revisions, if the author is not even willing to pay attention to a reviewer’s prior efforts. Granted, sometime reviewers are off base. However, I have seen very few reviews that offer absolutely no useful advice to an author.

To avoid greed an author should be self-reflective about the content of her manuscript and honestly evaluate where it should go. Before I write up a piece of research I pinpoint my target audience and I select the journal that caters to that audience. Not everything I write belongs in a general journal – much of what I want to say is narrow, but useful for the subfield. I start with sending the manuscript to what I think is the appropriate journal and where it will have an impact.

Sloth. “Laziness.”

A hallmark of sloth is when a manuscript arrives that is sloppi, diss-organized, poorly written, ya know, and plain drafty. In the era of word processors, spell checkers and grammar correction, it is odd to receive manuscripts that are rushed, mistake prone and incomplete. But it happens.

One way to think about journals is that you have one shot. You are presenting the journal with your very best work. You may as well make it as clean and well written as possible. While you may think that copy editing, upon acceptance, can rid your manuscript of its worst offenses, think again. Reviewers are going to notice instances where the prose is unclear, where typos abound and when a key figure is omitted. A manuscript that does not live up to minimal standards of formatting is likely to set off alarms for the editor and reviewers. What inference should these gatekeepers make if your manuscript appears slothful? The logical inference is to assume that the science underlying the work is equally sloppy.

Avoiding sloth is easy. First, see what the journal wants. All journals have a style sheet that you should consult before beginning to write. Make certain you fit the style. It is not that hard to do and it will save you plenty of time. Second, make certain the manuscript is perfect. When you have finished the manuscript and you are ready to send it to the journal, stop! Wait a few days, and then go back and read it critically checking for errors. Coming back to a manuscript with a fresh view can often find those pesky errors that remain (and do not become overly dependent on spell correction). The point is there is no hurry. After spending months or years getting your manuscript ready, a few more days will not hurt you.

Gluttony. “An overindulgence.”

I ran full steam into gluttony when I imposed an 8500 word limit on manuscripts. Many authors bitterly complained that they needed 50 or 60 pages to demonstrate their result and to ask otherwise was to hold back knowledge. It did not surprise me that those same authors, once it was clear that the constraint was going to be enforced, found that a manuscript could be turned in that was 8,498 words in length. Gluttony is manifest in excessive verbiage.

Gluttony takes on various guises in a manuscript. One culprit is the literature review. Gluttons prefer to be exhaustive (and exhausting) in their review of the literature. This means citing and detailing every piece that is tangentially related to the topic. The defense of gluttony is that surely a reviewer will object if she is not cited. Yet such a literature review usually ends up looking like an encyclopedia and fails to put the current research into its context. Even worse, it crowds out the contribution. A gourmand (rather than a glutton) will write a literature review that accentuates the research, highlighting the contribution.

Gluttons are also fond of taking a long and winding road to the data. The result is a flabby discussion of the research design and structure of the data. Of course the aim is well intentioned because, in the interest of full transparency, the author believes that every decision about case selection and coding must be put on the table. Yet most readers want to see the core elements of the design and data and move to the results. By all means write the fully blown version of the data. It should be available for researchers (and at AJPS it is more than welcome in the Supporting Information). An interested reader, who wants to know more, should have that information at her fingertips – but it should not be inflicted on all readers.

Finally, gluttons are fond of showing off their methodological prowess. For an empirical piece this may mean including every statistical check in the full text. If there is room for two models in a table, why not eight? The glutton knows no bounds. Of course, every robustness test will also be included, crammed into the manuscript, crowding out the point to the research. Robustness tests are critical and readers should demand them. However, they may not always belong in the main text. Again, at AJPS the Supporting Information is the logical place to put robustness checks. Strangely enough the glutton should thrive in this environment. Sadly, however, the SI is usually just a collection of tables with little indication why they are important. The SI should be written as a stand-alone document in which interested readers can follow the logic intended by the author in offering additional tests and information.

To protect against gluttony authors should re-read and edit their manuscripts before submitting to a journal. The key question while editing should always be: “Is this necessary?” It never hurts to have peers or colleagues give your manuscript a quick reading, especially if you ask them to comment on it over coffee (your treat). You might ask them whether sections can be cut or clarified. Science is a collective process and we learn from one another through or interactions.

Wrath. “Uncontrolled feeling of hatred and anger.”

This is a common failing that usually accompanies a letter rejecting a manuscript. It takes the form of “The editor is an idiot and the reviewers are jerks.” While one or the other may be true, it is unlikely that they are conjointly true. At most journals reviewers are carefully selected for their expertise in the subfield and for their capacity to assess the impact of the manuscript. The editor values their advice and you should value it too.

Wrath leads to two kinds of problems. First, by getting angry, you risk ignoring good advice. Rejections hurt and you should feel free to get mad, shout, rail against the powers that be and then wait two days. After that try to figure out what the editor and reviewers are telling you. If your manuscript was reviewed, that’s great. It means that you made it past one hurdle and you received the intense scrutiny of your peers. Make the best of it.

Wrath creates a second problem if you immediately write a blistering email detailing the degree and type of idiot the editor must be. After all, two of the reviewers suggested that the manuscript might possibly be given an R&R. A journal is not a democracy. If it was, then the median reviewer would dictate outcomes and we would have median science. Earlier I mentioned that excellence is a requirement. Arguing with an editor (the person who will ultimately make the decision to publish) is not a great strategy. Please write a blistering email and then put it aside for two days. It will be cathartic. It will also not be sent based on sober second thoughts. Sometimes a reviewer or the editor may be mistaken. Politely write a reasoned note indicating why. I have been known to change my mind. I am more likely to do so when an argument is made for why I should.

To avoid wrath please feel free to get angry, rant to your pet and then put your rejection aside. After a few days return to the manuscript and the reviews and try to figure out what was being said. Oftentimes you are probably right in that the reviewers have missed your point. But that should be a signal that you have not clearly communicated your point. All reviews and letters from the editor contain some nugget of information that will help you when revising the manuscript for another journal.

Lust. “An intense desire.”

Lust often appears as a wonton commitment to a tool or a finding. When reading a manuscript lust leaps out of the pages when an author exclaims that the latest complex estimation/modeling/textual technique is the only solution for this (and all other) problem. If only the prose matched the passionate embrace of the technique. It rarely does and all too often the chosen method ends up a one-night stand.

The danger with lust, whether it is a method or a finding, is that an author loses perspective and fails to be self critical. Who knows the research project best? Obviously it is the researcher carrying out the work. But being blinded by technique, it is easy to ignore problems with the research. Reviewers and editors will quickly find those blind spots and reject the manuscript. It is good science not to be too enamored with your tools, bright and shiny though they may be.

Authors also lust after a finding. Before fully embracing a finding, as attractive and fresh as it may seem, step back and add some perspective. Is the finding spurious? How robust is it to alternative specifications? Will the finding persist? These are all questions that should be posed before committing the finding to paper. Yet lust can easily overcome sensibility. If you don’t raise these questions, others not blinded by lust surely will.

To avoid lust an author needs to be self-critical. Don’t get mislead by a novel technique or finding. Make certain what you have to say is robust. After all, if you get published, you will have to live with your article for the rest of your career. You should prefer something that will hold up under scrutiny, rather than be shown to be flawed.

Envy. “An insatiable desire to possess what another has.”

Envy manifests itself in the publication by others. Scholars want to get their work into the journals to further the science. Envy arises when furthering the science takes a backseat to careerism and merely counting publications. Journals have limited capacity, so publication often looks like a zero-sum game.

A common source of envy is the view that “an inferior paper was published and my manuscript will provide a much needed correction.” This is rarely a concern with replication, which is a valuable part of the scientific enterprise. Instead it typically involves a minor extension of a well-known result. Adding yet another control variable may be useful for the subfield, but it does not make for a path breaking manuscript. Worse, with a cluttered set of independent variables, it may not be clear what inference to draw. While an author may be envious of another’s publication, it does not mean that the new, improved, minor contribution should automatically be published in a general journal. It may mean that the manuscript is a good contribution to the subfield and with plenty of subfield journals that should be the first place an author should head.

Another source of envy is the view that journals are clubs where those who get in do so on the basis of knowing the editor or have special connections with the Editorial Board. Naturally this leads to envy of purported club members. I wish that was true that I was a gatekeeper to a secret club because I would have loved to figure out a way to collect rents off of authors submitting to AJPS. Alas, I am not the richer for being editor. AJPS, like almost all journals, takes conflicts of interest very seriously. I never handle a manuscript from my colleagues or graduate students. An Associate Editor handles anything that might look like a conflict of interest. Likewise, in selecting reviewers we avoid conflicts of interest, which includes people at the same institution, co-authors and people who directed the author’s dissertation. We occasionally make mistakes, but they are very rare.

Rather than being envious, authors should worry about their contribution and how it furthers knowledge.

Pride. “An excessive admiration of self.”

Pride is the root of all sin. For authors overweening pride in a manuscript is often a combination of all the other sins. Pride leads one to think that the manuscript’s importance is self evident. So much so, that its importance may never be written down, but simply asserted by the author. Feeling that the manuscript stands on its own and needs no further explanation is problematic. Readers (and reviewers) are hardly omniscient and should not be expected to read an author’s mind. What an author is doing should be clearly spelled out. But pride can be blinding and this manifests itself in two ways.

First, the manuscript is either not asking or not answering a research question. While the manuscript may have bright and shiny new techniques (see Lust) and fresh data, unless it is linked with a clear, theoretically motivated question, it will make little sense. This will be clear to a reviewer. But pride in the technique or the data can be blinding and cause the author to ignore detailing how the analysis ties in with a research question. A result without a motivating question is an anecdote.

Second, the manuscript does not engage a relevant debate. Again pride in technique may lead the author to ignore the fact that the debate was settled long ago. Often this is due to an author carrying out the analysis, isolating a finding, and then doing a cursory job of finding a literature that might fit the finding. This usually means missing the fact that others have already made important contributions to resolving the question. It is painful to learn from the editor or a reviewer that, while the empirics are competently executed, the question was long ago resolved and that the manuscript has re-invented the wheel.

As with all of the sins, pride can be overcome with humility. Researchers know their own work better than anyone else. Being honest about weaknesses in your work is a crucial part of the process of science. Recognizing, rather than covering up, those weaknesses is important for generating knowledge. Realizing that what you are doing is building science, rather than advancing your career, should provide a healthy dose of humility.

These sins that I have written about I have learned both the hard way (my own mistakes) and by reading thousands of manuscripts over the past four years. I don’t pretend to be the ultimate authority on publishing, so take my thoughts with a grain of salt. However, if this description of sins to avoid gets you to think more carefully about your work and how you present it, then it should make for better science.