Skip to content Skip to navigation

Our reporter was a data point in a study of scientific careers. She and others have questions

Tomorrow's Research

Message Number: 
1706

Meanwhile, Twitter also erupted with comments from people who thought that the methods—using paper authorship in a handful of journals as a proxy for academic activity—were flawed, relying on a too-narrow definition of who counts as a scientist.

Folks:

 

The posting below is an article written in response to the material in TP Msg. #1695 Scientists Don’t Stay for Long in Their Jobs Anymore: Study, from Inside Higher Edposted on February 2, 2019. [https://tomprof.stanford.edu/posting/1695]. The article below is from: “Our reporter was a data point in a study of scientific careers. She and others have questions" by Katie Langin, December 14, 2018, Science Careers[E1]

(doi:10.1126/science.caredit.aaw3840)https://www.sciencemag.org/careers/2018/12/our-reporter-was-data-point-study-scientific-careers-she-and-others-have-questions. Reprinted with permission from AAAS." 

 

Regards,

 

Rick Reis 

reis@stanford.edu

UP NEXT: How to Use ‘Active Learning’ to Teach Critical Thinking in the Lab

 

Tomorrow’s Research

 

---------- 1,484 words ----------

 

Our reporter was a data point in a study of scientific careers. She and others have questions It’s not every day that you realize you’re a data point in a scientific study—and a misrepresented data point at that. But that’s what happened to a number of current and former scientists—including me—while reading a study reporting that scientific careers have become significantly shorter in the past 50 years, published earlier this week in the Proceedings of the National Academy of Sciences (PNAS).

“Half of academic scientists leave the field within 5 years,” The Washington Post headline trumpeted. “New study says scientists are leaving academic work at unprecedented rate,” read Inside Higher Ed. It's a message that’s likely familiar to those who follow the plight of today’s early-career researchers, and many shared the paper on social media as yet more evidence that systemic change is urgently needed.

Meanwhile, Twitter also erupted with comments from people who thought that the methods—using paper authorship in a handful of journals as a proxy for academic activity—were flawed, relying on a too-narrow definition of who counts as a scientist. As a prolific associate professor of ecology at the University of Canterbury in Christchurch, New Zealand, tweeted, “Shocked to discover that I am an early career dropout with no lead authorship ever (according to authors' definition)! Defining research active careers using 9 ecology journals is flawed - even if the message resonates.”

The study sought to figure out whether the “half-life” of a scientific cohort has changed over the past 50 or so years, spurred in part by growing concern about the dysfunction of the academic training model and job market. “We are reaching a point that doesn’t really look sustainable,” says Staša Milojević, the lead author of the study and an associate professor of informatics at Indiana University in Bloomington. Given the number of people who leave academia to pursue jobs elsewhere, Ph.D. programs should do more to recognize diverse career paths and provide appropriate training options, she says. There shouldn’t be a “single train for everybody.”

With that backdrop in mind, Milojević and her colleagues used a big-data approach to look at changes in the career landscape by teasing apart scientists who are actively publishing from people who’ve published but—for whatever reason—aren’t publishing anymore. They focused on a subset of journals in three disciplines—six for astronomy, nine for ecology, and 12 for robotics—amounting to nearly 300,000 publications authored by 71,164 astronomers; 20,704 ecologists; and 17,646 roboticists. The authors chose those journals because they “are well established, usually publish a large fraction of original research in a particular field, and are considered to be good representatives of those fields,” they wrote in the paper. 

But many critics say that these limited data are not sufficient to draw the conclusions claimed in the study, with some offering their own publication record as evidence. In addition to the University of Canterbury ecologist, an actively publishing ecosystem scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, wrote on Twitter, “I got a PhD in Ecology in 1988 and have never once published in the journals defined as in my field.”

The “Twitter-storm” of criticism primarily focused on ecology—the field I worked in as a scientist, and one that has evolved rapidly since the study start date in the 1960s. The same issues may not apply to the other disciplines Milojević examined. For astronomy, at least, there’s evidence that the results hold.

In 2016, Peter Yoachim—an astronomer who works as a staff scientist at the University of Washington in Seattle—conducted a similar, as-yet unpublished analysis focusing on that discipline, though with a crucial difference. He identified astronomers based on publications in astronomy journals; then—in a departure from the PNAS paper’s methods—he counted publications in a broader range of journals when calculating the length of time an author was an active, publishing scientist. His results are “very similar” to what the authors reported in the PNAS paper, he says. (Milojević and her colleagues didn’t look at all possible journals in part because it’s computationally challenging to match names between papers—a problem that would have been amplified if they’d cast a wide net, potentially leading to more false-positive matches.)

“I’m not going to defend their techniques,” Yoachim says. But “astronomy is a really small field” with a limited number of journals, so only looking at six journals in the PNAS study may not have biased the findings too badly, he notes. But he agreed that it could be potentially problematic when looking at other fields, such as ecology.

My story

I’m no longer a practicing scientist—I'm a reporter and editor for Science Careers—but I was one of the 20,704 ecologists whose names were included in the Milojević database, and one whose career trajectory was misrepresented. I published a chapter of my master’s thesis in one of the “chosen” ecology journals—Oecologiain 2007. That pegged me as an ecologist from the study’s perspective. (Milojević confirmed this.)

The researchers then tried to link that publication with any ecology publications I authored after that—in essence, to figure out how long I stuck around in academia (or, as the study authors phrased it, to figure out my “ultimate survival status in science”). Since that 2007 publication, I’ve earned a Ph.D. in ecology and published 14 peer-reviewed papers in 12 journals. My publication record extends through to this month’s issue of Conservation Genetics. In theory my scientific career length—based on my publication record—should have been estimated to be 13 years. Yet the study’s methods rendered me a “transient”—an author with a single publication whose estimated career length was only 1 year—because none of my pre- or post-2007 papers were published in any of the chosen journals.

The study authors acknowledge in the paper that the dataset’s “incompleteness … may affect the determination of career length.” But they go on to write that it shouldn’t affect analyses looking at trends over time because there’s no reason to believe that the level of “incompleteness” would be different now than, say, 50 years ago.

Is that assumption reasonable? An education researcher isn’t so sure, pointing out that over the study’s timespan the set of journals being examined was likely a “shrinking” percentage of the total number of journals in each field.

To figure out whether that was the case for my (now former) discipline, I took a look at the 30 top-ranked ecology journals based on International Scientific Indexing impact factors. Most of the journals published their first issue in the 1980s or later, long after 1961—the year when the first cohort of ecologists analyzed in the PNAS study began publishing. (Of the “chosen” ecology journals, only two of the nine even existed in 1961.)

Top-ranked ecology journals

Most of today's top 30 ecology journals started in the 1980s and 1990s.

(GRAPHIC) K. LANGIN/SCIENCE; (DATA) K. LANGIN/SCIENCE 

That means that an ecologist trying to publish their work in 1961 likely had fewer journals to choose from than an ecologist trying to publish their work in 2010. What’s more, there are more scientists trying to publish now, so competition is more severe. In other words, it’s plausible to me that some modern-day ecologists’ careers could look shorter simply because the PNAS study’s methods are not well equipped to capture the full range of journals they could be publishing in—an issue that may not apply as much to previous generations of ecologists.

When I relayed my concerns to Milojević, she told me that “it’s a point well taken.” She doesn’t think that the incompleteness of the dataset biased the study’s central conclusion—that the career lengths of actively publishing scientists have shortened in the past 50 years—but added that she and her colleagues are planning a follow-up study that looks at a broader range of disciplines. They’re also working to refine their methods so that they come up with better estimates for the length of time scientists were actively publishing. “We should capture more; we should be able to capture you and the others that we have missed,” she told me.

For my part, I look forward to seeing that study and finding out whether Milojević’s conclusions hold—not only because reporting on scientific workforce issues is my day job, but also because it’s something I’ve experienced firsthand. I am one of the many publishing scientists who have purposefully (and happily) left academia to pursue a career elsewhere—and I agree with Milojević that Ph.D. programs should do a better job preparing people like me for nonacademic careers.

But until that new study comes out, I think we should avoid pointing to the PNAS study as a solid example of academia’s revolving door, at least for ecology. I’d rather point to a study where I’m an accurate data point.

 


According to her website this is written as ScienceCareers, with the journal Sciencebeing italicized followed by a space, and a separate word Careers, not italized because it’s a section of the magazine.