It wasn’t hard to achieve gender balance.

If you aren't aware of this figure by now you should be.  Credit: Moss-Racusin et al. 2012.

If you aren’t aware of this figure by now you should be. Credit: Moss-Racusin et al. 2012.

A couple of weeks ago my colleagues and I submitted a session proposal to ESA (Paleoecological patterns, ecological processes, modeled scenarios: Crossing temporal scales to understand an uncertain future) for the 100th anniversary meeting in Baltimore. I’m very proud of our session proposal.  Along with a great topic (and one dear to my heart) we had a long list of potential speakers, but we had to whittle it down to eight for the actual submission.

The speaker list consists of four male and four female researchers, a mix of early career and established researchers from three continents. It wasn’t hard. We were aware of the problem of gender bias, we thought of people who’s work we respected, who have new and exciting viewpoints, and who we would like to see at ESA.  We didn’t try to shoehorn anybody in with false quotas, we didn’t pick people to force a balance.  We simply picked the best people.

Out of the people we invited only two turned us down.  While much has been said about higher rejection rates from female researchers (here, and here for the counterpoint), both of the people who turned us down were male, so, maybe we’re past that now?

This is the first time I’ve tried to organize a session and I’m very happy with the results (although I may have jinxed myself!).  I think the session will be excellent because we have an excellent speakers list and a great narrative thread through the session, but my point is: It was so easy, there ought to be very little excuse for a skewed gender balance.

PS.  Having now been self-congratulatory about gender I want to raise the fact that this speakers list does not address diversity in toto, which has been and continues to be an issue in ecology and the sciences in general.  Recognizing there’s a problem is the first step to overcoming our unconscious biases.

Upcoming Conferences – Part I: Geological Society of America

Figure 1.  GSA 2014 in beautiful downtown Vancouver!  I listened to them building this convention center every day for months on end, so it had better look nice!

Figure 1. GSA 2014 in beautiful downtown Vancouver! I listened to them building this convention center every day for months on end, so it had better look nice!

Much of the research I do is interdisciplinary. While I often consider myself a paleoecologist, it is just as true that I am a physical geographer, earth scientist, ecologist, geoinformatician and computational biologist, with an order that varies depending on the project at hand. I think Jacquelyn Gill once suggested I call myself a paleobiogeoecoclimatologinformatician, and I honestly think I’m probably forgetting a piece of that.  Anyway, with that in mind I find myself at a variety of conferences each year. I am as much at home at the Botanical Society of America, the American Geophysical Union, or the Geological Society of America‘s annual conferences as I am at the Canadian Association of Palynologist’s meeting. The next meeting for me is this Octobers Geological Society of America meeting in Vancouver BC.

Since Vancouver was home for me while I did my Ph.D, I jumped at the opportunity to present at GSA.  One session in particular was of great interest: Where in the World? Access and Availability to Geoscience Data II. I’ve been involved with the Neotoma Paleoecological Database for a while now, and have been actively working on an R package for the database. In particular I wanted to talk about my experiences at the intersection of Neotoma (the data provider) and PalEON (a data consumer & provider). From my viewpoint both large scale projects have benefitted immensely from the partnership. While the PalEON project is able to get large volumes of data from the Neotoma Database, we have also been able to leverage our connections to begin inputting data into the database, and, because of our particular needs, we have been able to act as a test case for the development of the API, R package, and many of the new upload tools for Neotoma. Much of the success of EarthCube in the near future is going to depend on the ability to gain community trust, and engagement. The partnership between large-scale ecological research and broad community databases provides exactly the kinds of synergies needed to help improve this kind of collaboration, and, ultimately, research success in the future. That talk:

Session No. 253
Where in the World? Access and Availability to Geoscience Data II
Vancouver Convention Centre-West 116/117
Tuesday, 21 October 2014: 1:00 PM-5:00 PM

I was also invited by Amy Myrbo to speak about some of our work looking at chronologies in Neotoma. This may seem overly technical, perhaps not particularly ecological, but the way we deal with chronologies in paleoecological data is critically important for modelling past ecosystem changes. Without an accurate understanding of time, it is nearly impossible to make sense of ecological patterns in the past, or to apply them in a meaningful way to modern ecological theory. I’ll talk a bit about the lessons we’ve learned from re-building chronologies in PalEON, particularly some of the great work Andria Dawson has been doing, and how to deal with issues of time when working with large datasets, with specific focus on Neotoma. That talk:

Session No. 325
Recent Advances in Limnogeology (1:00 – 5:00pm)
Vancouver Convention Centre-West: 213
Wednesday, 22 October 2014

I’m also going to be speaking at the Biodiversity Lunchtime Internal Seminar Series (BLISS) at UBC on Monday, October 20th from noon to 1:00PM, and again for the Biological Sciences Seminar Series at Simon Fraser University on Wednesday October 22 at 3:30PM (as long as the 135 is running on time!).

Lastly, if you are a paleoecologist or palynologist, CAP, the Canadian Association of Palynologists, is having their Annual General Meeting on Wednesday October 22nd at 11:30am at Mahony & Sons, right by the convention. All are welcome, so please come out and join us. Everyone loves meetings!

Ecosystem services have always been with us. Using the past to explore their dynamics.

Figure 1.  Hoping we'll find a proxy for beautiful sunsets.  Image from: BrendelSignature at en.wikipedia

Figure 1. Hoping we’ll find a proxy for beautiful sunsets. Image from: BrendelSignature at en.wikipedia

Clean water, forest products, clean air.  The value of ecosystem services has received a lot of attention in the past several years. In 1997 Robert Costanza and co-authors provided one of the first real valuations of ecosystem services (Costanza et al., 1997), estimating that, at an annual subsidy of ~$33 Trillion, they provide more value to human society than the entire global GNP at the time (~$18 Trillion).  Following Costanza’s paper and the Millennium Ecosystem Assessment (2005) considerable research effort has gone into understanding the extent of ecosystem services around the globe, and at multiple scales, however the widespread application of the term ‘ecosystem service’ has led some to question the consistency with which the term is used (Seppelt et al., 2011).

Regardless, it is clear that ecosystems play a vital role in maintaining critical social functions.  Whether providing clean and safe drinking water, wood for building, or giving us a sense of relief at the sight of a beautiful sunrise, ecosystem services are a fundamental (and often undervalued) component of our well being.  Given this, it is no surprise that we have become keenly interested in projecting changes in valuation of services in the future under changes in land use and as a result of global change.  Joshua Lawler and co-authors (2014) use land use trends to estimate changes in ecosystem services in the continental United States under three different conservation scenarios, but don’t attach value to the shifts.  In Landscape Ecology Monica Turner and co-authors (2013) lay out several key research questions to help resolve our uncertainty about the effects of future change on ecosystem service provisioning.  One of the key research questions in this paper is “How well will understanding of past landscape dynamics and ecosystem services inform the future?“.

The use of paleoecology in understanding ecosystem service function and change is still in its infancy, but John Dearing and co-authors (2012) have provided an excellent road map in their paper “Extending the timescale and range of ecosystem services through paleoenvironmental analyses, exemplified in the lower Yangtze basin”.  Table 1 of the paper provides a long list of ecosystem services and possibly related paleoecological proxies, linking food production to pollen microfossils, fresh water provision to diatom assemblages and air quality regulation to spherical carbonaceous particles, all of which – incidentally- can be found in lake sediment records.

The challenges of using the paleo-record remain, and it is critical that researchers begin to address the methods with which we cross-scales, from the paleo-record to modern ecological time scales, and on to future projections.  Excellent work by McLauchlan et al. (2014) in BioScience is beginning to do just this.  Exploring the ways in which we can integrate paleoeoclogical process into modern ecological theory is critical for understanding the long time-scale processes that ultimately help regulate the provision of ecosystem services.


Costanza, R. et al. 1997. The value of the world’s ecosystem services and natural capital. Nature. 387, 253-260.

Dearing, JA. et al. 2012. Extending the timescale and range of ecosystem services through paleoenvironmental analyses, exemplified in the lower Yangtze basin. Proceedings of the National Academy of Sciences, 109(18), E1111-E1120. [PDF]

McLauchlan, KK. et al. 2014. Reconstructing disturbances and their biogeochemical consequences over multiple timescales. BioScience. [PDF]

Lawler, JJ. et al. 2014. Projected land-use change impacts on ecosystem services in the United States. Proceedings of the National Academy of Sciences, 111(20), 7492-7497. [PDF]

Seppelt, R. et al. 2011. A quantitative review of ecosystem service studies: approaches, shortcomings and the road ahead. Journal of Applied Ecology, 48(3), 630-636. [PDF]

Turner, MG. et al. 2013. Consequences of spatial heterogeneity for ecosystem services in changing forest landscapes: priorities for future research. Landscape ecology, 28(6), 1081-1097. [PDF]

Quaternary Science . . . on Mars . . . three billion years ago.


Cross-posted from the Open Quaternary blog.

Originally posted on OpenQuaternary Discussions:

For a curious person, one of the great benefits of being a Quaternary researcher is the breadth of research that is relevant to your own research questions.  The recent publication of fifty key questions in paleoecology (Seddon et al., 2014) reflects this breadth, spanning a broad range questions that reflect human needs, biogeophysical processes, ecological processes and a broad range of other issues.  The editorial board of Open Quaternary also reflects this incredible disciplinary breadth.  To me it is clear that the Quaternary sciences is an amalgam of multiple disciplines, and, at the same time, a broadly interdisciplinary pursuit.  To be successful one must maintain deep disciplinary knowledge in a core topic, as well as disciplinary breadth across topics such as ecology, anthropology, geology (and specifically geochronology), and you need a good grounding in statistics and climatology.

One of the things that is not always quite as apparent is the breadth…

View original 700 more words

We’re reading the same paper, but we’re getting different messages.

Earlier I posted about an interesting paper by Jankó and colleagues in Geoforum about similarities and differences in citation patterns between the IPCC and the NIPCC.  It turns out I wasn’t the only one interested in it.

It was pointed out to me that Judith Curry, Watts Up With That and the Heartland Institute have all written posts about the paper.  Their key takeaway message seems to come from the fact that the citations between the two are similar, and that Jankó and co-authors include language that indicates that reflexively dismissing ‘skeptic’ arguments does a disservice to scientific advancement.  This does an injustice to Jankó and colleagues because it misrepresents what I believe is very interesting work into the underpinnings of scientific inquiry, particularly around climate change.  Much of the support Bast and Curry see in the paper comes from a single sentence, associated with a citation from a 2013 paper by Mayanna Lahsen.

My reading of the sentence:

But when we take the contrarian arguments seriously, there is a chance to bring together the differing views and knowledge claims of the disputing ‘interpretive communities’ (Lahsen, 2013b).

Is not to say that we need to accept their arguments as alternate fact, but to say that the reflexive dismissal of contrarian viewpoints limits our ability to engage and understand the contrarian viewpoint.  The paper itself “Anatomy of dissent: A cultural analysis of climate skepticism” certainly shows little support of skepticism. Mayanna Lahsen has done some excellent work understanding climate change denial (skepticism?) from a sociological/anthropological viewpoint.  Indeed, her arguments in the cited reference point more to the fact that scientists need to work harder to engage with skeptics in an effort to avoid cultural backlash.  She is not arguing that skeptics pose acceptable alternative models to anthropogenic climate change.  Take this sentence for “Anatomy of dissent” (the same paper cited by Jankó and colleagues):

To promote their agenda, powerful backlash actors have frequently adopted deceptive strategies to create the fictitious appearance of broad grassroots and scientific support.

Does this in any way suggest that we ought to be taking contrarian arguments seriously because they are valid?  No, we are being asked to take them seriously because by understanding their backgrounds and motivations we can begin to address the causes of backlash against climate science, and move forward toward solutions.

I argued in my last post that just because the IPCC and NIPCC use the same citations, they are not equally acceptable models for global climate and climate change.  Interestingly, just because Bast, Curry and I read the same paper doesn’t mean we came to the same conclusions either.

Working on the frozen finger!

This post also appears on the PalEON Blog.  There are some great posts there so check it out!

We’re at Camp PalEON this week.  It’s lots of fun and I think that the attendees get a lot out of it.  Effectively we’re trying to distill process associated with the entire seven year project into one week of intensive learning.  We teach probability theory, Bayesian methods, ecosystem modelling, dendrochronology, paleoecology and pollen analysis, age modelling and vegetation reconstruction to seventeen lucky early-career researchers in six intensive days (people were still plunking away at 11pm last night, our first day!).

We spend a lot of our time at the University of Notre Dame’s Environmental Research Center indoors looking at computers, but we had a very nice time yesterday afternoon.  I hung out on a raft with Jack Williams and Jason McLachlan, coring with a frozen finger.  The frozen finger is a special kind of corer, used to recover lake sediment that preserves the sediment stratigraphy in a much cleaner way than many other coring techniques.

Jack Williams and Jason McLachlan filling the core casing with ethanol so that the cold slurry conducts to the outer wall.

Jack Williams and Jason McLachlan filling the core casing with ethanol so that the cold slurry conducts to the outer wall. (photo credit: Jody Peters)

When using the frozen finger we fill the base of the corer with dry ice, and suspend the dry ice in ethanol to create an incredibly cold surface.  We then drop the casing into the lake sediment.  The sediment freezes to the surface of the core casing over the course of ten or fifteen minutes before we pull the corer back to the surface of the lake.  The freeze-corer (or frozen finger) is often used for ancient DNA studies (e.g., Anderson-Carpenter et al., 2011) since the freezing process helps stabilize DNA in the lake sediment until the core can be brought back to the lab and analyzed.

The sediment on the outside of the core casing  is peeled off carefully and wrapped before it is stored in a cooler of dry ice. (photo credit: Jody Peters)

The sediment on the outside of the core casing is peeled off carefully and wrapped before it is stored in a cooler of dry ice. (photo credit: Jody Peters)

Jason McLachlan and I are going to go sieve the sediment ourselves later this afternoon to give the workshop participants a chance to take a look at lake sediment, pick charcoal and find macrofossils later tonight.  Meanwhile everyone is hard-coding an MCMC model in R and, later today, learning about Midwestern Paleoecology.  All in all, it’s a great course and I’m happy to be involved with it for a second time.  Hopefully we’ll have some more posts, but in the meantime we’ve made the preliminary readings open to the public on our project wiki, and most of our R work is up and available on GitHub so that you can take a look and work along.

How do you edit someone else’s code?

As academics I like to think that we’ve become fairly used to editing text documents. Whether handwriting on printed documents (fairly old school, but cool), adding comments on PDFs, or using some form of “track changes” I think we’ve learned how to do the editing, and how to incorporate those edits into a finished draft. Large collaborative projects are still often a source of difficulty (how do you deal with ten simultaneous edits of the same draft?!) but we deal.

Figure 1. If your revisions look like this you should strongly question your choice of (code) reviewer.

Figure 1. If your revisions look like this you should strongly question your choice of (code) reviewer.

I’m working on several projects now that use R as a central component in analysis, and now we’re not just editing the text documents, we’re editing the code as well.

People are beginning to migrate to version control software and the literature is increasingly discussing the utility of software programming practices (e.g., Scheller et al., 2010), but given that scientific adoption of programming tools is still in its early stages, there’s no sense that we can expect people to immediately pick up all the associated tools that go along with them. Yes, it would be great if people would start using GitHub or BitBucket (or other version control tools) right away, but they’re still getting used to basic programming concepts (btw Tim Poisot has some great tips for Learning to Code in Ecology).

The other issue is that collaborating with graduate students is still a murky area. How much editing of code can you do before you’ve started doing their work for them? I think we generally have a sense of where the boundaries are for written work, but if code is part of ‘doing the experiment’, how much can you do? Editing is an opportunity to teach good coding practice, and to teach new tools to improve reproducibility and ease of use, but give the student too much and you’ve programmed everything for them.

I’m learning as I go here, and I’d appreciate tips from others (in the comments, or on twitter), but this is what I’ve started doing when working with graduate students:

  • Commenting using a ‘special’ tag:  Comments in R are just an octothorp (#), I use #* to differentiate what I’m saying from a collaborator’s comments.  This is fairly extensible, someone else could comment ‘#s’ or ‘#a’ if you have multiple collaborators.
  • Where there are major structural changes (sticking things in functions) I’ll comment heavily at the top, then build the function once.  Inside the function I’ll explain what else needs to be done so that I haven’t done it all for them.
  • If similar things need to be done further down the code I’ll comment “This needs to be done as above” in a bit more detail, so they have a template & the know where they’re going.

The tricky part about editing code is that it needs to work, so it can be frustratingly difficult to do half-edits without introducing all sorts of bugs or errors.  So if code review is part of your editing process, how do you accomplish it?