Just published: can sandbox models be educational and fun?

Just out: a paper by me and education expert Bridget Mulvey grapples with the question: analogue sandbox models are cool, but are they effective teaching tools?

Analogue sandbox models are a way of demonstrating tectonic deformation processes in the classroom: the weirdness of physical scaling laws means that slowly squeezing and stretching a tub of sand produces faults and folds like those produced in the crust over geological timescales.

Example of a sandbox model experiment where layers of colored sand have been deformed by horizontal compression. Photo by Chris Rowan.
Another compressional experiment, but this time with a layer of icing sugar (confectioner’s sugar), which is more cohesive and therefore stronger, beneath the layers of sand, and a layer of glass microbeads, which are weaker, within them. Photo by Chris Rowan.

After building a sandbox model for some research, I wanted to use it in my classes, but the results of the first attempts were… disappointing. The students enjoyed running the experiments, but it didn’t seem to help them understand any better what structures you get in response to different strains, and the effect of weaker or stronger layers.

So, inspired by this article in Eos on cycle-based learning, I developed an activity where we did multiple runs of experiments, with students sketching predictions of what would happen beforehand, assessing those predictions afterwards and also reassessing predictions for experiments that have yet to be run. We kept track of how students’ understanding developed during the multiple cycles by scoring their predictive sketches for how realistic they were. We also tested their general spatial skills with a test before and after the activity.

And we did see improvements! Especially in students who had low scores in the spatial skills test taken before the activity, who did much, much better in the post-test. And importantly, students still seemed to enjoy this more structured activity.

So yes: analog sandbox models are cool, and can be effective teaching tools – if you design an activity that helps students focus on the things you want them to learn.

This post was collated from this Twitter thread.

Categories: geology, publication, science education, structures, tectonics

The long-term seismic impact of mega thrust earthquakes

Here’s a very interesting analysis of aftershock patterns in the wake of M9+ megathrust events: the aftershocks in a ‘core’ region closest to the rupture shut off within a few years of the main shock, after which seismicity might remain very low for centuries. However, within a larger ‘corona’ of stressed rocks around this core region, seismicity is boosted for decades.

Graph of aftershock rate against time relative to background levels for 300 years after a large megathrust earthquake. Schematic in top right shows relative distribution of core (blue) and corona (red) zones, which are plotted separately. In the core, the blue line shows a period of activity below background starting within a few years and persisting for several centuries; in the corona the red line shows a decline to background within a few decades.
Model of aftershock rate against time relative to background levels for 300 years after a large megathrust earthquake. Schematic in top right shows relative distribution of core (blue) and corona (red) zones, which are plotted separately. From Stein & Toda (2022).

One think I like about this model is how it reconciles the known history of large earthquakes on the Cascadia megathrust with its historical lack of much seismicity at all, which for some time led us to dangerously underestimate the risk it posed to the Pacific Northwest. It’s still recovering from the last rupture in 1700. Furthermore, perhaps as it starts to evolve towards rupturing again in the future, we might expect to see a bit more low-level seismicity in the ‘core’ region.

[collated from this Twitter thread]

Categories: earthquakes, geohazards, geology

Not enough people get taught Earth Science, and that’s a problem for all of us

This article articulates an increasingly concerning question: in a world where increased exposure to natural hazards, resource scarcity and the consequences of climate change are amongst the most critical issues facing our society, why does Earth Science get no love in our education system?

I spend a lot of time teaching non-science majors basic Earth Science, and it does sting a little when students say they took your class because they thought it would be easy, not because it’s interesting or important. Sometimes, I manage to change their minds, which is quite nice. And my job!

But I can’t help but worry about the many, many people who I don’t even get the chance to convince. Should people having the information they need to make well-informed decisions about the defining issues of this century be dependent on them going to college and taking a non-compulsory course to meet their general education requirements? I’d argue not.

Furthermore, we can’t ignore the potential impact this has on the lack of diversity in Earth Science: how many fantastic people who could have contributed to our understanding of the planet, and how to sustainably live on it, have passed Earth Science by, not even realising it was there?

Categories: geology, science education

Weird rocky exoplanets

Obtaining the composition of rocky exoplanets from the spectral signature produced when their dying parent star eats them is already pretty mind-blowing. But the results – which suggest that on some of these worlds, quartz is substituted for olivine in their mantles? That’s a mind-supernova.

I’m not even sure how you could get such a composition – coalescing from a particularly iron poor planetary nebula, perhaps? But given how Earth’s mantle is dominated by olivine and its pressure-induced phase changes, there could be profound differences in how an olivine-free mantle convects and loses heat.

As this figure from the paper shows, the data also point to another kind of rocky exoplanet rich in periclase (MgO) rather than orthopyroxene. In other words, a significant non-silicate phase!

Figure 3 of Putirka & Xu (2021) plotting estimated compositions of exoplanetary mantles (large diamonds) on one of 3 ternary diagrams. Top left: components are Olivine, Orthopyroxene (Opx) and Clinopyroxene (Cpx), like the Earth’s mantle. Note how the light grey and dark grey diamonds plot way outside the ternary diagram, indicating at least one of the three components is not present. Centre: components are Quartz (replacing Olivine as in Earth’s continental crust), Opx, and Cpx. Bottom right: olivine, Cpx and periclase (MgO) replacing Opx.

So what does this mean? The authors suggest that “quartz-rich mantles might create thicker crusts, while the periclase-saturated mantles could plausibly yield, on a wet planet like Earth, crusts made of serpentinite.”

Of course, there are some sizeable (acknowledged) caveats, given the rather extreme detection method. We’ll have to see whether these inferences stand up to scrutiny. But it’s yet more evidence that beyond our solar system, many surprises about the way planets work await.

[This post was collated from this Twitter thread]

Categories: geology, planets, rocks & minerals

How to spend a lot of money on a problem without making any progress in solving it, nuclear fusion edition

I’ve been known on occasion to mock fusion for being eternally 25 years in the future, and this article on the latest potential advances doesn’t really help me assess how credible the people and approaches that star in it actually are. But there is some eyebrow-raising information in the background that gives some context to the long wait for the fusion dawn.

In particular, a report from the U.S. Energy Research and Development Administration in 1976 projected that if $9 billion per year was spent on research, practical fusion energy could be achieved by 1990. Reduce that to $1 billion per year, and the projection was “Fusion Never.” And guess what?

“[$1 billion]’s about what’s been spent…Pretty close to the maximum amount you could spend in order to never get there.”

The current annual spending of US government on fusion research: $670 million. In contrast, the estimated annual cost of US fossil fuel subsidies is $650 billion.

There is, of course, no proof that $9 billion/year would have actually moved working fusion’s perpetually 25 years away horizon any closer. But opting for ‘definitely not enough money to work’ whilst demonstrating the ability to throw an order of magnitude more money elsewhere in the energy sector is, at best, poor strategic foresight. At worst, it looks a lot like what you’d do if you wanted to make it look like you were pushing for an energy breakthrough without materially threatening the status quo.

With the incidental bonus of turning that potential breakthrough into a punchline. My scoffing suddenly feels a little hollow.

[This post was collated from this Twitter thread]

Categories: climate crisis, society