A Balanced Complexity

An interesting relationship to consider:

  1. A balanced complexity of ecosystem sounds = environmental health
  2. A balanced complexity of brain activity = mental health

braingoldie

 

 

 

 

 

 

 

 

 

 

 

 

If you’re interested in the concept of self-organized criticality or networks, more here:

Like students, plants give up after years of failure, too

By Famartin (Own work) [CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0)%5D, via Wikimedia Commons
“The plant life of Australia’s outback may have “given up”, according to satellite-based maps tracking the impact of changing climatic conditions, such as rainfall and temperature, on the world’s ecosystems.

“The study suggests the vegetation of our interior does not respond to sudden increases in rainfall because it has “learned” that drought will soon follow. . . .

“‘Sometimes when you subject an ecosystem to some kind of disturbance, such as a drought or fire, they behave differently depending on their past,’ he explained. . . .

“‘They don’t care if it is good favourable conditions now, because they know it is temporary and it is not worth investing in growing more at this time because they become bigger and it is a lot more to care of when the drought returns,’ he said.”

–Dani Cooper, “Global satellite map highlights sensitivity of Australia’s plants to changes in rainfall and temperature” on ABC Science News

A Brain is More than the Sum of its Parts

“Why does the brain transcend bell-curve averages?  One possible explanation is that the brain lacks a privileged scale because its functioning cannot be reduced to component parts (i.e., neurons).  Rather, it is the complex interactions between parts which give rise to phenomena at all spatial and temporal scales. . . . Like averages, reductionism is deeply ingrained in our scientific thinking.  Water is explained in terms of molecules, molecules in terms of atoms, etc.  If the brain is reducible to simpler parts, it should also exhibit a privileged scale of organization.

And yet, it does not.  A unifying mechanism for power law behavior in the brain and other systems is that of self-organized criticality (SOC).  According to this model, systems such as the brain operate on the brink of instability, exhibiting slow processes that build energy and fast processes that dissipate energy.  In such systems, small causes have effects of many sizes. Imagine you are at the beach building a sand pile.  As you add sand, the pile gets taller until its slope reaches a critical angle where it can barely support more sand.  Steadily adding more sand will result in avalanches ranging in size from a few grains to significant portions of the pile.  The avalanches are a scale invariant emergent property. Studying individual grains of sand tells you little about avalanches.”

—Joel Frohlich, “Scale Invariance: A Cautionary Tale Against Reductionism” on Knowing Neurons (HT Alexis Madrigal’s newsletter)

On Failure, Uncertainty, and Risk

I came across this really interesting video shortly after writing about the concept of “explosive networks.” In this overview of his book, Greg Ip provides a comprehensive explanation of how efforts to increase safety and stability can often end up resulting in greater catastrophe (including the efforts in fire prevention). I think his ideas pair well with our broader conversations about the relationship between complex systems and chaos.

Ip asks an interesting question: How can we allow danger to make us safe?

He provides the example of the aviation industry, and how it is now far safer to fly than it is to drive in a car, thanks to the great pressure and transparency generated by any disasters that have occurred. He also frames this in terms of the economy, acknowledging that risk-taking is ultimately what increases wealth.

In the world of education, there has recently been much talk on the importance of failure in learning, and it’s interesting how this parallels broader discussions about complexity and uncertainty within other sectors. For example, if you really want to geek out, you can watch video of the panel of Tyler Cowen, Jared Bernstein and Alex Pollock debating economic principles — what I found interesting was how all of the panelists implicitly concurred on the point that human beings suffer from psychological limitations, which results in greater uncertainty and unpredictability. We have frequently examined this topic here under the banner of “cognitive bias.”

Much of the work that we do in my school’s Support Services department (we’ve decided to rebrand the term “special education) is to try and shift student perceptions of themselves. Often the greatest barrier to student learning is not disability, nor even the content and tasks demanded by rigorous academic subjects, but rather a student’s belief that they are either unable to do the work, or that asking the necessary questions to clarify their understanding is simply not worth the “risk” of appearing “stupid.”

To bring the classroom side of things back to Greg Ip’s question: How can we allow danger to make us safe? I think in a school, we can never completely remove the psychological “dangers” of peer and self-perceptions when challenged by difficult and complex academic content and tasks. The question in a school is not how can we make content easier or ignore the reality that failing in front of others is inherently risky, but rather how can we increase students’ willingness to take the risks necessary for learning? And as Ip suggests about the aviation industry, maybe being transparent about the smaller failures and misconceptions that inevitably do occur along the way can be of greater benefit in the long-run, and prevent much greater disasters from occurring father down the line.

 

 

These Little Avalanches

Visitor running down a dune in Great Sand Dunes National Park.
“If you frequently trigger small cascades, you never get really massive events, but you [sacrifice] all that short-term profit,” D’Souza explained. “If you prevent cascades at all costs, you might make a lot of profit, but eventually a cascade is going to happen, and it will be so massive it [could] wipe out your entire profit.”

—Jennifer Ouellette, “The New Laws of Explosive Networks” on Quanta Magazine

This quote, referring to a concept termed “explosive percolation,” runs parallel to best practice in fire prevention.

After decades of overzealous fire prevention (think: Smokey the Bear), we’ve ended up with a situation wherein apocalyptic wildfires have become a norm. Fire prevention, experts have come to recognize, now requires smaller burns—or, in the absence of controlled burns due to the risk involved, actively thinning underbrush and trees through human labor.

The concept of “explosive percolation” also relates to a concept we’ve explored here before, termed a “self-organized criticality,” in which complex systems maintain stability via “small avalanches” that spontaneously transition between states of chaos and order.

In schools, this confirms the notion that to maintain stability and order within a school community (or classroom) requires “frequently triggering small cascades” of new learning and activities interspersed within stable norms, rituals, and traditions that any school or teacher maintains.

In schools where order is so strictly maintained as to suffer from a “blind application of rules,”  greater disorder may await further down the line. As always, a healthy balance necessitates diversity.