Archive for the ‘Complex Adaptive Systems’ Category

Gremlins (Part 2)

May 28, 2015

gremToday’s “Gremlins” — Complexity & Uncertainty

Consider an airline that has fine tuned its daily flight schedule and passenger seat pricing with yield management techniques. The strategy is set to optimize asset deployment and return on investment for a flight. And that works very well until the system becomes destabilized .

One or more planes become inoperable as they experience unexpected equipment failure and are grounded. Depending on a number of factors, like degree of connectivity and inter-dependency, you might see phenomena called “cascading failure” as the failure of one “node’ (or the grounding of one airplane) cascades through the system scrambles connecting flights and strands ticket holders awaiting the grounded airplane.

The chief scientist for a corporation that pioneer airline yield management described the problem: “John, I made a fortune for airlines through incremental increases in pricing and scheduling strategies over the years. The result was a very fragile system.”

You will see similar examples of these phenomena in the electric grid, the financial market (systemic risk), and in the spread of contagion through pandemic, counterfeits entering the supply chain or adulterated food in the supply chain.

networks[1]And as you might guess, network theory and other tools in our complexity tool box can provide insights and “what if” alternatives to re-wiring highly complex time dependent systems such as yield management systems. The pertinent question is no longer focused on optimality; rather, the focus shifts to expediency and survivability. The beauty of network theory is that it can add value to describing an enormous number of problems and opportunities.

The same idea holds for commercial aviation, the layout of cities and its use in urban planning, supply chain applications are almost infinite, and its military applications (some of which business can borrow) are the foundation of “netcentric operations” where real-time communications link levels of command on the battle field. And as you might surmise, network theory is the foundational theory for communication systems and, more recently has been incorporated in to social media analysis.

As described by Barbara Tucuman in The Guns of August, once German mobilization for World War One began, an activity that was inexorably tied to precise and highly interdependent actions synchronized with the German railroad operators, it could not be stopped without throwing the entire mobilization into chaos taking weeks to recover. In the mean time leaving Germany helpless to respond to any attack. Once they started to mobilize there was, literally, no turning back without complete and utter confusion. War was inevitable.

Path dependency is a phenomenon that occurs when there is no physical or intellectual slack is built into the system. “In the strongest version of path dependencies, path transformation is presumed to be highly unlikely except through rare radical ruptures or re-orientations, which are associated with violent external shocks.”

Are there theories and tools to help overcome path dependency risk and the other risks associated with highly connected and interdependent systems?


Into The Matrix – Adding new tools and perspectives

graphic two

I developed this matrix to match the right tool or approach to levels of complexity and uncertainty. One of the things we see with successful consulting firms is the tendency toward a certain path dependency of their own when it comes to preferences in models and approaches.

There is an overwhelming bias toward focusing our problem solving approaches inside the red circle of The Matrix. The tools inside the red circle, e.g. cost-benefit, linear programming, ROI, predict-then-act models etc., do not work in environments exhibiting high levels of complexity and uncertainty, e.g.. complex adaptive systems (aka CAS). They work for the industrial engineer dealing with relatively simple systems designed and run by human beings. The value of the matrix is that it puts methods, and client expectations into a realistic perspective

I built this model for the Department of Homeland Security, Science & Technology Directorate, who face extremely high levels of complexity and uncertainty (aka “wicked problems”).


Wicked problems can only be successfully tamed by moving outside your comfort zone, going beyond the red circle.

Closing thoughts…

First, increasing complexity is not a passing fad. It is the new norm and it continues to produce high levels of uncertainty and change that are challenging the basic assumptions that made us successful in the past.

Second, change is not continuous or smooth. It comes in bursts. It is like a house cat that sleeps most of the day and then, suddenly starts “bouncing off the walls” That’s why we are so interested in agility and resilience.

Third, the market is brutally efficient. As the prevailing paradigm fails it will be replaced by new norms, assumptions, values and expectations. What doesn’t work is replaced – that’s evolution. That holds true for business models, political constitutions, and consulting firms. The irony is that while CAS are inherently unpredictable, there is a deterministic certainty that we can’t avoid complexity, and that we need to build in resilience in our systems/businesses if we want to make the evolutionary “cut.”

Forth, all risk is transitive. If you are at risk on the Internet, and I depend on you, then I too am at risk.

Complexity, Sharks & Risk Consultants – How “Internal Auditor” Magazine Got It Wrong

August 8, 2011

They got it wrong. Internal Auditor published an article by Neil Baker “Managing the Complexity of Risk” claiming that “The ISO 31000 framework aims to provide a foundation for effective risk management within the organization.”  Well….not so fast.

“Complexity” has become something of a buzz word in today’s business culture.  But I think our understanding of the word is vague. Naming something is not the same as actually knowing anything about what you just named (see  “The Red Wagon Principal: Knowing Is Better Than Naming”).  The misappropriation of a concept is always done with the best intentions.  The problem with Neil’s article is that it sets false expectations for the reader: “Well, you know, we risk experts are on top of this ‘complexity thing’ and we’ve got these magic bullets, checklists and procedures that “We’ll show Mr. Complexity who’s boss.”


Let me be perfectly clear: ISO 31000 (and COSO for that matter) has absolutely nothing to do with managing complexity or uncertainty – regarding risk or otherwise. In simple terms: no matter how thinly you slice it, it’s still baloney.

I am not so much worried about “new” risks – there is not much new under the sun. I am, however, worried about certain types of risks, especially those we’re confident about understanding.  It is always a mistake to get cocky when it comes to risk.  A little ‘bump’ here and a little ‘change’ there and before you know it, what we thought we knew all about is cloaked in fog and uncertainty.  All of a sudden our historic data and expert opinions no longer hold water.  By the way, have you heard what happened to Amanda the Risk Expert?

Vacation at Amity Goes Wrong! Risk Consultant Eaten by Bruce-the-Shark

Amanda was an ERM professional looking forward to a vacation at the seaside community of Amity.  She elicited expert opinions and facilitated a risk self-assessment with the Mayor, several Aldermen, and the Amity Chamber of Commerce , all of which assured her – they were experts, you know – that she would enjoy a safe, quiet and relaxing stay at their little piece of heaven.  If nothing else, Amanda knew her ISO 31000.

Unfortunately, she was promptly eaten by Bruce-The-Shark the first evening of her arrival as she went for a midnight swim.

I won’t bother to comment on how self-interest can cloud “expert” assessments. As for jaws2-1[1]predictive analytics?  She didn’t have any data suggesting a history of shark attacks at or around Amity.  Now that may suggest crummy data, or it may mean that sharks seldom frequented the Atlantic waters around Amity. But the lack of data suggesting shark attacks around Amity did not mean it was safe to go into the water!  Remember the story about the turkey who gets a nice breakfast of corn every morning except for that one morning right before Thanksgiving.   And then whack! That, by the way, is called “the problem of induction.” The world is a very dynamical place.

Amanda should have focused on “consequence” rather than “threat.”  Lots of things can go wrong when you’re swimming by yourself, at night, in the Atlantic.  The threat could be a shark, or a cramp, or you could get run over by a speed boat.  Who knows?  And that’s my point. The threat is not predictable but the consequences of any of a multitude of “bad things happening” are, e.g. dying alone at midnight in the Atlantic.  In Amanda’s case the consequence was existential.

The biggest problem for Amanda was her mind-set.  She was using an epistemologically faulted paradigm, i.e. her professional approach didn’t hold water.   The late Thomas Kuhn would have called it “received knowledge” i.e. the insights we acquire through school and in our profession which are seldom challenged. The paradigm the “late” Amanda used was retrospective and opinion based. She assumed that today is pretty much like yesterday and will be pretty much like tomorrow. She assumed stability and continuity, or what economists call “equilibrium.”  Truly complex adaptive systems, like sharks, have one characteristic that both economists and ocean swimmers hate: surprise.

Who would have predicted:

  • The S&P downgrade of United States to “AA” status?
  • Twitter
  • The Macarena

Nobody saw the demise of poor Amanda coming, right?  Remember these two points, if nothing else….

Complexity = Surprise

Wrong Paradigm = Fish Bait

“You’re gonna need a bigger boat.” Chief Brody

Great White sharks, like deep drilling oil spills, financial meltdowns, and terrorist attacks are not predictable, the constitute surprise events with significant consequences.

“We’re gonna need a bigger paradigm.” 

While ISO3100 and COSO are more than adequate frameworks for well-behaved randomness, and for simple systems characterized by linearity, equilibrium and stability.  But not so much for higher levels of randomness and complexity.  Like Chief Brody, we need a bigger and more expansive paradigm that can deal with non-linear operating far from equilibrium and exhibiting wild randomness.  And, of course, surprises like Bruce-The-Shark.

Amanda would have been better off going to Las Vegas.  At least predictive modeling works there – the House always comes out on top.

Bon Appetite!

Hey Accenture! Thanks for the Really Cool App! A Review of Their Global Risk Management Study & Thoughts on Competition

July 8, 2011

 Accenture isn’t “top of mind” when we think of ERM…for now.   I recall a Senior Director at consulting firm I worked for (it’s no longer in business)  tell an auditorium of about 1000 consultants “I’m not afraid of Accenture.  They don’t scare me.”  Ah huh. 

I thought: “That’s because you haven’t come up against them in the market.”   Look, this Risk Report is more than a compilation of statistics and trends, it tells us a lot about Accenture’s corporate culture and what’s important to them.  But first, go get the App! The App tells you a lot about this corporate culture.

The App – They ought to charge you for it. A couple of clicks and you’re got customized and mobile knowledge management. Works on your iPhone, iPad, Android device or laptop. The only way they could have made it better is to have tossed in Key Board Cat for good measure. Accenture wants to be your E-Buddy and they’ll go through a lot of expense to spoon feed you great info.

Okay, why do consultants conduct and publish painfully correct and beautifully pictured risk research?

  1. To prove they are smart and pretty
  2. To give staff something to do
  3. To build relationships

Pie-charts never made a sale. It’s all about relationships.  That’s part of Accenture’s DNA. That kind of thinking is dangerous….for the competition. That’s why they are aggressively promoting this Report and using new media to do it.  It is a tool to build relationships.  Research is not an end in itself, and this is very good research, mind you.  But then so is everybody else’s!

Aside from the classy mobile App, Accenture does everything but invite you for coffee at Starbucks with your industry Partner. The Team has private emails of the Partners. I can’t even get my kid to reply to my emails, what chance do I have with a real, live Partner? Well, I gave it a try with two, Michael Chagares (Cross-Industry) and Shelly Hurley (Global Resources). Both replied! Steve Culp, the Managing Director just a click away too. Nice touch.  Nice example of executive leadership.  Culture, folks, culture.

Accenture doesn’t cover as many market verticals as the competition, but they drill down with the same expertise and level of insight you expect from a Big 5.  So how do you differentiate?  Bring something unique and valuable to the table, something that is insightful. Consider the following excerpt from their risk report.

 “Due to the systemic nature of emerging risk, and the severe potential impact of such events, companies with slight advantages in detecting and managing emerging risks can obtain significant competitive opportunities.” (p.42 Accenture, 2011 The Report…)

Accenture is actually talking about Brian Arthur’s Law of Increasing Return[1] (although it is doubtful they are aware of this) where a tiny advantage early on can produce amazing compounding effects over time.  They need to translate a beautiful theory into something tangible the client can understand. Here’s an example…

In 2000 Philips had a “clean room” fire in its microchip/wafer plant in New Mexico. Both Ericsson and Nokia bought chips from Philips. Philips estimated about a week’s delay in production. Nokia played it safe and sent teams around the globe to get other sources lined up. Ericsson’s response was more laid-back. As described in Sheffi in The Resilient Enterprise [2] the head of Ericsson’s consumer electronic division didn’t hear about the chip problem until weeks after the fire. Once they realized the magnitude of the problem, it was too late. Nokia had already locked up all the surplus capacity. The plant was down for 9 months. Ericsson took a $2.34 billion loss in the company’s mobile phone division and ended-up being forced into a JV with Sony. During the first 6 months of the crisis Nokia’s market share went from 27% to 30% of the handset market. Ericsson’s went from 12% to 9%  That’s the competitive advantage of risk management and crisis management joined at the hip ….oops, I meant fully integrated across silos.

Complexity — The consequences for being caught napping are even worse today. We’re actually dealing with multi-tier risk — interconnected vulnerabilities and consequences that are often as opaque as they are deadly.  They cannot be truly understood with prevailing ERM methodologies. It’s like trying to repair a computer with a hammer.  You might get lucky but odds are you will just make matters worse.  The varying degrees of interdependence and interconnectivity between capital markets, manufacturing and consumer products markets exhibit eye-crossing complexity.  Optimizing everything from supply chains and flight schedules to capital market trading has made a lot of money…and made us more vulnerable to catastrophic failure, made the nets more fragile.

Two Paradigms

I think we’re going to have two paradigms, both operating simultaneously and dealing with completely different sorts of risks. One deals with fairly well known risks that use self-assessments, risk registers, and the traditional tool kit. The other deals with Complexity Enriched Risk[3] or CER and uses a different approach. In the CER environment the “ERM Assessor” morphs into an “ERM Consultant” as the overall operational and financial environment becomes “enriched” by complexity. We need to understand and help our clients to understand this strange new environment. I think we all will become more and more familiar with these terms in risk management….

  • Complex Adaptive Systems
  • Resilience – ecological models and engineering models
  • Agent Based Modeling
  • Network Theory and Network Analysis
  • Behavioral Economics
  • Tiered Risk
  • Non-Linear Behavior and Non-Linear Modeling
  • Thresholds, Tipping Points & Phase Changes
  • Stationarity

To add value in a CER environment the consultant must provide expert, substantive advice, not just on the ERM process but on specific risks at hand. This is outside the comfort zone of many ERM practices.

It’s all about rules for auditors, it’s all about relationships for consultants. Both are going to have to learn about complexity at the tool kit, grass roots level to meet 21st century risk.

[2] Yossi Sheffi, The Resilient Enterprise, MIT Press, 2005, pp. 8-9.

[3] Coined by John Marke…kinda catchy eh?

Viva The Revolution in Predictive Analytics! A Shameless Parody of Bogart’s “Treasure of Sierra Madre” and Hemingway’s “Old Man & The Sea.”

June 25, 2011

Okay, I poke fun at the rising elite of consulting, i.e. those involved with predictive analytics or “the revolution in analytics.” Yet it raises serious issues around the capabilities and limitations of certain kinds of analytical modeling.  I am, by the way, a “fan” of quantitative methods …so keep smiling kick back and have a Corona.

The Old Man & The Blogger

The Blogger sat in The Cantina on a hot, lazy afternoon sipping a Corona.  Managers and marketing types gathered around him, hanging on his every word.   He spoke of “integrated business planning” and “executives getting fast answers to important questions.”   Then he crossed the line. “I think we’re about to witness a revolution in how companies use analytics in business processes. I don’t use that overworked term lightly. I expect this to be as revolutionary as the impact that client/server computing had on transaction processing and related systems such as ERP and CRM.” Oh, yeah?

Revolution!  The Old Man, a bearded Operations Research analyst in the back of the bar stirred from his siesta.  He moved slowly, almost painfully as ancient bones strained to come alive. But his eyes were coal-black, glinting with some sort of passion or maybe just crazy from the heat.  He tipped his sombrero and squinted..the eyes said it all: “I’ve been in the mountains with Fidel!”  A relic of the past hung from the Old Man’s belt, a HP RPN[1] calculator.

They said he belonged to the old Operations Research Society of America (ORSA).   They said when he has too much tequila he does matrix math with a pencil and paper.  He scares the undergraduates.  Maybe he had been in the Sierra del Escambray’s with Fidel doing some sort of weird linear programming or optimization?  Rumors?  No one really knows.

It’s About Insight    “Revolution in Analytics?  Revolution in Analytics?!  We don’t need no stinking Revolution in Analytics.” he mocked.   “Hey Amigo, we’ve been doing quant math and running yield management optimization programs since before you were born.”

The Old Man was getting more and more agitated…  “It ain’t about crunching numbers!  It’s about insight!  Ain’t nobody s-plained to you about complex adaptive systems or CAS? Eh?   No? Well you can kiss your assumption about equilibrium auto-correlation, and stationarity goodbye!   No more deterministic solutions!”  Getting in The Blogger’s face now: “Do you know anything about indeterminacy and heuristics?  Emergent properties? Yeah! I didn’t think so.”   The Old Man was on a roll now and there was no stopping him.  He was animated, a cross between Zorba the Greek and Hunter S. Thompson.  It was frightening but fascinating at the same time…we were spellbound.

 “You go down to Santa Fe (he meant the Santa Fe Institute which does research on CAS) and ask them what they think of predictive analytics!  And they will tell you that CAS solution landscapes DANCE!  Because the system adapts, it morphs, and it laughs at us!”

But now the Old Man was spent.  He was tired and needed rest. So he went down to the sea to watch the waves dance and was lost in the hubris of the past.

Hopeful but Cautious – Don’t “Pull A McNamara!”

The pros know the capabilities and limitation of analytical modeling.  They understand stationarity, emergent properties, and adaptive behavior in CAS, etc.  But others…well not so much.  And that can cause problems.

There’s a lot of hype going around about analytics.  And that’s to be expected in fields that re-emerge every twenty or so years into managerial mainstream.  We had Operations Research during World War Two, the RAND years of quantitative systems analysis during the Kennedy and Johnson Administrations, Chaos Theory and the hedge funds during the 1990’s, and so on.  When good data and good theory converge the results can be astounding, as in the airlines yield management and optimization innovations.   We also can look at predictive analytics in maintenance and equipment failures, and I recently wrote a piece reviewing a JAMA study on how performance analytics were reducing the risk of heart attacks mortality during hospital stays. Even though this is a “healthcare” orientation, the methodology and epistemology is brilliant and can serve as a template for virtually any industry.

Although I poke fun at combining quantitative analysis with the high levels of uncertainty accompanying CAS, I am optimistic we can accommodate both views simultaneously (see my CERTS posting).  However, when bad theory dominates, as in the RAND/ McNamara employment of Planning, Programming & Budgeting during the Vietnam War….well, in that case a lot of good people died. We can never let that sort of analytical hubris happen again…. see  Marke “Approaches to Risk Under Conditions of Uncertainty and Complexity” presented at the Society for Risk Analysis 2007 for a critique of McNamara’s approach.

So here’s the deal: if you position yourselves as a bunch of quants who do regression analysis or multivariate data analysis or whatever (being defined by the tool kit rather than the problem) you are doomed to work on relatively deterministic problems conforming to stationarity and living in equilibrium. Alternately, if you proceed to highly stochastic or indeterminate problems, the chances of hubristic failure are very high.  The two small power point pages at the bottom of this posting will graphically explain the typology. Remember the folks at Rand and don’t “Pull A McNamara.”

I think there is a middle ground.  It focuses on a multi-disciplinary approach that is not dominated by the tool kit.  Yeah, yeah, I know….you already spent a lot of money positioning this “new” service as “analytical.”  Re-positioning is gonna drive the marketing-types nuts.

Or maybe it’s just time to go down to the sea and watch the waves…

Here are slides from the SRA presentation (referenced and linked above)  that may prove helpful (click to enlarge). They were adapted from John Sutherland’s A General Systems Philosophy for the Social and Behavioral Sciences; 1973.  I am happy to discuss at length should there be interest, and I would also be happy to work with “quants” professionals who would like to incorporate complexity and CAS theory into their work.

[1] RPN standing for “Reverse Polish Notation”

The Rule of LGOPs (Little Groups of Paratroopers) A Metaphor for Resilience

June 6, 2011

On this the 71st anniversary of the World War II D-Day invasion it is only fitting to remind ourselves that rarely do things go as planned in battle.

The 18th century military strategist Carl Von Clausewitz called it the “fog of war.” It must have been pretty foggy on the night of June 5th and morning of June 6th 1944 off the coast of Normandy. In the pre-dawn hours Airborne troopers were dropped all over the field of battle, few hitting the “drop zone” as planned…

Rule of LGOPs
After the demise of the best Airborne plan, a most terrifying effect occurs on the battlefield.

This effect is known as the Rule of LGOPs. This is, in its purestNormandy_1944 form, small groups of 19- year old American Paratroopers. They are well-trained, armed-to-the-teeth and lack serious adult supervision. They collectively remember the Commander’s intent as “March to the sound of the guns and kill anyone who is not dressed like you…” …or something like that. Happily they go about the day’s work…

The Rule of LGOPs is instructive:
– They shared a common vision
– The vision was simple, easy to understand, and unambiguous
– They were trained to improvise and take the initiative
– They need to be told what to do; not how to do it

The Rule of LGOPs is, of course, a metaphor for resilience. All Armies, by the way, believe their soldiers are the best, the bravest, the most noble. But not all are the most resilient or adaptable. To be sure, I am not denigrating planning. Whether that structured thought effort is military, homeland security, or risk assessment, which I include as a type of planning. But anticipation must go hand in glove with adaptability.  Life is full of surprises.

The full paper is available here and explores some interesting philosophical issues, especially epistemology — the theory of knowledge and our world views.  Rarely do things go as planned. Rarely does everybody hit the drop zone.

Click the following for “Blood Upon The Risers”  and a tribute to the American Paratrooper.

The Pogo Pathology or Where Did All Those Cats Come From?!!

June 4, 2011

cat-island-japan-image-1A pathology can be thought of as a deviation from a normal, healthy condition.  In a real sense pathology is the antithesis of sustainability or resilience.  Ironically, there are pathologies associated with sustainability and resilience programs…programs that have turned out worse after we’ve gotten our hands on them.

We are incessant meddlers, in both man-made and natural systems.  And sometimes things go wrong, sometimes pogo[1]horribly wrong even with the best intentions.  More complexity increases the ways things can go wrong and shortens out response time.  We design flood control programs that cause catastrophic flooding.  Natural Resource Managers introduce predators into environments where they ravage ecosystems. Regulators encourage competition and efficiency in banking and “too big to fail” banks nearly cause the collapse of the global financial system.  Our success in reducing forest fires leads to massive conflagrations far worse than had we left well enough alone. You get the picture?  If I were Pogo I would be sitting there with a double barrel shotgun yelling: “Get outta my swamp! All you do is screw things up!”

When One FacePalm Won’t Cut It – In 1949 five domestic cats were introduced to Marion Island to deal with a mouse population out of control. The cats were not neutered.


By 1977 their population had grown to around 3,500 and developed a taste for the native birds threatening to drive them to extinction. A similar situation prevailed in Macquarie Island (Tasmania).  Cats (unneutered) were introduced to kill rats, mice and rabbits. And it worked until they multiplied like crazy and started eating the endangered birds they were supposed to protect. 3,000 cats were culled from the population and, as typical for a complex adaptive system, the island ecosystem did just that, it adapted. True to form, the rabbit population ballooned to 130,000 causing tremendous damage to vegetation while the rat population grew likewise, developing a taste for bird chicks.

In April 2014, after nearly three years of monitoring with no sign of surviving individual rabbits, rats, or mice the project was declared a success.

As a species we are ill-suited to deal with the increased uncertainty and complexity. This is especially true regarding sustainability and resilience, which usually involve our dealing with complex adaptive systems — either natural or man-made. In retrospect some of these concepts from complexity may have come in handy.

Perhaps the most grievous error was assuming the “Engineer’s View of Resilience.” Here management assume equilibrium, stability, or stationarity, and we engineer/design solutions anticipating equilibrium.  Our paradigm seeks efficiency and constancy, and assumes predictability.  They favor the Gaussian/normal distribution, and were obviously absent from engineering or “B” school the day exponential distributions were introduced. Cats no nothing of statistical distributions, Gaussian or otherwise.  Aside from eating, sleeping and playing, cats produce kittens.  Fertile cats, introduced into a closed ecosystem without predator species, produce lots and lots of kittens, and do so at a non-linear rate…or damn fast.

Axioms We Will Explore
We need to think about things a bit differently than we are used to.  Lets start with some axioms…rules of thumb.  You may have favorites in this list, so let me know and we can begin there.

kindness“First Do No Harm”  “Primum Non Nocere.” Taken from medical ethics it means it may be better not to do something, or even to do nothing, than to risk causing more harm than good. This advice is antithetical to type “A” personalities.

The technique of Failure Mode & Effect Analysis borrowed from Risk Management may be helpful here. How might things go wrong? Spend some time thinking about this. How do you do that? Get people who do NOT think like you on the team. Diversity is about more than political correctness.President Kennedy used these techniques:

  • Focusing on the problem as a whole
  • Surface underlying assumptions about the problem/opportunity
  • If everybody agrees, then something is wrong
  • Seek to synthesizes input from a variety of sources
  • Slow down (yes, I know, this is a rough one)

I think one of the most grievous elements of decision making is how we value speed in arriving at a solution. You don;t get extra credit for “solving” the problem quickly. Decide in haste, repent at leisure. which brings us to the next point…all solutions are hypothetical.

“Sustainability Policies Are Hypotheses”  Hypotheses suggest tentative relationships. We have limited information and limited understanding of the variables and how they interact.  These interactions are also dynamical so proposed solutions become all the more tentative.  Two wonderful questions to keep in mind:

1) how do I know the solution is working; and

2) how do I turn it off if it fails or goes out of control.

Doubt, Due Diligence & Complementary Solutions  Falsification was a technique made popular by the philosopher Karl Popper. In practicing sustainability and resilience we are also practicing science. Our hypothetical policies need to be subjected to harsh criticism, with us being the harshest critiques.Taking the Devil’s Advocate Position and try proving the argument false!
I guarantee you will get new insights. Make it part of your due diligence.

 Never be satisfied with only one answer.  Niels Bohr, one of the fathers of quantum physics, believed that a single explanation cannot exhaust the richness of experience but rather other, complementary or even paradoxical explanations be present.

Notes to NGOs and Granting Foundations: You may grant between several thousand to several hundred thousand dollars for resilience or sustainability projects. While positive results are always welcome, the value of negative results could, in the long run, prove much more valuable.  Foundations must not only be the advocate of sustainable and resilient work, but of the highest quality of science.

“Question Authority” Sustainability and resilience issues are extra-normal. Nobody knows anything for sure.

Many of the problems, and their tentative solutions, are relatively new and novel. Basically, they haven’t been around long enough for “experts” to develop the depth. New variables pop out of nowhere and interact in ways we haven’t seen before.

Experts thrive and flourish working in a “command & control” (aka federal government). The Florida Everglades is a case in point. Efforts to “develop” the Everglades sought to harmonize three very different variables: agriculture, urbanization, and conservation. We’ve drained it; and then constructed a massive infrastructure to control flooding and mitigate damage from Hurricanes. The results: we have significantly reduced the area of natural habitat, created dramatic declines in water quality, and made the region increasingly vulnerable to extreme weather conditions.  For a detailed discussion see “Resilience Thinking” by Brian Walker.  Expert opinion, backed up with models, sand tables and simulations played a part in creating a mess in Pogo’s backyard. That brings us to the next axiom…be suspicious of the tools in the experts’ tool box.

 “Be Suspicious of Models, Simulations and Quants”  Better to read the entrails of a freshly killed goat than the output of a quantitative model, especially simulations. At least later you can eat the goat.

It’s not the data that will kill you, it’s the underlying assumptions.  “You don’t like the results?  Okay, we’ll tweak the assumptions.” Predictive Analytics faces similar pitfalls.  Sustainability projects can often be contentious and political. Opposing sides often bring in consultants (aka hired guns) who produce different findings while looking at the same issue.

I guarantee you the “quants” will get it wrong and, anyway, nobody will be able to understand it to begin with. In all seriousness, you need a stable environment where historic statistics continue to be relevant to dynamically changing markets, technologies and human behavior. We can produce extremely precise forecasts for daily electricity usage. On the other hand, we haven’t seen much success in quantitative systems designed to “Beat The (Wall) Street”or break the bank in Las Vegas.  Randomness and indeterminacy are, unfortunately, facts of life.

“Question Received Knowledge”
We teach invincibility.  We teach that everything is possible, that science conquers all, that nature is ours for domination.

We reward expertise in executing processes or templates. We teach “the model” and its application in an environment of low uncertainty and low complexity. Remember what Thomas Kuhn said about “received knowledge” in Structures of Scientific Revolutions? No? Not a big surprise. Kuhn is rarely taught to undergraduate or masters students; and for that matter, most PhD’s skate by with some dopey “research methods” class that neglects to teach epistemology. Read Kuhn if you want an “ah huh” moment.

The simple explanation, in case you’re not interested in reading Kuhn, is that professionals become path dependent…in the way they think, in their assumptions about how things work, and with the models that have consistently shown success. All well and good until the environment changes; and what worked so well in the past becomes less and less irrelevant to the evolving problems at hand. Today’s environment is quickly becoming more and more complex, and with it more and more uncertainty.

“Learn About Complexity & Uncertainty”

We are dealing with ecosystems, be that a rain forest or city. These are complex adaptive systems. Brian Walker, an Australian environmental scientist, wrote an exceptional book called “Resilience Thinking.” It is very accessible and a quick read. I have also done work in sustainability and resilience. If I can be of help, give a call or email.  Happy to hat.

John Marke 636-458-1917,

Re-Thinking Systems, Particle Physics and Candy Mints

February 14, 2011

CERTS: Is it a breath mint or a candy mint? It is perhaps the second most vexing taxonomic dilemma of the 20th century, with the wave-particle duality of quantum physics edging first place by a hair. CERTS is part of American advertising folk lore. Since we’re not gunning for a fight we’ll just call it a “mint” for the time being. Anyway, “the mint” was famously promoted during the 1960s and 1970s as one spokesperson would insist “CERTS is a breath mint!” while another, just as adamantly declared, “CERTS is a candy mint!” Before either a divorce or fist fight ensued an unseen announcer would resolve the issue by declaring: “It’s two, two, two mints in one!”

If CERTS can simultaneously, be both a “breath mint and candy mint” cannot a system be both open and closed at the same time (for the pedantic reader: they exhibit characteristics of both open and closed systems) ?  If this is so, it has profound impact on how we look at organizations and how we frame organizational/economic problems!

• An open system is inherently unstable, with multiple equilibrium points, complex adaptive, full of surprises and unpredictable; on the other hand…

• A closed system is stable, existing at or near one equilibrium point, entropic (with friction and the second law of thermodynamics kicking in), stable, deterministic, based on immutable doctrine, and predictable.

Think of it this way: there are “breath mint consultants, theorists, and vendors” and “candy mint consultants, theorists and vendors” out there….all competing for your attention and your money. And all are competing in good faith according to their paradigm or how they see the world. Some want you to think about your business as if it were a closed system, they believe in predictive metrics, reducing slack in the system and optimization. Likewise there are those who say “It’s an open, adaptive system! Don’t try to predict anything, be resilient and agile!” They will also help you redesign your business or government, and tell you about contingency management and increasing slack in the system (a philosophy that is diametrically opposed to optimization). Who is right???

It’s Not A Question of “Either/Or”……. Reality Is “Both/And”

I tend toward the complex adaptive
(open) systems point of view. But there is one embarrassing little problem with that: sometimes you can predict! And you can do so even for the most complex of adaptive systems!

Here is a good example from no less than the Journal of the American Medical Association (JAMA, April 26, 2006 – Vol. 295, No 16) who published a study: “Association Between Hospital Process Performance and Outcomes Among Patients With Acute Coronary Syndromes.” See the following blog post… The results: “A significant association between care process and outcomes was found, supporting the use of broad, guideline-based performance metrics as a means of assessing and helping improve hospital quality.” This means that performance based metrics can help save heart attack and stroke patients lives.

Does this mean I am turning in my membership card to The Complex Adaptive Systems or CAS Club? No, but what I am saying is this: Stop thinking in terms of “either/or” and start thinking in terms of “both /and.”

CAS theory works, but not all the time. Predictive modeling works, but not all the time….keep an open mind, if not an open system! Although space prohibits a full discussion of this theory, here’s a peek at some of the research I am doing on relating performance, predictive metrics, and risk. Let’s assume for a moment that businesses and governments go through a life cycle very simply depicted as follows: This is the “sigmoidal” or “S” curve drawn with a fore loop and a back loop (it’s really a Mobius Strip, but I’m not good at 3-D geometry) .

The area between the “r” and “K” is stable and we can use “predictive” modeling here, i.e. regression, time series, maybe some operations research modeling. Predictive modeling doesn’t work when you go past the tipping point at “K” and the system experiences catastrophic failure or, as they say in physics, a “step change.” But let’s go a little further:

1. It is a qualitative model, i.e. there are no scales on either axis of the graph (I have hopes of changing that)

2. Systems operating in the area between “r” and “K”  behave more like closed systems than open systems; and they are amenable to predictive modeling and performance optimization with techniques like linear programming.

Then the environment becomes more volatile and life throws us a curve.

3. At the top of the “S” curve, at “Kappa or K” we experience sudden and discontinuous change, e.g. a heart attack, a financial meltdown, nuclear war, or bankruptcy.  Such change reasserts the dominance of the political, economic, social, technological or climatic over our system.

4. This process is going on simultaneously in different parts of the organization and those parts will necessarily be at different phases on the cycle! Think of a series of “nests” resting one on top of the other….multiple hierarchies (or as C.S. Holling calls it, a Panarchy).

RESILIENCE: The model describes life-cycle.  All (open or closed) systems go through life-cycles. The model shows something that we intuitively know: sooner or later our system is going to go out of whack.  Thresholds, scales, feedback loops, and domains are going change in unpredictable Titanic[2]ways, and for that matter, the catalyst for this catastrophic change is probably going to be something that takes us by surprise. Think RMS Titanic, the ship that was supposed to be unsinkable and iceberg, that wasn’t supposed to be where it was.

You do not have to know how everything is connected to everything else, though you do have to have at least an idea about thresholds, scales, feedback loops and other elements of your system.  As Walker and Salt point out in Resilience Practice (p.23, Island Press, 2012) “Resilience thinking involves requisite simplicity: figure out the minimum but sufficient information needed to manage your system for the values that you hold to be important.”

N.B. This is easier said than done. Consultants take note. This is not an analytical exercise. It is an exercise in negotiating a story, a narrative, with clients , and doing so in an environment that can be highly political and fraught with uncertainty.

And one final point: a system can be both open and closed simultaneously, not unlike an electron can be both wave and particle simultaneously or, for that matter, CERTS can be both a breath mint and a candy mint! So keep an open mind.

John Marke © 2011

The Curly Factor – How to Profit From Coarse Behavior

January 20, 2011

We  talk about “risk intelligence”  and various gurus and philosophers have called it “the capacity to learn about risk from experience and a special kind of intelligence for thinking about risk and uncertainty.”

I’m not exactly sure what that means, except you’re going pay a lot find out; and you’re also likely to get involved with estimating probabilities about threats and quantifying vulnerabilities and maybe sophisticated probabilistic models.

But what if you are not very sophisticated?  Is there hope?  Actually it pays to be “simple” when it comes to dealing with high impact risk.  That’s what The Curly Factor is all about.

Curly was a very simple guy.  He was easily the dumbest but best loved character of the Three Stooges. There isn’t a culture in the world where people don’t recognize Curly.  Since he’s been dead for half a century and made a bunch of low budget “shorts” of questionable quality in the 1930’s that’s a pretty remarkable achievement.

So what is The Curly Factor? In two words:

“coarse behavior. “

No, not the kind of coarse behavior your mother yelled at you about.  Curly only noticed very basic signals from his environment.  Curly didn’t have elaborate or sophisticated decision making rules.  He wouldn’t know probability from a pot roast. But when they passed the hors d’oeuvers at a fancy black tie party, Curly didn’t just take two; he merrily took the whole tray! Subtlety was lost on him.

Curly only paid attention to the essential. We can just as easily substitute “essential” for “existential.”All other information is filtered out.

Is the key to emulate Curly and not the sophisticates with their complicated probabilities and models?  In certain instances that is exactly the case.

Although risk “experts” are loathed to admit it, the class of risk that is most threatening …and the ones for which there are no probability distributions…are the risks beyond our control, the ones we didn’t see coming. Traditional risk management isn’t equipped to deal with these killer risks, the ones that can bring your company or government to its knees overnight.  There are emerging tools and techniques but that has yet to become part of the profession’s DNA.  It’s all about the science of complexity and resilience, but more on that later.

So is the answer to do dumb things? Well, not exactly dumb…more like “less sophisticated.”

Curly engages in what economists call “sub-optimal behavior.”  Yes, that means he does dumb things, or at least things that seem dumb to the rest of us.  The corporate world has programmed us to go for gains at the margin, get a quick success, to become more efficient, and do the rational thing.   You win a lot of short term victories but there is solid evidence species that are prolific and successful in the short term die out after being hit with an unanticipated event. Sub-optimization can often equal survival.

Some of us have developed great talents for dealing with known risks but don’t have the understanding or aptitude for dealing with the uncertain.  Worse yet, people are often programmed and rewarded for focusing on the subtle.  To understand this you have to change your frame of reference, think about things a little differently.

Try this thought experiment: You are the President of British Petroleum and you have this vision of the rUNFYnD[1]Deepwater Horizon disaster a year before it takes place.  You see the explosion, the fire, and the months of oil spilling into the Gulf. But here’s the catch: you can’t tell anyone about it. And that means you cannot focus remediation on that drilling platform.  What do you do?  Where do you focus your corporate risk management efforts?

Yeah, you ignore the subtleties of traditional risk management. You ignore the extraneous and focus like a laser on only that information that is directly relevant to your survival. That is exactly what Curly would do.

Now you understand the advantages of coarse behavior.  There is, of course, nothing wrong with traditional risk management techniques; and I am not suggesting abandoning them.  They are necessary but not sufficient to survive in today’s global and highly complex environment.  If you haven’t developed the insight and the science to deal with uncertainty maybe you ought to think about doing so.  Every now and again ask yourself “What would Curly do?”  And always take the whole tray when they pass the hors d’oeuvers.

Note: This is a work in progress and I am happy to discuss my ongoing research.  E-mail me and I will send you Rick Bookstaber’s original work on coarse behavior  called the “Optimization of Coarse Behavior” or you can go to my profile on Linked-In and download it from the archive there.  One warning however, it is a “little” math intense.

John Marke © 2015

Performance, Risk, Complexity & All That Jazz

November 18, 2010

What do we mean by “performance?”  It’s not a philosophical question to ponder over a Starbucks.  The world is awash with complexity and uncertainty. It’s show time, folks.

Performance? Are your expectations realistic?  Better yet, are your expectations relevant?How do you know?

We know from ecology that some entities – fish, fauna, football teams, multi-nationals –  become “super adapted” or “super competent” in relation to their environment, i.e. they develop unique characteristics that give them a competitive edge.  However, when the environment changes, whether you are a fish in a pond or a football team, sometimes you need to abandon what worked so well in the past and move on to develop other skills….of become “de-selected” i.e. a euphemism for being terminated with extreme prejudice.

At the very core of this issue is “what is performance?” If we get this wrong nothing else is going to work. And I am not as concerned about the measurement issue as getting the “what is?” question right. Legend has it that the Emperor Nero played the harp while Rome burned…perhaps the ancient Roman equivalent of collecting irrelevant KPI’s (key performance indicators) while the city went up in smoke. But I digress…

At one time I thought performance could be easily captured! That was back in the days when everybody thought “market cap” was the end all for performance measurement.  Then it became obvious that market factors often overshadowed company-specific factors. Research (courtesy of McKinsey & Co.) showed that from 1990 to 2000 about 70% of the returns to individual companies was due to market factors and that only about 30 percent was due to company specific factors. There are so many elements beyond management’s control, i.e. trader psychology, herd behavior, and most certainly the velocity of trades in a global economy. Let me be even more blunt: “Why am I busting my tail controlling costs and investing in product improvements when they only account for 30% of my value?”  Good question.

Next question: when did makets get so damned smart?  I have a good friend, an actuary, whose philosophy is “you can never fool the market.” Of course he made that rather sweeping statement prior to the 2007 financial meltdown on The Street and the remarkable Mr. Bernie “Trust Me” Madoff.  So you can still do everything “right” and end up in Bankruptcy Court.

It begs a question: “what is the purpose or goal of the corporation?”

I knew the answer from my work in national defense – survival.

Plain and simple: at the end of the day your Army must be  functional, operational and ready to respond. If you invest in horse cavalry and the era of the tank is dawning – you lose. If your doctrine is massed artillery and the era of maneuver warfare arrives – you lose. There is an evolutionary logic at work here and it is brilliantly captured in the concept of generational warfare. Is there anything like that in business?

Yes.  If you do not survive you have absolutely no hope of increasing shareholder value (assuming you buy into “agency theory” in the first place).

Eric Beinhocker, a McKinsey Fellow,  uses an evolutionary framework for thinking about performance, suggesting that evolution is a search algorithm for fit designs. For Eric, the objective of any business is survival. The fit are selected to go on, to continue to live and prosper. “The objective function of any schema must be the survival and replication…”  Makes sense to me!

Fitness is an appealing, intuitive, fairly low tech, commonsensical way of thinking about performance.  The market senses designs that are poor, and lets them wither away (sometimes quite quickly). Obviously business shares a few thorny problems with evolutionary biology in determining the exact mechanisms for selection, but we can leave that for another day. For now leave the “how” of selection as a black box.

Here’s the deal: If you’re fit, you get to stick around.

And Beinhocker adds a little spice to the story. He references work done in the late 1990’s and early 2000’s that competitive advantage is both rare and short lived in both the biological and business worlds. Speaking from empirical data, he demonstrates there is no sustainable competitive advantage (apologies to Michael Porter ) only a never-ending race to create new sources of temporary advantage.

“This then changes our definition of an excellent company from one that has continuous high performance for very long periods (an achievement that is almost non-existent) to one that can string together a series of temporary advantages over time – in other words running the ‘red queen’ race from Alice in Wonderland, i.e. running with all one’s might to just stay in place.”

Yes, but that was then (1990’s) this is now. Beinhocker may be too conservative. Perhaps survival is today’s core strategic issue. Forget about high end performance or even the “red queen race,” and hope you don’t end up like Leman Brothers.

The Problem With Beinhocker…Times Have Changed, Gotten More Complex

Beinhocker is a great theorist, but his data is old and, I think, less relevant in today’s highly connected and interdependent world. Complexity has exacerbated risk. Beinhocker was an optimist!

Empirical studies always have data lag. The world has changed from the 1980’s and 1990’s. Stationarity – the assumption that the distribution of outcomes does not change – can lead to tragic outcomes. Not everything fits under the bell curve and failing to take into account the randomness generated by complex systems interplay creates gigantic “black swans” that can devour whole industries.

Today is it about survival, endurance and longevity? Maybe so!

More about performance, risk and complexity in on going blog discussions.

Game Changing Risk & Globalization

July 8, 2010

November 1, 1913, New York, NY – College Football

In 1913 the Ivy League owned college football, and Army dominated the Ivy League. That November, a small, financially strapped Catholic men’s school went east to take a crack at the big guys.  With over 5,000 spectators and sports writers from major New York news papers in the stands there was plenty of coverage for what promised to be a somewhat boring game between Army and a virtual unknown (at least outside of the mid-west).

Army was undefeated and had a shot at the National championship.  That was until Gus Dorais threw something called a “forward pass” to a receiver named Rockne….Knute Rockne. During the game, Dorais completed 14 of 17 passes for 243 yards.

Defense?  Army had no defense!  According to Rockne, “Everybody seemed astonished. There had been no hurdling, no tackling, no plunging, and no crushing….just a long distance touchdown by rapid transit.”  There were no play books for defense against the forward pass…this had been a game of pure muscle and beef on the line. With the operative words being “had been.” Nobody had done this before.  Rockne was catching on a dead run at 30, 40 and 50 yards.  As the “Fighting Irish” marched off the field the score was 35 – 13, and the rest, as the say, is history.

There are lessons to be learned here.  First, pay attention when the rules of the game change.  Yes, the forward pass was legal, but heavily restricted prior to the 1913 rules change. The old regulations limited the pass to 20 yards forward of the line of scrimmage, and to a stationary receiver, i.e. he could not be in motion.  Second, pay attention when the technology changes.  Another 1913 rule change altered the shape of the football from oval to spherical, allowing it to be thrown harder, further and more accurately. And third, adapt your playbook or suffer the consequences.

Global Supply Chain – 20XX

For the past 20 years our focus in supply chain has been locked into a “running game” of taming “demand uncertainty.”  The state-of-the art being “just in time” manufacturing where little inventory is carried before put into immediate use.   In addition, outsourcing has further reduced cost and transferred many non-core, and at times even core, business processes to globally dispersed, third party facilities to take advantage of low labor costs and cheap global logistics.

There is no free lunch.  You want global reach?  Well, you also get global risk.  Today more things can also go wrong, in more places, and with more effect than ever before.  Upstream trading partners are much less “visible” while simultaneously assuming more and more control over key manufacturing and process operations. Supply side risk is the “forward pass” of globalization.  Is it a threat?  Ask Mattel.  Ask P&G.  Ask Colgate.  Ask Wall-Mart.

Asymmetrical football

Had Bob Costa been around in 1913 for Notre Dame/Army he might have called it “asymmetrical football.”  That’s when the weaker opponent uses a “game changing strategy” to negate the strength of its opponent.  Simple: if you can’t win against the muscle of the defensive line, put the ball in the air. That was a smart strategy for Notre Dame, but there is an even more important lesson.

A second definition of “asymmetric” involves the multiplier effect.  That’s when one unit of input into a system yields more than one unit of output. Continuing with our football example, prior to the ND/Army game, football was exclusively a running game.  One new unit of input into the game – the pass – yielded numerous outputs, i.e. different combinations of moving the ball forward of the line of scrimmage.  You could still run the ball; you could do a short pass. You could do a “Hail Mary.” You could do a screen; you could do a lateral and then pass, etc. For the more mathematically inclined, outcomes were no longer linearly bounded[1].  For coaches, it was a supreme headache!  Passing exponentially increased the complexity of the game and made any sort of accurate play prediction virtually impossible.

And another thing, Dorais wasn’t just throwing footballs; he was throwing bricks!  And the New York media made sure the other coaches got the message…like a brick in the head, the forward pass couldn’t be ignored.  Unfortunately supply chain mangers and their trusted advisors might have harder heads than college football coaches. Remember, people have the amazing ability to deny, displace, or otherwise ignore a threat, either failing to “notice” or failing to correctly interpret and/or act on game changing events.

Email for complete paper.


Get every new post delivered to your Inbox.

Join 35 other followers