• About The Regularized Singularity

The Regularized Singularity

~ The Eyes of a citizen; the voice of the silent

The Regularized Singularity

Author Archives: Bill Rider

Who knows the recipe for the secret sauce?

10 Wednesday Dec 2014

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Helping others is the secret sauce to a happy life.

― Todd Stocker

mcdonalds-secret-sauce-revealed-heres-official-big-mac-recipe.w654I used to work at McDonalds a long time ago. Most people know that a Big Mac uses a secret sauce in dressing the sandwich. It looks like Thousand Island dressing, but rest assured, it is a secret sauce of some sort. Ideally, the secret sauce is the literal trademark of the sandwich, its identity and it’s a secret only known by a hallowed priesthood. Little did I know that in my chosen professional life I would be exposed to a completely different “secret sauce”.

BigMac-1A successful modeling and simulation code is very much the same thing; it has a trademark “secret sauce”. Usually this is the character for the code is determined by how it is made robust enough to run interesting applied problems. Someone special figured out how to take the combination of physical models, numerical methods, mesh, computer code, round-off error, input, output… and figured out how to make it all work. This isn’t usually documented well, if at all. Quite often it is more than a little embarrassing. The naïve implementation of the same method usually doesn’t quite work. This is a dark art, the work of wizards and the difference between success and failure.

6534-computer-wizardThe rub is that we are losing the recipes. In many places the people who developed the secret sauce are retiring and dying. They aren’t being replaced. We are losing the community knowledge of the practices that lead to success. We may be in for a rude awakening because these aspects of modeling and simulation are underappreciated, undocumented and generally ignored. Sometimes the secret to make the code work is sort-to-very embarrassing.latest

This is a very dirty little secret. Rebuilding this knowledge once lost is going to be very expensive. In most cases we don’t even recognize what we’re losing because the modern view of modeling and simulation doesn’t value it. It is essential and the sort of thing that isn’t being funded today.

Do Science and Politics Mix?

09 Tuesday Dec 2014

Posted by Bill Rider in Uncategorized

≈ 2 Comments

Unknown-2Conservatives are not necessarily stupid, but most stupid people are conservatives…

I never meant to say that the Conservatives are generally stupid. I meant to say that stupid people are generally Conservative. I believe that is so obviously and universally admitted a principle that I hardly think any gentleman will deny it.

― John Stuart Mill

Yesterday an opinion column appeared in Nature arguing that the scientific community should steer clear of partisan politics (http://www.nature.com/news/science-should-Capitol-for-Forum-Pagekeep-out-of-partisan-politics-1.16473 by Daniel Sarewitz). Dr. Sarewitz’s arguments and sentiments are high-minded, but really only has traction in the ideal World where support and respect for science was bi-partisan. We do not live there. We live in a World were science has been politicized and attacked. These attacks have come predominantly from one side, the conservatives and their dominant organization, the GOP. Until support and attacks on science are more balanced it is inevitable that the scientific community aligns with one side over the other.

As mankind becomes more liberal, they will be more apt georgewashingtonto allow that all those who conduct themselves as worthy members of the community are equally entitled to the protections of civil government. I hope ever to see America among the foremost nations of justice and liberality.

― President George Washington

UnknownThe assault on science and reason by conservatives is seemingly endless. Leading the charge is the denial of climate change and the threat it poses to humanity. The reasoning for the denial is two-fold, the risk action on climate would impart to the ruling corporate class, and the conservatives love of their wasteful, destructive lifestyles (which largely fuel profit to the corporate overlords). Further attacks occur within their embrace of fundamental religious faction’s desire to indoctrinate our Unknown-1children with their myths in place of science (i.e., creationism). In other venues the conservatives attack science that goes against corporate greed be it environmental science, medicine and especially the social sciences. Conservatives deny the underlying biological basis of homosexuality because of its implications for their religious beliefs. Time and time again it is their commitment to traditional religious belief over science and reason that drives a wedge.

101c00f90e23d6d315b3c0e1bff145d9The attacks on science and reason are by no means completely one-sided. Both liberals and conservatives fail to repudiate the various science deniers and neo-Luddite factions in their midst. For instance liberal anti-science can be see with anti-vaccine, anti-GMO and anti-nuclear movements. Each of these is based on fear of technology and is fundamentally irrational. For instance, the coupling of liberal environmental leanings and anti-nuclear mindsets undermines support for action on climate change Unknown-3(https://williamjrider.wordpress.com/2014/06/06/why-climate-science-fails-to-convince/).

Of all the varieties of virtues, liberalism is the most beloved.

― Aristotle

Science is liberal and progressive by its very nature. The fundamental character of research puts it at odds with conservatism almost a priori. Nothing is wrong with religion per se, but when it stands in the way of knowledge and progress there is a problem. The conservatives use religion to control the masses and subjugate people as they have for millennia. This takes the form of fundamentalism, which is the enemy of science, progress and humanity at large. Since science and reason threaten the means that conservatives control the masses, it is a threat to their power. This is the core of the GOP’s repeated attacks on science. This is why scientists will line up on the other side against them.

When the experts’ scientific knowledge is legitimated in terms of being rational, logical, efficient, educated, progressive, modern, and enlightened, what analogies can other segments of society . . . utilize to challenge them?

― Martin Guevara Urbina

 

The Unfinished Business of Modeling & Simulation

08 Monday Dec 2014

Posted by Bill Rider in Uncategorized

≈ 1 Comment

Where all think alike there is little danger of innovation.

—Edward Abbey

02What do you do when you’re in a leadership position for a project that you’re sure is moving in the wrong direction?

If you’re a regular reader you can guess that its high performance computing, and the direction has been wrong for a couple of decades (IMHO). Today and tomorrow we are just kicking the can down the road. Current direction is virtually identical to what we’ve been doing since the early to mid-1990’s. There isn’t a lot of visionary thinking to be had. On the one hand the imminent demise of Moore’s law promises some degree of disruption. We can’t continue to make progress the way we have been, and change will be thrust on us. On the other hand the conditions for visionary thinking, risk taking and progress are absent.images

Without deviation from the norm, progress is not possible.

― Frank Zappa

While the business we are doing is stable today, the continued traditional emphasis has an aggregated impact on a host of issues. Being stable also means that the business is stagnant, which isn’t good for the science. Progress in modeling and simulation has been made largely in two regards: computational power available has increased, and the way of conducting studies has matured. The stagnation of progress is most evident in the codes; the methods and models in the codes are simply not moving forward meaningfully. The codes in terms of methods and models are largely the same as we exmatex-1000x400used twenty years ago. Furthermore most of innovative and creative energy has gone into implementing the codes on modern computers. The result is a phalanx of missed opportunity whose implicit costs are massive. I’d like to sketch out what some of these opportunities and the costs associated with missing them.

Societies in decline have no use for visionaries.

― Anaïs Nin

Historically, we have been rewarded greatly by algorithm, methods and models improvements that exceeded the benefits of faster computers. Despite this track record support for continued development along these lines has languished and dropped in intensity. One might surmise that we have picked off the low hanging fruit already and made the easy breakthroughs. I’m far more optimistic that massive improvements and innovation are not just possible, but awaiting relatively easy synthesis into our current work. To achieve these gains we will have to discard some of the limitations we impose on our current execution of projects.

But knowing that things could be worse should not stop us from trying to make them better.

― Sheryl Sandberg

So what is holding us back?

A big part of the problem is the issue of “sunk cost”. The codes are now huge and quite complex. They represent massive investments in resources over years, if not decades. The program management is not interested in the starting over, but rather evolving capability forward. This is rather limited in scope, and largely takes the form of moving the codes whole cloth onto new computing platforms. For people with short time horizons (and/or attention spans) this is a safe path to success. The long-term costs are lost to the risk calculus currently employed. No one realizes that the code is merely a vehicle for intellectual products that can utilize automatic computation. Its value is solely based on the quality of the thinking in its implementation, and the quality of its implementation. Virtually all the effort today is on implementation rather than the thinking itself. Until we overcome this sunk cost mentality, codes will remain intellectually static with respect to their defining applied character.

Restlessness is discontent — and discontent is the first necessity of progress. Show me a thoroughly satisfied man — and I will show you a failure.

― Thomas A. Edison

Ninja_Hcurl_40_approxWhat are some of the things we are missing? Clearly one of the greatest sacrifices of the “sunk cost” code is static discretizations and models. The numerical methods that implement the physical models in codes are generally completely intertwined with the codes basic structure. Over time, these aspects of the code become a virtual skeleton for everything else the code does. The skeletal replacement surgery usually kills the patient, and that can be allowed. Therefore we get stuck. New discretizations could provide far more accurate solutions, and new models could provide greater fidelity to reality, but this has been taken off the table to maintain continuity of effort. Part of the work that we need to conduct is a better understanding of how practical discretization accuracy is achieved. For most applications we don’t have smooth solutions and the nominal notions of numericalimages-1 accuracy do not hold. How do discretization choices impact this? And how can these choices be optimized given resources? Furthermore changes in these areas are risky and never sure to succeed, while risk reduction with fear of failure is the preeminent maxim of project management today.

Anyone who says failure is not an option has also ruled out innovation.

—Seth Godin

MorleyWangXuElementsMoving on to other more technical aspects of computing and potential benefits I’ll touch on two other missing elements. One of these is stability theory. As I noted a couple of posts ago, robustness is a key to a code’s success. At a very deep level robustness is a crude form of stability. The crudeness is a symptom of failings in the current stability theory. This implies that we could be far better with a more extensive and useful stability theory. Part of this is defining a stability that captures the requirements mathematically for producing robust, physical results. Today we simply don’t have this. Stability theory is a starting point, and we have to kludge our way to robustness.

Innovative solutions to new challenges seldom come from familiar places.

—Gyan Nagpal

UnknownA second area of progress that we have suffered from not having is numerical linear algebra. We are thirty years on from the last big breakthrough, multigrid. Multigrid is viewed as being the ultimate algorithm given its ideal scaling with respect to the number of unknowns (being linear, and all other methods are super linear). Since then we have moved to using multigrid as a preconditioner for Krylov method improving both methods, and implemented the method on modern computers (which is really hard). Thirty years is a long time especially considering that other advances in this field came on a faster than decadal pace. A good question to ask is whether a sub-images-2linear method can be defined? Is multigrid the ultimate algorithm? I suspect that the answer is sub-linear method can be discovered, and work on “big data” is pointing the direction. Beyond this we typically solve linear algebra far more accurately (very small residuals) than probably necessary. It is done almost reflexively with a better safe than sorry attitude. This is a huge waste of effort, and someone should come up with a sensible way to set solver tolerances and optimize computational resources.

The willingness to be a champion for stupid ideas is the key to greater creativity, innovation, fulfillment, inspiration, motivation, and success.

—Richie Norton

A big area for progress is uncertainty quantification. The methods today are clearly focused on modeling and parametric uncertainties using sampling methods. While sampling is general, it is inefficient. These are epistemic uncertainties reflecting our lack of knowledge. The natural variability or aleatory uncertainty is largely unexplored computationally. This reflects pointedly on the modeling approach we use. Key to this is the generally homogeneous nature of material models even though the materials are quite heterogeneous at the scale of the discretization. This is a clear place where the maintenance of codes over long periods of time works against progress. Most of the potentially more efficient uncertainty methods are deeply intrusive and don’t fit existing code bases. Further complicating matters, the development of these methods has not been focused on models sufficient for applications. It has focused on “toy” problems. To progress we need to take significant risks and tackle real problems using innovative methods. Our system today is not set up to allow this.

Dreamers are mocked as impractical. The truth is they are the most practical, as their innovations lead to progress and a better way of life for all of us.

—Robin S. Sharma

IBM_sequoa12345Expect to see a lot of money going into computing to support “extreme” or “exascale” initiatives. It is too bad that this effort is largely misplaced and inefficient. The chosen approach is grossly imbalanced and not indicative of historical perspective. The work we are not doing is risky, but capable of massive benefit. Current management models seem to be immune to measuring opportunity cost while amplifying the tendency to avoid risk and failure at all costs.

Never confuse movement with action.

― Ernest Hemingway

 

Fear’s Unparalleled Cost

07 Sunday Dec 2014

Posted by Bill Rider in Uncategorized

≈ Leave a comment

The only thing we have to fear is fear itself.

― Franklin D. Roosevelt

The more I consider the state of affairs in our country, the more I realize that fear is 1000509261001_2021239942001_FDR-A-Day-That-Will-Live-in-Infamyour single greatest weakness. It has become our defining characteristic. Fear is driving everything we do as a nation, and it is choking us. FDR spoke those words to a generation whose early lives spat at fear, but whose actions later in life paved the way to its control. More than the loss of innovation that I wrote about last, we have lost our courage and become a nation frightened cowards. We fight wars against weak nations for terrible reasons. We allow our vastly armed police force to terrorize our citizens. We imprison huge numbers of Americans without any thought to what it implies. We torture using methods we have executed people for. Its all because we are afraid.We are a shadow of the john-f-kennedy-1nation that faced facism because we have lost
our nerve.

We are not afraid to entrust the American people with unpleasant facts, foreign ideas, alien philosophies, and competitive values. For a nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.

― John F. Kennedy

Both FDR and JFK spoke to an American that has shrank from view and been replaced by backwards looking zealots wishing to retain a country they don’t deserve anymore. Once upon a time our clear problems and threats spurred Americans to action and accepting of sacrifice. We were willing to confront problems and accept challenges. Now we simply fear losing the advantages we have been granted since the end of World War 2. We aren’t even willing to admit that they have already been lost. The World isn’t static, and others have been willing to accept the sacrifices Americans have shied away from. Our body politic is two parts fear and risk aversion and one part pure denial. The denial is American Exceptionalism, which takes everything we do and casts it as divine and immune from all consideration and perspective.Barcelona-Police-brutality

To share your weakness is to make yourself vulnerable; to make yourself vulnerable is to show your strength.

― Criss Jami

We have massive problems in our nation that need immePolice-Brutalitydiate attention. Without change the problems will move from festering to metastasizing and exploding. Whether it is the curse of massive economic inequality and its risks to the bulk of the population and its toxic impact on politics, or our continuing racial inequities both are shrinking from any progress. We are allowing inequality to continue undermining any reality of to the increasingly mythical “American Dream” while allowing the elite to buy elections “legally”. We might have an abysmal level of social mobility; if you’re poor you’ll stay poor, if you’re rich you’ll stay rich. Race is continuing stain that will explode soon as the identity of minority and majority switch identity. We run the risk of having the minority rule, which is the recipe for revolution as is the scourge of inequality.

America is great because she is good. If America ceases to be good, America will cease to be great.

― Alexis de Tocqueville

War+on+TerrorThe war on terror is the epitome of our collective fear. While 9/11 was tragic, it shouldn’t have ever resulted in the sort of resources devoted to its response. We have lost much of our soul to it. Terror has bred terror, and America committed torture, murder and other crimes in its wake. We have sacrificed freedom and privacy in the name of fear. Terror kills very few Americans even factoring 9/11 in, or the lives of soldiers fighting overseas. Americans do a much better job of killing other Americans than terrorists be it by gunfire citizen-to-citizen or our completely and utterly out of control police force.plancolombia_460x276

Jim Crow book coverOn top of this we have a completely out of control prison system. It has become a new day Jim Crow with its racial imbalances, and a complete lack of perspective on it terribly reflects on all of us. We destroy more lives of fellow citizens with the moronic war on drugs than the war on terror could have ever caused. The criminalization of drugs is mostly about subjugating minorities and little about public safety (alcohol is a very dangerous drug, but the drug of choice for the white power structure). The drug war isn’t about safety; it’s a replacement for Jim Crow.

I explain that Americans at the level of popular culture, at the level of grassroots politics, were thinking very hard about what it would mean to have a country they didn’t believe was God’s chosen nation. What would it mean to not be the world’s policeman? What would it mean to conserve our resources? What would it mean to not treat our presidents as if they were kings? That was happening! And the tragedy of Ronald Reagan most profoundly wasn’t policy — although that was tragic enough — but it was robbing America of that conversation. Every time a politician stands before a microphone and utters this useless, pathetic cliché that America is the greatest country ever to exist, he’s basically wiping away the possibility that we can really think critically about our problems and our prospects. And to me, that’s tragic.

—Thomas Frank

Is Risk Aversion Killing Innovation?

05 Friday Dec 2014

Posted by Bill Rider in Uncategorized

≈ 2 Comments

1948471Yesterday while working out I read a stunningly good article from Aeon (http://aeon.co/magazine/science/why-has-human-progress-ground-to-a-halt/) by Michael Hanlon. I thought it was well done and extremely thought provoking. It’s worth the time for you to read it yourself. In a nutshell Hanlon focuses upon the unwillingness to take risks as the thing that is sapping our ability to progress both scientifically and socially.

In times of war or crisis, power is easily stolen from the many by the few on a promise of security. The more elusive or imaginary the foe, the better for manufacturing consent.

― Ronald Wright

Of course part of my attraction to the article was it similarity in thought to a number of my own posts:

https://williamjrider.wordpress.com/2014/11/17/progress-and-the-social-contract/

https://williamjrider.wordpress.com/2014/03/03/we-only-fund-low-risk-research-today/

https://williamjrider.wordpress.com/2013/11/27/trust/

https://williamjrider.wordpress.com/2014/11/12/its-all-about-trust/

https://williamjrider.wordpress.com/2013/12/06/postscript-on-trust-or-trust-and-inequality/

https://williamjrider.wordpress.com/2014/09/26/what-is-the-source-of-the-usas-tilt-to-the-right/.

He did have a significant amount of additional thinking beyond anything I’ve written. I thought his thesis that we have become risk adverse as a society is worth a great deal of consideration. Moreover he didn’t really pull out the big smoking gun with respect tTSA-Airport-Security-Electronic-Devices-Will-Undergo-Inspectiono risk adverse behavior, the “War on Terror”. Almost nothing demonstrates our commitment to not taking risks as well as this. Moreover it has become a tool for amplifying fear for those taking distinct and specific advantage of it to feather their own beds.

A crisis is a terrible thing to waste.

― Paul Romer

While I agree with the general nature of Hanlon’s thesis, I think something deeper might be at work.worldwar1somme-tl

The 20th Century was punctuated by three catastrophic crises: World War 1, the great depression and World War 2. The great leap forward for humanity took place in the wake of the Second World War’s carnage powered by American hegemony, a rebuilding industrialized World and the 27-0701aCold War. The great stagnation that started in the early 1970’s has seen each of those elements come to a halt, and an immense rise of two paired elements massive inequality and conservatism. The rebuilding of the class of oligarchs is destroying the vast middle class that marked that period of great progress. The conservative movements are a direct response to the vast social (and technological) progress. The conservatism is a reaction to the outright fear of change that Hanlon identifies.

At first sign of crisis, the ignorant don’t panic because they don’t know what’s going on, and then later they panic precisely because they don’t know what’s going on.

― Jarod Kintz

Ironically, the oligarchs have relied upon the rejection of progressive ideals for business to power their accumulation of wealth. They have also relied upon the massive societal disruptions associated with technology to provide much of means of cold-war-hero-Hcreating wealth that is outside the established channels of the social order. The conservatives have come as a reaction to the sort of changes produced in the “Golden Quarter” as Hanlon describes it. Fear of racial equality is driven by the loss of the white majority, and religious fundamentalism reacts to the sorts of freedoms earned during that period. All of this amplified by the discomfort of new technology while the new technology creates change in society that wreck havoc with the traditional social order.

Crisis is Good. Crisis is a Messenger.

― Bryant McGill

What will fix this and result us to the sort of progress that humanity should aspire toward? I fear it will be a new set of calamities that will surely be unleashed on society some time in the (near) future. We are approaching a serious instability and believe the events of 9/11 and the financial crisis in 2008 were merely pre-shocks to what is coming. As before during the 20th Century these calamities are the results of systematic imbalances and the violent end of eras of excess.

Treat this crisis as practice for the next crisis.

― John Parenti

Consolidated B-24Old Europe started to die in World War 1 and its wake helped set in motion forces that created the depths of the depression and the cataclysm of World War 2, which marked the end of Old Europe and the birth of that Golden Quarter. One must also remember that the excesses of the hyper-rich and inequality also played a key roll in how WWI and the depression unfolded. These excesses unleashed a torrent of progressive action to fix the damage to society. It seems that the same thing could unfold in the future to end the current era of stagnation and greed. Let’s hope not. One might hope we have the wisdom to turn away before things get so bad.

review50_img6The bind we are in today is largely about trust and faith in each other. We don’t trust because we know how selfish, self-centered and fundamentally corrupt we are. We assume everyone else is just as untrustworthy. Without trust the ability to do anything important or great simply doesn’t exist. No one is worth investing anything for the good of the whole. Every action has become centered on the good of the self. Crisis and calamity are built by such selfishness. Unfortunately, America is the most selfish place in the world, bar none. You do the math, who is the most likely to trigger the next calamity?

Then the shit hit the fan.

― John Kenneth Galbraith

The Scheduled Breakthrough

04 Thursday Dec 2014

Posted by Bill Rider in Uncategorized

≈ 1 Comment

In preparing for battle I have always found that plans are useless, but planning is indispensable.

― Dwight D. Eisenhower

It is quarterly review time and it is a reminder of how terribly we run the Labs these days. We run our projects really well and simultaneously run the Labs into the ground in tDilbert200607251he process. Our mode of project/program management and accountability is crushing our ability to do meaningful work. We make plans for our research, which includes milestones, Gantt charts, and the like. While I don’t have anything against planning per se, I have a lot being held to the plans. The quarterly reports are exemplars of being held to a plan that should only be an initial trajectory, and not the final destination.

Pert_example_gantt_chartI will grant you that the approach to project management has its place. A lot of rote construction projects should be done this way. A complex, but often executed project should run this way. I am talking about research. Research is the exemplar of what should absolutely not be run this way. Yet we do it with almost Pavlovian glee.

Remember the two benefits of failure. First, if you do fail, you learn what doesn’t work; and second, the failure gives you the opportunity to try a new approach.

—Roger Von Oech

The biggest problem is that the approach to doing project management along with time reporting is crushing innovation in research. Innovation simply doesn’t work this way, and it’s likely that the management is actually undermining the capacity to do innovative work. We have come up with a way of describing what they ask for “the scheduled breakthrough”. You predetermine what your “breakthrough” will be. Of course research plainly doesn’t work that way, that’s why its research. The more that you are held to the breakthroughs you promised in your plan, the less likely real success will be.

Bureaucracy destroys initiative. There is little that bureaucrats hate more than innovation, especially innovation that produces better results than the old routines. Improvements always make those at the top of the heap look inept. Who enjoys appearing inept?

—Frank Herbert

Perhaps the worst thing about this approach is what the planning is doing to our ability to think outside the box. We are increasingly defining objectives that we know we can accomplish instead of reaching for new things. If an objective looks too difficult or risky, it will be rejected. The more we hold people to their written objectives, the more we rule out innovation and discovery. The only time people put risky objectives down on their plans are when the discovery has already been made. In other words, the plan is do deliver work you’ve already completed. This is even worse than mediocrity, it is stasis.

Following the rules of your industry will only get you so far.

—Max McKeown

This trend has been going on for several decades now. The entire approach has led to the quality of the work at the Labs decreasing. The oversight and attention to detail is directly associated with working in an environment where trust is low. While I believe that my managers trust me, it is clear that our country does not trust the Labs and science as a whole. So we have this system to enforce accountability. Maybe it does that, but itdilbert-project   also undermines the effectiveness of the work. It is choking our science to death. I believe many of the same things are happening more broadly to research that happens at places like universities. We have more accountability and worse results. The system is harming our country seriously, and there is no end it sight.

Where all think alike there is little danger of innovation.

—Edward Abbey

I have the option of being either honest and ineffective or dishonest and effective. What a horrible choice I’m being offered from a horrible, stupid system. This is what happens when trust is low.

Plans are of little importance, but planning is essential.

― Winston Churchill

Robustness is Stability, Stability is Robustness, Almost

03 Wednesday Dec 2014

Posted by Bill Rider in Uncategorized

≈ 3 Comments

Recently, I wrote about the priorities in code development putting accuracy and efficiency last in the list of priorities (https://williamjrider.wordpress.com/2014/11/21/robust-physical-flexible-accurate-and-efficient/). Part of the not very implied critique associated with this is that the relative emphasis in development is very close to the opposite of my list. High performance computing and applied mathematics today is mostly concerned with efficiency (first) and accuracy (second). I believe these priorities do us a disservice, represent a surplus of hubris and fail to recognize some rather bold unfinished business with respect to stability theory.

All that it is reasonable to ask for in a scientific calculation is stability, not accuracy.

–Nick Trefethen

Regions_02I thought about what I wrote a few weeks ago, and realized that when I state robust, I mean almost the same thing as stable. Well almost the same is not the same. Robust is actually a stronger statement since it implies that the answer is useful in a sense. A stable calculation can certainly produce utter and complete gibberish (it may be even more dangerous to produce realistic-looking, but qualitatively/quantitatively useless results). I might posit that robustness could be viewed as a stronger form of stability that provides a guarantee that the result should not be regarded as bullshit.

stability-3.hiresPerhaps this is the path forward I’m suggesting. The theory of PDE stability is rather sparse and barren compared to ODE theory. PDE stability is really quite simple conceptually, while ODE stability theory is rich with detail and nuance. One has useful and important concepts such as A-stability, L-stability and so on. There are appealing concepts such as relative-stability and order stars, which have no parallel in PDE stability. I might be so bold as to suggest that PDE stability theory is incomplete and unfinished. We have moved toward accuracy and efficiency and never returned to finish the foundation they should be built upon. We are left with a field that has serious problems with determining quality and correctness of solutions (https://williamjrider.wordpress.com/2014/10/15/make-methods-better-by-breaking-them/, https://williamjrider.wordpress.com/2014/10/22/821/).

Maybe a useful concept would be robust stability. What are the conditions where we can expect the results to be physical and nonlinearly stable? Instead the concept of robustness often gets a bad name because it implies tricks and artifices used to produce results securely. A key point is that robustness is necessary for codes to do useful work, yet doing the work of making methods robust is looked down upon. Doing this sort of work successfully resulted in the backhanded compliment/slight being thrown my way:

you’re really good at engineering methods.

Thanks, I think. It sounds a lot like,

you’re a really good liar

In thinking about numerical methods perhaps the preeminent consideration is stability. As I stated, it is foundational for everything. Despite its centrality to the discussion today, stability is a rather later comer to the basic repertoire of the numerical analyst only being invented in 1947 while many basic concepts and methods precede it. Moreover its invention in numerical analysis is extremely revealing about the fundamental nature of computational methods. Having computers and problems to solve with them drives the development of methods.Eniac

Recently I gave a talk on the early history of CFD (https://williamjrider.wordpress.com/2014/05/30/lessons-from-the-history-of-cfd-computational-fluid-dynamics/) and did a bit of research on the origin of some basic concepts. One of my suppositions was that numerical stability theory for ODEs must have preceded that for PDEs. Instead this was not true! PDEs came first. The reason for this is the availability and use of automatic computation (i.e., computers). Because of the applications of PDEs to important defense work during and after World War 2, the problem of stability was confronted. Large-scale use of computers for integrating ODEs didn’t come along until a few years later. The origins of stability theory and its recognition are related in a marvelous paper by Dahlquist [Dahlquist], which I wrote about earlier (https://williamjrider.wordpress.com/2014/08/08/what-came-first-the-method-or-the-math/). There I expressed my annoyance at the style of mathematics papers that obscures the necessary human element in science in what I believe to be a harmful manner. The lack of proper narrative allows the history and impact of applied math to be lost in the sands of time.

JohnvonNeumann-LosAlamosThe PDE stability theory was first, and clearly articulated by John Von Neumann and first communicated during lectures in February 1947, and in a report that same year [VNR47]. These same concepts appeared in print albeit obliquely in Von Neumann and Goldstine [VNG47], and Crank-Nicholson’s classic [CN47]. Joe Grcar gives a stunning and full accounting of the work of Von Neumann and Goldstine and its impact on applied mathematics and computing in SIAM Review [Grcar]. Since Von Neumann had access to and saw the power of computing, he saw stability issues first hand, and tackled them. He had to, it bit him hard in 1944 [MR14]. His stability analysis methodology is still the gold standard for PDEs (https://williamjrider.wordpress.com/2014/07/15/conducting-von-neumann-stability-analysis/).

Another theme worth restating is the roll of (mis-)classification of the early reports had on muddying the history. LA-657, which was the report on the first mention of stability in numerical analysis was classified until 1993 even though the report is clearly unclassified (https://williamjrider.wordpress.com/2014/11/20/the-seven-deadly-sins-of-secrecy/). As it turned out the official unveiling of the ideas regarding stability of PDEs came out in two papers in 1950 [VNR50,CFVN50].

germundAs Dahlquist relays the PDE world had a head start, and other important work was conducted perhaps most significantly the equivalence theorem of Lax [LaxEquiv]. This theorem was largely recreated independently by Dahlquist two or three years later (he reports that Lax gave the theory in a seminar in 1953). The equivalence theorem states that the combination of stability and consistency is equivalent to convergence. Being rather flip about this stability means getting an answer and consistency means solving the right problem.

 

From there the ODE theory flowered and grew to the impressive tapestry we have today. A meaningful observation is that we have a grasp of the analytical theory for the solution of ODEs that eludes us today with PDEs. Perhaps the PDE theory would flow like water from a breaking dam were such an analytical theory available. I’m not so sure. Maybe the ODE theory is more of the consequence of the efforts a few people or a culture that was different from the culture responsible for PDEs. Its worth thought and discussion.

The investigator should have a robust faith – and yet not believe.

 Claude Bernard

[LaxEquiv] Lax, Peter D., and Robert D. Richtmyer. “Survey of the stability of linear finite difference equations.” Communications on Pure and Applied Mathematics 9, no. 2 (1956): 267-293.

[VNG] Von Neumann, John, and Herman H. Goldstine. “Numerical inverting of matrices of high order.” Bulletin of the American Mathematical Society 53, no. 11 (1947): 1021-1099.

[Dahlquist] Dahlquist, Germund. “33 years of numerical instability, Part I.” BIT Numerical Mathematics 25, no. 1 (1985): 188-204.

[CN47] Crank, John, and Phyllis Nicolson. “A practical method for numerical evaluation of solutions of partial differential equations of the heat-conduction type.” In Mathematical Proceedings of the Cambridge Philosophical Society, vol. 43, no. 01, pp. 50-67. Cambridge University Press, 1947.

[CFVN50] Charney, Jules G., Ragnar Fjörtoft, and J. von Neumann. “Numerical integration of the barotropic vorticity equation.” Tellus 2, no. 4 (1950): 237-254.

[VNR50] VonNeumann, John, and Robert D. Richtmyer. “A method for the numerical calculation of hydrodynamic shocks.” Journal of applied physics 21, no. 3 (1950): 232-237.

[VNR47] VonNeumann, John, and Robert D. Richtmyer. “On the numerical solution of partial differential equations of parabolic type.” Los Alamos Scientific Laboratory Report, LA-657, December 1947.

[Grcar] Grcar, Joseph F. “John von Neumann’s analysis of Gaussian elimination and the origins of modern Numerical Analysis.” SIAM review 53, no. 4 (2011): 607-682.

[MR14] Mattsson, Ann E., and William J. Rider. “Artificial viscosity: back to the basics.” International Journal for Numerical Methods in Fluids (2014). DOI 10.1002/fld.3981

Are choices a blessing or a curse?

01 Monday Dec 2014

Posted by Bill Rider in Uncategorized

≈ Leave a comment

We are our choices.

― Jean-Paul Sartre

t_section_mesh_cropLife comes with many choices regarding what to do, what to eat, buy, watch, listen to and so on. Depending on your personal tastes these choices are wonderful or a burden. If you really care about something quite often you demand choices to be happy. You won’t be pleased with limited options when you know something better isn’t even being offered. In other cases where you aren’t emotionally invested, too many choices can be a burden, and unwelcome. You just need something functional and aren’t willing to expend the effort to sift through a bunch of alternatives. This distinction happens over and over across our lives both personal and professional.

FEMShapesWhat one person demands as a phalanx of options is a crushing affront to another. The demands of choice come from the aficionado who sees the texture and variation among the choices. When no options are available it can be greeted as the acceptance of something awful. This could even be true for the single option, which is acknowledged as the best and would be chosen from many options. On the other hand for someone who doesn’t care about the details, the mediocre is just fine. It isn’t that they wouldn’t like something better; it is that they can’t tell the difference or don’t care. This sort of dichotomy exists with everyone and varies topic to topic. It plays a huge role in science and engineering. I am certainly guilty of this, and I suspect all of you are too.

In any moment of decision, the best thing you can do is the right thing. The worst thing you can do is nothing.
― Theodore Roosevelt

shapesA while back I wrote about what I don’t like about the finite element method (FEM) (https://williamjrider.wordpress.com/2014/08/01/what-do-i-have-against-the-finite-element-method/). Over the long weekend I was thinking about adaptivity and robustness in numerical methods. Some of my thoughts were extensions of the Riemann solver work discussed last week. When my thoughts turned to finite element powered methods, it dawned on me that I didn’t have many options, or more properly the extensive choices offered by other frameworks. The choices I did have were limited in scope and flexibility. Some approaches to method adaptation were simply absent.images-2

I realized that this was what really deeply bothered me about finite elements. It isn’t the method at all; it’s the lack of options available to engineer the method. For a lot of engineers the FEM is a “turn the crank” exercise. You get a mesh, and pick the degrees of freedom, put the governing equations into the weak form and integrate. You have a numerical method and you are done. For complex physics this approach can be woefully inadequate and with the FEM you aren’t left with much to do about it.

UnknownWorking at Sandia one thing is always true; the code you write will implement the FEM. With a new project and generally it would be very beneficial to have multiple valid discretizations on the same mesh. This would enable a number of things such as error estimation, resilience against hardware errors, and more robust overall algorithms. The problem is that the FEM generally offers a single preferred discretization once the mesh and associated elements are chosen.

To some extent this is overstated. Some FEM methods offer a bit more in the way of options such as discontinuous Galerkin. Additionally, one could chose to over- or Unknownunder-integrate, lump the mass matrix or apply a stabilization method. Even then, the available options for discretizing are rather barren compared with finite volume or finite difference methods. It feels like a straightjacket by contrast to relative unconstrained freedom. Even the options I once worked with were too constrained compared with the universe of possibilities offered as I discovered in my most recent paper (“Revisiting Remap Methods” DOI 10.1002/fld.3950).

The hardest choices in life aren’t between what’s right and what’s wrong but between what’s right and what’s best.
― Jamie Ford

For people whose job is doing analysis of physical or engineered systems with codes, the options are a burden. They just want something that works and don’t care much about the detail. They graciously accept something better or something improved even if they couldn’t express the reasons for the improvement. With commercial CFD codes this situation has become critical. These codes reflect a relatively stagnant state of affairs with CFD methods.

Unknown-2For me this is a particular issue in the area of shock physics. Most of the users of shock physics codes are completely happy with their options. For some, the code simply needs to run to completion and produce something that looks plausibly realistic. For me this seems like a god-awfully low standard, and I see methods that are antiquated and backwards. The code users usually only notice new methods when something bad happens, the code runs slower, the answer changes from the ones they’ve grown accustomed to, or the code crashes. It is a rarity for the new method to be greeted as a benefit. The result is stagnation and a dearth of progress.

Sometimes you have to choose between a bunch of wrong choices and no right ones. You just have to choose which wrong choices feels the least wrong.
― Colleen Hoover

This trend is fairly broad. As numerical methods have matured, the codes based upon them have stabilized because the users are generally satisfied with the options offered. Improvements in the methodology are not high on their wish list. Moreover, they have a general understanding of the options the codes are based on, and have little interest in the methods that might be developed to improve upon them. As such the bar for improving codes has been raised to a very high level. With the cost of implementing codes on new computer architectures growing, the tide has turned to a focus on legacy methodology on modern computers sapping the impetus to improve.

Unknown-1We have become a community that sees options as a burden. Other burdens such as changes in computers are swallowing the appetite for the options that exist. As time goes by, the blessings seem more and more distant and foreign to the thought process. Moreover the users of codes don’t see the effort put into better methods as a virtue and want to see focus on improving the capacity to model the physical systems they are interested with. Part of this relates strongly to the missing elements in the education of people engaged in modeling and simulation. The impact of numerical methods on the modeling of physical systems is grossly under-appreciated, and leads to a loss of images-1perspective. Methods in codes are extremely important and impactful (artificial, numerical and shock dissipation anyone?). Users tend to come close to completely ignoring this aspect of their modeling due to the esoteric nature of its impact.

When faced with two equally tough choices, most people choose the third choice: to not choose.

― Jarod Kintz

A

Resistance is Futile

29 Saturday Nov 2014

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Time is an illusion.

― Albert Einstein

Infinity-Time1 Time is relentless. As an opponent it is unbeatable and can only be temporarily held at bay. We all lose to it, with death being the inevitable outcome. Science uses the second law of Thermodynamics as the lord of time. It establishes a direction defined by the creation of greater disorder. In many ways the second law stands apart from other physical laws in its fundamental nature. It describes the basic character of change, but not its details.

But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.

— Sir Arthur Stanley Eddington

2ndLT_Colour_Small Change is constant and must be responded to. The challenge of the continual flow of
events provides the key distinguishing character of response. On the one hand conservatives resist the flow and attempt to retain the shape of the World. Liberals and progressives work to shape the change to so that the World changes for the better. Where the conservative sees the best future in the past, the progressive sees the best future as being a new beginning.

Change isn’t made by asking permission. Change is made by asking forgiveness, later.

― Seth Godin

f51c72c0f963fba023c505963654d5b0These tendencies are seen in differing taste for the arts. Take music where oldies are the staple of conservatives who don’t warm to the newer ideas. The old standards of their childhood and teen years make for a calming influence and sentimental listening. The progressive ear looks for new combinations rather than the familiar. Invention and improvisation are greeted warmly as a new challenge to one’s tastes. For example rap is viewed as not being music of any sort by the conservative ear, and greeted as stunningly original by the liberal ear. On the one hand the past is viewed as a template for the future, and on the other changes are seen as the opportunity for improvement. This tension is at the core of humanity’s struggle for mastery over time.images

Our time is marked by certain emblematic moments such as 9/11, Nixon’s resignation or the fall of the Berlin Wall. Each of these moments clearly defines a transition from everything before it, to everything after it. Some of these moments are simply climaxes to events preceding them. september-9-11-attacks-anniversary-ground-zero-world-trade-center-pentagon-flight-93-second-airplane-wtc_39997_600x450The horror of 9/11 started with the rise of Islam 1400 years ago, continuing with the Crusades, European colonialism, the oil crisis of the 70’s, American support for the Shah and his fall with rise of Islamic fundamentalism, the Soviet invasion of
Afghanistan, the American invasion of Iraq and the constancy of Arab tension over Israel. Our response has assured that the war will continue and has only enflamed more terrorism. Rather than short-circuit the cycle of violence we have amplified it, and assured its continuation for another generation. We have learned nothing from the history leading up to the event of September 11, 2001.

Tradition becomes our security, and when the mind is secure it is in decay.

― Jiddu Krishnamurti

These developments highlight some of the key differences with conservative and liberal responses to crisis. The conservative response usually takes little note of history, and applies power as the strategy. Power usually suits the powerful being arguably simple and usually effective. Liberals and progressives on the other hand are eager to take a different path, try something new and different, but often encounter paralysis from the analysis of the past. The different approach is often a failure, but when it succeeds the results are transformative. Power’s success only reinforces the power applying it. Ultimately when power encounters the right challenge, it fails and upsets the balance. In the end the power is reset and eventually the balance will be restored with a new structure at the helm.

Societies in decline have no use for visionaries.

― Anaïs Nin

shapeimage_1In science, the same holds conventional theories and approaches work almost all the time, but when they are overturned it is monumental. Even there conservative approaches are the workhorse and the obvious choice. Every so often they are exposed by something progressive and new that produces results the old approaches could not. This is the sort of thing that Kuhn wrote about with revolutions in science. As with other human endeavors the liberal and progressive wing leads science’s galileoadvance. The day-in, day-out work of science is left to the conservative side of things.

 “Normal science” means research firmly based upon one or more past scientific achievements, achievements that some particular scientific community acknowledges for a time as supplying the foundation for its further practice.

— Thomas Kuhn

So we are left with a balance to achieve. How do we handle the inevitability of change from the remorseless march of time? Are we interested in the conservative approach leading to uninspired productivity? Or progressive breakthroughs that push us forward, but most often end in failure?

All the effort in the world won’t matter if you’re not inspired.

― Chuck Palahniuk

Adaptivity is Under Utilized

28 Friday Nov 2014

Posted by Bill Rider in Uncategorized

≈ Leave a comment

The measure of intelligence is the ability to change.

― Albert Einstein

In looking at the codes we work with today one thing stands out. The methods used in production software are generally much simpler than it should be. Advances that should have been commonplace by now aren’t present. There seems to be a good reason for this; the complexity of implementing algorithms on modern computers biases choices toward the simple. The result is a relative stagnation in algorithms with telltale sign of utilizing adaptive concepts far less than one would have imagined.

Extraordinary benefits also accrue to the tiny majority with the guts to quit early and refocus their efforts on something new.

― Seth Godin

UnknownThe types of adaptivity most commonly seen are associated with adaptive grids (or “h” refinement). Grids lend themselves to straightforward understanding and impressive visualization. Even with its common presence, even this form of adaptivity is seen far less than one might have thought looking forward twenty years ago. Adaptivity takes other forms far less common than h-refinement such as p-adaptivity where the order of an algorithm is adjusted locally. A third classical form is r-adaptivity where the mesh is moved locally to improve solutions. This is the second most common approach in the guise of remesh-remap methods (or ALE codes). I’d like to chat about a handful of other approaches that could be big winners in the future especially if combined with the imagesclassical approaches.

To improve is to change; to be perfect is to change often.

― Winston S. Churchill

One of the really big options to exercise with adaptivity are algorithms. Simply changing the algorithm based on local solution characteristics should yield great enhancement in accuracy, and robustness. Taken broadly the concept has been around a long time even if it isn’t recognized as such. Right from the beginning with Von Neumann and Richtmyer’s artificial viscosity the addition of nonlinear dissipation renders the method adaptive. The dissipation is effectively zero if the flow is smooth, and dominant if the flow is discontinuous. Upwinding is another such approach where the support (or stencil) for a method is biased by the physics for better (less accurate, but physical) results.

imagesThese are relatively simple ideas. More complex adaptation in algorithms can be associated with methods that use nonlinear stencils usually defined by limiters. These methods use a solution quality principle (typically monotonicity or positivity) to define how a computational stencil is chosen (FCT, MUSCL, and TVD are good examples). More advanced methods such as essentially non-oscillatory (ENO) or the elegant Weighted ENO (WENO) method take this adaptivity up a notch. While algorithms like FCT and TVD are common in codes, ENO hasn’t caught on in serious codes largely due to complexity and lack of overall robustness. The robustness problems are probably due to the overall focus on accuracy over robustness as the key principle in stencil selection.

cyclesOne area where the adaptivity may be extremely useful is the construction of composite algorithms. The stencil selection in ENO or TVD is a good example as each individual stencil is a consistent discretization itself. It is made more effective and higher quality through the nonlinear procedure used for selecting. Another good example of this principle is the compositing of multigrid methods with Krylov iterations. Neither method is as effective on its own. They either suffer from robustness (multigrid) or suboptimal scaling (Krylov). Together the methods have become the standard. Part of the key to a good composite is the complementarity of the properties. In the above case multigrid can provide optimal scaling and Krylov offers stability. This isn’t entirely unlike TVD methods where upwinding offers the stability, and one of the candidate stencils offers optimal accuracy.ConvergenceHistory

A third area to consider is adaptive modeling approaches. One example can be found with multiscale methods where a detailed (assumed more accurate) model is used for the physics to make up for the crude baseline model. In many cases multiple models might be considered to be valid or applicable such as turbulence, failure or fracture modeling. In other cases none of the available models might be applicable. It might make sense to solve all the models and establish conditions for choosing or compositing their effect on the solution. If done correctly the limitations of a single method might be overcome through the selection procedure. In each of the cases mentioned above the current approaches are woefully inadequate.Unknown-2

A general issue with adaptivity that in estimation is holding it back is the relative balance of focus on accuracy over robustness. I believe great tipping the balance toward robustness demanded for applications could make progress. In academic research accuracy is almost always the focal point often at the cost of robustness. Efficiency would be the second focal point that undermines adaptivity’s adoption by codes.
As I noted in an earlier post, https://williamjrider.wordpress.com/2014/11/21/robust-physical-flexible-accurate-and-efficient/, the emphasis is often opposite to what applications demand. The combination of robustness-physicality-flexibility might do well to replace the typical accurate-efficient focus. The efficiency focus has hamstrung methods development for the whole of the MPP era, and the next generation of computers promises to make this worse. Combined with the research focus on accuracy this produces a combined impact of spurring outright stagnation in deployment of the adaptive approaches that ought to be dominating computation today.images

The world as we have created it is a process of our thinking. It cannot be changed without changing our thinking.

― Albert Einstein

Despite our massive advances with the raw power of computers, we have missed immense opportunities to unleash their full potential. The mindset that has created this environment is still dominant; more emphasis is placed on running old methodology on new computers than inventing new (better) methodologies optimal to the new computers. The result of this pervasive mismanagement is a loss of opportunity, and a loss potential. The end result is also a lack of true capability and problem solving capacity on these computers. Over time this stagnation has cost us more problem solving capability than we have gained over the same period of time with faster computers.

It’s never too late

…to start heading in the right direction.

― Seth Godin

← Older posts
Newer posts →

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • March 2026
  • February 2026
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Regularized Singularity
    • Join 60 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Regularized Singularity
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...