• About The Regularized Singularity

The Regularized Singularity

~ The Eyes of a citizen; the voice of the silent

The Regularized Singularity

Monthly Archives: November 2015

Calibration, Knobs and Uncertainty

27 Friday Nov 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

When the number of factors coming into play in a phenomenological complex is too large scientific method in most cases fails. One need only think of the weather, in which case the prediction even for a few days ahead is impossible.

― Albert Einstein

One of the dirty little secrets of computing in the scientific and engineering worlds is the fact that the vast majority of serious calculations are highly calibrated (and that’s the nice way to say it). In many important cases, the quality of the “prediction” is highly dependent upon models being calibrated against data. In some cases calling the calibrated “models,” does modeling a great disservice, and the calibration instruments are simply knobs used to tune the calculation. The tuning accounts for serious modeling shortcomings and often allows the simulation to produce results that approximate the fundamental balances of the physical system. Often without the calibrated or knobbed “modeling” the entire simulation is of little use and bears no resemblance of reality. In all cases this essential simulation practice creates a huge issue for the proper and accurate uncertainty estimation.

Confidence is ignorance. If you’re feeling cocky, it’s because there’s something you don’t know.

― Eoin Colfer

At some deep level the practice of calibrating simulations against data is entirely unavoidable. Behind this unavoidable reality is a more troubling conclusion that our knowledge of the World is substantially less than we might like to freely admit to ourselves. By the same token the actual ClimateModelnestinguncertainty in our knowledge is far larger than we are willing to admit. The sort of uncertainty that is present cannot be meaningfully addressed through the focus on more computing hardware (its assessment could be helped, but not solved). This uncertainty can only be addressed through a systematic effort to improve models and engage in broad experimental and observation science and engineering. If we work hard to actively understand reality better the knobs can be reduced or even removed as knowledge grows. This sort of work is exactly the sort of risky thing our current research culture eschews as a matter of course.

Do not fear to be eccentric in opinion, for every opinion now accepted was once eccentric.

― Bertrand Russell

This area of modeling and simulation is essential to many areas to varying degrees. If we are to advance our use and utility of modeling and simulation with confidence, it must be dealt with in a better and more honest way. It is useful to point to a number of important applications where calibration or knobs are essential to success. For air flow over an airplane or automobiles turbulence modeling is essential and turbulence is one of the key areas for calibrated results. Climate and weather modeling is another area where knobs are utterly essential. Plasma physics is yet another area where the modeling is so poor that calibration is absolutely necessary. Inertial or magnetically confined fusion both require knobs to allow simulations to be useful. In addition to turbulence and mixing, various magnetic or laser physics add to the problems with simulation quality, which can only be dealt with effectively through calibration and knobs.

You couldn’t predict what was going to happen for one simple reason: people.

― Sara Sheridan

The conclusion that I’ve come to is that the uncertainty in the cases of calibrated or knobbed calculation has two distinct faces each of which should be fully articulated by those conducting simulations. One is the best-case scenario of the simulated uncertainty, which depends on the modeling and its calibration being rather complete and accurate in capturing reality. The second is the pessimistic case where the uncertainty comes from the lack of knowledge that led to the need for calibration in the first place. If the simulation is calibrated, the truth is that the calibration is highly dependent upon the data used and guarantees of validity are dependent on matching the conditions closely associated with the data. Outside the range where the data was collected, the calibration should carry with it greater uncertainty. The further we move outside the range defined by the data, the greater the uncertainty.

This is most commonly seen in curve fitting using regression. The curve and the data are closely correlated and standard uncertainties are relatively small. When the uncertainty is taken outside the range of the data, it grows much larger. In the assessment of uncertainty in calculations this is rarely taken into account. Generally those using calculation like to be blithely unaware of whether the calibrations they are using are well within the range of validity. Calibration is also imperfect and carries an error with them intrinsic to the determination of the settings. The uncertainty associated with the data itself is always an issue when either taking the optimistic or more pessimistic face of uncertainty.

A potentially more problematic aspect of calibration is using the knobs toMesh_Refinement_Image4 account for multiple effects (turbulence, mixing, plasma physics, radiation and numerical resolution are common). In this cases the knobs may account for a multitude of poorly understood physical phenomena, mystery physics and lack of numerical resolution. This creates a massive opportunity for severe cognitive dissonance, which is reflected in an over-confidence in simulation quality. Scientists using simulations like to provide those funding their work with greater confidence than it should carry because the actual uncertainty would trouble those paying for it. Moreover the range of validity for such calculation is not well understood or explicitly stated. One of the key aspects of the calibration being necessary is that the calculation cannot reflect a real World situation without it. The model simply misses key aspects of reality without the knobs (climate modeling is an essential example).

In the cases of the knobs accounting for numerical resolution, the effect is usually crystal clear when the calibration of the knob settings needs to be redone whenever the numerical resolution changes because a new faster computer becomes available. The problem is that those conducting the calculations rarely make a careful accounting of this effect. They simply recalibrate the calculations and go on without ever making much of it. This often reflects a cavalier attitude toward computational simulation that rarely intersects with high quality. This lack of transparency can border on delusional. At best this is simply intellectually sloppy, at worst it reflects a core of intellectual dishonesty. In either case a better path is available to us.

Science is not about making predictions or performing experiments. Science is about explaining.

― Bill Gaede

titan2In essence there are two uncertainties that matter: the calibrated uncertainty where data is keeping the model reasonable, and the actual predictive uncertainty that is much larger and reflects the lack of knowledge that makes the calibration necessary in the first place. Another aspect of the modeling in the calibrated setting is the proper use of the model for computing quantities. If the quantity coming from the simulation can be tied to the data used for calibration, the calibrated uncertainty is a reasonable thing to use. If the quantity from the simulation is inferred and not directly calibrated, the larger uncertainty is appropriate. Thus we see that the calibrated model has intrinsic limitations, and cannot be used for predictions that go beyond the data’s physical implications. For example climate modeling is certainly reasonable for examining the mean temperature of the Earth. One the other hand the data associated with extreme weather events like flooding rains are not calibrated, and uncertainty regarding their prediction under climate change are more problematic.

climate_modeling-ruddmanIn modeling and simulation nothing comes for free. If the model needs to be calibrated to accurately simulate a system, the modeling is limited in an essential way. The limitations in the model are uncertainties about aspects of the system tied to the modeling inadequacies. Any predictions of the details associated with these aspects of the model are intrinsically uncertain. The key is the acknowledgement of the limitations associated with calibration. Calibration is needed to deal with uncertainty about modeling, and the lack of knowledge limits the applicability of simulation. One applies the modeling in a manner that is cautious, if they are being rational. Unfortunately people are not rational and tend to put far too much faith in these calibrated models. In these cases they engage in wishful thinking, and fail to account for the uncertainty in applying the simulations for prediction.

It is impossible to trap modern physics into predicting anything with perfect determinism because it deals with probabilities from the outset.

― Arthur Stanley Eddington

If we are to improve the science associated with modeling and simulation the key is uncertainty. We should charter work that addresses the most important uncertainties through well-designed scientific investigations. Many of these mysteries cannot be addressed without adventurous experimentation. Current modeling approaches need to be overthrown and replaced with different approaches without limitations (e.g., the pervasive mean field models of today). No amount of raw computing power can solve any of these problems. Our current research programs in high performance computing are operating in complete ignorance of the approach necessary for progress.

All you need in this life is ignorance and confidence, and then success is sure.

– Mark Twain

Supercomputing Today is Big Money Chasing Small Ideas

19 Thursday Nov 2015

Posted by Bill Rider in Uncategorized

≈ 1 Comment

We adhere to the saying, “if it ain’t broke, don’t fix it,” while not really questioning whether “it” is “broke.”

― Clayton M. Christensen

Supercomputing is a trade show masquerading as a scientific conference and at its core big money chasing small ideas. It takes place this week in Austin, and features the slogan “HPC Transforms“. The small idea is that all we need to do for modeling & simulation (and big data) to succeed is build faster computers. This isn’t a wrong idea per se, but rather a naïve and simplistic strategy that is suboptimal in the extreme. Its what we are doing despite the vacuous thinking behind it. Unfortunately we and other countries are prepared to spend big money on this strategy while overlooking the rather obvious and more balanced path to success. The balanced path is more difficult, challenging and risky, which is part of our unwillingness to pursue it. The tragedy that is unfolding is one of lost opportunity for true sustainable progress and massive societal impact.Mainframe_fullwidth

“HPC Transforms” isn’t a bad or wrong idea either. The problem is the basic concept of transformation happened decades ago, and today HPC works on the pure inertia of that generation old progress. It was the 1980’s that marked the birth of HPC and its transformative power on science. If look at HPC today we see a shell left over with only massive computing hardware being the focus. The elements of progress and success that fed the original transformative power of HPC have been allowed to whither. The heart and soul of HPC is whithering due to lack of care and feeding. A once balanced and important effort has become a dismal shell of its former self. We have allowed shallow slogans to replace a once magnificent scientific field’s path to change.

This week marked some insightful commentary about Clayton Christensen’s theory of disruptive innovation (https://hbr.org/2015/12/what-is-disruptive-innovation or the reader’s digest version http://www.businessinsider.com/clay-christensen-defends-his-theory-of-disruption-2015-11), which has become a bit of a hollow mantra and buzzword in many places. For many, like those in HPC, it has become a bit of a shallow offering about the nature of Supercomputing. Instead I’ll submit that the last twenty years has been marked by a disruptive disinnovation. The parallel computing “revolution” has ripped the heart and soul from supercomputing and left a rotting husk behind. The next generation of computing will only offer an acceleration of the process that has lobodomized supercomputing, and left a vertiable zombie Unknown-3behind. The lobodomy is the removal of attention and focus on the two pieces of computing that are most responsible for impacting reality, which I am going to refer to as the heart and soul of HPC. It doesn’t need to be this way, instead the path we are taking is a conscious choice driven by naivity and risk-aversion.

If you defer investing your time and energy until you see that you need to, chances are it will already be too late.

― Clayton M. Christensen

So what is this opposing concept of disruptive disinnovation that I’m proposing? It is a new technology that you are forced into using that undermines other important technologies. For supercomputing the concept is relatively easy to see. Computing has transformed quickly into a global economic colossus, but focused on the mobile market, which derive their value primarily through mobility, connectivity and innovation in applications.

Traditional mainframe sort of computing has changed with a distinct lack of drive for raw computing power. Low power that allows long battery life became the design mantra for computer chips and the easy performance of improvements of Moore’s law ended last decade. At the same time we have a mantra that we must have the most powerful computer (measured by some stupid benchmark that is meaningless!). This demand for the fastest computer became some sort of emptyend-world-survival-guide-staying-alive-during-zombie-apocalypse.w654national security issue to sell it without a scintila of comprehension for what makes these computers useful in the first place. The speed of the computer is one of the least important aspects of the real transformative power of supercomputing, and the most distant from its capasity to influence the real world.

To enable us to claim to have the fastest computer, which naively means we have the best science. In the process of using these new computers we undermine our modeling, methods and algorithmic work because just using these new computers was so hard. The quality of the science done with computers is completely and utterly predicated on the modeling used.

There are quarters that like to say that parallel computing was a disruptive innovation, except it made things worse. In the process we underminded the most important aspect of supercomputing to enable meaningless boasting. The concept is really simple to understand and communicate: it’s the apps stupid.url-1 The true value of computers are the applications, not the hardware. If anything should be obvious about the mobile computing era, it is the software that determines the value of computing, and we have systematically undermined the value, content and quality of our software. When I say this it is not an SQE question, but the application’s utility to impact reality.

What is this heart and soul of HPC?

Modeling is the heart of high performance computing. Modeling is the process of producing a mathematical model of the real world. HPC provided a path to solving a far greater variety and complexity of models scientifically and opened new vistas for scientific exploration and engineering creation. When modeling is a living breathing entity, it grows when it is critically compared with the reality it is supposed to represent. Some models die and others are born to replace them. Models breed with their genetic material mixing to produce better and more powerful offspring.

Today we have created walls that keep our models from breeding, growing and extending them to become better and more relevant to the issues that society is depending upon them to contribute toward. The whole modeling aspect of HPC is rather static and simply reflects a fixed point-of-view toward what we should be modeling. More than anything the current slogan-based approach to HPC simply promulgates models from the past into the future by fiat rather than an explicit choice.

You view the world from within a model.

― Nassim Nicholas Taleb

Perhaps the worst thing about the lack of attention being paid to modeling is the extreme needs that are unmet and the degree of opportunity being lost. The degree of societal impact that supercomputing could be having is being horrendously shortchanged. The leadership is fixated on hardware primarily as a low-risk path to seeming progress (a gravy train that is about to end). A higher risk path would be the support of work that evolves the utility of supercomputing into the modern world. The risk is higher, but the payoff would be potentially2-29s03immense and truly transformative. We have deep scientific, engineering and societal questions that will be unanswered, or answered poorly due to our risk aversion. For example, how does climate change impact the prevalence of extreme weather events? Our existing models can only infer this rather than simulate it directly. Other questions related to material failure, extremes of response for engineered systems, and numerous scientific challenges will remain beyond our collective grasp. All of this
opportunity is missed because we are unwilling to robustly fund risky research that would challenge existing modeling approaches.

Risks must be taken because the greatest hazard in life is to risk nothing.

― Leo Buscaglia

The soul of HPC is methods and algorithms, which together power the results that the computer can produce. We used to invest a great deal in improving methods and algorithms to amplify the good that the computer does. Today we simply use what we already have developed and re-implement them to fit onto the modern monstrosities we call supercomputers. The drive to continually improve and extend existing methods and algorithms to new heights of quality and performance is gone. We have replaced this with the attitude that these areas are mature and well developed not needing attention. Again, we can honestly assess this as a lost opportunity. In the past methods and algorithms have produced as much gain in performance as the machines. In effect they have been a powerful multiplier to the advances in hardware. Today we deny ourselves this effect to the detriment of the transformation this conference is touting to the World.

JohnvonNeumann-LosAlamosAll of this reflects a rather fundamental misunderstanding of what HPC is and could be. It is not a fully matured topic, nor is it ready to simply go into this station-keeping mode of operation. It still requires the extreme intellectual efforts and labors that put it in this transformative place societally. If HPC were more mature we might reasonably be more confident in its results. Instead HPC relies upon bravado of boastful claims that hardly match what capability it truly has. Any focused on attention on the credibility of computed results reveals that HPC has a great deal of work to do, and the focus on hardware does little to solve it. The greatest depth of work is found in modeling closely followed by issues associated with methods and algorithms.

Instead of basing a program for making HPC transformative on empirical evidence, we have a program based on unsupported suppositions. Hardware is easily understood by the naïve masses, which includes politicians and paper pushers. They see big computers making noise and lots of blinking lights. Models, methods and algorithms don’t have that appeal, yet without them the hardware is completely useless. With an investment in them we could make the hardware vastly more powerful and useful. The problem at hand isn’t that the new hardware is a bad investment; it is a good investment. The problem is how much better the new hardware could be with an appropriately balanced development program that systematically invested in modeling, methods and algorithms too.

People don’t want to buy a quarter-inch drill. They want a quarter-inch hole.

― Clayton M. Christensen

Despite this we have systematically disinvested in the heart and soul of HPC. It is arguable that our actual capacity for solving problems has been harmed by this lack of investment to the tune of 10, 100 or even 1000 times. We could have HPC that is a 1000 times more powerful today if we had simply put our resources into a path that had already been proven for decades. If we had a bolder and more nuanced view of supercomputing, the machines we are buying today could be vastly more powerful and impactful. Instead we clunk along and crow about a transformative capability that largely already happened. There are stunning potential payoffs societally that we are denying ourselves.

Modeling defines what a computer can do, and methods/algorithms define how well they can do it. What our leadership does not seem to realize is that no amount of computing power can do anything to improve a model that is not correct. The only answer that improves the ability to impact reality is a newer, better model of reality. The second aspect of supercomputing we miss is the degree to which methods and algorithms provide benefit.

Our computing power today is more dependent and has received greater benefit from the quality and efficiency of methods and algorithms than hardware. Despite the images-1clear evidence of its importance we are shunning progress in method and algorithms in order to focus on hardware. This is a complete and utter abdication of leadership. We are taking a naïve path simply because it is politically saleable and seemingly lower in obvious risk. The risk we push aside is short term, in the long term the risks we are taking on are massive and potentially fatal. Unfortunately we live in a World where our so-called leaders can make these choices without consequence.

This is an absolute and complete failure of our leadership. It is a tragedy of epic proportions. It reflects poorly on the intellectual integrity of the field. The choices made today reflect a mindset that erupted at the end of the cold war and was successful then in keeping the DOE’s national labs alive. We have gotten into a model of confusing survival with success. Instead of building from this survival strategy into something sustainable, the survival strategy has become the only strategy. If science were actually working properly, the lack of balance in HPC would have become evident. The Supercomputing meeting this week is an annual monument to the folly of our choices in investment in HPC.images-1

I can only hope that saner, more intelligent and braver choices will be made in the not too distant future. If we do we can look forward to a smarter, less naïve and far bolder future with high performance computing that brings the transformative power of modeling and simulation to life. The tragedy of HPC today isn’t what it is doing; it is what isn’t being done and the immense opportunities squandered.

We all die. The goal isn’t to live forever, the goal is to create something that will.

― Chuck Palahniuk

 

Today’s “Accountability” Destroys Quality in Science

13 Friday Nov 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Accountability is generally a good thing. We are at our best when we are held accountable to our colleagues, our efforts and ourselves. So how can accountability ever be a bad thing? The way it’s done today is a vehicle of unparalleled destructive power.

There is nothing so useless as doing efficiently that which should not be done at all.

― Peter Drucker

How to Increase Employee Accountability in the WorkplaceAvoiding accountability is never a good thing. On the other hand too much overbearing accountability starts to look like pervasive trust issues. The concomitant effects of working in a low-trust environment are corrosive to everything held accountable. As most things the key is balance between accountability and freedom, too much of either lowers performance. Today we have too little freedom and far too much accountability in a destructive form. For the sake of progress and quality a better balance must be restored. Today’s research environment is being held accountable in ways that reflect a lack of trust, and a complete lack of faith in the people doing the work, and perhaps most importantly produce a dramatic lack of quality in the work (https://williamjrider.wordpress.com/2014/10/23/excellence-and-accountability/).
Accountability can be implemented in many ways, and today in science it looks like micromanagement. How can we make accountability (a generally good thing!) destructive? We define work that should be innovative and creative in terms of well-defined deliverables and milestones (https://williamjrider.wordpress.com/2014/12/04/the-scheduled-breakthrough/), which must never be failed to execute. An important thing that comes from research is finding out what are distinctly bad ideas. The right thing to do is stop when you something is a bad idea and finds a new idea. Today we continue to plow along a path even when we know it’s the wrong one because of the sort of contracts we are accountable to. Perhaps most importantly the quality of the work rarely if ever enters into the accountability. We live in an environment where quality is simply assumed to be in place, and no one seems to have a direct and unbreakable commitment to it. In today’s accountability culture, quality is simply not part of the expectations.

images-1It shows in everything we do.

We divvy up the work into smaller and smaller bins with well-defined deliverables and quarterly progress reports. The same principles that are corrupting our business world are being applied to science (https://williamjrider.wordpress.com/2014/10/10/corporate-principles-do-not-equal-good-management/). Where these principles are arguably appropriate for business (the whole shareholder value concept as the point of business), they are unremittingly damaging to science. Yet apply them we do gleefully and wantonly. It is strangling the quality of the work that is being made accountable as surely as it wastes precious resources. Time and money are interchangeable, but the most unforgivable aspect of this is the waste of careers, talent and human potential to a cause that undermines more than it builds.

Small minds just like small stones can never create giant waves.

― Mehmet Murat ildan

The accountability we see today is destroying the ability to define, think about, and executes big ideas. We live in an era of small-minded, small ideas and a general lack of accomplishment of anything that matters. People are encouraged to work very prescriptively and narrowly within their prescriptively and narrowly defined scope of work. Success often (if not always) depends on things outside the scope of the work we are accountably doing. How can we do something “out of the box” if we are driven to always stay in “the box”? We then say that since it is outside our scope of work, it is outside what we are responsible for. We then feel that ignoring things out of scope for our responsibilities is a duty we are accountable for. The present form of accountability allows one to ignore the big picture and execute the body of work promised whether it matter or not, whether it is useful of not, and whether it is quality or not. It almost assures that work done is not well integrated or adaptive to deeper understanding.

…If there is no risk, there is no reward.

― Christy Raedeke

Another impact of the small-minded thinking is a complete lack of ownership of anything bigger than what you are directly accountable for. You are encouraged to focus only on what you are directly being paid to focus on. Coupled with naïve intellectually shallow management you have a recipe for systematic mediocrity. Just as damning is the extreme risk aversion of the management and increasingly by rank and file scientists. This pervasive risk aversion almost assures that nothing of significance will be accomplished. One can work hard on meaningless tasks and feel successful empowering an ever-diminishing quality standard for all the work touched by accountability. It assures that we will never accomplish anything big or important. In many cases this sort of approach is appropriate for building bridges, repaving roads or putting up a skyscraper. For research, science or high-end engineering it is harmful, damaging and ultimately a giant waste of money. We follow plans that do not stand the test of time and we fail to adjust to what we learn.

VyXVbzWOur accounting systems are out of control. They spawn an ever-growing set of rules and accounts to manage the work. All of this work is nothing more than a feel good exercise for managers who mostly want to show “due-diligence” and those they “manage risk”. No money is ever wasted doing anything (except increasingly all the money is wasted). Instead we are squeezing the life out of our science, which manifests itself as low quality work. In a very real way low quality science is easier to manage, far more predictable and easy to make accountable. One can easily argue that really great science with discovery and surprise completely undermines accountability, so we implicitly try to remove it from the realm of possibility. Without discovery, serendipity and surprise, the whole enterprise is much more fitting to tried and true business principles. In light of where the accountability has come from, it might be good to take a hard look at these business principles and the consequences they have reaped.

We exist in an increasingly risk adverse (https://williamjrider.wordpress.com/2015/10/23/we-want-no-risk-and-complete-safety-we-get-mediocrity-and-decline/) and fearful society beset by massive inequality of income, wealth and opportunity. Many of these terrible outcomes can be traced directly to the sorts of business principles being applied to science. Such principles are completely oriented toward driving outcomes preferentially toward the “haves” and away from the “have not’s”. Ultimately, the biggest threat to the rich and powerful is change in the status quo. The sorts of management and accountability used today mostly works to undermine any real progress, which favors the status quo. Science is one of the major societal engines of progress and change. The rich and powerful are fearful of progress, and work to kill it. We are tremendously successful at killing progress, and modern accountability is one of the best tools to do it.

Creativity requires the courage to let go of certainties.

― Erich Fromm

mediocritydemotivatorQuality suffers because of loss of breadth of perspective and injection of ideas from divergent points of view. Creativity and innovation (i.e., discovery) are driven by broad and divergent perspectives. Most discoveries in science are simply repurposed ideas from another field. Discovery is the thing we need to progress in science and society. It is the very thing that our current accountability climate is destroying. Accountability helps to drive away any thoughts from outside the prescribed boundaries of the work. Another maxim of today is the customer is always right. For us the customers are working under similar accountability standards. Since they are “right” and just as starved for perspective, the customer works to narrow the focus. We get a negative flywheel effect where narrowing focus and perspective work to enhance their effect.

Never attribute to malice that which can be adequately explained by stupidity.

― Robert Heinlein

This has manifested itself as the loss of the Labs as honest brokers. The Labs are simply sycophants today who work on what they are paid to work on. A large-scale extension of the customer is always right principle. They never provide even a scintilla of feedback to government programs because of the fear of having their funding cut. Instead they pile on to poorly constructed and intellectually shallow programs because they promise funding. Thus we get programs that are phenomenally shallow and intellectually empty, but are managed at a level that provides no freedom or innovation to rescue them from their mediocrity. The accountability means that the empty intellectual goals are executed to a tee, and any value that might have arisen from the resources is sacrificed to the altar of doing what you’re told to do.

When programs of the sort that the government is funding are integrated over decades you see an immense decline in the institutions due to the loss of autonomy of the staff. Our National leadership in science simply corrodes and younger scientists do not develop in any sort of coherent way. Careers are starved of the sorts of efforts needed to build them. We have created a generation of mediocre scientists who excel at obedience and simply grinding through projects. They are distinguished by their ability to produce the deliverables they promised and little else. Once great institutions are steaming caldrons of mediocrity and mostly just pork barrel spending (I often joke that the execution of the Lab Mission is best achieved by going out and buying a car).

An inappropriate focus on money is the root of many of these problems. These days we will do almost anything for money, and money is the primary measure of everything (https://williamjrider.wordpress.com/2014/08/29/money-makes-for-terrible-priorities/). In particular the accountability of what money is spent on provides the standard form of success. Did we do what the money was supposed to pay for? If so, success is declared. Never mind that the money has been sub-divided into ever-smaller bins that effectively destroy the ability to achieve anything bigger and more coherent. The big ideas that would really make a huge difference to everyone never happen because we can’t ever produce a body of work that is coherent enough to succeed. We are always doing work “in the box”.

imagesThe end result of our current form accountability is small-minded success. In other words we succeed at many small unimportant things, but fail at large important things. The management can claim that everything is being done properly, but never produce anything that really succeeds in a big way. From the viewpoint of accountability we see nothing wrong all deliverables are met and on time. True success would arise by attempting to succeed at bigger things, and sometimes failing. The big successes are the root of progress and the principal benefit of dreaming big and attempting to achieve big. In order to succeed big, one must be willing to fail big too. Today big failure surely brings congressional hearings and the all to familiar witch-hunt. Without the risks of failure we are left with small-minded success being the best we can do.

Big goals, trust and leadership are the cures. We need to prioritize progress and discovery by producing an environment that is tailored to produce it. Hand in hand with this is a level of faith in the human spirit and ingenuity. Let people believe that their work matters with proof that they are contributing to a meaningful goal. Daniel Pink wrote a book called “Drive” where a workplace is described that is the utter antithesis to the sort of accountability science labors under today (http://www.amazon.com/Drive-Surprising-Truth-About-Motivates/dp/1594484805/ref=sr_1_1?ie=UTF8&qid=1447431195&sr=8-1&keywords=drive).61MnRyNuIDL I was stunned by how empowering his description of work could be, and how far from this vision I work under today. I might simply suggest that my management read that book and implement everything in it. The scary thing is that they did read it, and nothing came of it. The current system seems to be completely impervious to good ideas (or perhaps following the book would have been too empowering to the employees!). Of course the book suggests a large number of practices that are completely impossible under current rules and opposed by the whole concept of accountability we are under today.

It is completely ironic that the very forces that are pushing accountability down our throats are completely free of any accountability themselves. Our current political class is virtually invulnerable to any accountability from the voters. The rich and powerful overlords rule the masses with impunity. Their degree of wealth makes them completely resistant to accountability. The accountability thrust upon the rest of us is simply a tool to maintain and magnify their power through killing progress and assuring that the status quo that favors them is never threatened. Accountability is simply a way of crushing progress, and making sure that the current societal order is maintained.

I worry that only some external force and/or event will be able to dismantle the current system, and it will not be pretty or pleasant for anyone. The forces in power today are quite entrenched and resist any move that might reduce their stranglehold on the World.

The best way to find out if you can trust somebody is to trust them.

― Ernest Hemingway

Are we really stewarding anything but decline?

06 Friday Nov 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

ArtilleryShell

Never ascribe to malice that which is adequately explained by incompetence.

― Robert J. Hanlon

I’ve written mostly about modeling and simulation because that’s what I do and what I know best, but its part of a larger effort and a larger problem. I work for a massive effort known as science-based stockpile stewardship where modeling and simulation is one of the major themes. This whole effort was conceived of as a way of maintaining our confidence (faith) in our nuclear weapons in the absence of actually testing them. There is absolutely no technCastle_Unionical reason not to test them; the idea of not testing is purely political. It is a good political stance from a moral and ethical point-of-view and I have no issue with taking that stand on those grounds. From a scientific and engineering point-of-view it is an awful approach, and clearly far from optimal and prone to difficulties. These difficulties can be a very good thing if harnessed appropriately, but today such utility is not present in the execution of our Lab’s mission. As one should always remember, nuclear weapons are political things, not scientific, and politics is always in charge.

The science-based stockpile stewardship program is celebrating its twenty-year22366202545_1acf15e9e4 anniversary. Our political leaders are declaring it to be a massive success. They have been busy taking a victory lap and crowing about its achievements. The greatest part of this success is high performance computing. These proclamations are at odds with reality. The truth is that the past 20 years have marked the downfall of the quality and superiority of our Labs and the supremacy of these institutions scientifically. The program should have been a powerful hedge against decline, and perhaps it has been. Perhaps without stockpile stewardship the Labs would be in even worse shape than they are today. That is a truly terrifying thought. We see a broad-based decline in the quality of the scientific output of the United States, and our nuclear weapons’ Labs are no different. It appears that the best days are behind us. It need not be this way with proper leadership and direction.

Confidence is something you feel before you truly understand the situation

― Julie E Czerneda

Trinity_Test_Fireball_16msNonetheless given the stance of not testing we should be in the business of doing the very best job possible within these self-imposed rules (i.e., no full up testing). We are not and we are not to a relatively massive degree. This is not on purpose, but rather by a stunning lack of clarity in objectives and priorities. We have allowed a host of other priorities to undermine success in this essential endeavor. By taking the fully integrated testing of the weapons off the table requires that we bring our very best to everything else we do.

OTD-July-25---Operation-Crossroads-jpgI’ve written a great deal about how bad our approach to modeling and simulation is, but it’s the tip of the proverbial iceberg of incompetence and steps that systematically undermine the work necessary to succeed. Where modeling and simulation gets a lot of misdirected resources the experimental and theoretical efforts at the Labs have been eviscerated. The impact of this evisceration on modeling and simulation is evident in issues with the actual credibility of simulation. This destruction has been done at the time when they are needed the most. Instead support for these essential scientific engines for progress have been “knee-capped”. Just as importantly a positive work environment has been absolutely annihilated by how the Lab’s are managed.

Without the big integrated experiment to tell you what you need to know for confidence all the other experiments need to be taken up a notch or two to fill in the gap. Instead we have created an environment where experimental science has been lobotomized and exists in an atmosphere of extreme caution that almost assures the lack of necessary results for healthy science. Hand in hand with a destruction of experimental science is the loss of any vibrancy of theoretical science. The necessary bond between experimental and theoretical science has been torn asunder. Usually when working well the two approaches push and pull each other to assure progress. With neither functioning, science grinds to a halt. Engineering is similarly dysfunctional. We do not know enough today to execute the mission. In a very real sense we will never know enough, but our growth of knowledge is completely dependent on a functioning engine of discovery powered primarily by experiment, but also theory. Without either functioning properly modeling and simulation is simply a recipe for over-confidence.

We can only see a short distance ahead, but we can see plenty there that needs to be done.

― Alan Turing

We have gotten to this point with the best of intentions, but the worst in performance and understanding of what it takes to be successful. We are not talking about malice on the part of our national leadership, which would be tantamount to treason, but rather the sort of incompetence that arises from the political chaos of the modern era. When we add a completely dysfunctional and spoiled public consciousness governed principally by fear we have the recipe for wholesale decline and the seemingly systematic destruction of formerly great institutions. Make no mistake, we are destroying our technical base as surely as our worst enemy would, but through our own inept management and internal discord.

Let’s start with the first nail in the coffin, the “Tiger teams” of the mid-1990’s. We decided to apply the same forces that have made nuclear power economically unviable to the National Labs (nuclear power has been made massively expensive through over regulation, and a legal environment which causes costs to explode through the time-integrated value of money). This isn’t actual safety, but rather an imposition of a massive paperwork and procedural burden on the Labs, which produces safety primarily by decreasing productivity to the level where nothing happens.

19.3_F2_ThornquistScience becomes so incremental that progress is glacial. You almost completely guarantee safety and in the process a complete lack of discovery. Experiments lose all their essence and utility in acting as a hedge against over-confidence by surprising us. Add the risk aversion we talk about below, and you have experimental science that does almost nothing. As a result we get very little for our experimental dollar, and allow ourselves to do almost nothing innovative or exciting. So yes, safety is really important, and we need to produce a safe working environment. This same environment must also be a productive place. The productivity gains that we have seen in the private world have been systematically undermined at the Labs, not just by safety, but two other drivers risk aversion and security.

Guaranteed security is another pox on the Labs. This pox is impacting society as a whole, but Labs suffer under another burden. We pay an immense tax on our lives by trying to defend ourselves from minuscule risks associated with terrorism. We have given up privacy as a society so that our security forces can find the scant number of terrorists who represent almost no actual risk to citizens. The security stance at the Labs is no different. We have almost no risk or danger of anything, yet we pay a huge price in terms of privacy, productivity and work environment to avoid vanishing small risks. Instead of producing Labs that are so fantastic that we constantly push back the barriers of knowledge and stay ahead of our enemies, we kill ourselves with security. We keep ourselves from communicating, producing work and collaborating effectively for virtually no true benefit aside from soothing irrational fear.

timeline-18Finally we have a focus on accountability where we want to be guaranteed that no money be wasted at any time. Part of this is risk aversion where research that might not pan out and doesn’t get funded because not panning out is viewed as failure. Instead these failures are at the core of learning and growing. Failure is essential to learning and acquiring knowledge. Our accountability system is working to destroy the scientific method, the development of staff, and our ability to be the best. To some extent we account not because we need to, but because we can. Computers allow us to sub-divide our sources of money into ever-smaller bins along with everyone’s time, and effort. In the process we lose the crosscutting nature of the Lab’s science in the process. We get a destruction of multi-disciplinary science that is absolutely essential to doing the work of stewardship. Without multi-disciplinary science we will surely fail at this essential mission, and we are managing the Labs in a way that assures this outcome.

Los_Alamos_colloquiumAll of this is systematically crushing our workforce and its morale. In addition, we are failing to build the next generation of scientists and engineers with a level of quality necessary for the job. We are allowing the quality of the staff to degrade through the mismanagement of the entire enterprise at a National level. Without a commitment to real unequivocal success in the stewardship mission, the entire activity is simply an exercise in futility.

We seek guaranteed safety and that simply cannot happen without doing nothing at all. We seek guaranteed lack of risk, and no chance of failure, which is the antithesis of research and learning. Science is powered by risk and voyage into the unknown. Without the unknown, an inherently risky thing, science is simply a curating of existing knowledge. Our security stance seems totally rational especially in the post-911 world. It is nothing more than a fear mongering that strives to do the impossible, maintain a tight control over information based on science, and maintain our advantage by keeping us from using the best available technology. Instead of enhancing our productivity with technology and science, we hamstring ourselves to defend our possession of old knowledge. The power of the Labs and their staff is driven by achievement and discovery and the push for safety, risk-free and total security is completely at odds and work to destroy what used to be our strength.

The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.

― George Bernard Shaw

fig10_roleWhen we look at the overall picture we see a system that is not working. We spend more than enough money on stockpile stewardship, but we spend the money foolishly. The money is being wasted on a whole bunch of things that have nothing to do with stewardship. Most of the resources are going into guaranteeing complete safety, complete absence of risk, complete security and complete accountability. It is a recipe for abject failure at the integrated job of safeguarding the Nation. We are failing in a monumental way while giving our country the picture of success. Of course the average American is so easily fooled because if they weren’t would our politics be so dysfunctional and dominated by fear-based appeals?

Evil people rely on the acquiescence of naive good people to allow them to continue with their evil.

― Stuart Aken

What could we be doing to make things better and step toward success?

The first thing we need a big audacious goals with enough resources and freedom to solve the problems. Stockpile stewardship itself should be enough of a challenge, and we do have the resources to solve the problem. What we are missing is the freedom to get the job done, and the general waste of resources on things that contribute nothing toward success. Actually much of our resourcing goes directly into things that detract from success. Think about it, we spend most of our precious money undermining any chance at succeeding. One of the core issues is that we are not answering the new questions that today’s World is asking. Instead we are continuing to try to answer yesterday’s questions even when they are no longer relevant.

Theories might inspire you, but experiments will advance you.

― Amit Kalantri

6767444295_259ef3e354Another way of making progress is to renew our intent towards building truly World-class scientists at the Labs. We can do this by harnessing the Lab’s missions to do work that challenges the boundary of science. Today we are World class by definition and not through our actions. We can change this by simply addressing the challenges we have with a bold and aggressive research program. This will drive the professional development to heights that today’s current approach cannot match. Part of the key to developing people is to allow their work to be the engine of learning. For learning, failure and risk is key. Without failure we learn nothing, just recreate the success we already know about. World-class science is about learning new things and cannot happen without failure, and failure is not tolerated today. Without failure science does not work.

A big piece of today’s issues with the Labs are a deep disconnect between experiment, and theory that is necessary to drive science forward. As well as the admonitions against failure, the push and pull of experiment and theory has broken down. This tie must be re-established if scientific health and vitality is to be restored. When it works properly we see a competition between experimental science and theory. Sometimes experiments provide results that theory cannot explain driving theory forward. At other times theory makes predictions that experiments have to progress to measure and confirm. Today we simply work in a mode where we continually confirm existing theory, and fail to push either into the unknown. Science cannot progress under such conditions.images

Much of the problem with the lack of progress can be traced to the enormous time, effort and resource that go into useless regulation, training and paperwork. These efforts go far beyond the necessary level of seriousness in assuring safety and security to trying to guarantee safety and security in all endeavors. These guarantees are foolish and lead to an overly cautious workplace where failure is ruled out by dictum and risks necessary for progress are avoided. This leads to lack of progress, meaning and excellence in science. It is a recipe for decline. We do not have a system that prioritizes productivity, progress and quality of work. We have lost the perspective in balancing our efforts in favor of the seemingly safest and securest mode of effort.

Cielo rotatorThe stupid, naïve and unremittingly lazy thinking that permeate high performance computing aren’t just found there. It dominates the approach to stockpile stewardship. We are stewarding our nuclear weapons with a bunch of wishful thinking instead of aimgreswell-conceived and executed plan. We are in the process of systematically destroying the research excellence that has been the foundation of our National security. It is not malice, but rather societal incompetence that is leading us down this path. Increasingly, the faith in our current approach is dependent on the lack of reality of the whole nuclear weapons’ enterprise. They haven’t been used for 70 years and hopefully that lack of use will continue. If they are used we will be in a much different World if they are used, and a World we are not ready for any more. I seriously worry that our lack of seriousness and pervasive naivety about the stewardship mission will haunt us. If we have screwed this up, history will not be kind to us.

You have attributed conditions to villainy that simply result from stupidity.

― Robert A. Heinlein

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • February 2026
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Regularized Singularity
    • Join 55 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Regularized Singularity
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...