• About The Regularized Singularity

The Regularized Singularity

~ The Eyes of a citizen; the voice of the silent

The Regularized Singularity

Monthly Archives: September 2017

Multimat 2017: Where did all the New Ideas go?

29 Friday Sep 2017

Posted by Bill Rider in Uncategorized

≈ 4 Comments

Science is what we have learned about how to keep from fooling ourselves.

― Richard Feynman

Last week was another trip to the Multimat conference, a biannual meeting of scientists who solve the equations of multiple material flows under highly energetic and compressible conditions. I’ve always greeted the meeting with great enthusiasm and returned to work bristling with new ideas and inspiration. In many ways this community is the tip of the intellectual spear for modeling and simulation capability.  It also marks an anniversary of sorts, four years of blogging.  My very first post here was about the San Francisco edition of the meeting that coincided with my 50th birthday (https://williamjrider.wordpress.com/2013/09/13/thoughts-about-multimat2013/). Two years ago we had a wonderful meeting in Wurzburg Germanyfad6939fd02149c8aa33953ec9789f41 (https://williamjrider.wordpress.com/2015/09/11/multimat2015-a-biannual-festival-on-computing-compressible-multiple-materials/). Every meeting was wonderful and this was no exception, except in one very important and perhaps essential regard; the meeting seemed devoid of the usual exciting intellectual inspiration. What happened to the big ideas that flowed so easily in every previous meeting? Is it my imagination, or have the exciting new ideas dried up?

Do stuff. be clenched, curious. Not waiting for inspiration’s shove or society’s kiss on your forehead. Pay attention. It’s all about paying attention. attention is vitality. It connects you with others. It makes you eager. stay eager.

― Susan Sontag

This assessment might seem rather harsh, but upon reflecting on the previous meetings, it holds up under considerable scrutiny. Each previous meeting was full of moments where you are watching someone else’s talk and thinking, “I wish I’d thought of that, this is brilliant”. This is exactly what makes conferences so wonderful and important to attend; you get to cherry pick people’s best ideas accumulated at great effort all at once. In the moment these ideas seem like Athena springing fully formed from Zeus’ brow! Your colleagues get to look like the geniuses they are and present their most creative and ingenious thoughts in an intellectual banquet (https://williamjrider.wordpress.com/2014/06/13/why-do-scientists-need-to-attend-conferences/, https://williamjrider.wordpress.com/2014/10/27/a-holistic-view-of-attending-conferences/). The reason for attending conferences isn’t to give talks; it is to learn new things taught by the smartest people you know. It is to meet and let ideas breed openly over laughter, food and drinks. You give talks as an act of repayment for the knowledge you are granted byimgres being in the audience. Giving talks is pretty low on the list of reasons, but not in the mind of our overlords, which starts to get at the problems I’ll discuss below. Given the track record of this meeting my expectations were sky-high, and the lack of inspiring ideas left me slightly despondent.

A few more thoughts about the meeting are worth pointing out before getting to the dialog about fresh ideas, their importance and postulates for their recent absence. The meeting is attended by a collection of computational scientists (mathematics, physics, engineering,…) dominated by the nuclear “club”. This means American, French and British with a smattering of Russians and Chinese – who couldn’t come this year for undisclosed reasons. These scientists for the most part work at their nation’s respective nuclear weapons’ labs. Occasional others attend like Israelis (an unacknowledged member of the club) along with a handful of Czechs, Italians, and Germans. As such the meeting serves as a proverbial checkup on the intellectual health of this important area of science at the West’s nuclear weapons Labs. This year’s checkup should give everyone pause, the state of health is declining. There is a real lack of creative energy surrounding the heart of our most important codes. Many important codes are built around a powerful hydro-solver that produces accurate, physically relevant solutions to the multi-material “hydrodynamic” equations. Previous meetings have seen a healthy resurgence of new ideas, but that upswing seems to have come to a staggering halt. These labs have also provided a deep well of inspired research that has benefited the broader scientific community including weather, climate, astrophysics and a broad swath of engineering use of computation.

lanl-logo-footer
lab_logo_blue_rgb_
download
Atomic_Weapons_Establishment_(logo)
220px-CEA_logo_nouveau.svg

In my opinion the reasons for this halt in creative energy are simple and straightforward. The foolhardy worldwide push for exascale computers is sucking the air out of the room. It is gobbling up all the resources and attention leaving nothing for new ideas. This completely obsessive and unwise focus on the hardware is attempting to continue – the already dead – Moore’s law. This push is draining the community of vitality, resources and focus. The reasons for the push are worth considering because they help define the increasingly hostile nature of the modern world toward science. The computers being build for the future are abysmal to use and the efforts to move our codes to them are sucking all the energy from the Labs. Nothing is left of creative work; nothing is left for new ideas. Simply put, the continued use of old ideas is hard enough if you add these generally unusable computers to the mix. The reason is simple; the new computers completely suck. They are true monstrosities (in the classic definition of the word) and complete pieces of shit as scientific hardware. They are exactly the computers we don’t want to use. The price of forcing them down our throats is the destruction of research that isn’t associated with simply making these awful computers work. Worse yet, the return on the massive investment of effort will be vanishingly small in terms of added modeling and simulation capability.

titan

As noted this whole direction is a foolish attempt to breathe life into the already rigid corpse of Moore’s law. Now dead at every scale of computing and already a decade deceased at the level of computer chips – note the death of Moore’s law and the ascendency of cell phones is strongly correlated, and that probably is not a coincidence. The truth of our real performance on computers is far more dire and damning of this entire effort. We have been getting an ever-lower proportion of the potential performance on our computers for 25 years. Each computer has a peak performance measured on silly hardware friendly benchmarks that no one gives a flying fuck about (the dense linear algebra LU decomposition, Linpac). This silly and useless benchmark is how we crown the fastest computer! Our actual code performance on these machines is truly atrocious and gets worse every year. The dirty little secret is that its been getting ever worse every year. It was god-awful 20 years ago, and it has just gotten worse. Zero is a pretty good approximation to the proportion of the performance we get – generally much less than one percent. We mindfully ignore the situation just like one would ignore a cancer threatening to devour our lives. The attitude is generally, “look away, nothing to see here”. The exascale program is that cancer metastasized.

21SUPERCOMPUTERS1-master768
OLYMPUS DIGITAL CAMERA
OLYMPUS DIGITAL CAMERA

Part of the discussion about exascale needs to acknowledge the nature of choices and priorities in research. In isolation, the exascale program is an unambiguous good; it would be genuinely awesome to have – usable – exascale computers (https://williamjrider.wordpress.com/2014/09/19/what-would-we-actually-do-with-an-exascale-computer/). This good needs to be weighed in terms of its cost and the impact of alternatives. It needs to be viewed through the lens of reality too. If one looks at the raw cost, the opportunity cost and collateral damage, and under this examination we can see that the exascale program is a massively negative force in science (https://williamjrider.wordpress.com/2016/06/27/we-have-already-lost-to-the-chinese-in-supercomputing-good-thing-it-doesnt-matter/, https://williamjrider.wordpress.com/2016/05/04/hpc-is-just-a-tool-modeling-simulation-is-what-is-important/, https://williamjrider.wordpress.com/2016/10/19/why-china-is-kicking-our-ass-in-hpc/, ). In isolation without considering anything else, it is a clear positive. In the context of lost opportunities and effective use of available resources, the program is an unmitigated disaster. We will all be poorer for it as it lays waste to potential breakthroughs we will be denied in its wake. In today’s world we talk about things in isolation, free of nuance and trade spaces that would make for a more robust and responsible conversation. Our leaders are irresponsible in the extreme for taking down this path with no real discussion, or any debate taking place. The message in the trenches is “do what you’re paid to do and quit asking questions”.

dag006The really dirty secret is that chasing exascale as a route to scientific discovery is simply bullshit of the highest and most expensive order. We would be far better served by simply figuring out how to use the hardware we already have. Figuring out how to efficiently use hardware we have had for decades would be a difficult and worthy endeavor. The punch line is that we could get orders of magnitude in improved performance out of the hardware we’ve been using for decades. By simply figuring out how to get our codes working more efficiently on the computers already existing would meet most scientific goals without eviscerating the rest of computational science in the process. Instead we chase goals that are utterly meaningless. In the process we are destroying the research that has true and lasting value. The areas being ignored in the push for exascale have the capacity to provide far more scientific capability than even the most successful exascale program could possibly deliver. This brings me back to the meeting in Santa Fe and the lack of energy and exciting ideas. In the past the meeting has been a great survey of the active work from a creative and immensely talented group of people. As such this meeting is the proverbial canary in the coalmine. The ideas are dying right in front of our eyes.

1wakdnThis outcome is conflated with the general lack of intellectual vigor in any public discourse. The same lack of intellectual vigor has put this foolish exascale program in place. Ideas are viewed as counter-productive today in virtually every public square. Alarmingly, science is now suffering from the same ill. Experts and the intellectual elite are viewed unfavorably and their views held in suspicion. Their work is not supported, nor is projects and programs dependent on deep thinking, ideas or intellectual labor. The fingerprints of this systematic dumbing down of our work have reached computational science, and reaping a harvest of poisoned fruit. Another sign of the problem is the lack of engagement of our top scientists in driving new directions in research. Today, managers who do not have any active research define new directions. Every year our manager’s work gets further from any technical content. We have the blind leading the sighted and telling them to trust them, they can see where we are going. This problem highlights the core of the issue; the only thing that matters today is money. What we spend the money on, and the value of that work to advance science is essentially meaningless.

leland_taylor_320Effectively we are seeing the crisis that has infested our broader public sphere moving into science. The lack of intellectual thought and vitality pushing our public discourse to the lowest common denominator is now attacking science. Rather than integrate the best in scientific judgment into our decisions on research direction, it is ignored. The experts are simply told to get in line with the right answer or be silent. In addition, the programs defined through this process then feed back to the scientific community savaging the expertise further. The fact that this science is intimately connected to national and international security should provide a sharper point on the topic.  We are caught in a vicious cycle and we are seeing the evidence in the hollowing out of good work at this conference. If one is looking for a poster child for bad research directions, the exascale programs are a good place to look. I’m sure other areas of science are suffering through similar ills. This global effort is genuinely poorly thought through and lacks any sort of intellectual curiosity.

Moving our focus back to exascale provides a useful case study of what is going wrong. We see that programs are defined by “getting funding” rather than what needs to be done or what should be done. Arguments for funding need to be as simple as possible, and faster computers are naïve enough for unrefined people to buy into. It sounds good and technically unsophisticated people buy it hook line and sinker. Computers are big loud and have lots of flashing lights to impress managers, politicians and business people who know no better. Our scientists have been cowered into compliance and simply act happy to get money for doing something. A paycheck beats the alternative, and we should feel happy that we have that. The level of inspiration in the overall approach has basically fallen off a cliff, and new ideas are shunned because they just make things complicated. We are left with the least common denominator as the driving force. We have no stomach for nuance or subtlety.

mediocritydemotivatorPriority is placed on our existing codes working on the new super expensive computers. The up front cost of these computers is the tip of the proverbial cost iceberg. The explicit cost of the computers is their purchase price, their massive electrical bill and the cost of using these monstrosities. The computers are not the computers we want to use, they are the ones we are forced to use. As such the cost of developing codes on these computers is extreme. These new computers are immensely unproductive environments. They are a huge tax on everyone’s efforts. This sucks the creative air from the room and leads to a reduction in the ability to do anything else. Since all the things being suffocated by exascale are more useful for modeling and simulation, the ability to actually improve our computational modeling is hurt. The only things that benefit from the exascale program are trivial and already exist as well-defined modeling efforts.

Increasingly everything is run through disconnected projects that are myopic by construction. The ability to do truly unique and groundbreaking science is completely savaged by this approach to management. Breakthroughs are rarely “eureka” moments where someone simply invents something completely new. Instead, most good research is not made through connections to other good research. Conferences are great incubators for these connections. Well-defined and proven ideas are imported and redefined to make contributions to a new area. This requires people to work across discipline boundaries, and learn about new things in depth. People need to engage deeply with one another, which is similarly undermined today by project management and information security focus. The key thing is exposure to new and related areas of endeavor and basic learning. The breakthroughs come episodically and do not lend themselves to the sort of project management in vogue today.

It isn’t like I came back with nothing. There were a couple of new things that really fall into the category of following up. In one case there was a continuation of a discussion of verification of shock tube problems with someone from Los Alamos. The discussion started in Las Vegas at the ASME VVUQ meeting, and continued in Santa Fe. In a nutshell, we were trying to get cleaner verification results by dividing the problem into specific regions associated with a particular solution feature and the expectation of different rates of convergence for each. We found something unexpected in the process that doesn’t seem to follow theoretical expectations. It’s worth some significant follow-up.  A mysterious result is always something worth getting to the bottom of. The second bit of new intellectual blood came in direct response to my talk. I will also freely admit that my contribution wasn’t the best. I haven’t had any better luck with a good free energy at work to energize my work. The same exascale demon is sucking my intellectual lifeblood out. I simply reported on a here-to-fore unreported structural failing of solvers. In summary, we find systematic, but small violations of the second law of thermodynamics in rarefactions for modern and classical methods. This shouldn’t happen and violations of the second law lead to unphysical solutions. All of this stems from identifying a brutal problem (https://williamjrider.wordpress.com/2017/06/09/brutal-problems-make-for-swift-progress/ ) that every general-purpose code fails at – what I call “Satan’s shock tube” with 12 order of magnitude jumps in density and pressure approximating propagation of waves into a vacuum.

We cannot live only for ourselves. A thousand fibers connect us with our fellow men; and among those fibers, as sympathetic threads, our actions run as causes, and they come back to us as effects.

― Herman Melville

Before closing I can say a thing or two about the meeting. None of the issues dulled the brilliance of the venue in Santa Fe, “the City Different”. While I was disappointed about not enjoying the meeting in some exotic European venue, Santa Fe is a fabulous place for a meeting. It is both old (by American standards), yet wonderfully cosmopolitan. There is genuine beauty in the area, and our hotel was nearly perfect. Santa Fe boasts exceptional weather in the fall, and the week didn’t disappoint. It has a vibrant art community including the impressive and psychedelic Meow Wolf. It was the Drury Plaza hotel placed in a remodeled (and supposedly haunted) old hospital. Two short blocks from the plaza, the hotel is enchanting and comfortable. We all shared two meals each day catered by the hotel’s exceptional restaurant. Having meals at the conference and together with the participants is optimal and makes for a much-improved meeting compared to going out to restaurants.

37414783
0162_ext_HDR
drury-plaza-hotel-in
LKIMG_3377-980x653
meow-wolf4-1024x723
d58060ebb6b5bee16aa45fe14f2d3524

We had a marvelous reception on the hotel’s rooftop bar enjoying a typical and gorgeous New Mexico early autumn sunset with flowing drinks, old friends and incredibly stimulating conversation. American laws virtually prohibit government funds paying for alcohol, thus the drinks were courtesy of the British and French governments. One more idiotic prohibition on productivity and common sense that only undermines our collective efforts especially creatively and collaboratively. These laws have only gotten more prescriptive and limiting. We no longer can pay for meals for interview lunches and dinners, much less business meetings. None of this is reflective of best practice for any business. The power of breaking bread and enjoying a drink to lubricate human interactions is well known. We only hurt our productivity and capacity to produce valuable work by the restrictions. We are utterly delusional about the wisdom of these policies. All of this only serves to highlight the shortcomings in the creative energy evident from the rather low level of vibrancy exhibited by the lack of exciting new ideas.

Never underestimate the power of human stupidity.

– Robert A. Heinlein

 

Testing the limits of our knowledge

22 Friday Sep 2017

Posted by Bill Rider in Uncategorized

≈ 2 Comments

The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge.

― Daniel J. Boorstin

All knowledge hurts.

― Cassandra Clare

Science is an important source and process for generating knowledge for humanity. Much of science is a core of well-known and well-tested knowledge about the univeunnamedrse. Most of the activity for working scientists is at the boundaries of our knowledge working to push back our current limits on what is known. The scientific method is there to provide structure and order to the expansion of knowledge. We have well chosen and understood ways to test proposed knowledge. A method of using and testing our theoretical knowledge in science is computational simulation. Within computational work the use of verification, validation with uncertainty quantification is basically the scientific method in action (https://williamjrider.wordpress.com/2016/12/22/verification-and-validation-with-uncertainty-quantification-is-the-scientific-method/ ). One of the key activities in the accomplishment of our understanding of theory is the determination of uncertainties.

Unfortunately what we call “uncertainty quantification” is only a small piece of what uncertainty needs to be evaluated in testing knowledge. Too often people only do this narrow part of uncertainty quantification and falsely believe this is sufficient for science.

The basic premise is that we can test our knowledge via the systematic understanding of uncertainties. If we examine the uncertainty in an honest and complete manner, the limits of knowledge can be explored. Some uncertainty can be reduced via greater computational effort, some uncertainty can be reduced with improved modeling and some uncertainty is irreducible. It is absolutely essential to understand the nature of what we do and don’t know systematically. Careful and honest study is clear, and the strong tendency of people is to view all uncertainty as bad. This is not necessarily true. Uncertainty is completely unavoidable, and understanding the degree to which this is true can be a great unburdening. If an uncertainty is unavoidable, one can dispense with attempting to reduce it and simply figure out how to live with it.

Crays-Titan-SupercomputerIf the uncertainty is irreducible and unavoidable, the problem with not assessing uncertainty and taking an implied value of ZERO for uncertainty becomes truly dangerous (https://williamjrider.wordpress.com/2016/04/22/the-default-uncertainty-is-always-zero/). In this case there is an uncertainty that should be there, and instead of building this knowledge into our work, we mindlessly ignore it. Sometimes it is actually mindfully ignored, which is utterly contemptible. This situation is so common as to be laughable, but actually provides the source for tragedy. Looking at weather and climate provides innumerable situations where ignorance of uncertainty pileofshitmay prove deadly in rather commonly encountered situations. As systems become more complex and energetic, chaotic character becomes more acute and common. This chaotic character leads to solutions that have natural variability. Understanding this natural variability is essential to understanding the system. Building this knowledge is the first step in moving to a capability to control and engineer it, and perhaps if wise, reduce it. If one does not possess the understanding of what the variability is, such variability cannot be addressed via systematic engineering or accommodation.

Courage doesn’t happen when you have all the answers. It happens when you are ready to face the questions you have been avoiding your whole life.

― Shannon L. Alder

This entire issue is engaged frequently. In the case of experiments for complex systems, the problem arises because lack of repeated experiments. Often such systems are complex and expensive, thus tests are carried out once. Even though any thoughtful examination of the situation would conclude that the results of the experiment are likely –almost certainly – highly variable, the experiment is treated as a unique event. Computational simulations of the experiment are viewed the same way; the calculation should try to reproduce this single experiment. This is potentially a recipe for disaster. A healthier point of view would be looking at this experiment as a single instance of drawn from a probability distribution. If the experiment were repeated there would be different results. A computational simulation if truly predictive would do exactly the same thing. Of course the simulations themselves are designed to compute the average response of such an experiment (based on mean field equations). All of this conspires to create big problems should the experiment actually draw from a low probability outcome (i.e., the tails of the distribution).5f282213e3d57606200fffd45374ecc5

To address this systemic issue we need to reframe both the experimental and theoretical practice. If an experiment fails to give repeatable results, we need to work to unveil the statistical nature of the results especially for complex, high consequence systems. Theoretical models need to have the same properties more mindfully rather than producing the variability by happenstance (when mean field models produce variability almost serendipitously). Computational simulations follow the theory by construction and great benefits to our knowledge and understanding by more structurally and mindfully building in the variability to the theory. Failure to address this issueostrich-head-in-sandsystematically is an ever-growing limit for science. We have a major scientific gap open in front of us and we are failing to acknowledge and attack it with our scientific tools. It is simply ignored almost by fiat. Changing our perspective would make a huge difference in experimental and theoretical science, and remove our collective heads from the sand about this matter.

Nothing in life is to be feared, it is only to be understood. Now is the time to understand more, so that we may fear less.

― Marie Curie

At a deeper and more fundamental level the whole exploration of the true uncertainty is the search for the understanding. We seek to define our level of precision for the modeling of something by systematically examining the levels of certainty (by proxy of studying uncertainty). We need to understand the quality of our knowledge, and a complete survey of uncertainty is a path to this end. For computational modeling there are three broad categories for the uncertainty, the model itself, the model’s numerical solution, and the experimental fidelity used to grant confidence to the model. Each of these uncertainties can in turn be broken down into more detailed pieces. For

41stDQimuaL._SX329_BO1,204,203,200_
download

example we must compare to the model itself rather than the error in the solution of the model. It is important to choose some structure for the uncertainty and commit to an estimate of all portions of the structure. One should never take a zero magnitude for the uncertainty of any structural element by ignoring it.

Any fool can know. The point is to understand.

― Albert Einstein

One of the clearest ways to undermine this quest for understanding the boundaries of our knowledge is ignoring a key uncertainty. There are several usual suspects for the _12122_tex2html_wrap26willful uncertainty ignorance. Probably the most common uncertainty to be willfully ignorant of is numerical error. The key numerical error is discretization error that arises from the need to make a continuous problem, discrete and computable. The basic premise of computing is that more discrete degrees of freedom should produce a more accurate answer. Through examining the rate that this happens, the magnitude of the error can be estimated. Other estimates can be had though making some assumptions about the solution and relating the error the nature of the solution (like the magnitude of estimated derivatives). Other generally smaller numerical errors arise from solving systems of equations to a specified tolerance, parallel consistency error and round-off error. In most circumstances these are much smaller than discretization error, but are still non-zero.

Experimental observations are only experience carefully planned in advance, and designed to form a secure basis of new knowledge.

― Sir Ronald Fisher

A second category of uncertainty that is highly prone to being ignored is the experimental variability. Often this is the direct consequence of only doing a single experiment. Rather then know the physics of the problem well enough to make the conclusion that the experiment will be highly variable, this is ignored and we will endeavor to model the single experiment as a unique well-determined event. The result of this set of unjustified assumptions is wholesale ignorance of systematic and irreducible uncertainty. This is truly scientific malpractice. Addressing this shortcoming should be the focus of significant effort experimentally, as well as in modeling and its numerical solution. It is a very large and largely unaddressed issue in science.

In addition to ignoring the intrinsic variability in the experiments, the more classical and even less excusable uncertainty often ignored is measurement error. This error is always present even in cases where the experiment is well posed and utterly reproducible. Measurements are always finite precision and have some degree of error and uncertainty. This finite value should always be reported as part of the experimental comparison even when the uncertainty is small. In a deep sense the small uncertainty is more important because it lends credence to the sense that the data is high quality. Not reporting the uncertainty simply leaves this vague and undefined.

The-most-powerful-Exascale-ComputerThe last area of uncertainty is the modeling uncertainty. In the vast majority of cases this will be the largest source of uncertainty, but of course there will be exceptions. It has three major components, the choice of the overall discrete model, the choice of models or equations themselves, and the coefficients defining the specific model. The first two areas are usually the largest part of the uncertainty, and unfortunately the most commonly ignored in assessments. The last area is the most commonly addressed because it is amenable to automatic evaluation. Even in this case the work is generally incomplete and lacks full disclosure of the uncertainty.

Today in examining modeling uncertainty we most often attack the least important one systematically, and the more important modeling uncertainties are ignored. The “easy” uncertainty to attack is the coefficients in the model. This can be achieved using well-developed methods such as MCMC (Markov chain Monte Carlo). One can define a set of parameters to be varied and ranges for the variation. The calculations can bemcmc-samplingrepeated using values drawn to efficiently sample the probability space of the calculation and produce the uncertainty. This sampling is done for a very highly dimensional space, and carries significant errors. More often than not the degree of error associated with the under sampling is not included in the results. It most certainly should be.

The other two uncertainties are generally larger and thus more important to characterize. Unfortunately neither is amenable to the sort of turnkey black box approach the parametric uncertainty allows. As a result these uncertainties are usually completely ignored. These two areas of uncertainty are closely related. Any complex problems can be modeled in a variety of ways. An analyst ends up making innumerable assumptions and choices in the course of modeling a complex problem. One choice is the code used for the analysis and the specific settings within the code. Beyond this there are choices on how the problem is meshed, boundary conditions, initial conditions, submodels to close the equations, and analysis of results. Each and every one of these choices can produce changes in the results. It is very uncommon to see a problem solved in different or remotely independent ways. As a result the uncertainty from the modeling and analysis is usually completely hidden.

Science, my boy, is made up of mistakes, but they are mistakes which it is useful to make, because they lead little by little to the truth.

― Jules Verne

To truly test our knowledge in a topic we must be open to a full and honest evaluation of the uncertainties. Knowledge and uncertainty are two sides of the same coin. If you don’t know the uncertainty, you don’t know the extent of your knowledge. Too often we only do the assessment that is easy. In this assessment strategy we also implicitly choose uncertainty estimates of ZERO for information we know is uncertain, but difficult to assess. This is a pernicious and dangerous practice. Due diligence and responsibility should dictate that some sort of uncertainty be estimated for all sources. If we cannot estimate these uncertainties in a credible and knowledgeable manner, we have no business in modeling and our experiments are not yielding their full value. The only responsible act in this case would be to produce work that would make such a bounding assessment possible.

Negative results are just what I want. They’re just as valuable to me as positive results. I can never find the thing that does the job best until I find the ones that don’t.

― Thomas A. Edison

The Inspirational Promise of TeD Talks

15 Friday Sep 2017

Posted by Bill Rider in Uncategorized

≈ 1 Comment

Your number-one mission as a speaker is to take something that matters deeply to you and to rebuild it inside the minds of your listeners. We’ll call that something an idea.
― Chris J. Anderson

janell-500x500Every September my wife and I attend the local TeDx event here in Albuquerque. It is a marvelous way to spend the day, and leaves a lasting impression on us. We immerse ourselves in inspiring, fresh ideas surrounded by like-minded people. It is empowering and wonderful to see the local community of progressive people together at once listening, interacting and absorbing a selection of some of the best ideas in our community. This year’s event was great and as always several talks stood out particularly including Jannell MacAulay (Lt.Olivia-GatwoodCol USAF) talking about applying mindfulness to work and life, or Olivia Gatwood inspiring poetry about the seeming mundane aspects of life that speaks to far deeper issues in society. The smallest details are illustrative of the biggest concerns. Both of these talks made me want to think deeply about applying these lessons in some fashion to myself and improving my life consequentially.

TED-Talks

That’s part of the point of TeD, the talks are part of the gospel of progress, part marketing of great ideas and part performance art. All of these things have a great use to society in lifting up and celebrating a drive to be better and progress toward a better future. Humanity has immense power to change the world around them for the better. We can look across the globe and witness the collective power of humanity to change their environment. A great deal of this change is harmful or thoughtless, but much of it is a source of wonder. Our understanding of the World around us and the worlds within us has changed our biological destiny.

main-event-2017-web-headerWe have transitioned from an animal fighting for survival during brief violent lives, to beings capable of higher thought and aspiration during unnaturally long and productive lives. We can think and invent new things instead of simply fighting to feed us and reproduce a new generation of humans to struggle in an identical manner. We also can produce work whose only value is beauty and wonder. TeD provides a beacon for human’s best characteristics along with a hopeful forward-looking community committed to positive common values. It is a powerful message that I’d like to take with me every day. I’d like to live out this promise with my actions, but the reality of work and life comes up short.

There was a speaker from my employer this year, and there always is. There wasn’t anyone from my former employer, the other major scientific Lab in our state (what was once one of the premier scientific institutions in the World, but that’s a thing of the past). Also noticeable is the lack of support for the local TeD organization by either employer. I’ll grant you that Los Alamos has supported it in the past, but no longer. There’s probably some petty and idiotic reason for the withdrawal of support. My employer, Sandia, doesn’t support it, and hasn’t ever. It looks like our local University doesn’t support it either. I know that Los Alamos did their own local TeD conference and perhaps they thought that was enough TeD for them. That’s the sad best-case scenario, and I don’t know what the full story is.

For Sandia it’s not particularly surprising as it’s not exactly a progressive, idea-centered place, and these days no place is anyway. The University should be, but the lack of financial support from the state could explain it (its a common characteristic of GOP governance to eviscerate universities). It is quite hard for me to express my level of disappointment in these institutions’ lack of civic support for progressive thought. It is stark testimony on the current state of affairs where two National Laboratories and a University cannot be supportive of a major source of progressive thought in the community they are embedded within. An active progressive and intellectual community in the areas where these institutions are located should be beneficial for recruiting and retention of progressive and intellectual staff. It is one sign that this sort of long view isn’t at work. It is a sign of the times.

download-1TeD talks are often the focus of criticism for their approach and general marketing nature strongly associated with the performance art nature. These critiques are valid and worth considering including the often-superficial nature of how difficult topics are covered. In many ways where research papers can be criticized increasingly as merely being the marketing of the actual work, TeD talks are simply the 30-second mass market advertisement of big ideas for big problems. Still the talks provide a deeply inspiring pitch for big ideas that one can follow up on and provide the entry to something much better. I find the talk is a perfect opening to learning or thinking more about a topic, or merely being exposed to something new.

Control leads to compliance; autonomy leads to engagement.

– Daniel H. Pink

One prime example is one of my favorite talks of all time by Daniel Pink (https://www.ted.com/talks/dan_pink_on_motivation). This talk is basically a pitch for the book “Drive” and touches only superficially on the topic. The book itself is a distillation of very complex topics. All of this is true, but none of this undermines the value in the ideas. TeD provides a platform to inspire people to do more and get closer to the actual application of the ideas to their lives (not just buy Pink’s book, the true cynics take on the purpose). Interestingly, the managers at work were also reading Pink’s book and discussing the ideas therein. The rub was the observation that I coulddownload-2 not identify a single thing recommended in Pink’s book that made it to the workplace. It seemed to me that the book simply inspired the management to a set of ideals that could not be realized. The managers aren’t really in charge; they are simply managing the corporate compliance instead of managing in a way that maximizes the performance of its people. The Lab isn’t about progress any more; it is about everything, but progress. Compliance and subservience has become the raison d’etre.

For artists, scientists, inventors, schoolchildren, and the rest of us, intrinsic motivation the drive do something because it is interesting, challenging, and absorbing is essential for high levels of creativity.

– Daniel H. Pink

download

Intrinsic motivation is conducive to creativity; controlling extrinsic motivation is detrimental to creativity.

–Daniel H. Pink

This deep frustration isn’t limited to TeD talks; it is almost every source of great advice or inspiration available. Almost every manager I know reads the Harvard Business Review. I read it too. It is full of wonderful ideas and approaches to improving the way we work. It is impossible to see anything ever done with all the great advice or inspiration. My workplace looks like all the “before” cases studies in HBR and more like it every day, not less. Nothing ever recommended happens at work, nothing is tried, nothing changes in the positive direction; its like we are committed to moving backwards. HBR download-1is progressive in terms of the business world. The problem is that the status quo and central organizing principle today is anti-progressive. Progress is something everyone is afraid of, and the future appears to be terrifying and worth putting off for as long as possible. We see genuinely horrible lurch toward an embrace of the past along with all its anger, bigotry, violence and fear. Fear is the driving force for avoiding anything that looks progressive.

Management isn’t about walking around and seeing if people are in their offices, he told me. It’s about creating conditions for people to do their best work.

– Daniel H. Pink

Now that I’ve firmly established the lack of relevance of TeD and progressive thought in my workplace, I can at least appreciate and apply it at a personal level. I’d love for work to reflect a place for genuine progress, but this seems a bridge too far today. Work is a big part of life and these observations are rather dismaying. Ideally, I’d like a workplace that reflects my own values. The truth of the matter is that this is nearly impossible for a progressive-minded person in America today. Even the bastions of progressive thought like Universities are not working well. Society at large seems to be at war with elites and progressive thought far more under siege than whites, or Christians. I can ask the serious question, how many atheists are in Congress? How much well proven and accepted science does our government reject already? Don’t get me started on our judicial system, or the war on drugs both of which focus far more on oppressing minorities than crime or drug abuse. The bottom line is the sense that we are in a societal backlash against change; so more progress seems to be impossible. We will be fighting to hold onto the progress we’ve already made.

maxresdefault-1Still I can offer a set of TeD talks that have both inspired me and impacted my life for the better. They have either encouraged me to learn more, or make a change, or simply change perspective. I’ll start with a recent one where David Baron gave us an incredibly inspiring call to see the total eclipse in its totality (https://www.ted.com/talks/david_baron_you_owe_it_to_yourself_to_experience_a_total_solar_eclipse). I saw the talk concluding that I simply had to go, and then I showed to my wife to convince her. It did! We hopped into the car at midnight the day of eclipse and drove eight hours to get from Northern Idaho to Eastern Oregon. We got off I-82 atmaxresdefaultDurkee finding a wonderful community center with a lawn and watched it with 50 people from all over the local area plus a couple from Berlin! The totality of the eclipse lasted only two minutes. It was part of a 22-hour day of driving over 800 miles, and it was totally and completely worth every second! Seeing the totality was one of the greatest experiences I can remember. My life was better for it, and my life was better for watching that TeD talk.

61431a6a6184b8f62bc9e27705d1b9f8429541cf_2880x1620Another recent talk really provoked me to think about my priorities. It is a deep consideration of what your priorities are in terms of your health. Are you better off going to the gym or going to party, or the bar? Conventional wisdom says the gym will extend your life the most, but perhaps not. Susan Pinker provides a compelling case that social connection is the key to longer life (https://www.ted.com/talks/susan_pinker_the_secret_to_living_longer_may_be_your_social_life ). This gets at the disparity between men and women since women tend to connect in long life affirming friendships with greater ease than men. The talk is backed up by data, and by visiting places where people live long lives. These people live in communities where they are entangled in each other’s lives almost by design. It gets to the priorities associated with health care and self care along with the benefit of actions. Focusing on your social life is a genuinely beneficial act to prolonging your life.

Our modern computing world is a marvel, but it also has some rather pronounced downsides. In many ways our cell phones are making us far unhappier people. The phones and their apps are designed to grab, demand our attention. They can become sources of deep and pervasive anxiety. This is exactly what they are designed to do. As Adam Alter explains, an entire industry is set up to get as much of our attention as possible because our attention equals money, big money (https://www.ted.com/talks/adam_alter_why_our_screens_make_us_less_happy). He also explains that it doesn’t have to be like this. The same social engineering that has gone into making the phones so demanding could be harnessed to help us be better. If we balanced the naked profit motive with some measure of social responsibility, we might turn this problem into a benefit. This is a wonderfully inspiring idea; it is also terribly progressive and dangerous to the unfettered capitalism fueling this growing societal crisis.

374906e7e2a0f3970763c48ce8e7cc28614444a0_2880x1620

Love rests on two pillars: surrender and autonomy. Our need for togetherness exists alongside our need for separateness.

– Esther Perel

The power of TeD extends to far deeper personal matters as well. A couple of talks by Esther Perel speak to reframing our love lives (https://www.ted.com/talks/esther_perel_the_secret_to_desire_in_a_long_term_relationship, https://www.ted.com/talks/esther_perel_rethinking_infidelity_a_talk_for_anyone_who_has_ever_loved ). Perel defies conventional thought on love, marriage and infidelity providing a counter theory to all these matters. Her first talk is an accompaniment to her first book and tackles the thorny issue of keeping your long-term relationship hot and steamy. It is a challenge many of us have tackled, and no doubt struggled with. This 27485struggle is for good reasons, and knowing the reasons provides insight to solutions. Perel powerfully explains the problem and speaks to working toward solutions.

The thornier issue of infidelity is the second talk (and her brand new book). Like before, she tackles the topic from a totally different perspective. Her approach is unconventional and utterly refreshing. The new perspectives provide an alternative narrative to handling this all too common human failing. Explaining and understanding the complex root of this all-to-common relationship problem can improve our lives. It is an alternative to the moral perspective that has failed to provide any solutions. Among the threads to concentrate on is the relatively new character of modern marriage in the history of humanity, and the consequences of the deep changes in the institution. One of the beauties of TeD is the exposure to fresh perspective on old ideas along side completely new ideas.

The very ingredients that nurture love mutuality, reciprocity, protection, worry, and responsibility for the other are sometimes the very ingredients that stifle desire.

– Esther Perel

Truth and courage aren’t always comfortable, but they’re never weakness.

– Brene Brown

The last talk I’ll highlight today is truly challenging to most of us. Brene Brown is a gifted and utterly approachable speaker presenting a topic that genuinely terrifies most of us, vulnerability (https://www.ted.com/talks/brene_brown_on_vulnerability). Begin vulnerable is an immensely valuable characteristic that almost everyone struggles with. Vulnerable often equates with being weak, but also open and honest. That openness and honesty is the key to being a better person and developing better relationships. In many cases the weakness and honesty is shared only with yourself. In either case vulnerability provides an avenue to connection and an embrace of humanity that both frees you and allows deeper relationships to flourish. The freedom you give yourself allows you to grow, learn and overcome bad experiences.

What would you be glad you did–even if you failed?

– Brene Brown

I always wish that I could focus on most of what I hear at a local TeD event, but one must make choices, time and effort are limited. While I do plan to more mindfully apply mindfulness to my life, right now I’ll hedge toward the artistic side of things, if for no Nature___Clouds_Clouds_in_the_moonlit_night_080130_other reason that I usually don’t. I will close by honoring the inspirational gift of Olivia Gatwood’s talk on poetry about seeking beauty and meaning in the mundane. I’ll write a narrative of a moment in my life that touched me deeply.

The Best Gift

A night of enchanting companionship was drawing to a close,

and I was longing for one last kiss before parting

Those early autumn nights are so welcoming,

 

the crisp nights promised, but not yet arrived,

summer still alive, but fading

I hadn’t even bothered to fully dress for the goodbye,

 

Conventions and neighbors be damned

It was a warm evening and my skin wanted to drink it in,

drink her in too, one last time

 

We slowly made our way out to my driveway

talking, still flirting, our banter unabated

The moon full, bright, and peeking between the gaps in the single cloud

 

adorning the sky as it illuminates the night

It will light her way home as a warm beacon

“Good,” I think, “you’ll be safe” on your long drive home

 

We draw close to each other, pressing hard while

savoring the time spent together fun and friendship

with a depth that was unexpected, but welcome

 

You ask, “What would you like for your birthday?”

My mind goes to my elaborate tattoo to adorn me soon,

“I’m already getting what I want for myself”

 

“I always ask for more time,” she said longingly

Her words cut me to the core,

of course, what else would she want?

 

My head spins with the truth revealed by her breathtaking honesty,

with words failing me for a breath or two, … or three

My mind opens with the realization of her precious offering

 

“I just want good memories”

Realization washes over me, she just gave me the best gift I could have hoped for

We kiss deeply and parted until we next renew making good memories

 

 

You are not special; you are replaceable

08 Friday Sep 2017

Posted by Bill Rider in Uncategorized

≈ 6 Comments

You are not special. You’re not a beautiful and unique snowflake. You’re the same decaying organic matter as everything else. We’re all part of the same compost heap. We’re all singing, all dancing crap of the world.

– Chuck Palahniuk

This post was inspired by twin events: a comment from a dear friend, and watching the fight-club-postermovie “Fight Club” again. This is my 300th blog post here. Its been an amazing experience thanks for reading.

If you consider the prospect of retirement and you feel that your place of work does not need you and would not suffer from you departure, you aren’t alone. This is an increasing trend for work today. You are an imminently replaceable cog in the machine, which can be interchanged with another person without any loss to the workplace. Your personal imprint on the products of work is not essential and someone else could do exactly what you do. If you work in one of the many service industry jobs, or provide the basic execution of tasks, the work is highly prescribed and you versus someone else doesn’t matter much. If you are reliable, show up and work hard, you are a good worker, but someone else with all the same characteristics is just as good.

What’s measured improves

–Peter F. Drucker

I didn’t used to feel this way, but times have changed. I felt this way when I worked at potentialdemotivator_largeMcDonalds for my first job. I was a hard worker, and a kick ass grill man, opener, closer, and whatever else I did. I became a manager and ultimately the #2 man at a store. Still I was 100% replaceable and in no way essential, the store worked just fine without me. I was interchangeable with another hard working person. It isn’t really the best feeling; you’d like to be a person whose imprint on the World means something. This is an aspiration worth having, and when your work is truly creative, you add value in a way that no one else can replicate.

When I started working almost 30 years ago at Los Alamos, this dynamic felt a lot different. People mattered a lot, and an individual was important. Every individual was important, unique and worth the effort. As a person you felt the warm embrace Los_Alamos_colloquiumof an incubator for aspiring scientists. You were encouraged to think of the big picture, and the long term while learning and growing. The Lab was a warm and welcoming place where people were generous with knowledge, expertise and time. It was still hard work and incredibly demanding, but all in the spirit of service and work with value. I repaid the generosity through learning and growing as a professional. It was an amazing place to work, an incredible place to be, an environment to be treasured, and made me who I am today.

Never attribute to malevolence what is merely due to incompetence

–Arthur C. Clark

It was also a place that was out of time. It was a relic. The modern World came to Los Alamos and destroyed it, creating a shadow of its former greatness. The sort of values that made it such a National treasure and one of the greatest institutions could not coexist with today’s culture. The individuals so treasured and empowered by th8286049510_dd79681555_ce scientific culture there were relabeled as “butthead cowboys,” troublemakers, and failures. The culture that was generous, long term in thought, viewing the big picture and focused on National service was haphazardly dismantled. Empowerment was ripped away from the scientists and replaced with control. Caution replaced boldness, management removed generosity, all in the name of formality of operations that removes anything unforeseen in outcomes. The modern world wants assured performance. Today Los Alamos is mere shadow of itself, stumbling forward toward the abyss of mediocrity. Witnessing this happen was one of the greatest tragedies of my life.

People who don’t take risks generally make about two big mistakes a year. People who do take risks generally make about two big mistakes a year.

–Peter F. Drucker

Along with assured performance we lose serendipity and discovery. We lose learning and surprises, good and bad. We lose the value in the individual, and the ability to have one person make a positive difference. All of this is to keep one person from making a negative difference or to avoid mistakes and failures. The removal of mistakes and failures removes the engine of learning and real scientific discovery from table as well. Each and every one of these steps is directly related to the fear of the bad things happening. Every good is a flip side of a bad thing, and when you can’t accept the bad, you can’t have the good either. In the process the individual has been removed from SONY DSCimportance. Everything is process today and anything bad can be managed out of existence. No one looks at the downside to this, and the downside is sinister to the quality of the workplace.

Let’s be clear about what I’m talking about. This isn’t about being cavalier and careless. It isn’t an invitation to be dangerous or thoughtless. This is about making a best earnest effort at something, and still failing. This is about doing difficult things that may not succeed, putting your best effort forward even if it falls short. In many ways we have lost the ability to distinguish between the good and bad failure with all failure viewed as bad, and punished. We have made the workplace an obsessively cautious and risk adverse place that lacks the soul it once embraced. We have lost the wonder and power of the supremely talented person in the prime of their creative powers to create game changing things or knowledge.

The core problem is the willingness to deal with the inevitable risks and failures with empowering people. Instead of seeing the risks and failures and a necessary element in enabling success, we have fallen victim to the fiction that we can manage the risk and failure out of existence, all while assuring productivity. This is utterly foolish and antithetical to reality. The risks are necessary to strive to achieve difficult and potentially great things. If one is working at the limit of their capability the result is frequently failure, and the ensemble of failures paves the way for success. It tells us clearly what does not work, and provides the hard lessons that educate us. Somehow we have allowed the delusion that achievement can be had without risk and failure to creep into our collective consciousness.

mediocritydemotivatorInstead of encouraging and empowering our people to take risks while tolerating and learning from failure, we do the opposite. We steer people away from doing risky work, punish failure and discourage lesson learning. It is as if we had suddenly become believers in the “free lunch”. True achievement is extremely difficult, and true achievement is powered by the ability to try to do risky almost impossible things. If failure is not used as an opportunity to learn, people will become disempowered and avoid the risks. This in turn will kill achievement before it can even be thought of. The entire system would seem to be designed to disempower people, and lower their potential for achievement.

The other aspect of this truly viscous cycle is the dismantling of expertise. Expertise is built on the back of years and years of failure. Of course this happens only if the failures are actively engaged as educational opportunities that empower the expert to engage in more thoughtful risks. These thoughtfully engaged in risks still need to fail and perhaps fail most of the time. Gradually the failures of today begin to look like the achievements of yesterday. What we see as a failure today would be a monumental achievement a decade ago. This is completely built on the back of seeing the failures of yesterday in the right light, and learning the lessons available from the experience.

When we empower people to take risks and grow them into experts, they also provide Unknown-3the knowledge necessary to mentor others. This was a key aspect of my early career experience at Los Alamos. At that time the Lab was teeming with experts who were generous with their time and knowledge. All you had to do was reach out and ask, and people helped you. The experts were eager to share their experience and knowledge with others in a spirit of collective generosity. Today we are managed to completely avoid this with managed time and managed focus. We are trained to not be generous because that generosity would rob our “customers” of our effort and time. The flywheel of the experts of today helping to create the experts of tomorrow is being undone. People are trained to neither ask, nor provide expertise freely.

What we are moving toward is a system that is less than the sum of its parts. What I started with was a system that added great value to every person, and effectively was far greater than the sum of its parts. The generosity that characterized my early career added immense value to every hour spent at work. Today this entire way of working is being torn apart by how we are managed. People can’t be generous if they have to account for all their time and charge it to a specific customer. The room for serendipity, discovery and the addition of personal value to activities is being removed to satisfy bean counters and small-mined people. We have allowed an irrational fear of one misspent dollar to waste billions of dollars and the productive potential of people’s lives. Worse yet, the whole apparatus erected to produce formal operations are ripping the creative force from the workplace and replacing it with soulless conformity. It matters less and less who we are each day; we are simply replaceable parts in a mindless machine.

I might be temped to simply end the discussion here, but this conclusion is rather dismal. imagesIt is where we find ourselves today. We also know that the state of affairs can be significantly better. How can we get there from here? The first step would be some sort of collective decision that the current system isn’t working. From my perspective, the malaise and lack effectiveness of our current system is so pervasive and evident that action to correct it is overdue. On the other hand, the current system serves the purposes of those in control quite well, and they are not predisposed to be agents of change. As such, the impetus for change is almost invariably external. It is usually extremely painful because the status quo does not want to be rooted out unless it is forced to. The circumstances need to demand performance that current system cannot produce, and as systems degrade this becomes ever more likely.

At the time, my life just seemed too complete, and maybe we have to break everything to make something better out of ourselves.

–Chuck Palahniuk

The current system is thoroughly disempowering and oriented toward explicit control of people’s actions. Keeping order and people in line while avoiding risk and failure are the core principles. The key to any change is enabling trust for the individual to move to centrality in the system. The upside to the trust is the degree of efficiency and effectiveness that is born from trust; the downside is the possibility of failure, poor performance and various human failings. The system needs to be resilient to these inevitable problems with people. The negative impact of trying to control and manage these failings results in destroying most of the positive things individuals can provide. Empowerment needs to trump control and allow people’s natural inclination toward success to be central to organizational design.

In most cases being a good boss means hiring talented people and then getting out of their way.

–Tina Fey

We need to completely let go of the belief that we can manage all the bad things away Steve_Jobs_Headshot_2010-CROPand not lose all the good things in the process. Bad things, bad outcomes and bad behavior happen, and perhaps need to happen to have all the good (in other words “shit happens”). Today we are gripped with a belief that negative outcomes can be managed away. In the process of managing away bad outcomes, we destroy the foundation of everything good. To put it differently we need to value the good and accept the bad as a necessary condition for enabling good outcomes. If one looks at failure as the engine of learning, we begin to realize that the bad is the foundation of the good. If we do not allow the bad things to happen, let people fuck things up, we can’t have really good things either. One requires the other and our attempts to control bad outcomes, removes a lot of good or even great outcomes at the same time.

An expert is someone who knows some of the worst mistakes that can be made in his subject, and how to avoid them.

– Werner Heisenberg

So to sum up, let’s trust people again. Let’s let them fail, fuck up and do bad things. Let’s let people learn from these failures, fuck up’s and painful experiences. These people will learn a lot, including very painful lessons and get hurt deeply in the process. They will become wise, strong, and truly experts at things. People who are entrusted are empowered and love their work. They are efficient, productive and effective. They have passion for what they do, and give their work great loyalty. They will take risks in a fearless manner. They will be allowed to fail spectacularly because spectacular success and breakthroughs only come from these fearlessly taken risks.

May I never be complete. May I never be content. May I never be perfect.

–Chuck Palahniuk

 

 

If you don’t know uncertainty, bounding is the first step to estimating it

01 Friday Sep 2017

Posted by Bill Rider in Uncategorized

≈ 1 Comment

Sometimes the hardest thing any of us can hope for is finding the courage to be honest with ourselves.

― Kira Saito

Today I’m writing about dealing with the unfortunate practice of failing to address uncertainty, and implicitly setting its value to zero, the smallest possible value. This approach is pernicious, and ubiquitous in computational science (and a lot of other science). It is a direct threat to progress and far too acceptable in practice.  I wrote about this at length decrying this standard practice, but it remains the most common practice in uncertainty quantification (https://williamjrider.wordpress.com/2016/04/22/the-default-uncertainty-is-always-zero/). In a nutshell when someone doesn’t know what the uncertainty is they simply assign a value of zero to it. We can do something better, but first this needs to be recognized for what it is, systematic and accepted ignorance.

chart-with-huge-error-barsThe reasons for not estimating uncertainties are legion. Sometimes it is just too hard (or people are lazy). Sometimes the way of examining a problem is constructed to ignore the uncertainty by construction (a common route to ignore experimental variability and numerical error). In other cases the uncertainty is large and it is far more comfortable to be delusional about its size. Smaller uncertainty is comforting and implies a level of mastery that exudes confidence. Large uncertainty is worrying and implies a lack of control. For this reason getting away with choosing a zero uncertainty is a source of false confidence and unfounded comfort, but a deeply common human trait.

mistakesdemotivator_largeIf we can manage to overcome the multitude of human failings underpinning the choice of the default zero uncertainty, we are still left with the task of doing something better. To be clear, the major impediment is recognizing that the zero estimate of uncertainty is not acceptable (most “customers” like the zero estimate because it seems better even though its assuredly not!). Most of the time we have a complete absence of information to base uncertainty estimates upon. In some cases we can avoid zero uncertainty estimates by being more disciplined and industrious, in other cases we can think about the estimation from the beginning of the study and build the estimation into the work. In many cases we only have expert judgment to rely upon for estimation. In this case we need to employ a very simple and well-defined technique to providing an estimate.

Learning is not the accumulation of knowledge, but rather, one thing only: understanding

― Donna Jo Napoli

The best way to explore estimates is using the time-honored approach of bounding the uncertainty. One should be able to provide clear evidence that the uncertainty is both smaller and larger than certain known values. This provides bounds for the magnitude of uncertainty.  Depending on the purpose of the study, these magnitudes can be used to more appropriately use the results. This can then be used to provide some sort of reasonable and evidence based uncertainty to energize progress and underpin credibility. If the estimate of the smallest possible uncertainty is that ubiquitous zero, the estimate should be rejected out of hand. The uncertainty is never ZERO, not ever. Nothing is known with absolute certainty. If the uncertainty is very small there should be very strong evidence to support the bold assertion. We do know some things extremely well like Planck’s constant, but it still has an uncertainty of a finite size.

The flip side to the lower bound is the upper bound for the uncertainty. Generally b7cb11b1c07d55e2165046959098156aspeaking, there will be a worst case to consider or something more severe than the scenario at hand. Such large uncertainties are likely to be quite uncomfortable to those engaging in the work. This should be uncomfortable if we are doing things right. The goal of this exercise is not to minimize uncertainties, but get things right. If such bounding uncertainties are unavailable, one does not have the right to do high consequence decision-making with results. This is the unpleasant aspect of the process; this needs to be the delivery of the worst case. To be more concrete in the need for this part of the bounding exercise, if you don’t know how bad the uncertainty is you have no business using the results for anything serious. As stated before the bounding process needs to be evidence based, the assignment of lower and upper bounds for uncertainty should have a specific and defensible basis.

Belief can be manipulated. Only knowledge is dangerous.

― Frank Herbert

Once the bounds for the uncertainty are established along with associated evidence, some choices need to be made to use the information. To a large extent the most conservative choice is the easiest to defend meaning that the upper bound for uncertainty should be used. If the work is being engaged in an honest sense this would be a pessimistic perhaps in the extreme. If one thinks about things in a probabilistic sense, the bounds should establish an interval for the potential uncertainty. This interval is most likely to be defensibly treated with a uniform distribution. For most cases using a midpoint averaging the lower and upper bounds is a reasonable choice. If the application associated with the decision-making is extremely important, the upper bound or something skewed in that direction is probably advisable.

imagesTo some extent this is a rather easy lift intellectually. Cultural difficulty is another thing altogether. The indefensible optimism associated with the default zero uncertainty is extremely appealing.  It provides the user with a feeling that the results are good. People tend to feel that there is a single correct answer. The smaller the uncertainty is the better they feel about the answer. Large uncertainty is associated with lack of knowledge and associated with low achievement. The precision usually communicated with the default, standard approach is highly seductive. It takes a great deal of courage to take on the full depth of uncertainty along with the honest admission of how much is not known. It is far easier to simply do nothing and assert far greater knowledge while providing no evidence for the assertion.

Uncertainty is a discomforting concept for people. Certainty is easy and comfortable while uncertainty is difficult and carries doubt. It is problematic to consider the role of chance in events, and the fickle nature of reality. A great many important events occur largely by chance and could have quite easily turned out quite differently. Consider how often you encounter a near miss in life, something where danger seemed far to close and just missed you. When these events turn out disastrously they can be tragedies. How often have similar tragedies been barely averted? This same dynamic plays out in experiments that are repeated. An attempt is made to make the experiment reproducible. Occasionally something completely different unfolds. The repeated effects are never exactly the same; there is a small variation. These variations are the uncertainty and depending on the experiment, they have a magnitude.

What happens when you do the experiment exactly once? The simplest thing to do is First-Time-Measurements-of-Turbulent-Mixingconsider this experiment to be a completely determined event with no uncertainty at all. This is the knee jerk response of people is the consideration of this single event as being utterly and completely deterministic with no variation at all. If the experiment were repeated with every attempt to make it as perfect as possible, it would turn out slightly differently. This comes from the myriad of details associated with the experiment that determine the outcome. Generally the more complex and energetic the phenomenon of being examined is, the greater the variation (unless there are powerful forces attracting a very specific solution). There is always a variation, the only question is how large it is; it is never, ever identically zero. The choice to view the experiment as perfectly repeatable is usually an unconscious choice that has no credible basis. It is an incorrect and unjustified assumption that is usually made without a second thought. As such the choice is unquestionably bad for science or engineering. In many cases this unconscious choice is dangerous, and represents nothing more than wishful thinking.

to hope was to expect

― Jane Austen

 

 

 

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • February 2026
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Regularized Singularity
    • Join 55 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Regularized Singularity
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...