• About The Regularized Singularity

The Regularized Singularity

~ The Eyes of a citizen; the voice of the silent

The Regularized Singularity

Monthly Archives: September 2015

Today We Value Form Over Substance

25 Friday Sep 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Our greatest fear should not be of failure but of succeeding at things in life that don’t really matter.

― Francis Chan

Vacations are a necessary disconnection from the drive and drain of the every day events of our lives. They are also necessary to provide perspective on things that become to commonplace within the day in day out. My recent little vacation was no different. I loved both Germany and France, but came to appreciate the USA more too. Europeans and particularly Parisians smoke in tremendous numbers. It makes a beautiful city like Paris a bit less ideal and uncomfortable. My wife has recently had major surgery that limits her mobility, and European accommodation for disabilities and handicaps are terrible in comparison to the USA.   680x250_paris2

So the lesson to be learned is that despite many issues, the USA is still a great place and better than Europe in some very real ways. These interesting observations are not the topic here, which is another observation born of vacation time and its numerous benefits.

Bureaucracy destroys initiative. There is little that bureaucrats hate more than innovation, especially innovation that produces better results than the old routines. Improvements always make those at the top of the heap look inept. Who enjoys appearing inept?

― Frank Herbert

Another thing I noted was the irritation I feel when formality trumps substance in work. Formality has its place, but when it stifles initiative, innovation and quality, it does more harm than good. There was an activity at work that I had finished prior to leaving. Anything related to the actual work of it was utterly complete. It involved reviewing some work and denoting whether they had completed the work promised (it was in our parlance, a milestone). They had, and in keeping with current practice for such things, the bar was set so low that they would have had an almost impossible time not succeeding. Despite the relative lack of substance to the entire affair, the old fashioned memo to management was missing (the new fashioned memo with electronic signature was finished, and delivered). Here form was subservient to function and effort was expended in a completely meaningless way.

I had committed to taking a real vacation; I was not going to waste Paris doing useless administrative work. I screened my e-mail and told the parties involved that I would deal with it upon my return. Yet people wouldn’t let go of this. They had to have this memo signed in the old-fashioned way. In the end I thought what if they had put as much effort into doing old-fashioned work? What of instead of dumbing down and making the work meaningless to make sure of success, the work had been reaching far, and focused on extending the state of practice? Well then it might have been worth a bit of extra effort, but the way we do work today, this administrative flourish was simply insult to injury.

Management cares about only one thing. Paperwork. They will forgive almost anything else – cost overruns, gross incompetence, criminal indictments – as long as the paperwork’s filled out properly. And in on time.

― Connie Willis

The experience makes it clear, today we value form over function, appearances over substance. It is an apt metaphor for how things work today. We’d rather be successful doing using useless work than unsuccessful doing useful work. A pathetic success is more valuable than a noble failure. Any failure is something that induces deep and unrelenting fear. The inability to fail is hampering success of the broader enterprise to such a great degree as to threaten the quality of the entire institution (i.e., the National Lab system).Pert_example_gantt_chart

The issue of how to achieve the formality and process desired without destroying the essence of the Lab’s excellence has not been solved. Perhaps if credit were given for innovative, risk-taking work so that the inevitable failures were not penalized would be a worthy start. In terms of impact on all the good that the Labs do, the loss of the engines of discovery they represent negatively impacts national security, the economy, the environment and the general state of human knowledge. Reversing these impacts and puts the Labs firmly in the positive column would be a monumental improvement.

It is hard to fail, but it is worse never to have tried to succeed.

― Theodore Roosevelt

The epitomes of these pathetic successes are milestones. Milestones measure our programs, and seemingly the milestones denote important, critical work. Instead of driving accomplishments, the milestones sew the seeds of failure. This is not specifically recognized failure, but the failure driven by the long-term decay of innovative, aggressive technical work. The reason is the view that milestones cannot fail for any reason, and this knowledge drives any and all risk from the definition of the work. People simply will not take a chance on anything if it is associated with a milestone. If we are to achieve excellence this tendency must be reversed. Somehow we need to reward the taking of risks to power great achievements. We are poorer as a society for allowing the current mindset to become the standard.

Without risk, we systematically accomplish less innovative important work, and ironically package the accomplishment of relatively pathetic activities as success. To insure success, good work is rendered pathetic so that the risk of failure is completely removed. It happens all the time, over and over. It is so completely engrained into the system that people don’t even realize what they are doing. To make matters worse, milestones command significant resources be committed toward their completion. So we have a multitude of sins revolving around milestones: lots of money going to execute low-risk research masquerading as important work.

pgmmilestoneOver time these milestones come to define the entire body of work. This approach to managing the work at the Labs is utterly corrosive and has aided the destruction of the Labs as paragons of technical excellence. We would be so much better off if a large majority of our milestone failed, and failed because they were so technically aggressive. Instead all our milestones succeed because the technical work is chosen to be easy. Reversing this trend requires some degree of sophisticated thinking about success. In a sense providing a benefit for conscientious risk-taking could help. We still could rely upon the current risk-averse thinking to provide systematic fallback positions, but we would avoid making the safe, low-risk path the default chosen path.

Only those who dare to fail greatly can ever achieve greatly.

― Robert F. Kennedy

One place where formality and substance collide constantly is the world of V&V. The conduct of V&V is replete with formality and I generally hate it. We have numerous frameworks and guides that define how it should be conducted. Doing V&V is complex and deep, never being fully defined or complete. Writing down the process for V&V is important simply for the primary need to grapple with the broader boundaries of what is needed. It is work that I do, and continue to do, but following a framework or guide isn’t the essence of what is needed to do V&V. An honest and forward-looking quality mindset is what V&V is about.

It is a commitment to understanding, curiosity and quality of work. All of these things are distinctly lacking in our current culture of formality over substance. People can cross all the t’s and dot all the i’s, yet completely failed to do a good job. Increasingly the good work is being replaced by formality of execution without the “soul” of quality. This is what I see, lots of lip service paid to completion of work within the letter of the law, and very little attention to a spirit of excellence. We have created a system that embraces formality instead of excellence as the essence of professionalism. Instead excellence should remain the central tenet in our professional work with formality providing structure, but not the measure of it.

Bureaucracies force us to practice nonsense. And if you rehearse nonsense, you may one day find yourself the victim of it.

― Laurence Gonzales

Let’s see how this works in practice within V&V, and where a different perspective and commitment would yield radically different results.

I believe that V&V should first and foremost be cast as the determination of uncertainties in modeling and simulation (and necessarily experimentation as the basis for validation). Other voices speak to the need to define the credibility of the modeling and simulation enterprise, which is an important qualitative setting for V&V work. Both activities combine to provide a deep expression of commitment to excellence and due diligence that should provide a foundation for quality work.

I feel that uncertainties are the correct centering of the work in a scientific context. These uncertainties should always be quantitatively defined, that is should never be ZERO, but always have a finite value. V&V should push people to make concrete quantitative estimates of uncertainty based on technical evidence accumulated through focused work. Sometimes this technical evidence is nothing more than expert judgment or accumulated experience, but most of time much more. The true nature of what is seen in work done today whether purely within the confines of customer work or research shown at conferences or in journals does not meet these principles. The failure to meet these principles isn’t a small quibbling amount, but a profound systematic failure. The failure isn’t really a broad moral problem, but a consequence of fundamental human nature at work. Good work does not provide a systematic benefit and in many cases actually provides a measurable harm to those conducting it.

Today, many major sources of uncertainty in the modeling, simulation or experimentation are unidentified, unstudied and systematically reported as being identically ZERO. Often this value of ZERO is simply implicit. This means that the work doesn’t state that it’s “ZERO,” but rather fails to examine the uncertainties at all leading to it being a nonentity. In other words the benefit of doing no work at all is reporting a smaller uncertainty. The nature of the true uncertainty is invisible. This is a recipe for an absolute disaster.Mesh_Refinement_Image4

A basic principle is that doing more work should result in smaller uncertainties. This is like a statistical sampling where gathering more samples systematically produces a smaller statistical error (look at standard error in frequentist statistics). The same thing applies to modeling or numerical uncertainty. Doing more work should always reduce uncertainties, but the uncertainty is always finite, and never identically ZERO. Instead by doing no work at all, we allow people to report ZERO as the uncertainty. Doing more work can only result in increasing the uncertainty. If doing more work increases the uncertainty the proper conclusion is that your initial uncertainty estimate was too small. The current state of affairs is a huge problem that undermines progress.

Here is a very common example of how it manifests itself in practice. The vast majority of computations for all purposes do nothing to estimate numerical errors, and get away with reporting an effective value of ZERO for the numerical uncertainty. Instead of ZERO if you have done little or nothing to structurally estimate uncertainties, your estimates should be larger than the truth to account for your lack of knowledge. Less knowledge should never be rewarded with reporting smaller uncertainty.

For example you do some work and find out that the numerical uncertainty is larger than your original effort. The consequence is that your original estimate was too small and you should learn about how to avoid this problem in the future. Next time doing more work and getting to report a smaller uncertainty should then reward you. You should also do the mea culpa and admit that your original estimate was overly optimistic. Remember V&V is really about being honest about the limitations of modeling and simulation. Too often people get hung up on being able to do complete technical assessment before reporting any uncertainty. If the full technical work cannot be executed, they end up presenting nothing at all, or ZERO.

People get away with not doing numerical error estimation in some funny ways. Here is an example that starts with the creation of the numerical model for a problem. If the model is created so that it uses all the reasonably available computing resources, it can avoid estimating numerical errors by a couple of ways. Often these models are created with an emphasis of computational geometric resolution. Elements of the problem are meshed using computational elements that are one thick (or wide or tall). As a result the model cannot be (simply) coarsened to produce a cheaper model that can assist in computational uncertainty estimation. Because it has used all the reasonable resources, refining the model and completing a simulation is impossible without heroic efforts. You effectively only have a single mesh resolution to work with by fiat.

Then they often claim that their numerical errors are really very small. Any effort to estimate these small errors would be a waste of time. This sort of twisted logicurldemands a firm unequivocal response. First, if your numerical error is so small than why are using such a computationally demanding model? Couldn’t you get by with a bit more numerical error since its so small as to be regarded as negligible? Of course their logic doesn’t go there because their main idea is to avoid doing anything, not actually estimate the numerical uncertainty or do anything with the information. In other words, this is a work avoidance strategy and complete BS, but there is more to worry about here.

It is troubling that people would rely upon meshes where a single element defines a length scale in the problem. Almost no numerical phenomena I am aware of are resolved in a single element with the exception of integral properties such as conservation, and only if this is built into the formulation. Every quantity associated with the single element is simply there for integral effect, and could be accommodated with even less resolution. It is almost certainly not “fully resolved” in any way shape or form. Despite these rather obvious realities of numerical modeling of physical phenomena, the practice persists and in fact flourishes. The credibility of such calculations should be taken as being quite suspect without extensive evidence to contrary.

In the end we have embraced stupidity and naivety as principles packaged as formality and process. The form of the work being well planned and executed as advertised for defining quality. Our work is delivered with unwavering over-confidence that is not supported by the qualities of the foundational work. We would be far better off looking to intelligence, curiosity and sophistication with a dose of wariness as the basis for work. Each of these characteristics form a foundation that naturally yields the best effort possible rather than the systematic reduction of risk that fails to push the boundaries of knowledge and capability. We need to structure our formal processes to encourage our best rather than frighten away the very things we depend upon for success as individuals, institutions and a people.

 

Your Time Step Control is Wrong

18 Friday Sep 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

IMG_3163

No man needs a vacation so much as the man who has just had one.

― Elbert Hubbard

A couple of things before I jump in: I was on vacation early this week, and by vacation I mean, vacation, nothing from work was happening, it was Paris and that’s too good to waste working, and the title of the post is partially right, its actually a situation that is much worse than that.

Among the most widely held and accepted facts for solving a hyperbolic PDE explicitly is the time step estimate, and it is not good enough. Not good enough to be relied upon for production work. That’s a pretty stunning thing to say you might think, but its actually pretty damn obvious. Moreover this basic fact also means that the same bounding estimates aren’t good enough to insure proper entropy production either since they are based on the same basic thing. In short order we have two fundamental pieces of the whole solution technology that can easily fall completely short of what is needed for production work.

It is probably holding progress back as more forgiving methods will be favored in the face of this shortcoming. What is more stunning is how bad the situation can actually be.

If you don’t already know, what is the accepted and standard approach? The time step size or dissipation is proportional to the characteristic speeds in the problem and then other specifics of a given method. The characteristic speeds are the absolute key to everything. For simple one dimensional gas dynamics, the characteristics are u\pm c and u where u is the fluid velocity and c is the sound speed. A simple bound can be used as \lambda_{\max}=|u|+c. The famous CFL or Courant condition gives a time step of \Delta t = \min(A \Delta x/\lambda_{\max}), where A is a positive constant typically less than one. Then you’re off to the races and computing solutions to gas dynamics.

For dissipation purposes you look for the largest local wave speeds for a number of simple Riemann solvers, or other dissipation mechanism. This can be done locally or globally. If you choose the wave speed large enough, you are certain to meet the entropy condition.

sodxThe problem is that this way of doing things is as pervasive as it is wrong. The issue is that it ignores nonlinearities that can create far larger wave speeds. This happens on challenging problems and at the most challenging times for codes. Moreover the worse situation for this entire aspect of the technology happens under seemingly innocuous circumstances, a rarefaction wave, albeit it becomes acute for very strong rarefactions. The worst case is the flow into a vacuum state where the issue can become profoundly bad. Previously I would account for the nonlinearity of sound speeds for shock waves by adding a term similar to an artificial viscosity to the sound speed to account for this. This is only proportional to local velocity differences, and does not account for pressure effects. This approach looks like this for an ideal gas, C = Co + \frac{\gamma + 1}{2}|\Delta u|. This helps the situation a good deal and makes the time step selection or the wave speed estimate better, but far worse things can happen (the vacuum state issue mentioned above).

Here is the problem that causes this issue to persist. To really get your arms around this issue requires a very expensive solution to the Riemann problem, the exact Riemann solver. Moreover to get the bounding wave speed in general requires a nonlinear iteration using Newton’s method, and checking my own code 15 iterations of the solver. No one is going to do this to control time steps or estimate wave speeds.

Yuck!

I will postulate that this issue can be dealt with approximately. A first step would be to include the simplest nonlinear Riemann solver to estimate things; this uses the acoustic impedances to provide an estimate of the interior state of the Riemann fan. Here one puts the problem in a Lagrangian frame and solve the approximate wave curves, on the left side -\rho_l C_l (u_* - u_l) = p_* - p_l and on the right side \rho_r C_r (u_r -u_*) = p_r - p_* solving for u_* and p_*. Then use the jump conditions for density, 1/\rho_* - 1/\rho = (u_l-u_*)/\rho_l C_l to find the interior densities (similar to the right). Then compute the interior sound speeds to get the bounds. The problem is that for a strong shock or rarefaction this approximation comes up very short, very short indeed.

The way to do better is to use a quadratic approximation for the wave speeds where the acoustic impedance changes to account for the strength of the interaction. I used these before to come up with a better approximate Riemann solver. The approximation is largely the same as before except, \rho C_o \rightarrow \rho C_o + \rho s (u_* - u_o). Now you solve a quadratic problem, which is still closed form, but you have to deal with an unphysical root (which is straightforward using physical principles). For the strong rarefaction this still doesn’t work very well because the wave curve has a local minimum at a much lower velocity than the vacuum velocity.

I think this can be solved, but I need to work out the details and troubleshoot. Aside from this the details are similar to the description above.

Rider, William J. “An adaptive Riemann solver using a two-shock approximation.” Computers & fluids 28.6 (1999): 741-777.

Multimat2015: A Biannual Festival on Computing Compressible Multiple Materials

11 Friday Sep 2015

Posted by Bill Rider in Uncategorized

≈ 1 Comment

An expert is someone who knows some of the worst mistakes that can be made in his subject, and how to avoid them.
― Werner Heisenberg

logo-ohnetextOne of my first talked about the Multimat2013, a biannual meeting of scientists specializing in the computation of multi-material compressible flows. Last time in 2013 we met in San Francisco, this time in Würzburg Germany. These conferences began as a minisymposium at an international congress in 2000. The first actual “Multimat” was 2002 in Paris. I gave the very first talk at this meeting (and it was a near disaster, an international jet lag tale with admonitions about falling asleep when you arrive in Europe, don’t do it!). The 2nd Conference was in 2005 and then every two years thereafter. The spiritual leader for the meetings and general conference chairman is Misha Shashkov, a Lab fellow at Los Alamos. Still taken as a whole the meeting marks a remarkable evolution and renaissance for numerical methods, particularly Lagrangian frame shock capturing.

1024px-Marienberg_wuerzburgSometimes going to a conference is completely justified by witnessing a single talk. This was one of those meetings. Most of the time we have to justify going to conferences by giving our own talks. The Multimat2015 conference was a stunning example of just how wrong-headed this point of view is. The point of going to a conference is to be exposed to new ideas from a different pool of ideas than you usually swim in. It is not to market or demonstrate our own ideas. This is not to say that giving talks at conferences aren’t valuable, they just aren’t the principle or most important reason for going. This is a key manner in which our management simply misunderstands science.

The morning of the second day I had the pleasure of seeing a talk by Michael Dumbser (University of Trento). I’ve followed his career for a while and deeply appreciate the inventive and interesting work he does. For example I find his PnPm methods to be a powerful and interesting approach to discretely attacking problems in a manner that may be vastly powerful. Nonetheless I was ill prepared for the magnificent work he presented at Multimat2015. One of the things that have held discontinuous Galerkin methods back for years is nonlinear stabilization. I believe Michael has “solved” this problem, at least conceptually.

Like many brilliant ideas he took a problem that cannot be solved well and recast it into a problem that we know how to solve. This is just such a case. The key idea is to identify elements that need nonlinear stabilization (or in other words, the action of a limiter). One identified, these elements are then converted into a number of finite volume elements corresponding to the degrees of freedom in the discontinuous basis used to discretize the larger element. Then a nonlinear stabilization is applied to the finite volumes (using monotonicity limiters, WENO, etc. whatever you like). Once the stabilized solution is found on the temporary finite volumes, the evolved original discontinuous basis is recovered from the finite volume solution. Wow what an amazingly brilliant idea! This provides a methodology that can retain high sub-element level resolution of discontinuous solutions.

The problem that remains are producing a nonlinear stabilization suitable for production use that goes beyond monotonicity preservation. This was the topic of my talk, how does one most to something better than mere monotonicity preservation as a nonlinear stabilization technique and be robust enough for production use. We need to produce methods that stabilize solutions physical, but retain accuracy to a larger degree while producing results robustly enough for use in a production setting. Once such a method is developed it would improve Dumbser’s method quite easily. A good step forward would be methods that do not damp isolated, well-resolved extrema in a robust way. Just as first order methods are the foundation for monotonicity preserving methods, I believe that monotonicity preserving methods can form the basis for extrema-preserving methods.

I often use a quote from Scott Adams to describe the principle for designing high-resolution methods, “Logically all things are created by a combination of simpler less capable components,” pointed out by Culbert Laney in his book Computational Gas Dynamics. The work of Dumbser is a perfect exemplar of this principle in many ways. Here the existing state of the art methods for Gas Dynamics are used as fundamental building blocks for stabilizing the discontinuous Galerkin methods.

sodAnother theme from this meeting is the continued failure by the broader hyperbolic PDE community to quantify errors, or quantify the performance of the methods used. We fail to do this even when we have an exact solution… Well this isn’t entirely true. We quantify the errors when we have exact solution that is continuously differentiable. So when the solution is smooth we show the order of accuracy. Change the problem to something with a discontinuity and the quantification always goes away and replaced with hand-waving arguments and expert judgment.

The reason is that the rate of convergence for methods is intrinsically limited to first-order with a discontinuity. Everyone then assumes that the magnitude of the error is meaningless in this case. The truth is that the magnitude of the error can be significantly different from method to method and an array of important details changes the error character of the methods. We have completely failed to report on these results as a community. The archetype of this character is Sod’s shock tube, the “hello World” problem for shock physics. We have gotten into the habit of simply showing results to this problem that demonstrate that the method is reasonable, but never report the error magnitude. The reality is that this error magnitude can vary by a factor of 10 for commonly used methods at the same grid resolution. Even larger variations occur for more difficult problems.images copy 3

The problems with a lack of quantification continue and magnify in more than one dimension. For problems with discontinuities there are virtually no exact solutions for problems in multiple dimensions (genuinely multi-dimensional as opposed to problems that are one-dimensional run in multiple dimensions). One of the key aspects of multiple dimensions is vorticity. This renders problems chaotic and non-unique. This only amplifies the hand waving and expert statements on methods and their relative virtues. I believe we should be looking for ways to move past this habit and quantify the differences.

This character in no small way is holding back progress. As long as hand waving and expert judgment is the guide for quality, progress and improvements for method will be hostage to personality instead of letting the scientific method guide choices.

General issues with quantifying the performance of methods. Where it is easy isn’t done, where it is hard. Multidimensional problems that are interesting all have vorticity, the results are all compared in the infamous eyeball or viewgraph norm. Mixing & vorticity are essential, but never measured. All comparisons are expert based and use hand-waving arguments, and the current experts and their methods will continue to hold sway and progress will wane.

The heart of excitement for previous meetings, collocated, cell-centered Lagrangian methods have now become so common and broadly used as to be passé. A number of good talks were given on this class of methods showing a broad base of progress. It is remarkable that such methods have now perhaps displaced the classical staggered mesh methods originating with Von Neumann as the stock and trade of this community. The constant and iterative progress with these methods awaits the next leap in performance, and the hard work of transitioning to being a workhorse in production solutions. This work is ongoing and may ultimately provide the impetus for the next great leap forward.

Aside from this work the meeting attempts to work within the natural tension between physics, mathematics, engineering and computer science to its great benefit. In addition to this beneficial tension, the meeting also straddles the practical and pragmatic needs of code developers for production software, and university research. Over the years we have witnessed a steady stream of ideas and problems flowing back and forth between these disparate communities. As such the meeting is a potpourri of variety and perspective providing many great ideas for moving the overall technical community forward through the creative utilization of these tensions

We tend to overvalue the things we can measure and undervalue the things we cannot.
― John Hayes

Now I am looking forward to the next meeting two years hence in Santa Fe.

What Do We Want From Our Labs?

04 Friday Sep 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Some men are born mediocre, some men achieve mediocrity, and some men have mediocrity trust upon them.

― Joseph Heller

map_1400_REV3

In taking a deep breathe and pausing from my daily grind, I considered the question, “what do we want from our Labs?” By Labs I mean the loose collection of largely government sponsored research institutions supporting everything from national defense to space exploration. By, we, I mean the Nation and its political, intellectual and collective leadership. Working at a Lab for the last 25 plus years under the auspices of the Department of Energy (working for two Labs actually), I think this question is worth a lot of deep thinking and consideration.

A reasonable conclusion drawn from the experiences given my span of employment would be,

“We want to destroy the Labs as competent entities.”

I’m fairly sure this isn’t the intent of our National Leadership, but rather an outcome of 8286049510_dd79681555_cother desires. Otherwise well-intentioned directives that have deeply damaging unintended consequences drive the destruction of the Labs. While calling the directives well-intentioned is probably correct, the mindset driving them is largely a combination of fear, control and unremittingly short-term thinking. Such destruction is largely a consequence of changes in National behavior that sweep across every political and economic dimension. Many of the themes parallel the driving forces of inequality and a host of other societal transformations.

4797035306_84fa3db10a_bOne of the key aspects of the Labs that deeply influenced my own decision to work there was their seemingly enduring commitment to excellence. In the years since, this commitment has wavered and withered to the point where excellence gets lip service and little else. The environment no longer supports the development or maintenance of truly excellent science and engineering (despite frequent protestations of being “World Class”). Our National Labs were once truly World Class, but no more. Decades of neglect and directives that conflict directly with achieving World Class results have destroyed any and all claim to such lofty status. We are not part of an acceptance of mediocrity that we willfully ignore and ignorantly claim to be excellent and world class.

World Class requires commitment to quality of the sort we cannot muster because it also requires a degree on independence and vigorous intellectual exchange that our masters can no longer tolerate. So, here is my first answer of what they want from the Labs,

“obedience” and “compliance”

above all else. Follow rules and work on what you are told to work on. Neither of these values can ever lead to anything “World Class”. I would submit that these values could only undermine and work to decay everything needed to be excellent.

One place where the fundamental nature of excellence and the governance of the Labs intersect is the expectations on work done. If one principle dominates the expectations that the Labs must comply with is

“do not ever ever ever fail at anything”.

The kneejerk response from politicians, the public and the increasingly scandal-mongering media is “that seems like a really good thing.” If they don’t fail then money won’t be wasted and they’ll just do good work. Wrong and wrong! If we don’t allow failure, we won’t allow success either, the two are intertwined as surely as life and death.

nasa1In reality this attitude have been slowly strangling the quality from the Labs. By never allowing failure we see a march steadily toward mediocrity and away from excellence. No risks are ever taken, and the environment to develop World Class people, research and results is smothered. We see an ever-growing avalanche of accounting and accountability to make sure no money ever gets wasted doing anything that wasn’t pre-approved. Meticulous project planning swallows up any and all professional judgment, risk-taking or opportunities. Breakthroughs are scheduled years in advance, absurd as that sounds.

People forget how fast you did a job – but they remember how well you did it

― Howard Newton

The real and dominant outcome from all this surety of outcomes is a loss of excellence, innovation and an ever-diminishing quality. The real losers will be the Nation, its security and its economic vitality into the future. Any vibrancy and supremacy we have in economics and security is the product of our science and technology excellence thirty to forty years ago. We are vulnerable to be preyed upon by whomever has the audacity to take risks and pursue the quality we have destroyed.

Another piece of the puzzle are the salaries and benefits for the employees and managers. At one level it is clear that we are paid well, or at least better than the average American. The problem is that in relative terms we are losing ground every year to where we once were. On the one hand we are told that we are World Class, yet we are compensated at a market-driven, market-average rate. Therefore we are either incredibly high-minded or the stupidest World Class performers out there. At the same time the management’s compensation has shot up, especially at the top of the management of the Labs. On the one hand this is simply an extension of the market-driven approach. Our executives are compensated extremely well, but not in comparison to their private industry peers. In sum the overall picture painted by the compensation at the Labs is one of confusion, and priorities that are radically out of step with the rhetoric.

All of this is tragic enough in and of itself were it simply happening at the Labs alone. Instead these trends merely echo a larger chorus of destruction society wide. We see all our institutions of excellence under a similar assault. Education is being savaged by many of the same forces. Colleges view the education of the next generation as a mere afterthought to balancing their books and filling their coffers. Students are a source of money rather than a mission. One need only look at what universities value; one thing is clear educating students isn’t their priority. Business exists solely to enrich their executives and stockholders; customers and community be damned. Our national infrastructure is crumbling without anyone rising to even fix what already exists. Producing an infrastructure suitable for the current Century is a bridge too far. All the while the Country pours endless resources into the Military-Industrial Complex for the purpose of fighting paper tigers and imaginary enemies. Meanwhile the internal enemies of our future are eating us from within and winning without the hint of struggle.

In the final analysis what are we losing by treating our Labs with such callousness and disregard? For the most part we are losing opportunity to innovate, invent and discover the science and technology of the future. In the 25 years following World War II the Labs were an instrumental set of institutions that created the future and continued the discoveries that drove science and technology. The achievements of that era are the foundation of the supremacy in both National defense and security, but also economic power. By destroying the Labs as we are doing, we assure that this supremacy will fade into history and we will be overtaken by other powers. It is suicide by a slow acting poison.

Quality means doing it right when no one is looking.

― Henry Ford

We have short-circuited the engines of excellence in trying to control risk and demonstrate accountability. The best way to control risk is not take any. We have excelled at this. By not taking risks we don’t have to deal with or explain failure in the short-term, but in the long-term we destroy quality and undermine excellence. Accountability is the same. We know how every dollar is spent, and what work is done every hour of the day, but that money no longer supports serendipity and the professional discretion and judgment that underpinned the monumental achievements of the past. One cannot claim to plan or determine a priori the breakthroughs and discoveries they haven’t made yet. Now we simply have low-risk and fully accountable Laboratories that can only be described as being mediocre.

Such mediocrity provides the Nation with no short-term issues to explain or manage, but makes our long-term prospects clearly oriented towards a decline in the Nation’s fortunes. We seem to uniformly lack the courage and ethical regard for our children to stop this headlong march away from excellence. Instead we embrace mediocrity because it’s an easy and comfortable short-term choice.

We don’t get a chance to do that many things, and every one should be really excellent. Because this is our life.

― Steve Jobs

The only sin is mediocrity.

― Martha Graham

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • February 2026
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Regularized Singularity
    • Join 55 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Regularized Singularity
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...