Science is not about making predictions or performing experiments. Science is about explaining.
― Bill Gaede
We would be far better off removing the word “predictive” as a focus for science. If we replaced the emphasis on prediction with a focus on explanation and understanding, our science would improve overnight. The sense that our science must predict carries connotations that are unrelentingly counter-productive to the conduct of science. The side-effects of the predictivity undermine the scientific method at every turn. The goal of understanding nature and explaining what happens in the natural world is consistent with the conduct of high quality science. In many respects large swaths of the natural world are unpredictable in highly predictable ways. Our weather is a canonical example of this. Moreover, we find the weather to be unpredictable in a bounded manner as time scales become longer. Science that has focused on understanding and explanation has revealed these truths. Attempting to focus prediction under some circumstances is both foolhardy and technically impossible. As such the reality of prediction needs to be entered into carefully and thoughtfully under well-chosen circumstances. We also need the freedom to find out that we are wrong and incapable of prediction. Ultimately, we need to find out limits on prediction and work to improve or accept these limits.
“Predictive Science” is mostly just a buzzword. We put it in our proposals to improve the
chances of hitting funding. A slightly less cynical take would take predictive as the objective for science that is completely aspirational. In the context of our current world, we strive for predictive science as a means of confirming our mastery over a scientific subject. In this context the word predictive implies that the we understand the science well enough to foresee outcomes. We should also practice some deep humility in what this means. Predictivity is always a limited statement, and these limitations should always be firmly in mind. First, predictions are limited to some subset of what can be measured and fail for other quantities. The question is whether the predictions are correct for what matters? Secondly, the understanding is always waiting to be disproved by a reality that is more complex than we realize. Good science is acutely aware of these limitations and actively probes the boundary of our understanding.
In the modern world we constantly have new tools to help expand our understanding of science. Among the most important of these new tools is modeling and simulation. Modeling and simulation is simply an extension of the classical scientific approach. Computers allow us to solve our models in science more generally than classical means. This has increased the importance and role of models in science. We can envision more complex models having more general solutions with computational solutions. Part of this power comes with some substantial responsibility; computational simulations are highly technical and difficult. They come with a host of potential flaws, errors and uncertainties that cloud results and need focused assessment. Getting the science of computation correct and assessed to play a significant role in the scientific enterprise requires a broad multidisciplinary approach with substantial rigor. Playing a broad integrating role in predictive science is verification and validation (V&V). In a nutshell V&V is the scientific method as applied to modeling and simulation. Its outcomes are essential for making any claims regarding how predictive your science is.
Experiment is the sole source of truth. It alone can teach us something new; it alone can give us certainty.
― Henri Poincaré
We can take a moment to articulate the scientific method and then restate it in a modern context using computational simulation. The scientific method involves making hypotheses about the universe and testing those hypotheses against observations of the natural world. One of the key ways to make observations are experiments where the measurements of reality are controlled and focused to elucidate nature more clearly. These hypotheses or theories usually produce models of reality, which take the form of mathematical statements. These models can be used to make predictions about what an observation will be, which then confirms the hypothesis. If the observations are in conflict with the model’s predictions, the hypothesis and model need to be discarded or modified. Over time observations become more accurate, often showing the flaws in models. This usually means a model needs to be refined rather than thrown out. This process is the source of progress in science. In a sense it is a competition between what we observe and how well we observe it, and the quality of our models of reality. Predictions are the crucible where this tension can be realized.
The quest for absolute certainty is an immature, if not infantile, trait of thinking.
― Herbert Feigl
One of the best ways to understand how to do predictive science in the context of modeling and simulation is a simple realization. V&V is basically a methodology that encodes the scientific method into modeling and simulation. All of the content of V&V is assuring that science is being done with a simulation and we aren’t fooling ourselves. Verification is all about making sure the implementation of the model and its solution are credible and correct. The second half of verification is associated with estimating the errors in the numerical solution of the model. We need to assess the numerical uncertainty and the degree to which it clouds the model’s solution.
Validation is then the structured comparison of the simulated model’s solution with observations. Validation is not something that is completed, but rather it is an assessment of work. At the end of the validation process evidence has been accumulated as to the state of the model. Is the model consistent with the observations? If the uncertainties in the modeling and simulation process along with the uncertainties in the observations can lead to the conclusion that the model is correct enough to be used. In many cases the model is found to be inadequate for the purpose and needs to be modified ˙or changed completely. This process is simply the hypothesis testing so central to the conduct of science.
Since all models are wrong the scientist cannot obtain a “correct” one by excessive elaboration. On the contrary following William of Occam he should seek an economical description of natural phenomena. Just as the ability to devise simple but evocative models is the signature of the great scientist so overelaboration and overparameterization is often the mark of mediocrity.
― George Box
Now it would be very remarkable if any system existing in the real world could be exactly represented by any simple model. However, cunningly chosen parsimonious models often do provide remarkably useful approximations. For example, the law PV = RT relating pressure P, volume V and temperature T of an “ideal” gas via a constant R is not exactly true for any real gas, but it frequently provides a useful approximation and furthermore its structure is informative since it springs from a physical view of the behavior of gas molecules.
― George Box
The George Box maxim about all models being wrong, but some being useful is important and key in the conduct of V&V. It is also central to modeling and simulation’s most important perspective, the constancy of necessity for improvement. Every model is a mathematical abstraction that has limited capacity for explaining nature. At the same time the model has a utility that may be sufficient for explaining everything we can measure. This does not mean that the model is right, or perfect, it means the model is adequate. The creative tension in science is the narrative of arc of refining hypotheses and models of reality or improving measurements and experiments to more acutely test the models. V&V is a process for achieving this end in computational simulations. Our goal should always be to find inadequacy in models and define the demand for improvement. If we do not have the measurements to demonstrate a model’s incorrectness, the experiments and measurements need to improve. All of this serves progress in science in a clear manner.
The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
–Sir Arthur Stanley Eddington
Let’s take a well thought of and highly accepted model, the incompressible Navier-Stokes equations. This model is thought to largely contain the proper physics of fluid mechanics, most notably turbulence. Perhaps this is true although our lack of progress in turbulence might indicate that something is amiss. I will state without doubt that the incompressible Navier-Stokes equations are wrong in some clear and unambiguous ways. The deepest problem with the model is incompressibility. Incompressible fluids do not exist and the form of the mass equation showing divergence free velocity fields implies several deeply unphysical things. All materials in the universe are compressible and support sound waves, and this relation opposes this truth. Incompressible flow is largely divorced from thermodynamics and materials are thermodynamic. The system of equations violates causality rather severely, the sound waves travel at infinite speeds. All of this is true, but at the same time this system of equations is undeniably useful. There are large categories of fluid physics that they explain quite remarkably. Nonetheless the equations are also obviously unphysical. Whether or not this unphysical character is consequential should be something people keep in mind.
It is impossible to trap modern physics into predicting anything with perfect determinism because it deals with probabilities from the outset.
― Arthur Stanley Eddington
In conducting predictive science one of the most important things you can do is make a prediction. While you might start with something where you expect the prediction to be correct (or correct enough), the real learning comes from making predictions that turn out to be wrong. It is wrong predictions that will teach you something. Sometimes the thing you learn is something about your measurement or experiment that needs to be refined. At other times the wrong prediction can be traced back to the model itself. This is your demand and opportunity to improve the model. Is the difference due to something fundamental in the model’s assumptions? Or is it simply something that can be fixed by adjusting the closure of the model? Too often we view failed predictions as problems when instead they are opportunities to improve the state of affairs. I might posit that if you succeed with a prediction, it is a call to improvement; either improve the measurement and experiment, or the model. Experiments should set out to show flaws in the models. If this is done the model needs to be improved. Successful predictions are simply not vehicles for improving scientific knowledge, they tell us we need to do better.
When the number of factors coming into play in a phenomenological complex is too large scientific method in most cases fails. One need only think of the weather, in which case the prediction even for a few days ahead is impossible.
― Albert Einstein
In this context we can view predictions as things that at some level we want to fail at. If the prediction is too easy, the experiment is not sufficiently challenging. Success and failure exists on a continuum. For simple enough predictions our models will always work, and for complex enough predictions, the models will always fail. The trick is finding the spot where the predictions are on the edge of credibility, and progress is needed and ripe. Too often we take the mindset is taken where predictions need to be successful. An experiment that is easy to predict is not a success, it is a waste. I would rather see predictions be focused at the edge of success and failure. If we are interested in making progress, predictions need to fail so that models can improve. By the same token a successful prediction indicates that the experiment and measurement need to be improved to more properly challenge the models. The real art of predictive science is working at the edge of our predictive modeling capability.
A healthy focus on predictive science with a taste for failure produces a strong driver for lubricating the scientific method and successfully integrating modeling and simulation as a valuable tool. Prediction involves two sides of science to work in concert; the experiment-observation of the natural world, and the modeling of the natural world via mathematical abstraction. The better the observations and experiments, the greater the challenge to models. Conversely, the better the model, the greater the challenge to observations. We need to tee up the tension between how we sense and perceive the natural world, and how we understand that world through modeling. It is important to examine where the ascendency in science exists. Are the observations too good for the models? Or can no observation challenge the models? This tells us clearly where we should prioritize.
We need to understand where progress is needed to advance science. We need to take advantages of technology in moving ahead in either vein. If observations are already quite refined, but new technology exists to improve them, it behooves us to take advantage of it. By the same token modeling can be improved via new technology such a solution methods, algorithmic improvements and faster computers. What is lacking from the current dialog is a clear focus on where the progress imperative exists. A part of integrating predictive science well is determining where the progress is most needed. We can bias our efforts to focus on where the progress is most needed while keeping opportunities to make improvements in mind.
The important word I haven’t mentioned yet is “uncertainty”. We cannot have predictive science without dealing with uncertainty and its sources. In general, we systematically or perhaps even pathologically underestimate how uncertain our knowledge is. We like to believe our experiments and models are more certain than they actually are. This is really easy to do in practice. For many categories of experiments, we ignore sources of uncertainty and simply get away with an estimate of zero for that uncertainty. If we do a single experiment, we never have to explicitly confront that the experiment isn’t completely reproducible. On the modeling side we see the particular experiment as something to be modeling precisely even if the phenomena of interest are highly variable. This is common and a source of willful cognitive dissonance. Rather than confront this rather fundamental uncertainty, we willfully ignore it. We do not run replicate experiments and measure the variation in results. We do not subject the modeling to reasonable variations in the experimental conditions and check the variation in the results. We pretend that the experiment is completely well-posed, and the model is too. In doing this we fail at the scientific method rather profoundly.
Another key source of uncertainty is numerical error. It is still common to present results without any sense of the numerical error. Typically, the mesh used for the calculation is asserted to be fine enough without any evidence. More commonly the results are simply given without any comment at all. At the same time the nation is investing huge amounts of money in faster computers that implicitly assume that faster computers yield better solutions, a priori. This entire dialog often proceeds without any support from evidence. It is 100% assumption. When one examines these issues directly there is often a large amount of numerical error that is being ignored. Numerical error is small in simple problems without complications. For real problems with real geometry and real boundary conditions with real constitutive models, the numerical errors are invariably significant. One should expect some evidence to be presented regarding its magnitude, and you should be suspicious if it’s not there. Too often we simply give simulations a pass on this detail and fail due diligence.
Truth has nothing to do with the conclusion, and everything to do with the methodology.
― Stefan Molyneux
In this sense the entirety of V&V is a set of processes for collecting evidence about credibility and uncertainty. In one respect verification is mostly an exercise in collecting evidence of credibility and due diligence for quality in computational tools. Are the models, codes and methods implemented in a credible and high-quality manner. Has the code development been conducted in a careful manner where the developers have checked and done a reasonable job of producing code without obvious bugs? Validation could be characterized by collecting uncertainties. We find upon examination that many uncertainties are ignored in both computational and experimental work. Without these uncertainties and the evidence surrounding them, the entire practice of validation is untethered from reality. We are left to investigate through assumption and supposition. This sort of validation practice has a tendency to simply regress to commonly accepted notions. In such an environments models are usually accepted as valid and evidence is often skewed toward that as a preordained conclusion. Without care and evidence, the engine of progress for science is disconnected.
In this light we can see that V&V is simply a structured way of collecting evidence necessary the scientific method. Collecting this evidence is difficult and requires assumptions to be challenged. Challenging assumptions is courting failure. Making progress requires failure and the invalidation of models. It requires doing experiments that we fail to be able to predict with existing models. We need to assure that the model is the problem, and the failure isn’t due to numerical error. To determine these predictive failures requires a good understanding of uncertainty in both experiments and computational modeling. The more genuinely high quality the experimental work is, the more genuinely testing the validation is to model. We can collect evidence about the correctness of the model and clear standards for judging improvements in the models. The same goes for the uncertainty in computations, which needs evidence so that progress can be measured.
It doesn’t matter how beautiful your theory is … If it doesn’t agree with experiment, it’s wrong.
― Richard Feynman
Now we get to the rub in the context of modeling and simulation in modern predictive science. To make progress we need to fail to be predictive. In other words, we need to fail in order to succeed. Success should be denoted by making progress in becoming more predictive. We should take the perspective that predictivity is a continuum, not a state. One of the fundamental precepts of stockpile steward ship is predictive modeling and simulation. We want confident and credible evidence that we are capable of faithfully predicting certain essential aspects of reality. The only way to succeed at this mission is continually challenging and pushing ourselves at the limit of our capability. This is means that failure should be an almost constant state of being. The problem is projecting a sense of success, which society demands while continually failing. We do not do this well. Instead we need to project a sense that we continually succeed at everything we promise.
In the process we create conditions where the larger goal of prediction is undermined at every turn. Rather than define success in terms of real progress, we produce artificial measures of success. A key to improving this state of affairs is an honest assessment of all of our uncertainties both experimentally and computationally. There are genuine challenges to this honesty. Generally, the more work we do, the more uncertainty we unveil. This is true of experiments and computations. Think about examining replicate uncertainty in complex experiments. In most cases the experiment is done exactly once, and the prospect of reproducing the experiment is completely avoided. As soon as replicate experiments are conducted the uncertainty becomes larger. Before the replicates, this uncertainty was simply zero and no one challenges this assertion. Instead of going back and adjusting our past state based on current knowledge we run the very real risk of looking like we are moving backwards. The answer is not to continue this willful ignorance but take a mea culpa and admit our former shortcomings. These mea culpas are similarly avoided thus backing the forces of progress into an ever-tighter corner.
The core of the issue is relentlessly psychological. People are uncomfortable with uncertainty and want to believe things are certain. They are uncomfortable about random events, and a sense of determinism is comforting. As such modeling reflects these desires and beliefs. Experiments are similarly biased toward these beliefs. When we allow these beliefs to go unchallenged, the entire basis of scientific progress becomes unhinged. Confronting and challenging these comforting implicit assumptions may be the single most difficult for predictive science. We are governed by assumptions that limit our actual capacity to predict nature. Admitting flaws in these assumptions and measuring how much we don’t know is essential for creating the environment necessary for progress. The fear of saying, “I don’t know” is our biggest challenge. In many respects we are managed to never give that response. We need to admit what we don’t know and challenge ourselves to seek those answers.
Only a few centuries ago, a mere second in cosmic time, we knew nothing of where or when we were. Oblivious to the rest of the cosmos, we inhabited a kind of prison, a tiny universe bounded by a nutshell.
How did we escape from the prison? It was the work of generations of searchers who took five simple rules to heart:
- Question authority. No idea is true just because someone says so, including me.
- Think for yourself. Question yourself. Don’t believe anything just because you want to. Believing something doesn’t make it so.
- Test ideas by the evidence gained from observation and experiment.If a favorite idea fails a well-designed test, it’s wrong. Get over it.
- Follow the evidence wherever it leads. If you have no evidence, reserve judgment.
And perhaps the most important rule of all…
- Remember: you could be wrong. Even the best scientists have been wrong about some things. Newton, Einstein, and every other great scientist in history — they all made mistakes. Of course they did. They were human.
Science is a way to keep from fooling ourselves, and each other.
― Neil deGrasse Tyson
n today’s world from work to private life to public discourse, experts are receding in importance. They used to be respected voices who added deep knowledge to any discussion, not any more. Time and time again experts are being rejected by the current flow of events. Experts are messy and bring painful reality into focus. With the Internet, Facebook and the manufactured reality they allow, it’s just easier to dispense with the expert. One can replace the expert with a more comforting and simpler narrative. One can provide a politically tuned narrative that is framed to support an objective. One can simply take a page from the legal world and hire their own expert. The expert is a pain to control, and expertise is expensive. Today we can just make shit up and it’s just as credible as the truth, and much less trouble to manage. Today our management culture with its marketing focus has no time for facts and experts to cloud matters. Why deal with the difficulties that reality offers when you can wish them away. The pitch for money is much cleaner without objective reality to make things hard. Since quality really doesn’t matter anyway, no one knows the difference. We live in the age of bullshit and lies. The expert is obsolete.
All of these horrors have been slowing dawning on me while seeing our broader world begin to go up in flames. The evening news is a cascade of ever more surreal and unbelievable events. The news has become absolutely painful to watch. A big part of this horrible discourse is the chants of “fake news” and the reality of it. The problems with fake news are permeating the discourse across society. Science and scientific experts are no different. A lack of confidence and credibility in the sources of information is a broad problem. Unless the system values integrity, quality and truth it will fade from view. Increasingly, the system values none of these things and we are getting their opposites. For each thing experts acts as gatekeepers of integrity, quality and truth. As such they are to be pushed out the way as impediments to success. The simple politically crafted message that comforts those with a certain point of view is welcomed by the masses. The messy objective reality with its subtle shadings and complexity are something people would rather not examine.
One of the clearest characteristics of our current research environment is the dominance of money. This only shadows the role of money is society at large. Money has become the one-size-fits-all measuring stick for science. This includes the view of the quality of science. If something gets a lot of money, it must be good. Quality is defined by budget. This shallow mindset is incredibly corrupting all the way from the sort of Lab’s where I work at to Universities and everything in between. Among the corrupting influences is the tendency for promotion of science to morph into pure marketing. Science is increasingly managed as a marketing problem and quality is equivalent to its potential for being flashy. In the wake of this attitude is a loss of focus on the basics and fundamentals of managing research quality.
Money is a tool. Just like a screwdriver, or a pencil, or a gun. We have lost sight of this fact. Money has become a thing unto itself and replaced the value it represents as an objective. Along the way the principles that should be attached to the money have also been scuttled. This entire ethos has infected society from top to bottom with the moneyed interests at the top lording over those without money. Our research institutions are properly a focused reflection of these societal trends. They have a similar social stratification and general loss of collective purpose and identity. Managers have become the most important thing superseding science or mission in priority. Our staff are simply necessary details and utterly replaceable especially with quality being an exercise in messaging. Expertise is a nuisance, and expert knowledge something that only creates problems. This environment is tailored to a recession of science, knowledge and intellect from public life. This is exactly what we see in every corner of our society. In its place reigns managers and the money, they control. Quality and excellence are meaningless unless they come with dollars attached. This is our value system, everything is for sale.
The result of the system we have created is research quality in virtual freefall. The technical class has become part of the general underclass whose well-being is not the priority of this social order. Part of the rise of the management elite as the identity of organizations is driven by this focus on money. Managers look down into organizations for glitzy marketing ammo, to help the money flow. The actual quality and meaning of the research is without value unless it comes with lots of money. Send us your slide decks and especially those beautiful colorful graphics and movies. Those things sell this program and get the money in the door. That is what we are all about, selling to the customer. The customer is always right, even when they are wrong as long as they have the cash. The program’s value is measured in dollars. Truth is measured in dollars, and available for purchase. We are obsessed with metrics, and organizations far and wide work hard to massage them to look good. Things like peer review are to be managed and generally can be politicked into something that makes organizations look good. In the process every bit of ethics and integrity can be squeezed out. These managers have rewritten the rules to make this all kosher. They are clueless about the corrosive and damaging all of this is to the research culture.
our highest leadership today. We are not led by people with integrity, ethics or basic competence. The United States has installed a rampant symptom of corruption and incompetence in its highest office. Trump is not the problem, he is the symptom of the issue. He may become a bigger problem if allowed to reign too long, he can become a secondary infection. He exemplified every single issue we have with ethics, integrity and competence to an almost cartoonish magnitude. Donald Trump is the embodiment of every horrible boss you’ve ever had, then amplified to an unimaginable degree. He is completely and utterly unfit for the job of President whether measured by intellect, demeanor, ethics, integrity or philosophy. He is pathologically incurious. He is a rampant narcissist whose only concern is himself. He is lazy and incompetent. He is likely a career white color criminal who has used money and privilege to escape legal consequences. He is a gifted grifter and conman (whose greatest con is getting this office). He has no governing philosophy or moral compass. He is a racist, bigot and serial abuser of women.
In a nutshell Donald Trump is someone you never want to meet and someone who should never wield the power of his current office. You don’t want him to be your boss, he will make your life miserable and throw you under the bus if it suits him. He is a threat to our future both physically and morally. In the context of this discussion he is the exemplar of what ills the United States including organizations that conduct research. He stands as the symbol of what the management class represents. He is decay. He is incompetence. He is a pathological liar. He is worthy of no respect or admiration save his ability to fool millions. He is the supremacy of marketing over substance. He is someone who has no idea how ironic his mantra “make America great again” is completely undermined by his every breath. His rise to power is the most clear and evident example of how our greatness as a nation has been lost and his every action accelerates our decline. People across the World have lost faith in the United States for the good reason. Any country that elected this moronic, unethical con man as leader is completely untrustworthy. No one symbolizes our fall from greatness more completely than Donald Trump as President.
The long-term impact could well be catastrophic. We can only fake it for so long before it catches up with us. We can allow our leadership to demonstrate such radical disregard for those they lead for so long. The lack of integrity, ethics and morality from our leadership even when approved by society will create damage that our culture cannot sustain. Even if we measure things in the faulty lens of money, the problems are obvious. Money has been flowing steadily into the pockets of the very rich and the management class and away from societal investment. We have been starving our infrastructure for decades. Our roads are awful, and bridges will collapse. 21st Century infrastructure is a pipe dream. Our investments in research and development have been declining in the same time frame scarified for short term profit. At the same time the wealth of the rich has grown, and inequality has become profound and historically unprecedented. These figures are completely correlated. This correlation is not incidental, it is a change in the priorities of society to favor wealth accumulation. The decline of research is simply another symptom.
After monotonicity-preserving methods came along and revolutionized the numerical solution of hyperbolic conservation laws, people began pursuing follow-on breakthroughs. Heretofore nothing has appeared as a real breakthrough although progress has been made. There are some very good reasons for this and understanding them helps us see how and where progress might be made and how. As I noted several weeks ago in the blog post about Total Variation Diminishing methods, the breakthrough with monotonicity preserving came in several stages. The methods were invented by practitioners who were solving difficult practical problems. This process drove the innovation in the methods. Once the methods received significantly notice as a breakthrough, the math came along to bring the methodology into rigor and explanation. The math produced a series of wonderful connections to theory that gave results legitimacy, and the theory also connected the methods to earlier methods dominating the codes at that time. People were very confident about the methods once math theory was present to provide structural explanations. With essential non-oscillatory (ENO) methods, the math came first. This is the very heart of the problem.
As I noted the monotonicity preserving methods came along and total variation theory to make it feel rigorous and tie it to solid mathematical expectations. Before this the monotonicity preserving methods felt sort of magical and unreliable. The math solidified the hold of these methods and allowed people to trust the results they were seeing. With ENO, the math came first with a specific mathematical intent expressed by the methods. The methods were not created to solve hard problems although they had some advantages for some hard problems. This created a number of issues that these methods could not overcome. First and foremost was fragility, followed by a lack of genuine efficacy. The methods would tend to fail when confronted with real problems and didn’t give better results for the same cost. More deeply, the methods didn’t have the pedigree of doing something amazing that no one had seen before. ENO methods had no pull.
ENO methods were devised to move the methods ahead. ENO took the adaptive discrete representation to new heights. Aside from the “adaptive” aspect the new method was a radical departure from those it preceded. The math itself was mostly notional and fuzzy lacking a firm connection to the same preceding work. If you had invested in TVD methods, the basic machinery you used had to be completely overhauled for ENO. The method also came with very few guarantees of success. Finally, it was expensive, and suffered from numerous frailties. It was a postulated exploration of interesting ideas, but in the mathematical frame, not the application frame. Its development also happened at the time when applied mathematics began to abandon applications in favor of a more abstract and remote connection via packaged software.
ENO and WENO methods were advantageous for a narrow class of problems usually having a great deal of fine scale structure. At the same time, they were not a significant (or any) improvement over the second-order accurate methods that dominate the production codes for the broadest class of important application problems. It’s reasonable to ask what might have been done differently to product a more effective outcome? One of the things that hurt the broader adoption of ENO and WENO methods is an increasingly impenetrable codes where large modification is nearly impossible as we create a new generation of legacy codes (retaining the code base).
When I got my first job out of school it was in Los Alamos home of one of the greatest scientific institutions in the World. This Lab birthed the Atomic Age and changed the World. I went there to work, but also learn and grow in a place where science reigned supreme and technical credibility really and truly mattered. Los Alamos did not disappoint at all. The place lived and breathed science, and I was bathed in knowledge and expertise. I can’t think of a better place to be a young scientist. Little did I know that the era of great science and technical superiority was drawing to a close. The place that welcomed me with so much generosity of spirit was dying. Today it is a mere shell of its former self along with Laboratories strewn across the country whose former greatness has been replaced by rampant mediocrity, pathetic leadership and a management class that rules this decline. Money has replaced achievement, integrity and quality as the lifeblood of science. Starting with a quote by Feynman is apt because the spirit he represents so well is the very thing we have completely beat out of the system.
The days of technical competence and scientific accomplishment are over. This foundation for American greatness has been overrun by risk aversion, fear and compliance with a spirit of commonness. I use the word “greatness” with gritted teeth because of the perversion of its meaning by the current President. This perversion is acute in the context of science because he represents everything that is destroying the greatness of the United States. Rather than “making America great again” he is accelerating every trend that has been eroding the foundation of American achievement. The management he epitomizes is the very thing that is the blunt tool bludgeoning American greatness into a bloody pulp. Trump’s pervasive incompetence masquerading as management expertise will surely push numerous American institutions further over the edge into mediocrity. His brand of management is all to prevalent today and utterly toxic to quality and integrity.
“butthead cowboys” and keep them from fucking up. Put differently, the management is there to destroy any individuality and make sure no one ever achieves anything great because no one can take a risk sufficient to achieve something miraculous. Anyone expressing individuality is a threat and needs to be chained up. We replaced stunning World class technical achievement with controlled staff, copious reporting, milestone setting, project management and compliance all delivered with mediocrity. This is bad enough by itself, but for an institution responsible for maintaining our nuclear weapons stockpile, the consequences are dire. Los Alamos isn’t remotely alone. Everything in the United States is being assaulted by the arrayed forces of mediocrity. It is reasonable to ask whether the responsibilities the Labs are charged with continue to be competently achieved.
All of this is now blazoned across the political landscape with an inescapable sense that America’s best days are behind us. The deeply perverse outcome of the latest National election is a president who is a cartoonish version of a successful manager. We have put our abuser and a representative of the class that has undermined our Nation’s true greatness in the position of restoring that greatness. What a grand farce! Every day produces evidence that the current efforts toward restoring greatness are using the very things undermining it. The level of irony is so great as to defy credulity. The current administration’s efforts are the end point of a process that started over 20 years ago, obliterating professional government service and hollowing out technical expertise in every corner. The management class that has arisen in their place cannot achieve anything but moving money and people. Their ability to create the new and wonderful foundation of technical achievement is absent.
from the beginning. In a very real way the bullshit science of Star Wars was a trail blazer for today’s rampant scientific charlatans. Rather than give science a free reign to seek breakthroughs along with the inevitable failure, society suddenly sought guaranteed achievement at a reduced cost. In reality it got neither achievement or economized results. With the flow of money being equated to quality as opposed to results, the combination has poisoned science.
the dominant factor in every decision. Since the managers are the gate keepers for funding they have uprooted technical achievement and progress as the core of organizational identity. It is no understatement to say that the dominance of financial concerns is tied to the ascendency of management and the decline of technical work. At the same time the desire for assured results produced a legion of charlatans who began to infest the research establishment. This combination has produced the corrosive effect of reducing the integrity of the entire system where money rules and results can be finessed to outright fabricated. Standards are so low now that it doesn’t really matter.
One of the key trends impacting our government funded Labs and research is the languid approach to science by the government. Spearheading this systematic decline in support is the long-term Republican approach to starving government that really took the stage in 1994 with the “Contract with America”. Since that time the funding for science has declined in real dollars along with a decrease in the support for professionalism by those in government. Over time the salaries and level of professional management has been under siege as part of an overall assault on governing. A compounding effect has been an ever-present squeeze on the rules related to conducting science. On the one hand we are told that the best business practices will be utilized to make science more efficient. Simultaneously, best practices in support for science have denied us. The result is no efficiency along with no best practices and simply a decline in overall professionalism for the Labs. All of this is deeply compounding the overall decline in support for research.
Meetings. Meetings, Meetings. Meetings suck. Meetings are awful. Meeting are soul sucking, time wasters. Meetings are a good way to “work” without actually working. Meetings absolutely deserve the bad rap they get. Most people think that meetings should be abolished. One of the most dreaded workplace events is a day that is completely full of meetings. These days invariably feel like complete losses, draining all productive energy from what ought to be a day full of promise. I say this as an unabashed extrovert knowing that the introvert is going to feel overwhelmed by the prospect.
If there is one thing that unifies people at work, it is meetings, and how much we despise them. Workplace culture is full of meetings and most of them are genuinely awful. Poorly run meetings are a veritable plague in the workplace. Meetings are also an essential human element in work, and work is a completely human and social endeavor. A large part of the problem is the relative difficulty of running a meeting well, which exceeds the talent and will of most people (managers). It is actually very hard to do this well. We have now gotten to the point where all of us almost reflexively expect a meeting to be awful and plan accordingly. For my own part, I take something to read, or my computer to do actual work, or the old stand-by of passing time (i.e., fucking off) on my handy dandy iPhone. I’ve even resorted to the newest meeting past-time of texting another meeting attendee to talk about how shitty the meeting is. All of this can be avoided by taking meetings more seriously and crafting time that is well spent. If this can’t be done the meeting should be cancelled until the time is well spent.


Conferences, Talks and symposiums. This is a form of meeting that generally works pretty well. The conference has a huge advantage as a form of meeting. Time spend at a conference is almost always time well spent. Even at their worst, a conference should be a banquet of new information and exposure to new ideas. Of course, they can be done very poorly and the benefits can be undermined by poor execution and lack of attention to detail.
Conversely, a conference’s benefits can be magnified by careful and professional planning and execution. One way to augment a conference significantly is find really great keynote speakers to set the tone, provide energy and engage the audience. A thoughtful and thought-provoking talk delivered by an expert who is a great speaker can propel a conference to new heights and send people away with renewed energy. Conferences can also go to greater lengths to make the format and approach welcoming to greater audience participation especially getting the audience to ask questions and stay awake and aware. It’s too easy to tune out these days with a phone or laptop. Good time keeping and attention to the schedule is another way of making a conference work to the greatest benefit. This means staying on time and on schedule. It means paying attention to scheduling so that the best talks don’t compete with each other if there are multiple sessions. It means not letting speaker filibuster through the Q&A period. All of these maxims hold for a talk given in the work hours, just on a smaller and specific scale. There the setting, time of the talk and the time keeping all help to make the experience better. Another hugely beneficial aspect of meetings is food and drink. Sharing food or drink at a meeting is a wonderful way for people to bond and seek greater depth of connection. This sort of engagement can help to foster collaboration and greater information exchange. It engages with the innate human social element that meeting should foster (I will note that my workplace has mostly outlawed food and drink helping to make our meetings suck more uniformly). Too often aspects of the talk or conference that would make the great expense of people’s time worthwhile are skimped on undermining and diminishing the value.
Reviews. A review meeting is akin to a project meeting, but has an edge that makes it worse. Reviews often teem with political context and fear. A common form is a project team, reviewers and then stakeholders. The project team presents work to the reviewers, and if things are working well, the reviewers ask lots of questions. The stakeholders sit nervously and watch rarely participating. The spirit of the review is the thing that determines whether the engagement is positive and productive. The core value about which value revolves is honesty and trust. If honesty and trust are high, those being reviewed are forthcoming and their work is presented in a way where everyone learns and benefits. If the reviewers are confident in their charge and role, they can ask probing questions and provide value to the project and the stakeholders. Under the best of circumstances, the audience of stakeholders can be profitably engaged in deepening the discussion, and themselves learn greater context for the work. Too often, the environment is so charged that honesty is not encouraged, and the project team tends to hide unpleasant things. If reviewers do not trust the reception for a truly probing and critical review, they will pull their punches and the engagement will be needlessly and harmfully moderated. A sign that neither trust nor honesty is present comes from an anxious and uninvolved audience.

Better meetings are a mechanism where our workplaces have an immense ability to improve. A broad principle is that a meeting needs to have a purpose and desired outcome that is well known and communicated to all participants. The meeting should engage everyone attending, and no one should be a potted plant, or otherwise engaged. Everyone’s time is valuable and expensive, the meeting should be structured and executed in a manner fitting its costs. A simple way of testing the waters are people’s attitudes toward the meeting and whether they are positive or negative. Do they want to go? Are they looking forward to it? Do they know why the meeting is happening? Is there an outcome that they are invested in? If these questions are answered honestly, those calling the meeting will know a lot and they should act accordingly.
It is time to return to great papers of the past. The past has clear lessons about how progress can be achieved. Here, I will discuss a trio of papers that came at a critical juncture in the history of numerically solving hyperbolic conservation laws. In a sense, these papers were nothing new, but provided a systematic explanation and skillful articulation of the progress at that time. In a deep sense these papers represent applied math at its zenith, providing a structural explanation along with proof to accompany progress made by others. These papers helped mark the transition of modern methods from heuristic ideas to broad adoption and common use. Interestingly, the depth of applied mathematics ended up paving the way for broader adoption in the engineering world. This episode also provides a cautionary lesson about what holds higher order methods back from broader acceptance, and the relatively limited progress since.
What Sweby did was provide a wonderful narrative description of TVD methods, and a graphical manner to depict them. In the form that Sweby described, TVD methods were a nonlinear combination of classical methods: upwind, Lax-Wendroff and Beam Warming. The limiter was drawn out of the formulation and parameterized by the ratio of local finite differences. The limiter is a way to take an upwind method and modify it with some part of the selection of second-order methods and satisfy the inequalities needed to be TVD. This technical specification took the following form, $ C_{j-1/2} = \nu \left( 1 + 1/2\nu(1-\nu) \phi\ledt(r_{j-1/2}\right) \right) $ and
My own connection to this work is a nice way of rounding out this discussion. When I started looking at modern numerical methods, I started to look at the selection of approaches. FCT was the first thing I hit upon and tried. Compared to the classical methods I was using, it was clearly better, but its lack of theory was deeply unsatisfying. FCT would occasionally do weird things. TVD methods had the theory and this made is far more appealing to my technically immature mind. After the fact, I tried to project FCT methods onto the TVD theory. I wrote a paper documenting this effort. It was my first paper in the field. Unknowingly, I walked into a veritable mine field and complete shit show. All three of my reviewers were very well-known contributors to the field (I know it is supposed to be anonymous, and the shit show that unveiled itself, unveiled the reviewers too).
exaggeration to say that getting funding for science has replaced the conduct and value of that science today. This is broadly true, and particularly true in scientific computing where getting something funded has replaced funding what is needed or wise. The truth of the benefit of pursuing computer power above all else is decided upon a priori. The belief was that this sort of program could “make it rain” and produce funding because this sort of marketing had in the past. All results in the
program must bow to this maxim, and support its premise. All evidence to the contrary is rejected because it is politically incorrect and threatens the attainment of the cargo, the funding, the money. A large part of this utterly rotten core of modern science is the ascendency of the science manager as the apex of the enterprise. The accomplished scientist and expert is merely now a useful and necessary detail, the manager reigns as the peak of achievement.
In this putrid environment, faster computers seem an obvious benefit to science. They are a benefit and pathway to progress, this is utterly undeniable. Unfortunately, it is an expensive and inefficient path to progress, and an incredibly bad investment in comparison to alternative. The numerous problems with the exascale program are subtle, nuanced, highly technical and pathological. As I’ve pointed out before the modern age is no place for subtlety or nuance, we live it an age of brutish simplicity where bullshit reigns and facts are optional. In such an age, exascale is an exemplar, it is a brutally simple approach tailor made for the ignorant and witless. If one is willing to cast away the cloak of ignorance and embrace subtlety and nuance, a host of investments can be described that would benefit scientific computing vastly more than the current program. If we followed a better balance of research, computing to contribute to science far more greatly and scale far greater heights than the current path provides.
Today supercomputing is completely at odds with the commercial industry. After decades of first pacing advances in computing hardware, then riding along with increases in computing power, supercomputing has become separate. The separation occurred when Moore’s law died at the chip level (in about 2007). The supercomputing world has become increasingly disparate to continue the free lunch, and tied to an outdated model for delivering results. Basically, supercomputing is still tied to the mainframe model of computing that died in the business World long ago. Supercomputing has failed to embrace modern computing with its pervasive and multiscale nature moving all the way from mobile to cloud.
Expansive uncertainty quantification – too many uncertainties are ignored rather than considered and addressed. Uncertainty is a big part V&V, a genuinely hot topic in computational circles, and practiced quite incompletely. Many view uncertainty quantification as only being a small set of activities that only address a small piece of the uncertainty question. Too much benefit is achieved by simply ignoring a real uncertainty because the value of zero that is implicitly assumed is not challenged. This is exacerbated significantly by a half funded and deemphasized V&V effort in scientific computing. Significant progress was made several decades ago, but the signs now point to regression. The result of this often willful ignorance is a lessening of impact of computing and limiting the true benefits.
progress are the computer codes. Old computer codes are still being used, and most of them use operator splitting. Back in the 1990’s a big deal was made regarding replacing legacy codes with new codes. The codes developed then are still in use, and no one is replacing them. The methods in these old codes are still being used and now we are told that the codes need to be preserved. The codes, the models, the methods and the algorithms all come along for the ride. We end up having no practical route to advancing the methods.
Complete code refresh – we have produced and now we are maintaining a new generation of legacy codes. A code is a storage for vast stores of knowledge in modeling, numerical methods, algorithms, computer science and problem solving. When we fail to replace codes, we fail to replace knowledge. The knowledge comes directly from those who write the code and create the ability to solve useful problems with that code. Much of the methodology for problem solving is complex and problem specific. Ultimately a useful code becomes something that many people are deeply invested in. In addition, the people who originally write the code move on taking their expertise, history and knowledge with them. The code becomes an artifact for this knowledge, but it is also a deeply imperfect reflection of the knowledge. The code usually contains some techniques that are magical, and unexplained. These magic bits of code are often essential for success. If they get changed the code ceases to be useful. The result of this process is a deep loss of expertise and knowledge that arises from the process of creating a code that can solve real problems. If a legacy code continues to be used it also acts to block progress of all the things it contains starting with the model and its fundamental assumption. As a result, progress stops because even when there is research advances, it has no practical outlet. This is where we are today.
Democratization of expertise – the manner in which codes are applied has a very large impact on solutions. The overall process is often called a workflow, encapsulating activities starting with problem conception, meshing, modeling choices, code input, code execution, data analysis, visualization. One of the problems that has arisen is the use of codes by non-experts. Increasingly code users are simply not sophisticated and treat codes like black boxes. Many refer to this as the democratization of the simulation capability, which is generally beneficial. On the other hand, we increasingly see calculations conducted by novices who are generally ignorant of vast swaths of the underlying science. This characteristic is keenly related to a lack of V&V focus and loose standards of acceptance for calculations. Calibration is becoming more prevalent again, and distinctions between calibration and validation are vanishing anew. The creation of broadly available simulation tools must be coupled to first rate practices and appropriate professional education. In both of these veins the current trends are completely in the wrong direction. V&V practices are in decline and recession. Professional education is systematically getting worse as the educational mission of universities is attacked, and diminished along with the role of elites in society. 
Last week I tried to envision a better path forward for scientific computing. Unfortunately, a true better path flows invariably through a better path for science itself and the Nation as a whole. Ultimately scientific computing, and science more broadly is dependent on the health of society in the broadest sense. It also depends on leadership and courage, two other attributes we are lacking in almost every respect. Our society is not well, the problems we are confronting are deep and perhaps the most serious crisis since the Civil War. I believe that historians will look back to 2016-2018 and perhaps longer as the darkest period in American history since the Civil War. We can’t build anything great when the Nation is tearing itself apart. I hope and pray that it will be resolved before we plunge deeper into the abyss we find ourselves. We see the forces opposed to knowledge, progress and reason emboldened and running amok. The Nation is presently moving backward and embracing a deeply disturbing and abhorrent philosophy. In such an environment science cannot flourish, it can only survive. We all hope the darkness will lift and we can again move forward toward a better future; one with purpose and meaning where science can be a force for the betterment of society as a whole.
The march of science is the 20th Century was deeply impacted by international events, several World Wars and a Cold (non) War that spurred National interests in supporting science and technology. The twin projects of the atom bomb and the nuclear arms race along with space exploration drove the creation of much of the science and technology today. These conflicts steeled resolve, purpose and granted resources needed for success. They were important enough that efforts were earnest. Risks were taken because risk is necessary for achievement. Today we don’t take risks because nothing important is a stake. We can basically fake results and market progress where little or none exists. Since nothing is really that essential bullshit reigns supreme.
resistance was not real. Ironically the Soviets were ultimately defeated by bullshit. The Strategic Defense Initiative, or Star Wars bankrupted the Soviets. It was complete bullshit and never had a chance to succeed. This was a brutal harbinger of today’s World where reality is optional, and marketing is the coin of the realm. Today American power seems unassailable. This is partially true and partially over-confidence. We are not on our game at all, and far to much of our power is based on bullshit. As a result, we can basically just pretend to try, and actually not execute anything with substance and competence. This is where we are today; we are doing nothing important, and wasting lots of time and money in the process.
The result of the current model is a research establishment that only goes through the motions and does little or nothing. We make lots of noise and produce little substance. Our nation deeply needs a purpose that is greater. There are plenty of worthier National goals. If war-making is needed, Russia and China are still worthy adversaries. For some reason, we have chosen to capitulate to Putin’s Russia simply because they are an ally against the non-viable threat of Islamic fundamentalism. This is a completely insane choice that is only rhetorically useful. If we want peaceful goals, there are challenges aplenty. Climate change and weather are worthy problems to tackle requiring both scientific understanding and societal transformation to conquer. Creating clean and renewable energy that does not create horrible environmental side-effects remains unsolved. Solving the international needs for food and prosperity for mankind is always there. Scientific exploration and particularly space remain unconquered frontiers. Medicine and genetics offer new vistas for scientific exploration. All of these areas could transform the Nation in broad ways socially and economically. All of these could meet broad societal needs. More to the point of my post, all need scientific computing in one form or another to fully succeed. Computing always works best as a useful tool employed to help achieve objectives in the real World. The real-World problems provide constraints and objectives that spur innovation and keep the enterprise honest.
Instead our scientific computing is being applied as a shallow marketing ploy to shore up a vacuous program. Nothing really important or impactful is at stake. The applications for computing are mostly make believe and amount to nothing of significance. The marketing will tell you otherwise, but the lack of gravity for the work is clear and poisons the work. The result of this lack of gravity are phony goals and objectives that have the look and feel of impact, but contribute nothing toward an objective reality. This lack of contribution comes from the deeper malaise of purpose as a Nation, and science’s role as an engine of progress. With little or nothing at stake the tools used for success suffer, scientific computing is no different. The standards of success simply are not real, and lack teeth. Even stockpile stewardship is drifting into the realm of bullshit. It started as a worthy program, but over time it has been allowed to lose its substance. Political and financial goals have replaced science and fact, the goals of the program losing connection to objective reality.
We would still be chasing faster computers, but the faster computers would not be the primary focus. We would focus on using computing to solve problems that were important. We would focus on making computers that were useful first and foremost. We would want computers that were faster as long as they enabled progress on problem solving. As a result, efforts would be streamlined toward utility. We would not throw vast amounts of effort into making computers faster, just to make them faster (this is what is happening today there is no rhyme or reason to exascale other than, faster is like better, Duh!). Utility means that we would honestly look at what is limiting problem solving and putting our efforts into removing those limits. The effects of this dose of reality on our current efforts would be stunning; we would see a wholesale change in our emphasis and focus away from hardware. Computing hardware would take its proper role as an important tool for scientific computing and no longer be the driving force. The fact that hardware is a driving force for scientific computing is one of clearest indicators of how unhealthy the field is today.
Current computing focus is only porting old codes to new computers, a process that keeps old models, methods and algorithms in place. This is one of the most corrosive elements in the current mix. The porting of old codes is the utter abdication of intellectual ownership. These old codes are scientific dinosaurs and act to freeze antiquated models, methods and algorithms in place while acting to squash progress. Worse yet, the skillsets necessary for improving the most valuable and important parts of modeling and simulation are allowed to languish. This is worse than simply choosing a less efficient road, this is going backwards. When we need to turn our attention to serious real work, our scientists will not be ready. These choices are dooming an entire generation that could have been making breakthroughs to simply become caretakers. To be proper stewards of our science we need to write new codes containing new models using new methods and algorithms. Porting codes turns our scientists into mindless monks simply transcribing sacred texts without any depth of understanding. It is a recipe for transforming our science into magic. It is the recipe for defeat and the passage away from the greatness we once had.
My work day is full of useless bullshit. There is so much bullshit that it has choked out the room for inspiration and value. We are not so much managed as controlled. This control comes from a fundamental distrust of each other to a degree that any independent ideas are viewed as dangerous. This realization has come upon me in the past few years. It has also occurred to me that this could simply be a mid-life crisis manifesting itself, but the evidence might seem to indicate that it is something more significant (look at the bigger picture of the constant crisis my Nation is in). My mid-life attitudes are simply much less tolerant of time-wasting activities with little or no redeeming value. You realize that your time and energy is limited, why waste it on useless things.
I read a book that had a big impact on my thinking, “The Subtle Art of Not Giving a Fuck” by Mark Manson . In a nutshell, the book says that you have a finite number of fucks to give in life and you should optimize your life by mindfully not giving a fuck about unimportant things. This gives you the time and energy to actually give a fuck about things that actually matter. The book isn’t about not caring, it is about caring about the right things and dismissing the wrong things. What I realized is that increasingly my work isn’t competing for my fucks, they just assume that I will spend my limited fucks on complete bullshit out of duty. It is actually extremely disrespectful of me and my limited time and effort. One conclusion is that the “bosses” (the Lab, the Department of Energy) not give enough of a fuck about me to treat my limited time and energy with respect and make sure my fucks actually matter.

If we look at work, it might seem that an inspired workforce would be a benefit worth creating. People would work hard and create wonderful things because of the depth of their commitment to a deeper purpose. An employer would benefit mightily from such an environment, and the employees could flourish brimming with satisfaction and growth. With all these benefits, we should expect the workplace to naturally create the conditions for inspiration. Yet this is not happening; the conditions are the complete opposite. The reason is that inspired employees are not entirely controlled. Creative people do things that are unexpected and unplanned. The job of managing a work place like this is much harder. In addition, mistakes and bad things happen too. Failure and mistakes are an inevitable consequence of hard working inspired people. This is the thing that our work places cannot tolerate. The lack of control and unintended consequences are unacceptable. Fundamentally this stems from a complete lack of trust. Our employers do not trust their employees at all. In turn, the employees do not trust the workplace. It is vicious cycles that drags inspiration under and smothers it. The entire environment is overflowing with micromanagement, control suspicion and doubt.
If we can’t say NO to all this useless stuff, we can’t say YES to things either. My work and time budget is completely stocked up with non-optional things that I should say NO to. They are largely useless and produce no value. Because I can’t say NO, I can’t say YES to something better. My employer is sending a message to me with very clear emphasis, we don’t trust you to make decisions. Your ideas are not worth working on. You are expected to implement other people’s ideas no matter how bad they are. You have no ability to steer the ideas to be better. Your expertise has absolutely no value. A huge part of this problem is the ascendency of the management class as the core of organizational value. We are living in the era of the manager; the employee is a cog and not valued. Organizations voice platitudes toward the employees, but they are hollow. The actions of the organization spell out their true intent. Employees are not to be trusted, they are to be controlled and they need to do what they are told to do. Inspired employees would do things that are not intended, and take organizations in new directions, focused on new things. This would mean losing control and changing plans. More importantly, the value of the organization would move away from the managers and move to the employees. Managers are much happier with employees that are “seen and not heard”.
As Mark Manson writes we only have so many fucks to give and my work is doing precious little to give them there. I have always focused on personal growth and increasingly personal growth is resisted by work instead of resonated with. It has become quite obvious that being the best “me” is not remotely a priority. The priority at work is to be compliant, take no risks, fail at nothing and help produce marketing material for success and achievement. We aren’t doing great work anymore, but pretend we are. My work could simply be awesome, but that would require giving me the freedom to set priorities, take risks, fail often, learn continually and actually produce wonderful things. If this happened the results would speak for themselves and the marketing would take care of itself. When the Labs I’ve worked at were actually great this is how it actually happened. The Labs were great because they achieved great things. The labs said NO to a lot of things, so they could say YES to the right things. Today, we simply don’t have this freedom.
If we could say NO to the bullshit, and give our limited fucks a powerful YES, we might be able to achieve great things. Our Labs could stop trying to convince everyone that they were doing great things and actually do great things. The missing element at work today is trust. If the trust was there we could produce inspiring work that would generate genuine pride and accomplishment. Computing is a wonderful example of these principles in action. Scientific computing became a force in science and engineering contributing to genuine endeavors for massive societal goals. Computing helped win the Cold War and put a man on the moon. Weather and climate has been modeled successfully. More broadly, computers have reshaped business and now societally massively. All of these endeavors had computing contributing to solutions. Computing focused on computers was not the endeavor itself like it is today. The modern computing emphasis was originally part of a bigger program of using science to support the nuclear stockpile without testing. It was part of a focused scientific enterprise and objective. Today it is a goal unto itself, and not moored to anything larger. If we want to progress and advance science, we should focus on great things for society, not superficially put our effort into mere tools.