Time is an illusion.
― Albert Einstein
Time is relentless. As an opponent it is unbeatable and can only be temporarily held at bay. We all lose to it, with death being the inevitable outcome. Science uses the second law of Thermodynamics as the lord of time. It establishes a direction defined by the creation of greater disorder. In many ways the second law stands apart from other physical laws in its fundamental nature. It describes the basic character of change, but not its details.
But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
— Sir Arthur Stanley Eddington
Change is constant and must be responded to. The challenge of the continual flow of
events provides the key distinguishing character of response. On the one hand conservatives resist the flow and attempt to retain the shape of the World. Liberals and progressives work to shape the change to so that the World changes for the better. Where the conservative sees the best future in the past, the progressive sees the best future as being a new beginning.
Change isn’t made by asking permission. Change is made by asking forgiveness, later.
― Seth Godin
These tendencies are seen in differing taste for the arts. Take music where oldies are the staple of conservatives who don’t warm to the newer ideas. The old standards of their childhood and teen years make for a calming influence and sentimental listening. The progressive ear looks for new combinations rather than the familiar. Invention and improvisation are greeted warmly as a new challenge to one’s tastes. For example rap is viewed as not being music of any sort by the conservative ear, and greeted as stunningly original by the liberal ear. On the one hand the past is viewed as a template for the future, and on the other changes are seen as the opportunity for improvement. This tension is at the core of humanity’s struggle for mastery over time.
Our time is marked by certain emblematic moments such as 9/11, Nixon’s resignation or the fall of the Berlin Wall. Each of these moments clearly defines a transition from everything before it, to everything after it. Some of these moments are simply climaxes to events preceding them.
The horror of 9/11 started with the rise of Islam 1400 years ago, continuing with the Crusades, European colonialism, the oil crisis of the 70’s, American support for the Shah and his fall with rise of Islamic fundamentalism, the Soviet invasion of
Afghanistan, the American invasion of Iraq and the constancy of Arab tension over Israel. Our response has assured that the war will continue and has only enflamed more terrorism. Rather than short-circuit the cycle of violence we have amplified it, and assured its continuation for another generation. We have learned nothing from the history leading up to the event of September 11, 2001.
Tradition becomes our security, and when the mind is secure it is in decay.
― Jiddu Krishnamurti
These developments highlight some of the key differences with conservative and liberal responses to crisis. The conservative response usually takes little note of history, and applies power as the strategy. Power usually suits the powerful being arguably simple and usually effective. Liberals and progressives on the other hand are eager to take a different path, try something new and different, but often encounter paralysis from the analysis of the past. The different approach is often a failure, but when it succeeds the results are transformative. Power’s success only reinforces the power applying it. Ultimately when power encounters the right challenge, it fails and upsets the balance. In the end the power is reset and eventually the balance will be restored with a new structure at the helm.
Societies in decline have no use for visionaries.
― Anaïs Nin
In science, the same holds conventional theories and approaches work almost all the time, but when they are overturned it is monumental. Even there conservative approaches are the workhorse and the obvious choice. Every so often they are exposed by something progressive and new that produces results the old approaches could not. This is the sort of thing that Kuhn wrote about with revolutions in science. As with other human endeavors the liberal and progressive wing leads science’s
advance. The day-in, day-out work of science is left to the conservative side of things.
“Normal science” means research firmly based upon one or more past scientific achievements, achievements that some particular scientific community acknowledges for a time as supplying the foundation for its further practice.
— Thomas Kuhn
So we are left with a balance to achieve. How do we handle the inevitability of change from the remorseless march of time? Are we interested in the conservative approach leading to uninspired productivity? Or progressive breakthroughs that push us forward, but most often end in failure?
All the effort in the world won’t matter if you’re not inspired.
― Chuck Palahniuk
The types of adaptivity most commonly seen are associated with adaptive grids (or “h” refinement). Grids lend themselves to straightforward understanding and impressive visualization. Even with its common presence, even this form of adaptivity is seen far less than one might have thought looking forward twenty years ago. Adaptivity takes other forms far less common than h-refinement such as p-adaptivity where the order of an algorithm is adjusted locally. A third classical form is r-adaptivity where the mesh is moved locally to improve solutions. This is the second most common approach in the guise of remesh-remap methods (or ALE codes). I’d like to chat about a handful of other approaches that could be big winners in the future especially if combined with the
classical approaches.
These are relatively simple ideas. More complex adaptation in algorithms can be associated with methods that use nonlinear stencils usually defined by limiters. These methods use a solution quality principle (typically monotonicity or positivity) to define how a computational stencil is chosen (FCT, MUSCL, and TVD are good examples). More advanced methods such as essentially non-oscillatory (ENO) or the elegant Weighted ENO (WENO) method take this adaptivity up a notch. While algorithms like FCT and TVD are common in codes, ENO hasn’t caught on in serious codes largely due to complexity and lack of overall robustness. The robustness problems are probably due to the overall focus on accuracy over robustness as the key principle in stencil selection.
One area where the adaptivity may be extremely useful is the construction of composite algorithms. The stencil selection in ENO or TVD is a good example as each individual stencil is a consistent discretization itself. It is made more effective and higher quality through the nonlinear procedure used for selecting. Another good example of this principle is the compositing of multigrid methods with Krylov iterations. Neither method is as effective on its own. They either suffer from robustness (multigrid) or suboptimal scaling (Krylov). Together the methods have become the standard. Part of the key to a good composite is the complementarity of the properties. In the above case multigrid can provide optimal scaling and Krylov offers stability. This isn’t entirely unlike TVD methods where upwinding offers the stability, and one of the candidate stencils offers optimal accuracy.


My job. All in all I’m pretty lucky. Beyond having enough money to have a comfortable life with food to eat, comfortable shelter and a few luxuries, I get to do what I love a little bit each week. I’ll save my concerns that the Labs where I work are a shadow of their former selves compared to the rest of the World, I’m doing great.
Modeling and simulation. The use of computers to solve problems in physics and engineering has become commonplace. Its common nature shouldn’t detract from the wonder we should feel. Our ability to construct virtual versions of reality is both wonderful for exploration, discovery and utility. The only thing that gives me pause is a bit of hubris regarding the scope of our mastery.
gorithms. Systematic ways of solving problems that are amenable to computing fill me with wonder. The only regret is that we don’t rely upon this approach enough. An accurate, elegant and efficient algorithm is a thing of beauty. Couple the algorithm with mathematical theory and it is breathtaking.
The end of Moore’s law. This is a great opportunity for science to quit being lazy. If we had relied upon more than raw power for improving computing, our ability to use computers today would be so much more awesome. Perhaps now, we will focus on thinking about how we use computers rather than simply focus on building bigger ones.
The Internet and the World Wide Web. We are living through a great transformation in human society. The Internet is changing our society, our governance, our entertainment, and almost anything else you can imagine. The core is it changes the way we talk, and the way we get and share information. It makes each day interesting and is the spoon that stirs the proverbial pot.
Nuclear weapons. We owe the relative piece that the World has experienced since World War II to this horrible weapon. As long as they aren’t used they save lives and keep the great powers in check.

Big data and statistics. Computers, sensors, drones and the internet of things is helping to drive the acquisition of data at levels unimaginable only a few years ago. With computers and software that can do something with it, we have a revolution in science. Statistics has become sexy and add statistics to sports and you combine two things that I love.
Genetics. The wonders of our knowledge of the genome seem boundless and shape knowledge gathering across many fields. Its impact on social science, archeology, paleontology to name a few is stunning. We have made incredible discoveries that expand the knowledge of humanity and provide wonder for all.
Albuquerque sunsets. The coming together of optics, meteorology, and astronomy, the sunsets here are epically good. Add the color other the mountains opposite the setting sun and inspiration is never more than the end of the day away.
sunset, it looks like home.














For supercomputing to provide the value it promises for simulating phenomena, the methods in the codes must be convergent. The metric of weak scaling is utterly predicated on this being true. Despite its intrinsic importance to the actual relevance of high performance computing relatively little effort has been applied to making sure convergence is being achieved by codes. As such the work on supercomputing simply assumes that it happens, but does little to assure it. Actual convergence is largely an afterthought and receives little attention or work.
Thus the necessary and sufficient conditions are basically ignored. This is one of the simplest examples of the lack of balance I experience every day. In modern computational science the belief that faster supercomputers are better and valuable has become closer to an article of religious faith than a well-crafted scientific endeavor. The sort of balanced, well-rounded efforts that brought scientific computing to maturity have been sacrificed for an orgy of self-importance. China has the world’s fastest computer and reflexively we think there is a problem.

While necessary applied math isn’t sufficient. Sufficiency is achieved when the elements are applied together with science. The science of computing cannot remain fixed because computing is changing the physical scales we can access, and the fundamental nature of the questions we ask. The codes of twenty years ago can’t simply be used in the same way. It is much more than rewriting them or just refining a mesh. The physics in the codes needs to change to reflect the differences.
A chief culprit is the combination of the industry and its government partners who remain tied to the same stale model for two or three decades. At the core the cost has been intellectual vitality. The implicit assumption of convergence, and the lack of deeper intellectual investment in new ideas has conspired to strand the community in the past. The annual Supercomputing conference is a monument to this self-imposed mediocrity. It’s a trade show through and through, and in terms of technical content a truly terrible meeting (I remember pissing the Livermore CTO off when pointing this out).
One of the big issues is the proper role of math in the computational projects. The more applied the project gets, the less capacity math has to impact it. Things simply shouldn’t be this way. Math should always be able to compliment a project.
A proof that is explanatory gives conditions that describe the results achieved in computation. Convergence rates observed in computations are often well described by mathematical theory. When a code gives results of a certain convergence rate, a mathematical proof that explains why is welcome and beneficial. It is even better if it gives conditions where things break down, or get better. The key is we see something in actual computations, and math provides a structured, logical and defensible explanation of what we see.
Too often mathematics is done that simply assumes that others are “smart” enough to squeeze utility from the work. A darker interpretation of this attitude is that people who don’t care if it is useful, or used. I can’t tolerate that attitude. This isn’t to say that math without application shouldn’t be done, but rather it shouldn’t seek support from computational science.
None of these priorities can be ignored. For example if the efficiency becomes too poor, the code won’t be used because time is money. A code that is too inaccurate won’t be used no matter how robust it is (these go together, with accurate and robust being a sort of “holy grail”).
Robust. A robust code runs to completion. Robustness in its most refined and crudest sense is stability. The refined sense of robustness is the numerical stability that is so keenly important, but it is so much more. It gives an answer come hell or high water even if that answer is complete crap. Nothing upsets your users more than no answer; a bad answer is better than none at all. Making a code robust is hard work and difficult especially if you have morals and standards. It is an imperative.
Efficiency. The code runs fast and uses the computers well. This is always hard to do, a beautiful piece of code that clearly describes an algorithm turns into a giant plate of spaghetti, but runs like the wind. To get performance you end up throwing out that wonderful inheritance hierarchy you were so proud of. To get performance you get rid of all those options that you put into the code. This requirement is also in conflict with everything else. It is also the focus of the funding agencies. Almost no one is thinking productively about how all of this (doesn’t) fit together. We just assume that faster supercomputers are awesome and better.
It isn’t a secret that the United States has engaged in a veritable orgy of classification since 9/11. What is less well known is the massive implied classification through other data categories such as “official use only (OUO)”. This designation is itself largely unregulated as such is quite prone to abuse.
