Gratitude makes blessings permanent.
It is the uniquely wonderful holiday of Thanksgiving in the United States. It’s one of the few holidays that haven’t been completely sullied by commercialism although “Black Friday” is making a run at it. I’m thankful that my employer honors the Holiday unlike too many. It might be nice to list things to be thankful about in science and technology.
As we express our gratitude, we must never forget that the highest appreciation is not to utter words, but to live by them.
―John F. Kennedy
My job. All in all I’m pretty lucky. Beyond having enough money to have a comfortable life with food to eat, comfortable shelter and a few luxuries, I get to do what I love a little bit each week. I’ll save my concerns that the Labs where I work are a shadow of their former selves compared to the rest of the World, I’m doing great.
What a computer is to me is the most remarkable tool that we have ever come up with. It’s the equivalent of a bicycle for our minds.
Modeling and simulation. The use of computers to solve problems in physics and engineering has become commonplace. Its common nature shouldn’t detract from the wonder we should feel. Our ability to construct virtual versions of reality is both wonderful for exploration, discovery and utility. The only thing that gives me pause is a bit of hubris regarding the scope of our mastery.
Al
gorithms. Systematic ways of solving problems that are amenable to computing fill me with wonder. The only regret is that we don’t rely upon this approach enough. An accurate, elegant and efficient algorithm is a thing of beauty. Couple the algorithm with mathematical theory and it is breathtaking.
Useful applied math. Mathematics is a wonderful tool if used properly. So much useful theory has been developed for so many aspects of science. Whenever mathematics is applied to bring order to an area of science I celebrate. It should be happening a lot more than it does, but when it does it’s great.
Technology is anything invented after you were born.
― Alan Kay
The end of Moore’s law. This is a great opportunity for science to quit being lazy. If we had relied upon more than raw power for improving computing, our ability to use computers today would be so much more awesome. Perhaps now, we will focus on thinking about how we use computers rather than simply focus on building bigger ones.
Innovative software. Is what makes the current computing revolution interesting. Perhaps science will start to understand how the key is software and not raw computing power. Software is evolving computing even with Moore’s law being on life support because it changes how we use computers constantly.
When we change the way we communicate, we change society
― Clay Shirky
The Internet and the World Wide Web. We are living through a great transformation in human society. The Internet is changing our society, our governance, our entertainment, and almost anything else you can imagine. The core is it changes the way we talk, and the way we get and share information. It makes each day interesting and is the spoon that stirs the proverbial pot.
Nuclear weapons. We owe the relative piece that the World has experienced since World War II to this horrible weapon. As long as they aren’t used they save lives and keep the great powers in check.

Turbulence. This is the gift that keeps giving. Turbulence is basically unsolved. It is also beautiful, important and pervasive. A breakthrough in understanding this bit of physics would be transformative.

Shock physics. Shocks are super cool. Energetic and destructive the drive to understand them drove early computing during World War II, and few realize the debt we owe to them. We understand far less about shock physics than we might admit, and it would be great to get back to focusing on discovery again.
Big data and statistics. Computers, sensors, drones and the internet of things is helping to drive the acquisition of data at levels unimaginable only a few years ago. With computers and software that can do something with it, we have a revolution in science. Statistics has become sexy and add statistics to sports and you combine two things that I love.
Genetics. The wonders of our knowledge of the genome seem boundless and shape knowledge gathering across many fields. Its impact on social science, archeology, paleontology to name a few is stunning. We have made incredible discoveries that expand the knowledge of humanity and provide wonder for all.
Our technology forces us to live mythically
― Marshall McLuhan
Modern medicine. Today we have all sorts of medicines and treatments that allow us to live and be productive with ailments that would have destroyed our lives a few generations ago.
Albuquerque sunsets. The coming together of optics, meteorology, and astronomy, the sunsets here are epically good. Add the color other the mountains opposite the setting sun and inspiration is never more than the end of the day away.
Sandia Mountain. A tribute to geology, the great shield, or half of a watermelon at 
sunset, it looks like home.
No duty is more urgent than that of returning thanks.
―James Allen















For supercomputing to provide the value it promises for simulating phenomena, the methods in the codes must be convergent. The metric of weak scaling is utterly predicated on this being true. Despite its intrinsic importance to the actual relevance of high performance computing relatively little effort has been applied to making sure convergence is being achieved by codes. As such the work on supercomputing simply assumes that it happens, but does little to assure it. Actual convergence is largely an afterthought and receives little attention or work.
Thus the necessary and sufficient conditions are basically ignored. This is one of the simplest examples of the lack of balance I experience every day. In modern computational science the belief that faster supercomputers are better and valuable has become closer to an article of religious faith than a well-crafted scientific endeavor. The sort of balanced, well-rounded efforts that brought scientific computing to maturity have been sacrificed for an orgy of self-importance. China has the world’s fastest computer and reflexively we think there is a problem.

While necessary applied math isn’t sufficient. Sufficiency is achieved when the elements are applied together with science. The science of computing cannot remain fixed because computing is changing the physical scales we can access, and the fundamental nature of the questions we ask. The codes of twenty years ago can’t simply be used in the same way. It is much more than rewriting them or just refining a mesh. The physics in the codes needs to change to reflect the differences.
A chief culprit is the combination of the industry and its government partners who remain tied to the same stale model for two or three decades. At the core the cost has been intellectual vitality. The implicit assumption of convergence, and the lack of deeper intellectual investment in new ideas has conspired to strand the community in the past. The annual Supercomputing conference is a monument to this self-imposed mediocrity. It’s a trade show through and through, and in terms of technical content a truly terrible meeting (I remember pissing the Livermore CTO off when pointing this out).
One of the big issues is the proper role of math in the computational projects. The more applied the project gets, the less capacity math has to impact it. Things simply shouldn’t be this way. Math should always be able to compliment a project.
A proof that is explanatory gives conditions that describe the results achieved in computation. Convergence rates observed in computations are often well described by mathematical theory. When a code gives results of a certain convergence rate, a mathematical proof that explains why is welcome and beneficial. It is even better if it gives conditions where things break down, or get better. The key is we see something in actual computations, and math provides a structured, logical and defensible explanation of what we see.
Too often mathematics is done that simply assumes that others are “smart” enough to squeeze utility from the work. A darker interpretation of this attitude is that people who don’t care if it is useful, or used. I can’t tolerate that attitude. This isn’t to say that math without application shouldn’t be done, but rather it shouldn’t seek support from computational science.
None of these priorities can be ignored. For example if the efficiency becomes too poor, the code won’t be used because time is money. A code that is too inaccurate won’t be used no matter how robust it is (these go together, with accurate and robust being a sort of “holy grail”).
Robust. A robust code runs to completion. Robustness in its most refined and crudest sense is stability. The refined sense of robustness is the numerical stability that is so keenly important, but it is so much more. It gives an answer come hell or high water even if that answer is complete crap. Nothing upsets your users more than no answer; a bad answer is better than none at all. Making a code robust is hard work and difficult especially if you have morals and standards. It is an imperative.
Efficiency. The code runs fast and uses the computers well. This is always hard to do, a beautiful piece of code that clearly describes an algorithm turns into a giant plate of spaghetti, but runs like the wind. To get performance you end up throwing out that wonderful inheritance hierarchy you were so proud of. To get performance you get rid of all those options that you put into the code. This requirement is also in conflict with everything else. It is also the focus of the funding agencies. Almost no one is thinking productively about how all of this (doesn’t) fit together. We just assume that faster supercomputers are awesome and better.
It isn’t a secret that the United States has engaged in a veritable orgy of classification since 9/11. What is less well known is the massive implied classification through other data categories such as “official use only (OUO)”. This designation is itself largely unregulated as such is quite prone to abuse.

Again something at work has inspired me to write. It’s a persistent theme among authors, artists and scientists regarding the concept of the fresh start (blank page, empty canvas, original idea). I think its worth considering how truly “fresh” these really are. This idea came up during a technical planning meeting where one of participants viewed this new project as being offered a blank page.
Once we stepped over that threshold, conflict erupted over the choices available with little conclusion. A large part of the issue was the axioms each person was working with. Across the board we all took a different set of decisions to be axiomatic. At some time in the past these “axioms” were choices, and became axiomatic through success. Someone’s past success becomes the model for future success, and the choices that led to that success become unstated decisions we are generally completely unaware of. These form the foundation of future work and often become culturally iconic in nature.
Take the basic framework for discretization as an operative example: at Sandia this is the finite element method; at Los Alamos it is finite volumes. At Sandia we talk “elements”, at Los Alamos it is “cells”. From there we continued further down the proverbial rabbit hole to discuss what sort of elements (tets or hexes). Sandia is a hex shop, causing all sorts of headaches, but enabling other things, or simply the way a difficult problem was tackled. Tets would improve some things, but produce other problems. For some ,the decisions are flexible, for others there isn’t a choice, the use of a certain type of element is virtually axiomatic. None of these things allows a blank slate, all of them are deeply informed and biased toward specific decisions of made in some cases decades ago.
The other day I risked a lot by comparing the choices we’ve collectively made in the past as “original sin”. In other words what is computing’s original sin? Of course this is a dangerous path to tread, but the concept is important. We don’t have a blank slate; our choices are shaped, if not made by decisions of the past. We are living, if not suffering due to decisions made years or decades ago. This is true in computing as much as any other area.
In case you’re wondering about my writing habit and blog. I can explain a bit more. If you aren’t, stop reading. In the sense of authorship I force myself to face the blank page every day as an exercise in self-improvement. I read Charles Durhigg’s book “Habits” and realized that I needed better habits. I thought about what would make me better and set about building them up. I have a list of things to do every day, “write” “exercise” “walk” “meditate” “read” and so on.
The blog is a concrete way of putting the writing to work. Usually, I have an idea the night before, and draft most of the thoughts during my morning dog walk (dogs make good motivators for walks). I still need to craft (hopefully) coherent words and sentences forming the theme. The blog allows me to publish the writing with a minimal effort, and forces me to take editing a bit more seriously. The whole thing is an effort to improve my writing both in style and ease of production.
For some reason I’m having more “WTF” moments at work lately. Perhaps something is up, or I’m just paying attention to things. Yesterday we had a discussion about reviews, and people’s intense desire to avoid them. The topic came up because there have been numerous efforts to encourage and amplify technical review recently. There are a couple of reasons for this, mostly positive, but a tinge of negativity lies just below the surface. It might be useful to peel back the layers a bit and look at the dark underbelly.
I touched on this topic a couple of weeks ago (
The biggest problems with peer reviews are “bullshit reviews”. These are reviews that are mandated by organizations for the organization. These always get graded and the grades have consequences. The review teams know this thus the reviews are always on a curve, a very generous curve. Any and all criticism is completely muted and soft because of the repercussions. Any harsh critique even if warranted puts the reviewers (and their compensation for the review at risk). As a result of this dynamic, these reviews are quite close to a complete waste of time.
Because of the risk associated with the entire process, the organizations approach the review in an overly risk-averse manner, and control the whole thing. It ends up being all spin, and little content. Together with the dynamic created with the reviewers, the whole thing spirals into a wasteful mess that does no one any good. Even worse, the whole process has a corrosive impact on the perception of reviews. They end up having no up side; it is all down side and nothing useful comes out of them. All of this even though the risk from the reviews has been removed through a thoroughly incestuous process.
An element in the overall dynamic is the societal image of external review as a sideshow meant to embarrass. The congressional hearing is emblematic of the worst sort of review. The whole point is grandstanding and/or destroying those being reviewed. Given this societal model, it is no wonder that reviews have a bad name. No one likes to be invited to their own execution.
environment we find ourselves. First of all, people should be trained or educated in conducting, accepting and responding to reviews. Despite its importance to the scientific process, we are never trained how to conduct, accept or responds to a review (response happens a bit in a typical graduate education). Today, it is a purely experiential process. Next, we should stop including the scoring of reviews in any organizational “score”. Instead the quality of the review including the production or hard-hitting critique should be expected as a normal part of organizational functioning.