Today the United States is the predominant power in the World with its technological advantage leading the way. American technological superiority expresses itself in both economic and military power. Whether through drone, or the Internet it sits on top of the heap. The technology that drives this supremacy is largely the product of military research conducted during the more than fifty years of the Cold War. The Interne
t for instance was born from a defense related research project designed to enable communication during and after a nuclear conflict. The United States appears to be smugly holding its lead almost as if it were part of the natural order. While all of this isn’t terribly arguable, the situation isn’t so rosy for the United States that it can lay back and assume this situation will persist indefinitely. This is exactly what is happening and it is an absolute risk to the Country.
Several factors contribute directly to the risk the USA is taking. A number
of other nations are right behind the United States and they are acting like they are behind by aggressively investing in technology. The technology that the United States depends upon is old, mature and far from the cutting edge. Most of it reflects investments, risks and the philosophy of 40 or 50 years ago when the Cold War was at its height. With the Cold War fading from sight, and victory at hand the United States took the proverbial victory lap while pulling back from the basics that provided the supremacy.
A large part of this is a lack of aggressive pursuit of R&D and a remarkably passive, fear-based approach to investment and management. The R&D goals of excellence, innovation and risk have been changed to acceptable mediocrity, incrementalism and safe bets. We have seen a wholesale change in the federal approach for supporting science. Almost without exception these changes have made the USA less competitive and actively worked toward destroying the systems that once led the World. This is true for research institutions such as federal laboratories and universities. Rather than improving the efficiency or effectiveness of our R&D foundation we have weakened them across the board. It is arguable that our political system has grown to take the USA’s supremacy completely for granted.
Without reverting back to a fresh set of investments and a forward looking philosophy the United States can expect its superiority to fade in the next 20 years. It doesn’t have to happen, but it will if something doesn’t change. The issues have been brewing and building for my entire adult life. American’s have become literally and metaphorically fat and lazy with a sense of entitlement that will be overthrown in a manner that is likely to be profoundly disturbing to catastrophic. We have no one to blame other than ourselves. The best analogy to what is happening is a team that is looking to preserve its victory by sitting on the lead. We have gone into the prevent defense, which as the saying goes “only prevents you from winning” (if you like soccer we have gotten a lead and decided to “park the bus” hoping our opponents won’t score!).
The signs are everywhere; we don’t invest in risking, far out research, our old
infrastructure (roads, bridges, power plants) is crumbling, and our new infrastructure is non-existent. Most other first World nations are investing (massively) in modern efficient Internet and telecommunications while we allow greedy, self-interested monopolies to starve our population of data. Our economy and ultimately our National defense will ultimately suffer from this oversight. All of these categories will provide the same outcome; we will have a weaker economy, weak inc
omes, poorer citizens, and an unreliable and increasingly inferior defense. If things don’t change we will fall from the summit and lose our place in the World.
To maintain a lead in technology and economic growth the Nation must aggressively fund research. This needs to happen in a wide range of fields and entail significant risk. Risk in research has been decreasing with each passing year. Perhaps the beginning of the decline can be traced by the attitude expressed by Senator William Proxmire. Proxmire went to great lengths to embarrass the scientific research he didn’t understand or value with his Golden Fleece Awards. In doing so he did an immense disservice to the Nation. Proxmire is gone, but his attitude is stronger than ever. The same things are true for investing in our National infrastructure; we need aggressive maintenance and far-sighted development of new capabilities. Our current political process does not value our future and will not invest in it. Because of this our future is at risk.
Another key sign of our concern about holding onto our lead is the expansion in government secrecy and classification. The expansion of classification is a direct result of the post 9-11 World, but also fears of losing our advantage. Where science and technology are concerned, the approach depends upon the belief that hiding the secrets can keep the adversary from solving the same problems we have. In some cases this is a completely reasonable approach where elements in the secret make it unique; however in situations where the knowledge is more basic, the whole approach is foolhardy. Beyond the basic classification of things, there is an entire category of classification that is “off the books”. This is the designation of documents as “Official Use Only” which removes them from the consideration under the Freedom of Information Act. This designation is exploding in use. While it does have reasonable purpose quite often it is used as another defacto classification. It lacks the structure and accountability that formal classification has. It is unregulated and potentially very dangerous.
The one place where this has the greatest danger is the area of “export control” which is a form of Official Use Only”. In most cases standard classification is well controlled and highly technically prescribed. Export control has almost no guidance whatsoever. The information falling under export control is much less dangerous than classified info, yet the penalties for violating the regulations are much worse. Along with the more severe penalties comes almost no technical guidance for how to determine what is export controlled. Together it is the recipe for disaster. It is yet another area where our lawmakers are utterly failing the Nation.
Ultimately the worst thing that the United States does is allowing extreme over-confidence to infect its decision-making. Just because the United States has over-whelming technological superiority today does not grant that for the future. As I noted about the superiority of today is based on the research of decades ago. If the research is not happening today, the superiority of the future will fade away. This is where the devotion to secrecy comes in. There is the misbegotten belief that we can simply hide the source of our supremacy, which is the equivalent of sitting on a lead and playing “prevent” defense. As we know the outcome from that strategy is often the opposite of intended. We are priming ourselves to be overtaken and surprised; we can only pray that the consequences will not be catastrophic and deadly.
The way to hold onto a lead is to continue doing those things that provided you the advantage in the first place. Aggressive, risk-taking research with a blend of open-ended objective applied to real-world problems is the recipe we followed in the past. It is time to return to that approach, and drop the overly risk adverse, cautious, over- and micro-managed and backwards looking approach we have taken in the past quarter of a century. The path to maintaining supremacy is crystal clear; it is only a matter of following it.

d perspective on how others live. Omaha offers me the chance to see a real sunrise that the Sandia Mountains deny me. Omaha also seems to eschew the practice of supplying sidewalks for its citizens. This is irritating given my newfound habit of walking every morning, and might explain part of the (big) red state propensity towards obesity.
orkforce in science and technology. General Klotz described the NNSA support for re-capitalizing the facilities as central to this. He reiterated the importance of the workforce several times. From my perspective we are failing at this goal, and failing badly. The science that the United States is depending on is in virtual free fall. Our supremacy militarily is dependent of the science of 20-40 years ago, and the pipeline is increasingly empty. We have fallen behind Europe, and may fall behind China in the not too distant future. The entire scientific establishment is receding from prominence in large part to a complete lack of leadership and compelling mission as a Nation. It is a crisis. It is a massive threat to National security. The concept of deterrence by capability used to be important. It is now something that we cannot defend because our capabilities are in such massive decline. It needs to come back; it needs t
o be addressed with an eye towards recapturing its importance. Facilities are no replacement for a vibrant scientific elite doing cutting edge work. Today, for some reason we seem to accept this as such.
recede the math. The math provides rigor, explanation and bounds for applying techniques. This reflects upon our considerations of where the balance of effort should be placed in driving innovative solutions. Generally speaking, I would posit that the computational experimentation should come first, followed by mathematical rigor, followed by more experimentation, and so on… This structure is often hidden by the manner in which mathematics is presented in the literature.
In developing the history of CFD I am trying to express a broader perspective than currently exists on the topic. Part of the perspective is defining the foundation that existed before computational science was even a conceptual leap in Von Neumann’s mind. I knew that a number of numerical methods existed including integration of ODE’s (the work of Runge, Kutta, Adams, Bashforth, etc…). One of Von Neumann’s great contributions to numerical methods was stability analysis, and now I’m convinced it was even greater than I had imagined.
dies early (e.g. Von Neumann) no such retrospective is available.
What is lost from the literary record is profound. Often the greatest discoveries in applied math come trying a well-crafted heuristic on a difficult problem and finding that it works far better than could be expected. The math then comes in to provide an ordered structural explanation for the empirical observation. Lost in the fray is the fact that the device was heuristic and perhaps a leap or inspiration from some other source. In other cases progress comes from a failure or problem with something that should work. We explain why it doesn’t in a rigorous fashion with a barrier theorem. These barrier theorems are essential to progress. The math then forms the foundation for the next leap. The problem is that the process is undocumented and this ill prepares the uninitiated for how to make the next leap. Experimentation and heuristic is key, and often the math only follows.
now how to do. We need methods that work, and invent math that explains the things that work. A more fruitful path would involve working hard to solve problems that we don’t know how to attack, finding some fruitful avenues for progress, and then trying to systematically explain progress. Along the way we might try being a bit more honest about how the work was accomplished.
Sometimes this blog is about working stuff out that bugs me in a hopefully articulate way. I’ve spent most of the last month going to scientific meetings and seeing a lot of technical talks and one of the things that bugs me the most are finite element methods (FEM). More specifically the way FEM is presented. There really isn’t a lot wrong with FEM per se, it’s a fine methodology that might even be optimal for some problems. I can’t really say because its proponents so often do such an abysmal job of explaining what they are doing and why. That is the crux of the matter.
Scientific talks on the finite element method tend to be completely opaque and I walk out of them knowing less than I walked in. The talks are often given in a manner that seems to intentionally obscure the topic with the seeming objective of making the speaker seem much smarter than they actually are. I’m not fooled. The effect they have gotten is to piss me off, and cause me to think less of them. Presenting a simple problem in an intentionally abstract and obtuse way is simply a disservice to science. It serves no purpose, but to make the simple grandiose and distant. It ultimately hurts the field, deeply.
Instead FEM research is increasingly focused on elliptic PDE’s, which are probably the easiest thing to solve in the PDE world. In other words, if you can solve an elliptic PDE well I know very little about the ability of a methodology’s capacity to attack the really hard important problems. It is nice, but not very interesting (the very definition of necessary and insufficient). Frankly the desire and interest in taking a method designed for solving hyperbolic PDE’s such as discontinuous Galerkin and applying it to elliptic PDE’s is worthwhile, but should not receive anywhere near the attention I see. It is not important enough to get the copious attention it is getting.
Where FEM excels is the abstraction of geometry from the method and ability to include geometric detail in the simulation within a unified framework. This is extremely useful and explains the popularity of FEM for engineering analysis where geometric detail is important, or assumed to be important. Quite often the innovative methodology is shoehorned into FEM having been invented and perfected in the finite volume (or finite difference) world. Frequently the innovative devices have to be severely modified to fit into the FEM’s dictums. These modifications usually diminish the overall effectiveness of the innovations relative to their finite volume or difference forbearers. These innovative devices are necessary to solve the hard multiphysics problems often governed by highly nonlinear hyperbolic (conservation or evolution) equations. I personally would be more convinced by FEM if some of the innovation happened within the FEM framework instead of continually being imported.
In a sense the divide is defined by whether you don’t assume regularity and add it back, or you assume it is there and take measures to deal with it when it’s not there. Another good example comes from the use of FEM for hyperbolic PDE’s where conservation form is important. Conservation is essential, and the weak form of the PDE should give conservation naturally. Instead with the most common Galerkin FEM if one isn’t careful the implementation can destroy conservation. This should not happen, conservation should be a constraint, an invariant that comes for free. It does with FVM, it doesn’t with FEM, and that is a problem. Simple mistakes should not cause conservation errors. In FVM this would have been structurally impossible because of how it was coded. The conservation form would have been built in. In FEM the conservation is a specially property, which is odd for something built on the weak form of the PDE. This goes directly to the continuous basis selected in the construction of the scheme.
are great opportunities to learn about what is going on around the World, get lots of new ideas, meet old friends and make new ones. It is exactly what I wrote about a few weeks ago, giving a talk is second, third or even fourth on the list of reasons to attend such a meeting.

n actual disaster. Frankly, the USA looks much worse by comparison with a supposedly recovering economy. There are private security guards everywhere. The amount of security and the meeting was actually a bit distressing. In contrast to this in a week, at a hotel across the street from the hospital, I heard exactly one siren, amazing. As usual getting away from my standard environment is thought provoking, which is always a great thing.
through by
We can now apply the same machinery to more complex schemes. Our first example is the time-space coupled version of Fromm’s scheme, which is a second-order method. Conducting the analysis is largely a function of writing the numerical scheme in Mathematica much in the same fashion we would use to write the method into a computer code.





Look for






ieved by using symbolic or numerical packages such as Mathematica. Below I’ve included the Mathematica code used for the analyses given above.
he Palmer House, which is absolutely stunning venue swimming in old-fashioned style and grandeur. It is right around the corner from Millennium Park, which is one of the greatest Urban green spaces in existence, which itself is across the street from the Art Institute. What an inspiring setting to hold a meeting. Chicago itself is one of the great American cities with a vibrant downtown and numerous World-class sites.
ortance to the overall scientific enterprise, and applied mathematics is suffering likewise. This isn’t merely the issue of funding, which is relatively dismal, but overall direction and priority. In total, we aren’t asking nearly enough from science, and mathematics is no different. The fear of failure is keeping us from collectively attacking society’s most important problems. The distressing part of all of this is the importance and power of applied mathematics and the rigor it brings to science as a whole. We desperately need some vision moving forward.
the work of Peter Lax on hyperbolic conservation laws. He laid the groundwork for stunning progress in modeling and simulating with confidence and rigor. There are other examples such as the mathematical order and confidence of the total variation diminishing theory of Harten to power the penetration of high-resolution methods into broad usage for solving hyperbolic PDEs. Another example is the relative power and confidence brought to the solution of ordinary differential equations, or numerical linear algebra by the mathematical rigor underlying the development of software. These are examples where the presence of applied mathematics makes a consequential and significant difference in the delivery of results with confidence and rigor. Each of these is an example of how mathematics can unleash a capability in truly “game-changing” ways. A real concern is why this isn’t happening more broadly or in targeted manner.



rather hyperbolic, and it is. The issue is that the leadership of the nation is constantly stoking the fires of irrational fear as a tool to drive political goals. By failing to aspire toward a spirit of shared sacrifice and duty, we are creating a society that looks to avoid anything remotely dangerous or risky. The consequences of this cynical form of gamesmanship are slowly ravaging the United States’ ability to be a dynamic force for anything good. In the process we are sapping the vitality that once brought the nation to the head of the international order. In some ways this trend is symptomatic of our largess as the sole military and economic superpower of the last half of the 20th Century. The fear is drawn from the societal memory of our fading roll in the World, and the evolution away from the mono-polar power we once represented.
ctions in the region has coupled this. Supposedly ISIS is worse than Al Qaeda, and we should be afraid. You are so afraid that you will demand action. In fact that hamburger you are stuffing into your face is a much larger danger to your well being than ISIS will ever be. Worse yet, we put up with the fear-mongers whose fear baiting is aided and abetted by the new media because they see ratings. When we add up the costs, this chorus of fear is savaging us and it is hurting our Country deeply.
United States is now more arduous than entering the former Soviet Union (Russia). This fact ought to absolutely be appalling to the American psyche. Meanwhile, numerous bigger threats go completely untouched by action or effort to mitigate their impact.
When did all this start? I tend to think that the tipping point was the mid-1970’s. This era was extremely important for the United States with a number of psychically jarring events taking center stage. The upheaval of the 1960’s had turned society on its head with deep changes in racial and sexual politics. The Vietnam War had undermined the Nation’s innate sense of supremacy while scandal ripped through the government. Faith and trust in the United States took a major hit. At the same time it marked the apex of economic equality with the beginnings of the trends that have undermined it ever since. This underlying lack of faith and trust in institutions has played a key roll in powering our decline. The anti-tax movement that set in motion public policy that drives the growing inequality in income and wealth began then arising from these very forces. These coupled to the insecurities of national defense, gender and race to form the foundation of the modern conservative movement. These fears have been used over and over to drive money and power into the military-intelligence-industrial-complex at a completely irrational rate.


There is a gap, but it isn’t measured in terms of FLOPS, CPUs, memory, it is measured in terms of our practice. Our supercomputers have lost touch with reality. Supercomputing needs to be connected to a real tangible activity where the modeling assists experiments, observations and design in producing something that services a societal need. These societal needs could be anything from national defense, cyber-security, space exploration, to designing better more fuel-efficient aircraft, or safer more efficient energy production. The reality we are seeing is that each of these has become secondary to the need for the fastest supercomputer.

s become the focus. This has led to a diminishment in the focus on algorithms and methods, which has actually a better track record than Moore’s law for improving computational problem solving capability. The consequence of this misguided focus is a real diminishment in our actual capability to solve problems with supercomputers. In other words, our quest for the fastest computer is ironically undermining our ability to use computers effectively as possible.
We should work steadfastly to restore the necessary balance and perspective for success. We need to allow risk to enter into our research agenda and set more aggressive goals. Requisite with this risk we should provide greater freedom and autonomy to those striving for the goals. Supercomputing should recognize that the core of its utility is computing as a problem solving approach that relies upon computing hardware for success. There is an unfortunate tendency to simply state supercomputing as a national security resource regardless of the actual utility of the computer for problem solving. These claims border on being unethical. We need computers that are primarily designed to solve important problems. Problems don’t become important because a computer can solve them.