• About The Regularized Singularity

The Regularized Singularity

~ The Eyes of a citizen; the voice of the silent

The Regularized Singularity

Category Archives: Uncategorized

Peter Lax’s Philosophy About Mathematics

25 Thursday Jun 2015

Posted by Bill Rider in Uncategorized

≈ 4 Comments

 

Linearity Breeds Contempt

—Peter Lax

imgres copy 3
A few weeks ago I went into my office and found a book waiting for me. It was one of the most pleasant surprises I’ve had at work for a long while, a biography of Peter Lax written by Reuben Hersh. Hersh is an emeritus professor of mathematics at the University of New Mexico (my alma mater), and a student of Lax at NYU. The book was a gift from my friend, Tim Trucano, who knew of my high regard and depth of appreciation for the work of Lax. I believe that Lax is one of the most important mathematicians of the 20th Century and he embodies a spirit that is all too lacking from current mathematical work. It is Lax’s steadfast commitment and execution of great pure math as a vehicle to producing great applied math that the book explicitly reports and implicitly advertises. Lax never saw a divide in math between the two and complete compatibility between them.

41fv+M3GpbL._SY344_BO1,204,203,200_The publisher is the American Mathematical Society (AMS) and the book is a wonderfully technical and personal account of the fascinating and influential life of Peter Lax. Hersh’s account goes far beyond the obvious public and professional impact of Lax into his personal life and family although these are colored greatly by the greatest events of the 20th Century. Lax also has a deep connection to three themes in my own life: scientific computing, hyperbolic conservation laws and Los Alamos. He was a contributing member of the Manhattan Project despite being a corporal in the US Army and only 18 years old! Los Alamos and John von Neumann in particular had an immense influence on his life’s work with the fingerprints of that influence all over his greatest professional achievements.

pdlaxIn 1945 scientific computing was just being born having provided an early example in a simulation of the plutonium bomb the previous year. Von Neumann was a visionary in scientific computing having already created the first shock capturing method and realizing the necessity of tackling the solution of shock waves through numerical investigations. The first real computers were a twinkle in Von Neumann’s eye. Lax was exposed to these ideas and along with his mentors at New York University (NYU), Courant and Friedrichs, soon set out making his own contributions to the field. It is easily defensible to credit Lax as being one of the primary creators of the field, Computational Fluid Dynamics (CFD) along with Von Neumann and Frank Harlow. All of these men had a direct association with Los Alamos and access to computers, resources and applications that drove the creation of this area of study.

500004277-03-01 copyLax’s work started with his thesis work at NYU, and continued with a year on staff at Los Alamos from 1949-1951. It is remarkable that upon leaving Los Alamos to take a professorship at NYU his vision of the future technical work in the area of shock waves and CFD had already achieved remarkable clarity of purpose and direction. He spent the next 20 years filling in all the details and laying the foundation for CFD for hyperbolic conservation laws across the world. He returned to Los Alamos every summer for a while and encouraged his students to do the same. He always felt that the applied environment should provide inspiration for mathematics and the problems studied by Los Alamos were weighty and important. Moreover he was a firm believer in the cause of the defense of the Country and its ideals. Surely this was a product of being driven from his native Hungary by the Nazis and their allies.

Lax also comes from a Hungarian heritage that provided some of the greatest minds of the 20th Century with Von Nimages copy 2eumann and Teller being standouts. Their immense intellectual gifts were driven Westward to America through the incalculable hatred and violence of the Nazis and their allies in World War 2. Ultimately, the United States benefited by providing these refugees sanctuary against the forces of hate and intolerance. This among other things led to the Nazis defeat and should provide an ample lesson regarding the values of tolerance and openness as a contrast.

The book closes with an overview of Lax’s major areas of technical achievement in a series of short essays. Lax received the Abel Prize for Mathematics in 2005 because of the depth and breath of his work in these areas. While hyperbolic conservation laws and CFD were foremost in his resume, he produced great mathematics in a number of other areas. In addition he provided continuous service to the NYU and United States in broader scientific leadership positions.

Before laying out these topics the book makes a special effort to describe Lax’s devotion to the creation of mathematics that is both pure and applied. In other words beautiful mathematics that stands toe to toe with any other pure math, but also has application to problems in the real world. He has an unwavering commitment to the idea that applied math should be good pure math too. The two are not in any way incompatible. Today too many mathematicians are keen to dismiss applied math as being a lesser topic and beneath pure math as a discipline.12099970-aerodynamic-analysis-hitech-cfd

This attitude is harmful to all of mathematics and the root of many deep problems in the field today. Mathematicians far and wide would be well-served to look to Lax as a shining example of how they should be thinking, solve problems, be of service and contribute to a better World.

…who may regard using finite differences as the last resort of a scoundrel that the theory of difference equations is a rather sophisticated affair, more sophisticated than the corresponding theory of partial differential equations.

—Peter Lax

13 Things that produce a mythical or legendary code

19 Friday Jun 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

After all, I believe that legends and myths are largely made of ‘truth’.

― J.R.R. Tolkien

JobsCodeQuoteFor the purposes of this post, “Code” = “Modeling & Simulation tool: instead of the set of instructions in a programming language. Some codes are viewed as being better and more worthy of trust than others. The reasons for such distinctions are many and varied, but most often vague and clouded in mystery. I hope to shed some light on the topic.

Legend does not contradict history. It preserves the fundamental after but magnifies and embellishes it.

― Adrien Rouquette

article4Often codes become useful by simply being the first one to achieve success with a difficult and important problem. In other cases the authors of the code are responsible for the code’s mythic status. Certain authors of codes bring with them a pedigree of achievement that breeds confidence in the product of their work. This is computer instructions or code, which comprises an executable tool. It is a combination of a model of physical reality, a method to solve that model including algorithms developed to optimize the method, and auxiliary software that connects the code to the computer itself. Together with the practices of the users of the code, and options enabled by the code itself, the modeling capability is defined. This capability is then applied to problems of interest and success occurs if the comparisons with observations of reality are judged to be high quality.dag006

Computers are good at following instructions, but not at reading your mind.

—Donald Knuth

What sort of things produces a code of legend and myth?

Myth must be kept alive. The people who can keep it alive are the artists of one kind or another.

― Joseph Campbell

  1. The code allows new problems to be solved, or solved correctly. Often a new interface or setup capability is key to this capacity as well as new models of reality. This has the same dynamic as discovery does in other areas. Being first is an incredibly empowering aspect of work and often provides a de facto basis for success.
  2. JohnvonNeumann-LosAlamosThe code allows problems to be solved better than before by whatever standard is used by the community. Sometimes being first is not enough because the quality of solution isn’t good enough. The discovery ends up being delayed until the results are good enough to be useful. As such success is related to quality of results and the expectations or standards of a technical community.
  3. The code solves a standing problem that hasn’t been solved before, or to a degree that instills confidence. Sometimes problems are acknowledged and result in being a standing challenge. When someone creates a tool that makes an effective solution to this sort of problem, it creates a “buzz” and provides the push the code needs for adoption more broadly.
  4. The code is strongly associated with success in application space (quality by association). If the code is strongly associated with a successful application product, the code can inherit its virtue. Usually this sort of success will be strongly associated with an institution or national program (like ICF, inertial confined fusion). The codes success can persist for as long as the application’s success, or in some cases outlast it.images-1
  5. The code is reliable (robust) and produces useful results as a matter of course. For some areas in modeling and simulation codes are fragile, or too fragile to solve problems of interest. In such cases a code will make a breakthrough when the code simply runs problems to completion and the results are physically or conceptually plausible. Depending on the situation the lowly standard will then transition to other forms of success as the standards for solution improve,
  6. The code produces physically reasonable solutions under difficult circumstances. This is a similar situation to the robustness virtue, but a bit better. Sometimes robustness is achieved through producing really terrible solutions (often very heavily diffused, or smeared out). This often destroys significant aspects of the solution. A better answer without such heavy-handed methods will yield code new followers who evangelize its use, or perhaps embarrass those holding onto the past.
  7. The code is associated with someone with a pedigree such as an acknowledged trailblazer in a seminal field to the application or code specialties. This is praise by association. Someone who is a giant in a field will produce a wake of competence, which is almost completely associated with a cult of personality (or personal achievement).

    Frank Harlow with Jacob Fromm

    Frank Harlow with Jacob Fromm

  8. The code’s methods are uniquely focused on the application problem area and not generalized beyond it. Sometimes a code is so focused in an important niche area that it dominates the analysis like no general-purpose code can. Often this means that the code caters to the basic needs of the analysis specifically and provides a basis of solution of application-specific problems that no general-purpose code can compete with.
  9. The code solves a model of reality that no other code can. In other cases, the code has models no other code provides. These models can be enabling because standard models are not sufficient to explain reality (i.e., fail validation). The new model may require some unique methodology for its solution, which together with the model provide a distinct advantage.
  10. The code is really fast compared to alternatives. For a lot of analysis questions it is important to able to run the code many times. Analysts like getting answers faster more than slower, and a quick turn-around time is viewed as a great benefit. If a code takes too long to get an answer, the ability to fully explore problems via parameter or scenario variation can be negatively impacted.Aircraft
  11. The code’s solutions are amenable to analysis or comparisons to observations are enabled. This has a lot more to do with the auxiliary analysis than the code itself. A code that has good visualization or data analysis built into its analysis system can provide significant encouragement for the use of the code itself.
  1. The code produces results that are comfortable to the community, or define the standard for the community. Sometimes the code simply either meets or sets the expectations for the community using it for analysis. If it confirms what they tend to believe already, the analysts have greater comfort using the code.
  2. The code’s methodology is comfortable to the community (and its intrinsic bias). For example the model and its solution are solved in a Lagrangian frame of reference, and the community only trusts Lagrangian frame solutions.

Storytellers seldom let facts get in the way of perpetuating a legend, although a few facts add seasoning and make the legend more believable.

― John H. Alexander

triple-point_BLAST_q8q7Sometimes a code has one or more of these items going for it. Once the code becomes used and trusted, it is the incumbent and it is very difficult to displace from usage. This is even true with unambiguously better methods. This is just a fact of life.

Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning.

— Rich Cook

 

 

 

 

 

 

 

Why do we do this stuff?

12 Friday Jun 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

The road to Hell is paved with the best of conscious intentions.

― Elizabeth F. Howell

climate_modeling-ruddmanLet me get to one of the key punch lines for this post, “no amount of mesh refinement, accuracy or computer speed can rescue an incorrect model.” The entire reason for doing modeling and simulation is impacting our understanding or response to the reality of the Universe. The only fix for a bad model is a better model. Better models are not something we are investing much effort in. This gets to a fundamental imbalance in high performance computing where progress is now expected to come almost purely through improvements in the performance of hardware.

Success doesn’t come to you; you go to it.

― T. Scott McLeod

Unknown-1If the field were functioning in a healthy manner, the dynamic would be fluid and flexible. Sometimes a new model would spur developments in methods, algorithms for its solution. This would ultimately spill down to software and hardware developments. The dynamic that is working today would also manifest itself in the need for improvements in software and hardware to allow for solutions of meaningful models. The issue at hand today is the 20 year history of emphasis on hardware and its inability to yield progress as promised. It is time to recognize that the current trajectory is imbalanced and needs significant alteration to achieve progress commensurate with what has been marketed to society at large.

There can be no ultimate statements science: there can be no statements in science which can not be tested, and therefore none which cannot in principle be refuted, by falsifying some of the conclusions which can be deduced from them.

― Karl Popper

images-1Modeling and simulation has become an end unto itself and lost some of its connection to its real reason for being done. The reason we conduct modeling and simulation is to understand, explain or influence reality. All of science has the objective of both uncovering the truth of the Universe and allowing man to apply some measure of control to it. As most things become to those practicing an art, modeling and simulation is a deep field combining many disparate fields together toward its accomplishment. This depth allows practitioners to lose track of the real purpose, and focus on the conduct of science to exclusion of its application.

Science has an unfortunate habit of discovering information politicians don’t want to hear, largely because it has some bearing on reality.

― Stephen L. Burns

Why would any of this make a difference?

CFD_tunnel_comparisonBy losing sight of the reason for conducting an activity causes a loss of the capacity to best utilize the field to make a difference. Science has a method and its manner conduct is important to keep in mind. Computational science is a bridge between the reality of physics and engineering and the computers that enable it. The biggest issue is the loss of perspective on what really determines the quality of modeling and simulation. Our current trajectory is focused almost exclusively on the speed of the computer as the route to quality. We have lost the important perspective that no computer can save a lousy model. It just assures a more expensive, high fidelity wrong solution.

The quest for absolute certainty is an immature, if not infantile, trait of thinking.

― Herbert Feigl

LMCT_modellingThe wrong solutions we are getting are not terrible, just limited. Science works properly when there is a creative tension between experiments and theory. Theory can be powered by computing allowing the solution of models impossible without it. Experiments must test these theories either by being utterly new, or employing better diagnostics. Without the experiment to test, confirm or deny, theory can rot from within essentially losing connection with reality. This fate is befalling our theories today by fiat. Our models are almost assumed to be correct and not subject to rigorous testing. More powerful computers are simply assumed to yield better answers. No impetus is present to refine or develop better models where all evidence points toward their utter necessity.

…if you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked—to make sure the other fellow can tell they have been eliminated.

― Richard P. Feynman

18-330s12Applied mathematics is a closely related field where the same slippage from reality is present. Again the utility of applied mathematics is distinctly like that of computing; it is utterly predicated upon the model’s quality visa-vis reality. In the period from World War 2 until around 1990, applied mathematics eagerly brought order and rigor to modeling, simulation and related activities. It became an able and honored partner for the advance and practice of science. Then it changed. It began to desire a deeper sense of mathematical honor as pure mathematics had in its eyes. In doing so applied math turned away from being applied and toward being governed by mathematical qualities. The lack of balance has emptied applied math’s capacity to advance science. The same has happened with computing. We are all poorer for it.

Science is not about making predictions or performing experiments. Science is about explaining.

― Bill Gaede

imagesAll of this may be overcome and the balance may be resurrected. All that is needed is to reconnect these fields with application. Application is a Gordian knot whose very nature powers science. Without the riddle and difficulty of application, the fields lose their vigor. The vigor is powered by attempting the solution of seemingly intractable problems. Without the continual injection of new ideas, the science cannot prosper. Such prosperity is currently being denied by a lack of connectivity to the very reality the fields discussed here could help to master. Such mastery is being denied by the lack of faith in our ability to take risks.

The intention (of an artist) is (the same as a scientist)…to discover and reveal what is unsuspected but significant in life.

― H W Leggett

Bad models abound in use today. A lot of them should be modified and discarded, but in today’s direction for scientific computing, we are simply claiming that a faster computer will open the door to solution. Many idealized equation sets are used in modeling that yield intrinsically unphysical solutions. The Euler equations without dissipation are a prime example. Plasma physics is yet another place where unphysical models are used because dissipation mechanisms are small. In macroscopic models dissipation is omnipresent, and leads to satisfaction of the second law of thermodynamic. Ideal equations remove this essential aspect of modeling by fiat.

The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.

—Sir Arthur Stanley Eddington

In no place is this more greatly overloaded with conimages-1 copytext than turbulence. There is a misbegotten belief that solving the incompressible Navier-Stokes equations will unveil the secrets of turbulence. Incompressibility is fundamentally unphysical and may remove fundamental aspects of turbulence through its invocation. Incompressibility implies infinite speed of sound and a lack of thermodynamics. Connections between the incompressible and compressible equations only exist for adiabatic (dissipation-free) flows. Turbulence is characterized by dissipation in the absence of finite viscosity, which implies derivative singularities in the flow. Compressible fluids have this character and its nature is highly associated with the details of thermodynamics. Incompressible flows have not been clearly associated with this character, and the lack of thermodynamic is a likely source of this failing.

The plural of anecdote is not data.

― Marc Bekoff

Another aspect of our continuum modeling codes is the manner of describing material response. We tend to describe materials in a homogeneous manner that is we “paint” them into a physical region of the problem. All the aluminum in a problem will be described by the same constitutive laws without regard to the separation of scales 34yrawwyca6bwyebdmh5l5zzdlncalwgbetween the computational mesh, and the physical scales in the material. This approach has been around for over 50 years and shows no signs of changing. It is actually long since past the time when this should have changed.

It is more Important to be of pure intention than of perfect action.

― Ilyas Kassam

The key is to apply the scientific method with rigor and vigor. Right now scientific computing has vacated its responsibility to apply the scientific method appropriately. Too often modeling and simulation are touted as being the third leg of science equal to theory and experiment. Modeling should always be beholding to experimental and fiUnknowneld observation, and should the model be found to be in opposition to the observation, it must be found faulty. Modeling is rather an approach to more generally and broadly find solutions to theory. Thus theory can be extended to more nonlinear and complex models of reality. This should aid the ability of theory to describe the physical universe. Often simulation can act as a Laboratory for theory where suppositional theory can be tested for congruence with observation (computational astrophysics is a prime example of this).

Intention is one of the most powerful forces there is. What you mean when you do a thing will always determine the outcome. The law creates the world.

― Brenna Yovanoff

epic-winThe bottom line is whether we are presently on a path that allows modeling and simulation to take its proper place in impacting reality, or explaining reality as part of the scientific method? I think the answer today is a clear and unequivocal, no. A combination of modern day political correctness regarding the power of computational hardware, over-selling of computing, fear and risk avoidance all lead to this. Each of these factors needs to be overcome to place us on the road to progress.

The tiniest of actions is always better than the boldest of intentions

― Robin Sharma

What needs to happen to make things better?

  • Always connect the work in modeling and simulation to something the real world,
  • Balance effort with the benefit of the world to the real world
  • Find a way to give up on determinism to an appropriate degree, model the degree of variability seen in reality,
  • Do not over emphasize the capacity of computational power to simply solve problems by fiat,
  • Take risks especially risks that have a high chance for failure, but large payoffs,
  • Allow glorious failure and reward risk-taking if done in a technically appropriate manner,
  • New methods and algorithms provide potential for quantum improvements in efficiency and accuracy as well as the promise of new uses for computational models,
  • No single aspect of modeling and simulation should be starved of attention as every part of this ecosystem must be healthy to achieve progress in predictive science,
  • Stop settling for legacy models, methods and codes just because they are “good” enough focus on quality and excellence.

In the republic of mediocrity, genius is dangerous.

― Robert G. Ingersoll

The Best Computer

05 Friday Jun 2015

Posted by Bill Rider in Uncategorized

≈ 2 Comments

cell-phone

What’s the “best” computer? By what criteria should a computer be judged? Best for what? Is it the fastest? Or the easiest to use? Or the most useful?

The most honest answer is probably the most useful, or impactful computer in how I live my life or work, so I’ll answer in that vein.

Have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

― Steve Jobs

Details matter, it’s worth waiting to get it right.

― Steve Jobs

If I had to answer honestly, it’s probably the latest computer I bought, my new iPhoneimgres url url-16. It’s an absolute marvel. It is easy to use and useful all at once. I have a vast array of applications to use, plus I can communicate with the entire World and access an entire World’s worth of information. I can access maps, find a place to eat lunch, take notes, access notes, find out the answer to questions, keep up with friends, and make new ones. It also allows me to listen to music either stored or via “radio”. It is so good that I am rarely without it. It helps me work out at the gym with an interval timer that I can program to develop unique tailored workouts. Anything that links to the “cloud” for data is even better because the data on the iPhone is the same as other platforms I use. The productivity and efficiency that I can work with is now simply stunning. The word awesome doesn’t quite do it justice. If you gave it toEvernote Camera Roll 20141026 065749 me ten years ago, I’d have thought aliens delivered the technology to humans.

We don’t get a chance to do that many things, and every one should be really excellent. Because this is our life.

― Steve Jobs

The fastest computer I have access to isn’t very good, or useful. It is just fast and really hard to use. In all honesty it is a complete horror show. For the most part this really fast computer is only good for crunching a lot of numbers in a terribly inefficient manner. It isn’t merely not a multi-purpose computer; it is single purpose computer that is quite poor at delivering that single purpose. Except for its speed it compares poorly to the supercomputers I used over 20 years ago. I say this noting that I am not prone to nostalgia at all. Generally I favor the modern over the past by a wide margin. This makes the assessment of modern supercomputing all the more damning.

Don’t be trapped by dogma — which is living with the results of other people’s thinking.

― Steve Jobs

Your time is limited, so don’t waste it living someone else’s life.

― Steve Jobs

Unlike the iPhone with its teeming modernity, the modern supercomputer is an ever more monstrous proposition with each passing year. Plans for future supercomputers are sure to create a new breed of monsters (think Godzilla, a good name for one of the machines!) that promise to consume energy like American consumers drunk on demonstrating their God-given right to excess. They also promise to be harder to use, less reliable, and nearly impossible to program. They might just be truly evil monsters in the making. The evil being done is primarily the loss of opportunity to make modeling and simulation match the hype.

Anything worth doing, is worth doing right.

― Hunter S. Thompson

It isn’t that the hyped vision of modeling and simulation as a third way for science is so flawed; it is our approach to achieving this vision that is so counter-productive. The vision is generally sound provided that the steps we took actually led to such an outcome. The overbearing emphasis on computing speed as the key path to producing a predictive modeling capability is fatally flawed. It is a path lacks the sort of checks and balances that science needs to succeed. A faulty model cannot predict reality regardless of how fast it executes on a computer, or how refined the computational “mesh” is. Algorithmic improvements can provide new applications, solve unsolved problems, and provide greater efficiency that pure computational speed cannot deliver.

It’s not like I’m all into nostalgia and history, it’s just that I can’t stand the way things are now

― Novala Takemoto

bh_computers_09The current fastest computer certainly isn’t the best supercomputer ever built. That crown lies on the head of the Crays of the 70’s, 80’s and 90’s built by that genius Seymour Cray. In the form of the X-MP, Y-MP, C90 or Cray 2 the supercomputer reached its zenith. In relative terms these Crays were joys to use, and program. They were veritable iPhones compared to the rotary phones we produce today. At that time with an apex in functionality and utility for supercomputing massively parallel computing was born (i.e., the attack of the killer micros), and the measure of a supercomputer became speed above all else. Utility, and usefulness be damned. The fully integrated software-hardware solution found in a Cray Y-MP became a relic in the wake of the “need for speed”.

Study the past if you would define the future.

― Confucius

titan2In a sense the modern trajectory of supercomputing is quintessentially American, bigger and faster is better by fiat. Excess and waste are virtues rather than flaw. Except the modern supercomputer it is not better, and not just because they don’t hold a candle to the old Crays. These computers just suck in so many ways; they are soulless and devoid of character. Moreover they are already a massive pain in the ass to use, and plans are afoot to make them even worse. The unrelenting priority of speed over utility is crushing. Terrible is the only path to speed, and terrible is coming with a tremendous cost too. When a colleague recently quipped that she would like to see us get a computer we actually wanted to use, I’m convinced that she had the older generation of Crays firmly in mind.

The future is already here – it’s just not evenly distributed.

― William Gibson

So, who are the geniuses that created this mess?

imagesWe have to go back to the mid-1990’s and the combination of computing and geopolitical issues that existed then. The path taken by the classic Cray supercomputers appeared to be running out of steam insofar as improving performance. The attack of the killer micros was defined as the path to continued growth in performance. Overall hardware functionality was effectively abandoned in favor of pure performance. The pure performance was only achieved in the case of benchmark problems that had little in common with actual applications. Performance on real application took a nosedive; a nosedive that the benchmark conveniently covered up. We still haven’t woken up to the reality.

Remembrance of things past is not necessarily the remembrance of things as they were.

― Marcel Proust

Geopolitically we saw the end of the Cold War including the cessation of nuclearCray XE6 image
Unknown-3Operation_Upshot-Knothole_-_Badger_001weapons’ testing. In the United Stated a program including high performance computing was sold as the alternative to nuclear testing (the ASCI program, now the ASC program). This program focused on computing power as the sole determinant of success. Every other aspect of computing became a veritable afterthought and was supported on a shoestring budget (modeling, methods, algorithms, and V&V). The result has been fast, unusable computers that deliver a pittance of their promised performance and a generation of codes with antiquated models and algorithms (written mostly in C++). We’ve been on this foolish path ever since to the extent that it’s become the politically correct and viable path going forward. We have lost a generation of potential scientific progress at the altar of this vacuous model for progress.

It shocks me how I wish for…what is lost and cannot come back.

― Sue Monk Kidd

Why do we choose this path when other more useful and rational approaches are available?

Risk aversion.

Unknown-1In the past forty some odd years we have as a society lost the ability to take risks even when the opportunity available is huge. The consequence of failure has become greater than the opportunity for success. In computing this trend has been powered by Moore’s law, the exponential growth in computing power over the course of the last 50 years (its not a law, just an observation). Under Moore’s law you just have to let time pass and computer performance will grow. It is a low-risk path to success.

When did the future switch from being a promise to being a threat?

― Chuck Palahniuk

Every other aspect of modeling and simulation entails far greater risk and opportunity to either fail, or fail to deliver in a predictable manner. Innovation in many areas critical to modeling and simulation are prone to episodic or quantum leaps in terms of capabilities (especially modeling and algorithms). These areas of potential innovationmistakesdemotivatorare also prone to failures where ideas simply don’t pan out. Without the failure you don’t have the breakthroughs hence the fatal nature of risk aversion. Integrated over decades of timid low-risk behavior we have the makings of a crisis. Our low-risk behavior has already created a fast immeasurable gulf in what we can do today versus what we should be doing today.

You realize that our mistrust of the future makes it hard to give up the past.

― Chuck Palahniuk

fastest-supercomputer-Fujitsu-Numerical-Wind-TunnelAn aspirational goal for high performance computing would be the creation of a computing environment that meant as much for scientific work as my iPhone means for how I live my life. Today we are very far from that ideal. The key to the environment isn’t the speed of the hardware, but rather the utility of how the hardware is integrated with the needs of the user. In high performance computing the user needs to produce scientific results, which depend far more on the modeling’s fundamental character than the speed of the computer.

The future depends on what you do today.

― Mahatma Gandhi

Focusing on the “Right” Scaling is Essential

29 Friday May 2015

Posted by Bill Rider in Uncategorized

≈ 3 Comments

 

I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.

― Abraham Maslow

High performance computing is a big deal these days and may become a bigger deal very soon. It has be500x343xintel-500x343.jpg.pagespeed.ic.saP0PghQP9come a new battleground for national supremacy. The United States will very likely soon commit to a new program for achieving progress in computing. This program by all accounts will be focused primarily on the computing hardware first, and then the system software that directly connects to this hardware. The goal will be the creation of a new generation of supercomputers that attempt to continue the growth of computing power into the next decade, and provide a path to “exascale”. I think it is past time to ask, “do we have the right priorities?” “Is this goal important and worthy of achieving?”

Lack of direction, not lack of time, is the problem. We all have twenty-four hour days.

― Zig Ziglar

Unknown-3I’ll return to these two questions at the end, but first I’d like to touch on an essential concept in high performance computing, scaling. Scaling is a big deal, it measures success in computing, in a nutshell describes efficiency of solving problems particular with respect to changing problem size or computing resource. In scientific computing one of the primary assumptions is that bigger faster computers yield better, more accurate results that have greater relevance to the real world. The success of computing depends on scaling and breakthroughs in achieving it, defines the sort of problems that could be solved.

Nothing is less productive than to make more efficient what should not be done at all.

― Peter Drucker

There are setechnicaldebtveral types of scaling with distinctly different character. Lately the dominant scaling in computing has been associated with parallel computing performance. Originally the focus was on strong scaling, which is defined by the ability of greater computing resources to solve a problem of fixed size faster. In other words perfect strong scaling would result from solving a problem twice as fast with two CPUs than with one CPU.

Lately this has been replaced by weak scaling where the problem size is adjusted along with the resource. The goal is to solve a problem that is twice as big with two CPUs just as fast as the original problem is solved with one CPU. These scaling results depend both on the software implementation and the quality of the hardware. They are the stock and trade of success in the currently envisioned high performance-computing program nationally. They are also both relatively unimportant and poor measures of the power of computing to solve scientific problems.

Two things are infinite: the universe and human stupidity; and I’m not sure about the universe.

― Albert Einstein

Algorithmic scaling is another form of scaling and it is massive in its power. We are failing to measure, invest and utilize it in moving forward in computing nationally. The gains to be made through algorithmic scaling will almost certainly lay waste to anything that computing hardware will deliver. It isn’t that hardware investments aren’t necessary; they are simply grossly over-emphasized to a harmful degree.

The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom.

― Isaac Asimov

The archetype of afq0A8hxlgorithmic scaling is sorting a list, which is an amazingly common and important function for a computer program. Common sorting algorithms are things like insertion, or quick-sort, and each comes with a scaling for the memory required and the number of operations to work to completion. In most cases the best that can be done is linear scaling, in other words for a list that is L items long, it takes order L operations. This means that for a sufficiently large list the cost is proportional to some constant times the length of the list, C N. Other high-grade algorithms like quicksort take order L log L, but may carry a smaller constant. These can be faster for shorter lists. If one chooses very poorly the sorting can scale like L^2. There are alsourlaspects of the algorithm and it’s scaling that speak to the memory-storage needed and the complexity of the algorithm’s implementation. These themes carry on to a discussion of more esoteric computational science algorithms next.sort-characteristics

In scientific computing two categories of algorithm loom large over substantial swaths of the field: numerical linear algebra, and discretization-methods. Both of these categories have important scaling relations associated with their use that have a huge impact on the efficiency of solution. We have not been paying much attention at all to the efficiencies possible from these areas. Improvements in both areas could yield improvements in performance that would put any new computer to shame.

For numerical linear algebra the issue is the cost of solving the matrix problem with respect to the number of equations. For the simplest view of the problem one uses a naïve method like Gaussian elimination (or LU decomposition), which scales like N^3 where N is the number of equations to be solved. This method is designed to solve a dense matrix where there are few non-zero entries. In scientific computing the matrices are typically “sparse” meaning most entries are zero. An algorithm specifically for sparse matrices lowers the scaling to N^2. These methods both produce “exact” solutions to the system (modulo poorly conditioned problems).7b8b354dcd6de9cf6afd23564e39c259

If an approximate solution is desired or useful one can use lower cost iterative methods. The simplest methods like the Jacobi or Gauss-Seidel iteration also scale at N^2. Modern iterative methods are based on the Krylov subspace with the conjugate gradient method being the classical method. As exact solutions these methods scale as N^2, but as iterative methods for approximate solutions the scaling lowers to $N^\frac{3}{2}$. One can do even better with multigrid methods, lowering the scaling to N.

Each of this sequence of methods has a constant in front of the scalinUnknowng, and the constant gets larger as scaling gets better. Nonetheless it is easy to see that if you’re solving a billion unknowns the difference between N^3 and N is immense, a billion billion. The difference in constants between the two methods is several thousand. In the long run multigrid wins. One might even do better than multigrid with current research in data analysis producing sublinear algorithms for large-scale data analysis. Another issue is the difficulty of making multigrid work in parallel, as the method is inherently NOT parallel in important parts. Multigrid performance is also not robust and Krylov subspace methods still dominate actual use.

Learn from yesterday, live for today, hope for tomorrow. The important thing is to not stop questioning.

― Albert Einstein

images-1Discretization can provide even great wins. If a problem is amenable to high-order accuracy, a higher order method will unequivocally win over a low-order method. The problem is that most practical problems you can get paid to solve don’t have this property. In almost every case the solution will converge at a first-order accuracy. This is the nature of the world. The knee-jerk response is that this means that high-order methods are not useful. This shows a lack of understanding on what they bring to the table and how they scale. High-order methods produce lower errors than low-order methods even when high-order accuracy cannot be achieved.Unknown

As a simple example take a high-order method that delivers half the error of a low order method. To get equivalent results the high-order method would take half the mesh defined by a number of “cell” or “elements” per dimension M. If one is interested in time-dependent problems, the number of time steps is usually proportional to M. Hence a one-dimensional problem would require M^2 degrees of freedom. For equivalent accuracy the high-order method would require M/2 cells and one-fourth of the degrees of freedom. It breaks even at four times the cost. In three dimensional time dependent problems, the scaling is M^4 and the break-even point is 16 in cost. This is imminently doable. Even larger improvements in accuracy would provide an even more insurmountable advantage.

stability-3.hiresThe counter-point to these methods is their computational cost and complexity. The second issue is their fragility, which can be recast as their robustness or stability in the face of real problems. Still their performance gains are sufficient to amortize the costs given the vast magnitude of the accuracy gains and effective scaling.

An expert is a person who has made all the mistakes that can be made in a very narrow field.

― Niels Bohr

images copy 26The last issue to touch upon is the need to make algorithms robust, which is just another word for stable. Work on stability of algorithms is simply not happening these days. Part of the consequence is a lack of progress. For example one way to view the lack of ability of multigrid to dominate numerical linear algebra is its lack of robustness (stability). The same thing holds for high-order discretizations, which are typically not as robust or stable as low order ones. As a result low-order methods dominate scientific computing. For algorithms to prosper work on stability and robustness needs to be part of the recipe.

If we knew what it was we were doing, it would not be called research, would it?

― Albert Einstein

Performance is a monotonically sucking function of time. Our current approach to HPC will not help matters, and effectively ignores the ability of algorithms to make things better. So “do we have the right priorities?” and “is this goal (of computing supremacy) important and worthy of achieving?” The answers are an unqualified NO and a qualified YES. The goal of computing dominance and supremacy is certainly worth achieving, but having the fastest computer will absolutely not get us there. It is neither necessary, nor sufficient for success.

This gets to the issue of priorities directly. Our cTitan-supercomputerurrent program is so intellectually bankrupt as to be comical, and reflects a starkly superficial thinking that ignores the sort of facts staring them directly in the face such as the evidence of commercial computing. Computing matters because of how it impacts the real world we live it. This means the applications of computing matter most of all. In the approach to computing taken today the applications are taken completely for granted, and reality is a mere afterthought.

Any sufficiently advanced technology is indistinguishable from magic.

― Arthur C. Clarke

 

Computational Science as a Commodity

23 Saturday May 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

 

[E]xceptional claims demand exceptional evidence.

What can be asserted without evidence can also be dismissed without evidence.

― Christopher Hitchens

Commercial CFD and mechanics codes are an increasingly big deal. They would have flameyou believe that the only thing people should concern themselves with is the meshing problems, graphical user interfaces, and computer power. The innards of the code with its models and methods are basically in the bag, and no big deal. The quality of the solutions is assured because it’s a solved problem. Input your problem in a point and click manner, mesh it, run it on a big enough computer and point and click visuals, then you’re done.

Marketing is what you do when your product is no good.

― Edwin H. Land

Of course this is not the case, not even close. One might understand why these vendors might prefer to sell their product with this mindset. The truth is that the methods in the codes are old, generally not very good by modern standards. If we had a healthy research agenda for developing improved methods, the methods in the codes would be appalling. On the other hand they are well understood, highly reliable or robust (meaning they will run to completion without undue user intervention). This doesn’t mean that they are correct or accurate. The problem is that they represent a very low bar of success. The codes and the methods utilized by them are far below what might be possible with a healthy computational science program.

legacy-code-1Another huge issue is the targeted audience of the users for these codes. If we go back in time we could rely upon the codes being used only by people with PhD’s. Nowadays the codes are targeted at people with Bachelor’s degrees with little or no expertise or interest in numerical methods or advanced models in the context of partial differential equations. As a result, these aspects of the code’s makeup and behavior have been systematically reduced in importance to being practically ignored. Of course, because I work at these very details provides me with the knowledge and evidence that these aspects are paramount in importance. All the meshing, graphics and gee-whiz interface can’t overcome a bad model or method in a code.

Reality is one of the possibilities I cannot afford to ignore

― Leonard Cohen

naseem-preccinsta burnerOne way to get to the truth is verification and validation (V&V). While V&V has become an important technical endeavor, it usually is applied more as a buzzword than an actual technical competence. The result is usually a set of activities that have more of the look and feel of V&V than the actual proper practice of V&V. Those marketing the codes tend to trumpet their commitment to V&V while actually espousing the cutting of V&V corners. Part of the problem is that rigorous V&V would in large part undercut many of their marketing premises.

We all die. The goal isn’t to live forever, the goal is to create something that will.

― Chuck Palahniuk

images-2What is truly terrifying about the state of affairs today is that this attitude has gone beyond the commercial code vendors and increasingly defines the attitude at National Labs, and Academia, the place where innovation should be coming from? Money for developing new methods and models has dried up. The emphasis in computational science has shifted to parallel computing and the acquisition of massive new computer platforms.

Reality is that which, when you stop believing in it, doesn’t go away.

― Philip K. Dick

tumblr_mrs5twy2xO1qlwxteo1_500The unifying theme in all of this is that the perception that is being floated is modeling and numerical methods is a solved area of investigation and we simply await a powerful enough computer to unveil the secrets of the universe. This sort of mindset is more appropriate for some sort of cultish religion than science. It is actually antithetical to science, and the result is a lack of real scientific progress.

Don’t try to follow trends. Create them.

― Simon Zingerman

So, what is needed?

  • Computational science needs to acknowledge and play by the scientific method. Increasingly, today it does not. It acts on articles of faith and politically correct low risk paths to “progress”.
  • We need to cease believing that all our problems will be solved by a faster computer
  • The needs of computational science should balance the benefits of models, methods, algorithms, implementation, software and hardware instead of the articles of faith taken today.
  • Embrace risks needed for breakthroughs in all of these areas especially models, methods and algorithms, which require creative work and generally need inspired results for progress.
  • Acknowledge that the impact of computational science on reality is most greatly improved by modeling improvements. Next in impact are methods and algorithms, which provide greater efficiency. Instead our focus on implementation, software and hardware actually produces less impact on reality.
  • Practice V&V with rigor and depth in a way that provides unambiguous evidence that calculations are trustworthy in a well-defined and supportable manner.
  • Acknowledge the absolute need for experimental and observational science in providing the window into reality.
  • Stop overselling modeling and simulation as an absolute replacement for experiments, and more as a guide for intuition and exploration to be used in association with other scientific methods.

Reality is frequently inaccurate.

― Douglas Adams

 

Opportunity Knocks, Will We Answer?

15 Friday May 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

A life spent making mistakes is not only more honorable, but more useful than a life spent doing nothing.

― George Bernard Shaw

Tianhe-2-supercomputerHigh performance computing is a hot topic these days. All sorts of promises have been made regarding its transformative potential. Computational modeling is viewed as the cure to the lack of ability to do expensive, dangerous or even illegal experiments. All sorts of benefits are supposed to rain down upon society as a driver of a faster, better and cheaper future. If we were collectively doing everything that should be done these promises might have a chance of coming true, but we’re not and they won’t, unless we start doing things differently.

The Chinese use two brush strokes to write the word ‘crisis.’ One brush stroke stands for danger; the other for opportunity. In a crisis, be aware of the danger–but recognize the opportunity.

― John F. Kennedy

 So, what the hell?

Computing’s ability to deliver on these promises is at risk, ironically due to a lack of risk taking. The scientific computing community seems to have rallied around taking the safe path of looking toward faster computing hardware as the route toward enhanced performance. High payoff activities such as new model or algorithm development are also risky and likely to fail with high probability. The relatively small number of successful projects in these areas result in massive payoffs in terms of performance.mistakesdemotivator

Despite a strong historical track record of providing greater benefits for computational simulation than hardware efforts to improve modeling, methods and algorithms are starved for support. This will kill the proverbial goose that is about to lay a golden egg. We are figuratively strangling the baby in the crib by failing to feed the core of creative value in simulation. We have prematurely declared that computational simulation is mature and ready for prime time. In the process we are stunting its growth and throwing money away on developing monstrous computers to feed computational power to a “petulant teen”. Instead we need to develop the field of simulation further and make some key steps toward providing society with a mature and vibrant scientific enterprise. Policy makers have defined a future where the only thing that determines computational simulation capability to be the computing power of the computer it runs on.

This mindset has allowed the focus to shift almost in its entirety toward computing hardware. Growth in the performance of computing power is commonly used as an advertisement for the access and ease of utilizing computational modeling. An increasing number of options exist for simply buying simulation capability in the form of computational codes. The user interfaces for the codes allows relatively broad access to modeling and definitely takes the capability out of the hands of the experts. For those selling capability this democratization is a benefit because it increases the size of the market. Describing this area as a mature, solved problem is another marketing benefit.12099970-aerodynamic-analysis-hitech-cfd

The question of whether this is a good thing still needs to be asked. How true are these marketing pitches?

It is relatively easy to solve problems today. Computer power allows the definition of seemingly highly detailed models and fine computational grid as well as stunning visual representations. All of these characteristics provide users with the feeling of simulation quality. The rise of verification and validation should allow the users to actually determine whether these feelings are justified. Generally V&V undermines ones belief in how good results are. On the other hand people like to feel that their analysis is good. This means that much of the negative evidence is discounted or even dismissed when conducting V&V. The real effect of the slipshod V&V is to avoid the sort of deep feedback that the quality of results should have on the codes.

When you fail, that is when you get closer to success.

― Stephen Richards

images-1At this juncture it’s important to talk about current codes and the models and methods contained in them. The core of the philosophy of code based modeling goes all the way back into the 1960’s and has not changed much since. This is a problem. In many cases the methods used in the codes to solve the models are nearly as old. In many cases the methods were largely perfected during the 1970’s and 1980’s. Little or no effort is presently being put forth to advance the solutions techniques. In summary most effort is being applied to simply implementing the existing solution techniques on the next generation of computers.

Remember the two benefits of failure. First, if you do fail, you learn what doesn’t work; and second, the failure gives you the opportunity to try a new approach.

― Roger Von Ouch

Almost certainly the models themselves are even more deeply ensconced and effectively permanent. No one even considers changing the governing equations being solved. Models of course have a couple of components, the basic governing equations are generally quite classical, and their closure is the part that slowly evolves. These equations were the product of 17th-19th Century science and philosophical mindset that should being questioned if science itself were healthy. If one thinks about the approach we take today, the ability to resolve new length and time scales it has changed monumentally. We should be able to solve vastly nonlinear systems of equations (we really can’t in a practical robust manner). Is it even appropriate to have the same equations? Or should the nature of the equations change as a function of the characteristic scales of resolution. Closure modeling evolves more readily, but only within the philosophical confines defined by the governing equations. Again, we are woefully static, and the lack of risk taking is undermining any promise for actual promise.

Take the practice of how material properties are applied to a problem as a key point. The standard way to apply material properties is to “paint” the properties into regions containing a material. For example if aluminum exists in the problem, a model defines the properties with its response to forces. The aluminum is defined, as being the same everywhere there is aluminum. As the scale size gets smaller aluminum (or any material) gets less and less homogeneous. There begins to be significant differences in the structure typically defined by the grain structure of the material and any imperfections. The model is systematically ignoring these heterogeneous features. Usually their collective effects are incorporated in an average way in the model, but the local effects of these details are ignored. Modern application questions are more and more focused upon the sort of unusual effects that happen due to these local defects.

Experimental observations are only experience carefully planned in advance, and designed to form a secure basis of new knowledge.

― Sir Ronald Fisher

Let’s be perfectly blunt and clear about the topic of modeling. The model we solve in simulation is the single most important and valuable aspect of a computation. A change in the model that opens new physical vistas is more valuable than any computer including one with limitless power. A computer is no better than the model it solves. This point of view is utterly lost today.

More dangerously we are continuing to write codes for the future in the same manner today. In other words we have had the same philosophy in computational modeling for the last 50 years or more. The same governing equations and closure philosophy are being used today. How much longer will be continue to do the same thing? I believe we should have changed a while ago. We can begin to study the impact of material and solution heterogeneity already, but the models and methods to do so are not being given any priority.

The reason is that it would be disruptive and risky. It would require changing our codes and practices significantly. It would undermine the narrative of computer power as the tonic for what ails us. It would be a messy and difficult path. It would also be consistent with the scientific method instead of following a poorly thought through intellectually empty article of faith. Because risk taking is so antithetical today this path has been avoided.

Our most significant opportunities will be found in times of greatest difficulty.

― Thomas S. Monson

ClimateModelnestingThe investments in faster computers are valuable and beneficial, but only if these investments are balanced with other investments. Modeling is the aspect of computation that is closest to reality and holds the greatest leverage and value. Methods for solving models and associated algorithms are next closest and have the next highest leverage. Neither of these areas is being invested in at a healthy level. Implementing these algorithms and models is next most important. Here there is a little more effort because existing models need to work on the computers. The two areas with the highest level of effort are system software and hardware. Ironically these two areas have the least amount of value in terms of effecting reality. No one in a position of power seems to recognize how antithetical to progress this state of affairs is.

Sometimes it’s the mistakes that turn out to be the best parts of life

― Carrie Ryan

How much does the user of a code impact its results?

08 Friday May 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Hypocrites get offended by the truth.

― Jess C. Scott

 

sankaran_fig1_360The overall quality of computational modeling depends on a lot of things and one very big one isn’t generally acknowledged, whoever is using the code. How much does it matter? A lot, much more than almost anyone would admit and the effect becomes greater as problem complexity grows.

 

The most important property of a program is whether it accomplishes the intention of its user.

― C.A.R. Hoare

 

The computer, the code, the computational resolution (i.e., mesh), the data, the models, and the theory all get acute and continuous attention from verification and validation. When the human element in quality is raised as an issue, people become immensely defensive. At the same time it is generally acknowledged by knowledgeable people that the impact of the user of the code (or modeler) is huge. In many cases it may be the single greatest source of uncertainty.

 

We don’t see things as they are, we see them as we are.

― Anaïs Nin

 

UnknownThis isn’t a matter of simple mistakes made in the modeling process; this is associated with reasonable choices made in representing complex problems. Different modelers make different decisions about dealing with circumstances and representing all the “gray” area. In many cases these choices live in the space where variability in results should be. For example the boundary or initial conditions are common sources of the changes. Reality is rarely fully reproducible and details that are generally fuzzy result are subtle changes in outcomes. In this space, the user of a code can different, but equally reasonable choices about how to model a problem. These can result in very large changes in results.

 

Despite this the whole area of uncertainty quantification of this effect is largely missing. This is because it is such an uncomfortable source of variation in results. Only a few areas readily acknowledge or account for this such as nuclear reactor safety work, the Sandia “Fracture Challenge” and a handful of other isolated cases. It is something that needs much greater attention, but only if we are courageous enough to attack the problem.

 

It’s funny how humans can wrap their mind around things and fit them into their version of reality.

― Rick Riordan

 

images-1The capacity to acknowledge this effect and measure it is largely resisted by the community. We are supposed to live in an age where everything is automatic and the computer will magically unveil the truths of the universe to us. This is magical thinking, but the commonly accepted dogma of modernity. Instead the core of value is fundamentally connected to the human element, and this truth seems to beyond our ability to admit.

 

We do not need magic to transform our world. We carry all of the power we need inside ourselves already.

― J.K. Rowling

 

The Plague of Black Box Mentality

01 Friday May 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Physics is becoming so unbelievably complex that it is taking longer and longer to train a physicist. It is taking so long, in fact, to train a physicist to the place where he understands the nature of physical problems that he is already too old to solve them.

— Eugene Paul Wigner

The past decade has seen the rise of commercial modeling and simulation tools with jean-rostand-computers-quotes-think-why-think-we-have-computers-to-doseemingly great capabilities. Computer power has opened vistas of simulation to the common engineer and scientist. Advances in other related technologies like visualization have provided an increasingly “turn-key” experience to users who can do seemingly credible cutting-edge work on their laptops. These advances also carry with them some real dangers most acutely summarized as a “black box” mentality toward the entire modeling and simulation enterprise.

artificial intelligence is no match for natural stupidity

—Albert Einstein

Black box thinking causes problems because people get answers without understanding how those answers are arrived at. When problems are simple and straightforward this can work, but as soon as the problems become difficult issues arise. The models and methods in a code can do funny things that only make sense knowing the inner workings of the code. The black box point of view usually comes with too much trust of what the code is doing. This can cause people to accept solutions that really should have been subjected to far more scrutiny.

The missing element in the black box mentality is the sense of collaboration needed to make modeling and simulation work. The best work is always collaborative including elements of computation and modeling, but also experiments, mathematics and its practical applications. This degree of multi-disciplinary work is strongly discouraged today. Ironically, the demands of cost accounting often work steadfastly to undermine the quality of work by dividing people and their efforts into tidy bins. The drive to make everything accountable discourages the ability to conduct work in the best way possible. Instead our current system of management encourages the black box mentality.

Another force for pushing black box thinking is education. Students now run codes whose interface is easy enough to bear some resemblance to video games. Of course with a generation of scientists and engineers raised on video games this could be quite powerful. At the same time the details of the codes are not generally emphasized, and instead they tend to be viewed as black boxes. In classes when the details of the codes are unveiled, eyes glaze over and it becomes clear that the only thing they are really interested in is getting results, not knowing how the results were arrived at.

One way this current trend is being blunted is the adoption of verification and validation (V&V) in modeling and simulation. V&V encourages a distinct multidisciplinary point-of-view in its execution particularly when coupled to uncertainty quantification. To do V&V correctly requires a significant amount of deep knowledge of many technical areas. This is really difficult. Instead of engaging deeply in the technical work necessary for good V&V is simply beyond the capacity of most people’s capabilities and tolerance for effort. People paying for modeling and simulation for the most part are unwilling to pay for good V&V. They would rather have V&V that is cheap and fools people into confidence.

Computers are incredibly fast, accurate, and stupid: humans are incredibly slow, inaccurate and brilliant; together they are powerful beyond imagination.

― Albert Einstein

Two elements are leading to this problem. No one is willing to pay for high-quality technical work either the development or use of simulation codes. Additionally no one is willing to pay for the developers of the code and the users to work together. The funding, environment and tolerance to support the sort of multi-disciplinary activities that produce good modeling and simulation (and by virtue of that goo V&V) is shrinking with each passing year. Developing professionals who do this sort of work well is really expensive and time-consuming. When the edict is to simply crank out calculations with a turnkey technology, the appetite for running issues to ground necessary for quality simply doesn’t exist.

A couple of issues have really “poisoned the well” of modeling and simulation. The belief is that the technology is completely trustworthy and mature enough for novices to use is an illusion. Commercial codes are certainly useful, but need to be used with skill and care by curious, doubtful users. These codes often place a serious premium on robustness over accuracy, and cut lots of corners to keep their users happy. A happy user is usually first and foremost someone with a completed calculation regardless of the credibility of that calculation. We also believe that everything is deeply enabled by almost limitless computing power.

Think? Why think! We have computers to do that for us.

— Jean Rostand

Computing power doesn’t relieve us of the responsibility to think about what we are doing. We should stop believing that the computational tools can be used like magic, black magic in black boxes that we don’t understand. If you don’t understand how you got your answers, you probably shouldn’t trust that answer until you do.

A computer lets you make more mistakes faster than any other invention with the possible exceptions of handguns and Tequila.

— Mitch Ratcliffe

 

 

Progress should be Mandatory

24 Friday Apr 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.

― George Bernard Shaw

CERN_large_hadron_colliderWe appear to be living in a golden age of progress. I’ve come increasingly to the view that this is false. We are living in an age that is enjoying the fruits of a golden age and following the inertia of a scientific golden age. The forces powering the “progress” we enjoy are not being returned to our future generations. So, what are we going to do when we run out of the gains made by our fore bearers?

Barcelona-Police-brutalityProgress is a tremendous bounty to all. We can all benefit from wealth, longer and healthier lives, greater knowledge and general well-being. The forces arrayed against progress are small-minded and petty. For some reason the small-minded and petty interests have swamped forces for good and beneficial efforts. Another way of saying this is the forces of the status quo are working to keep change from happening. The status quo forces are powerful and well-served by keeping things as they are. Income inequality and conservatism are closely related because progress and change favors those who benefit from change. The people at the top favor keeping things just as they are.Unknown

 Those who do not move, do not notice their chains.

― Rosa Luxemburg

article4Most of the technology that powers today’s world was actually developed a long time ago. Today the technology is simply being brought to “market”. Technology at a commercial level has a very long lead-time. The breakthroughs in science that surrounded the effort fighting the Cold War provide the basis of most of our modern society. Cell phones, computers, cars, planes, etc. are all associated with the science done decades ago. The road to commercial success is long and today’s economic supremacy is based on yesterday’s investments.vnc01

Without deviation from the norm, progress is not possible.

― Frank Zappa

Since the amount of long-term investment today is virtually zero, we can expect virtually zero return down the road. We aren’t effectively putting resources into basic or applied research much as we aren’t keeping up with roads and bridges. Our low-risk approach to everything is sapping the vitality from research. We compound this by failing to keep our 20th Century infrastructure healthy, and completely failing to provide a 21st Century one (just look at our pathetic internet speeds). Even where we spend lots on money on things like science little investment is happening because of the dysfunctional system. One of the big things hurting any march toward progress is the inability to take risks. Because failure is so heavily penalized, people won’t take the risks necessary for success. If you can’t fail, you can’t succeed either. It is an utterly viscous cycle that highlights the nearly complete lack of leadership. The lack of action by National leadership is simply destroying the quality of our future.

Restlessness is discontent — and discontent is the first necessity of progress. Show me a thoroughly satisfied man — and I will show you a failure.

― Thomas A. Edison

Take high performance computing as an example. In many respects the breakthroughs in algorithms have been as important as the computers themselves. Lack of risk taking has highlighted the computers as the source of progress because of Moore’s law. Algorithmic work is more speculative and hence risky. Payoffs are huge, but infrequent and thus risky. Effort might be expended that yields nothing at all. There shouldn’t be anything wrong with that! Because they are risky they are not favored.

We can only see a short distance ahead, but we can seeAlan_Turing_photoplenty there that needs to be done.

― Alan Turing

A secondary impact of the focus on computers is that the newer computing approaches are really hard to use. It is a very hard problem to simply get the old algorithmic approaches to work at all. With so much effort going into implementation as well as being siphoned from new algorithmic research, the end product is stagnation. Numerical linear algebra is a good example of this terrible cycle in action. The last real algorithm breakthrough is multigrid about 30 years ago. Since then work has focused on making the algorithms work on massively parallel computers.

Progress always involves risk; you can’t steal second base and keep your foot on first

― F.W. Dupee

The net result is lack of progress. Our leaders are seemingly oblivious to the depth of the problem. They are too caugh20131011153017_Nobel_Prize_03_5d9eb62feft up in trying to justify the funding for the path they are already taking. The damage done to long-term progress is accumulating with each passing year. Our leadership will not put significant resources into things that pay off far into the future (what good will that do them?). We have missed a number of potentially massive breakthroughs chasing progress from computers alone. The lack of perspective and balance in the course for progress shows a stunning lack of knowledge for the history of computing. The entire strategy is remarkably bankrupt philosophically. It is playing to the lowest intellectual denominator. An analogy that does the strategy too much justice would compare this to rating cars solely on the basis of horsepower.

A person who makes few mistakes makes little progress.

― Bryant McGill

is-the-orwellian-trapwire-surveillance-system-illegal-e1345088900843-640x360The end product of our current strategy will ultimately starve the World of an avenue for progress. Our children will be those most acutely impacted by our mistakes. Of course we could chart another path that balanced computing emphasis with algorithms, methods and models. Improvements in our grasp of physics and engineering should probably be in the driver’s seat. This would require a significant shift in the focus, but the benefits would be profound.

One of the most moral acts is to create a space in which life can move forward.

― Robert M. Pirsig

What we lack is the concept of stewardship to combine with leadership. Our leaders are stewards of the future, or they should be. Instead they focus almost exclusively on the present with the future left to fend for itself.Zmachine

 

Human progress is neither automatic nor inevitable… Every step toward the goal of justice requires sacrifice, suffering, and struggle; the tireless exertions and passionate concern of dedicated individuals.

― Martin Luther King Jr.

 

 

← Older posts
Newer posts →

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • February 2026
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Regularized Singularity
    • Join 56 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Regularized Singularity
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...