• About The Regularized Singularity

The Regularized Singularity

~ The Eyes of a citizen; the voice of the silent

The Regularized Singularity

Monthly Archives: June 2015

Peter Lax’s Philosophy About Mathematics

25 Thursday Jun 2015

Posted by Bill Rider in Uncategorized

≈ 4 Comments

 

Linearity Breeds Contempt

—Peter Lax

imgres copy 3
A few weeks ago I went into my office and found a book waiting for me. It was one of the most pleasant surprises I’ve had at work for a long while, a biography of Peter Lax written by Reuben Hersh. Hersh is an emeritus professor of mathematics at the University of New Mexico (my alma mater), and a student of Lax at NYU. The book was a gift from my friend, Tim Trucano, who knew of my high regard and depth of appreciation for the work of Lax. I believe that Lax is one of the most important mathematicians of the 20th Century and he embodies a spirit that is all too lacking from current mathematical work. It is Lax’s steadfast commitment and execution of great pure math as a vehicle to producing great applied math that the book explicitly reports and implicitly advertises. Lax never saw a divide in math between the two and complete compatibility between them.

41fv+M3GpbL._SY344_BO1,204,203,200_The publisher is the American Mathematical Society (AMS) and the book is a wonderfully technical and personal account of the fascinating and influential life of Peter Lax. Hersh’s account goes far beyond the obvious public and professional impact of Lax into his personal life and family although these are colored greatly by the greatest events of the 20th Century. Lax also has a deep connection to three themes in my own life: scientific computing, hyperbolic conservation laws and Los Alamos. He was a contributing member of the Manhattan Project despite being a corporal in the US Army and only 18 years old! Los Alamos and John von Neumann in particular had an immense influence on his life’s work with the fingerprints of that influence all over his greatest professional achievements.

pdlaxIn 1945 scientific computing was just being born having provided an early example in a simulation of the plutonium bomb the previous year. Von Neumann was a visionary in scientific computing having already created the first shock capturing method and realizing the necessity of tackling the solution of shock waves through numerical investigations. The first real computers were a twinkle in Von Neumann’s eye. Lax was exposed to these ideas and along with his mentors at New York University (NYU), Courant and Friedrichs, soon set out making his own contributions to the field. It is easily defensible to credit Lax as being one of the primary creators of the field, Computational Fluid Dynamics (CFD) along with Von Neumann and Frank Harlow. All of these men had a direct association with Los Alamos and access to computers, resources and applications that drove the creation of this area of study.

500004277-03-01 copyLax’s work started with his thesis work at NYU, and continued with a year on staff at Los Alamos from 1949-1951. It is remarkable that upon leaving Los Alamos to take a professorship at NYU his vision of the future technical work in the area of shock waves and CFD had already achieved remarkable clarity of purpose and direction. He spent the next 20 years filling in all the details and laying the foundation for CFD for hyperbolic conservation laws across the world. He returned to Los Alamos every summer for a while and encouraged his students to do the same. He always felt that the applied environment should provide inspiration for mathematics and the problems studied by Los Alamos were weighty and important. Moreover he was a firm believer in the cause of the defense of the Country and its ideals. Surely this was a product of being driven from his native Hungary by the Nazis and their allies.

Lax also comes from a Hungarian heritage that provided some of the greatest minds of the 20th Century with Von Nimages copy 2eumann and Teller being standouts. Their immense intellectual gifts were driven Westward to America through the incalculable hatred and violence of the Nazis and their allies in World War 2. Ultimately, the United States benefited by providing these refugees sanctuary against the forces of hate and intolerance. This among other things led to the Nazis defeat and should provide an ample lesson regarding the values of tolerance and openness as a contrast.

The book closes with an overview of Lax’s major areas of technical achievement in a series of short essays. Lax received the Abel Prize for Mathematics in 2005 because of the depth and breath of his work in these areas. While hyperbolic conservation laws and CFD were foremost in his resume, he produced great mathematics in a number of other areas. In addition he provided continuous service to the NYU and United States in broader scientific leadership positions.

Before laying out these topics the book makes a special effort to describe Lax’s devotion to the creation of mathematics that is both pure and applied. In other words beautiful mathematics that stands toe to toe with any other pure math, but also has application to problems in the real world. He has an unwavering commitment to the idea that applied math should be good pure math too. The two are not in any way incompatible. Today too many mathematicians are keen to dismiss applied math as being a lesser topic and beneath pure math as a discipline.12099970-aerodynamic-analysis-hitech-cfd

This attitude is harmful to all of mathematics and the root of many deep problems in the field today. Mathematicians far and wide would be well-served to look to Lax as a shining example of how they should be thinking, solve problems, be of service and contribute to a better World.

…who may regard using finite differences as the last resort of a scoundrel that the theory of difference equations is a rather sophisticated affair, more sophisticated than the corresponding theory of partial differential equations.

—Peter Lax

13 Things that produce a mythical or legendary code

19 Friday Jun 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

After all, I believe that legends and myths are largely made of ‘truth’.

― J.R.R. Tolkien

JobsCodeQuoteFor the purposes of this post, “Code” = “Modeling & Simulation tool: instead of the set of instructions in a programming language. Some codes are viewed as being better and more worthy of trust than others. The reasons for such distinctions are many and varied, but most often vague and clouded in mystery. I hope to shed some light on the topic.

Legend does not contradict history. It preserves the fundamental after but magnifies and embellishes it.

― Adrien Rouquette

article4Often codes become useful by simply being the first one to achieve success with a difficult and important problem. In other cases the authors of the code are responsible for the code’s mythic status. Certain authors of codes bring with them a pedigree of achievement that breeds confidence in the product of their work. This is computer instructions or code, which comprises an executable tool. It is a combination of a model of physical reality, a method to solve that model including algorithms developed to optimize the method, and auxiliary software that connects the code to the computer itself. Together with the practices of the users of the code, and options enabled by the code itself, the modeling capability is defined. This capability is then applied to problems of interest and success occurs if the comparisons with observations of reality are judged to be high quality.dag006

Computers are good at following instructions, but not at reading your mind.

—Donald Knuth

What sort of things produces a code of legend and myth?

Myth must be kept alive. The people who can keep it alive are the artists of one kind or another.

― Joseph Campbell

  1. The code allows new problems to be solved, or solved correctly. Often a new interface or setup capability is key to this capacity as well as new models of reality. This has the same dynamic as discovery does in other areas. Being first is an incredibly empowering aspect of work and often provides a de facto basis for success.
  2. JohnvonNeumann-LosAlamosThe code allows problems to be solved better than before by whatever standard is used by the community. Sometimes being first is not enough because the quality of solution isn’t good enough. The discovery ends up being delayed until the results are good enough to be useful. As such success is related to quality of results and the expectations or standards of a technical community.
  3. The code solves a standing problem that hasn’t been solved before, or to a degree that instills confidence. Sometimes problems are acknowledged and result in being a standing challenge. When someone creates a tool that makes an effective solution to this sort of problem, it creates a “buzz” and provides the push the code needs for adoption more broadly.
  4. The code is strongly associated with success in application space (quality by association). If the code is strongly associated with a successful application product, the code can inherit its virtue. Usually this sort of success will be strongly associated with an institution or national program (like ICF, inertial confined fusion). The codes success can persist for as long as the application’s success, or in some cases outlast it.images-1
  5. The code is reliable (robust) and produces useful results as a matter of course. For some areas in modeling and simulation codes are fragile, or too fragile to solve problems of interest. In such cases a code will make a breakthrough when the code simply runs problems to completion and the results are physically or conceptually plausible. Depending on the situation the lowly standard will then transition to other forms of success as the standards for solution improve,
  6. The code produces physically reasonable solutions under difficult circumstances. This is a similar situation to the robustness virtue, but a bit better. Sometimes robustness is achieved through producing really terrible solutions (often very heavily diffused, or smeared out). This often destroys significant aspects of the solution. A better answer without such heavy-handed methods will yield code new followers who evangelize its use, or perhaps embarrass those holding onto the past.
  7. The code is associated with someone with a pedigree such as an acknowledged trailblazer in a seminal field to the application or code specialties. This is praise by association. Someone who is a giant in a field will produce a wake of competence, which is almost completely associated with a cult of personality (or personal achievement).

    Frank Harlow with Jacob Fromm

    Frank Harlow with Jacob Fromm

  8. The code’s methods are uniquely focused on the application problem area and not generalized beyond it. Sometimes a code is so focused in an important niche area that it dominates the analysis like no general-purpose code can. Often this means that the code caters to the basic needs of the analysis specifically and provides a basis of solution of application-specific problems that no general-purpose code can compete with.
  9. The code solves a model of reality that no other code can. In other cases, the code has models no other code provides. These models can be enabling because standard models are not sufficient to explain reality (i.e., fail validation). The new model may require some unique methodology for its solution, which together with the model provide a distinct advantage.
  10. The code is really fast compared to alternatives. For a lot of analysis questions it is important to able to run the code many times. Analysts like getting answers faster more than slower, and a quick turn-around time is viewed as a great benefit. If a code takes too long to get an answer, the ability to fully explore problems via parameter or scenario variation can be negatively impacted.Aircraft
  11. The code’s solutions are amenable to analysis or comparisons to observations are enabled. This has a lot more to do with the auxiliary analysis than the code itself. A code that has good visualization or data analysis built into its analysis system can provide significant encouragement for the use of the code itself.
  1. The code produces results that are comfortable to the community, or define the standard for the community. Sometimes the code simply either meets or sets the expectations for the community using it for analysis. If it confirms what they tend to believe already, the analysts have greater comfort using the code.
  2. The code’s methodology is comfortable to the community (and its intrinsic bias). For example the model and its solution are solved in a Lagrangian frame of reference, and the community only trusts Lagrangian frame solutions.

Storytellers seldom let facts get in the way of perpetuating a legend, although a few facts add seasoning and make the legend more believable.

― John H. Alexander

triple-point_BLAST_q8q7Sometimes a code has one or more of these items going for it. Once the code becomes used and trusted, it is the incumbent and it is very difficult to displace from usage. This is even true with unambiguously better methods. This is just a fact of life.

Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning.

— Rich Cook

 

 

 

 

 

 

 

Why do we do this stuff?

12 Friday Jun 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

The road to Hell is paved with the best of conscious intentions.

― Elizabeth F. Howell

climate_modeling-ruddmanLet me get to one of the key punch lines for this post, “no amount of mesh refinement, accuracy or computer speed can rescue an incorrect model.” The entire reason for doing modeling and simulation is impacting our understanding or response to the reality of the Universe. The only fix for a bad model is a better model. Better models are not something we are investing much effort in. This gets to a fundamental imbalance in high performance computing where progress is now expected to come almost purely through improvements in the performance of hardware.

Success doesn’t come to you; you go to it.

― T. Scott McLeod

Unknown-1If the field were functioning in a healthy manner, the dynamic would be fluid and flexible. Sometimes a new model would spur developments in methods, algorithms for its solution. This would ultimately spill down to software and hardware developments. The dynamic that is working today would also manifest itself in the need for improvements in software and hardware to allow for solutions of meaningful models. The issue at hand today is the 20 year history of emphasis on hardware and its inability to yield progress as promised. It is time to recognize that the current trajectory is imbalanced and needs significant alteration to achieve progress commensurate with what has been marketed to society at large.

There can be no ultimate statements science: there can be no statements in science which can not be tested, and therefore none which cannot in principle be refuted, by falsifying some of the conclusions which can be deduced from them.

― Karl Popper

images-1Modeling and simulation has become an end unto itself and lost some of its connection to its real reason for being done. The reason we conduct modeling and simulation is to understand, explain or influence reality. All of science has the objective of both uncovering the truth of the Universe and allowing man to apply some measure of control to it. As most things become to those practicing an art, modeling and simulation is a deep field combining many disparate fields together toward its accomplishment. This depth allows practitioners to lose track of the real purpose, and focus on the conduct of science to exclusion of its application.

Science has an unfortunate habit of discovering information politicians don’t want to hear, largely because it has some bearing on reality.

― Stephen L. Burns

Why would any of this make a difference?

CFD_tunnel_comparisonBy losing sight of the reason for conducting an activity causes a loss of the capacity to best utilize the field to make a difference. Science has a method and its manner conduct is important to keep in mind. Computational science is a bridge between the reality of physics and engineering and the computers that enable it. The biggest issue is the loss of perspective on what really determines the quality of modeling and simulation. Our current trajectory is focused almost exclusively on the speed of the computer as the route to quality. We have lost the important perspective that no computer can save a lousy model. It just assures a more expensive, high fidelity wrong solution.

The quest for absolute certainty is an immature, if not infantile, trait of thinking.

― Herbert Feigl

LMCT_modellingThe wrong solutions we are getting are not terrible, just limited. Science works properly when there is a creative tension between experiments and theory. Theory can be powered by computing allowing the solution of models impossible without it. Experiments must test these theories either by being utterly new, or employing better diagnostics. Without the experiment to test, confirm or deny, theory can rot from within essentially losing connection with reality. This fate is befalling our theories today by fiat. Our models are almost assumed to be correct and not subject to rigorous testing. More powerful computers are simply assumed to yield better answers. No impetus is present to refine or develop better models where all evidence points toward their utter necessity.

…if you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked—to make sure the other fellow can tell they have been eliminated.

― Richard P. Feynman

18-330s12Applied mathematics is a closely related field where the same slippage from reality is present. Again the utility of applied mathematics is distinctly like that of computing; it is utterly predicated upon the model’s quality visa-vis reality. In the period from World War 2 until around 1990, applied mathematics eagerly brought order and rigor to modeling, simulation and related activities. It became an able and honored partner for the advance and practice of science. Then it changed. It began to desire a deeper sense of mathematical honor as pure mathematics had in its eyes. In doing so applied math turned away from being applied and toward being governed by mathematical qualities. The lack of balance has emptied applied math’s capacity to advance science. The same has happened with computing. We are all poorer for it.

Science is not about making predictions or performing experiments. Science is about explaining.

― Bill Gaede

imagesAll of this may be overcome and the balance may be resurrected. All that is needed is to reconnect these fields with application. Application is a Gordian knot whose very nature powers science. Without the riddle and difficulty of application, the fields lose their vigor. The vigor is powered by attempting the solution of seemingly intractable problems. Without the continual injection of new ideas, the science cannot prosper. Such prosperity is currently being denied by a lack of connectivity to the very reality the fields discussed here could help to master. Such mastery is being denied by the lack of faith in our ability to take risks.

The intention (of an artist) is (the same as a scientist)…to discover and reveal what is unsuspected but significant in life.

― H W Leggett

Bad models abound in use today. A lot of them should be modified and discarded, but in today’s direction for scientific computing, we are simply claiming that a faster computer will open the door to solution. Many idealized equation sets are used in modeling that yield intrinsically unphysical solutions. The Euler equations without dissipation are a prime example. Plasma physics is yet another place where unphysical models are used because dissipation mechanisms are small. In macroscopic models dissipation is omnipresent, and leads to satisfaction of the second law of thermodynamic. Ideal equations remove this essential aspect of modeling by fiat.

The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.

—Sir Arthur Stanley Eddington

In no place is this more greatly overloaded with conimages-1 copytext than turbulence. There is a misbegotten belief that solving the incompressible Navier-Stokes equations will unveil the secrets of turbulence. Incompressibility is fundamentally unphysical and may remove fundamental aspects of turbulence through its invocation. Incompressibility implies infinite speed of sound and a lack of thermodynamics. Connections between the incompressible and compressible equations only exist for adiabatic (dissipation-free) flows. Turbulence is characterized by dissipation in the absence of finite viscosity, which implies derivative singularities in the flow. Compressible fluids have this character and its nature is highly associated with the details of thermodynamics. Incompressible flows have not been clearly associated with this character, and the lack of thermodynamic is a likely source of this failing.

The plural of anecdote is not data.

― Marc Bekoff

Another aspect of our continuum modeling codes is the manner of describing material response. We tend to describe materials in a homogeneous manner that is we “paint” them into a physical region of the problem. All the aluminum in a problem will be described by the same constitutive laws without regard to the separation of scales 34yrawwyca6bwyebdmh5l5zzdlncalwgbetween the computational mesh, and the physical scales in the material. This approach has been around for over 50 years and shows no signs of changing. It is actually long since past the time when this should have changed.

It is more Important to be of pure intention than of perfect action.

― Ilyas Kassam

The key is to apply the scientific method with rigor and vigor. Right now scientific computing has vacated its responsibility to apply the scientific method appropriately. Too often modeling and simulation are touted as being the third leg of science equal to theory and experiment. Modeling should always be beholding to experimental and fiUnknowneld observation, and should the model be found to be in opposition to the observation, it must be found faulty. Modeling is rather an approach to more generally and broadly find solutions to theory. Thus theory can be extended to more nonlinear and complex models of reality. This should aid the ability of theory to describe the physical universe. Often simulation can act as a Laboratory for theory where suppositional theory can be tested for congruence with observation (computational astrophysics is a prime example of this).

Intention is one of the most powerful forces there is. What you mean when you do a thing will always determine the outcome. The law creates the world.

― Brenna Yovanoff

epic-winThe bottom line is whether we are presently on a path that allows modeling and simulation to take its proper place in impacting reality, or explaining reality as part of the scientific method? I think the answer today is a clear and unequivocal, no. A combination of modern day political correctness regarding the power of computational hardware, over-selling of computing, fear and risk avoidance all lead to this. Each of these factors needs to be overcome to place us on the road to progress.

The tiniest of actions is always better than the boldest of intentions

― Robin Sharma

What needs to happen to make things better?

  • Always connect the work in modeling and simulation to something the real world,
  • Balance effort with the benefit of the world to the real world
  • Find a way to give up on determinism to an appropriate degree, model the degree of variability seen in reality,
  • Do not over emphasize the capacity of computational power to simply solve problems by fiat,
  • Take risks especially risks that have a high chance for failure, but large payoffs,
  • Allow glorious failure and reward risk-taking if done in a technically appropriate manner,
  • New methods and algorithms provide potential for quantum improvements in efficiency and accuracy as well as the promise of new uses for computational models,
  • No single aspect of modeling and simulation should be starved of attention as every part of this ecosystem must be healthy to achieve progress in predictive science,
  • Stop settling for legacy models, methods and codes just because they are “good” enough focus on quality and excellence.

In the republic of mediocrity, genius is dangerous.

― Robert G. Ingersoll

The Best Computer

05 Friday Jun 2015

Posted by Bill Rider in Uncategorized

≈ 2 Comments

cell-phone

What’s the “best” computer? By what criteria should a computer be judged? Best for what? Is it the fastest? Or the easiest to use? Or the most useful?

The most honest answer is probably the most useful, or impactful computer in how I live my life or work, so I’ll answer in that vein.

Have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

― Steve Jobs

Details matter, it’s worth waiting to get it right.

― Steve Jobs

If I had to answer honestly, it’s probably the latest computer I bought, my new iPhoneimgres url url-16. It’s an absolute marvel. It is easy to use and useful all at once. I have a vast array of applications to use, plus I can communicate with the entire World and access an entire World’s worth of information. I can access maps, find a place to eat lunch, take notes, access notes, find out the answer to questions, keep up with friends, and make new ones. It also allows me to listen to music either stored or via “radio”. It is so good that I am rarely without it. It helps me work out at the gym with an interval timer that I can program to develop unique tailored workouts. Anything that links to the “cloud” for data is even better because the data on the iPhone is the same as other platforms I use. The productivity and efficiency that I can work with is now simply stunning. The word awesome doesn’t quite do it justice. If you gave it toEvernote Camera Roll 20141026 065749 me ten years ago, I’d have thought aliens delivered the technology to humans.

We don’t get a chance to do that many things, and every one should be really excellent. Because this is our life.

― Steve Jobs

The fastest computer I have access to isn’t very good, or useful. It is just fast and really hard to use. In all honesty it is a complete horror show. For the most part this really fast computer is only good for crunching a lot of numbers in a terribly inefficient manner. It isn’t merely not a multi-purpose computer; it is single purpose computer that is quite poor at delivering that single purpose. Except for its speed it compares poorly to the supercomputers I used over 20 years ago. I say this noting that I am not prone to nostalgia at all. Generally I favor the modern over the past by a wide margin. This makes the assessment of modern supercomputing all the more damning.

Don’t be trapped by dogma — which is living with the results of other people’s thinking.

― Steve Jobs

Your time is limited, so don’t waste it living someone else’s life.

― Steve Jobs

Unlike the iPhone with its teeming modernity, the modern supercomputer is an ever more monstrous proposition with each passing year. Plans for future supercomputers are sure to create a new breed of monsters (think Godzilla, a good name for one of the machines!) that promise to consume energy like American consumers drunk on demonstrating their God-given right to excess. They also promise to be harder to use, less reliable, and nearly impossible to program. They might just be truly evil monsters in the making. The evil being done is primarily the loss of opportunity to make modeling and simulation match the hype.

Anything worth doing, is worth doing right.

― Hunter S. Thompson

It isn’t that the hyped vision of modeling and simulation as a third way for science is so flawed; it is our approach to achieving this vision that is so counter-productive. The vision is generally sound provided that the steps we took actually led to such an outcome. The overbearing emphasis on computing speed as the key path to producing a predictive modeling capability is fatally flawed. It is a path lacks the sort of checks and balances that science needs to succeed. A faulty model cannot predict reality regardless of how fast it executes on a computer, or how refined the computational “mesh” is. Algorithmic improvements can provide new applications, solve unsolved problems, and provide greater efficiency that pure computational speed cannot deliver.

It’s not like I’m all into nostalgia and history, it’s just that I can’t stand the way things are now

― Novala Takemoto

bh_computers_09The current fastest computer certainly isn’t the best supercomputer ever built. That crown lies on the head of the Crays of the 70’s, 80’s and 90’s built by that genius Seymour Cray. In the form of the X-MP, Y-MP, C90 or Cray 2 the supercomputer reached its zenith. In relative terms these Crays were joys to use, and program. They were veritable iPhones compared to the rotary phones we produce today. At that time with an apex in functionality and utility for supercomputing massively parallel computing was born (i.e., the attack of the killer micros), and the measure of a supercomputer became speed above all else. Utility, and usefulness be damned. The fully integrated software-hardware solution found in a Cray Y-MP became a relic in the wake of the “need for speed”.

Study the past if you would define the future.

― Confucius

titan2In a sense the modern trajectory of supercomputing is quintessentially American, bigger and faster is better by fiat. Excess and waste are virtues rather than flaw. Except the modern supercomputer it is not better, and not just because they don’t hold a candle to the old Crays. These computers just suck in so many ways; they are soulless and devoid of character. Moreover they are already a massive pain in the ass to use, and plans are afoot to make them even worse. The unrelenting priority of speed over utility is crushing. Terrible is the only path to speed, and terrible is coming with a tremendous cost too. When a colleague recently quipped that she would like to see us get a computer we actually wanted to use, I’m convinced that she had the older generation of Crays firmly in mind.

The future is already here – it’s just not evenly distributed.

― William Gibson

So, who are the geniuses that created this mess?

imagesWe have to go back to the mid-1990’s and the combination of computing and geopolitical issues that existed then. The path taken by the classic Cray supercomputers appeared to be running out of steam insofar as improving performance. The attack of the killer micros was defined as the path to continued growth in performance. Overall hardware functionality was effectively abandoned in favor of pure performance. The pure performance was only achieved in the case of benchmark problems that had little in common with actual applications. Performance on real application took a nosedive; a nosedive that the benchmark conveniently covered up. We still haven’t woken up to the reality.

Remembrance of things past is not necessarily the remembrance of things as they were.

― Marcel Proust

Geopolitically we saw the end of the Cold War including the cessation of nuclearCray XE6 image
Unknown-3Operation_Upshot-Knothole_-_Badger_001weapons’ testing. In the United Stated a program including high performance computing was sold as the alternative to nuclear testing (the ASCI program, now the ASC program). This program focused on computing power as the sole determinant of success. Every other aspect of computing became a veritable afterthought and was supported on a shoestring budget (modeling, methods, algorithms, and V&V). The result has been fast, unusable computers that deliver a pittance of their promised performance and a generation of codes with antiquated models and algorithms (written mostly in C++). We’ve been on this foolish path ever since to the extent that it’s become the politically correct and viable path going forward. We have lost a generation of potential scientific progress at the altar of this vacuous model for progress.

It shocks me how I wish for…what is lost and cannot come back.

― Sue Monk Kidd

Why do we choose this path when other more useful and rational approaches are available?

Risk aversion.

Unknown-1In the past forty some odd years we have as a society lost the ability to take risks even when the opportunity available is huge. The consequence of failure has become greater than the opportunity for success. In computing this trend has been powered by Moore’s law, the exponential growth in computing power over the course of the last 50 years (its not a law, just an observation). Under Moore’s law you just have to let time pass and computer performance will grow. It is a low-risk path to success.

When did the future switch from being a promise to being a threat?

― Chuck Palahniuk

Every other aspect of modeling and simulation entails far greater risk and opportunity to either fail, or fail to deliver in a predictable manner. Innovation in many areas critical to modeling and simulation are prone to episodic or quantum leaps in terms of capabilities (especially modeling and algorithms). These areas of potential innovationmistakesdemotivatorare also prone to failures where ideas simply don’t pan out. Without the failure you don’t have the breakthroughs hence the fatal nature of risk aversion. Integrated over decades of timid low-risk behavior we have the makings of a crisis. Our low-risk behavior has already created a fast immeasurable gulf in what we can do today versus what we should be doing today.

You realize that our mistrust of the future makes it hard to give up the past.

― Chuck Palahniuk

fastest-supercomputer-Fujitsu-Numerical-Wind-TunnelAn aspirational goal for high performance computing would be the creation of a computing environment that meant as much for scientific work as my iPhone means for how I live my life. Today we are very far from that ideal. The key to the environment isn’t the speed of the hardware, but rather the utility of how the hardware is integrated with the needs of the user. In high performance computing the user needs to produce scientific results, which depend far more on the modeling’s fundamental character than the speed of the computer.

The future depends on what you do today.

― Mahatma Gandhi

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • February 2026
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Regularized Singularity
    • Join 55 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Regularized Singularity
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...