• About The Regularized Singularity

The Regularized Singularity

~ The Eyes of a citizen; the voice of the silent

The Regularized Singularity

Monthly Archives: January 2015

Verification, you’re doing it wrong.

29 Thursday Jan 2015

Posted by Bill Rider in Uncategorized

≈ 2 Comments

…Next time you’re faced with a choice, do the right thing. It hurts everyone less in the long run.
― Wendelin Van Draanen

As the best practices in scientific computing continue to improve, verification is more frequently being seen in papers and reports. The progress over the past decade has been fantastic to see. Despite this progress there are some underlying problems that are pervasive in the community’s practice, and whose impact will ultimately reduce progress. These poorly executed practices are inhibiting the characterization of methods and their impact (positive or negative) on solutions.26959f3

Firstly, code verification is almost always applied to problems that bear little resemblance to the problems that are intended to be solved in the application of a method. Code verification usually only reports order of accuracy for the purposes of matching the theoretical expectations. This is meeting the minimal requirements of code verification as a practice. Often ignored is the capability to report the precise numerical error for the problem being computed. Both the rate of convergence and the error contain important and useful information for the developers and users of a numerical method. Both should be systematically reported rather than just the minimum requirement.

091701_1_3For solution verification the problem is much worse. Even when solution verification is done we are missing important details. The biggest problem is the lack of solution verification for the application of scientific computing to problems. Usually the problem is simply computed and graphs are overlaid, and success is declared. The comparison looks good enough. No sense of whether the solution is accurate is given at least quantitatively. An error estimate for the solution shown, or better yet a convergence study would provide much enhanced faith in the results. In addition to the numerical error, the rate of convergence would also provide information on the tangible expectations for the solution for practical problems. Today such expectations are largely left to be guessed by the reader.GenericPickup

In any moment of decision, the best thing you can do is the right thing. The worst thing you can do is nothing.
― Theodore Roosevelt

I believe on of the deeper issues is the belief that the rate of convergence and numerical error only matter for problems with analytical results. This matters for code verification purposes, but also matters greatly for practical problems. In fact it is probably more important for practical problems, yet it is rarely reported. To get things working better we need to move to a practice where both convergence and error are reported as a matter of course. It would be a great service to the community.

2-29s03

Sustainable Success Depends on Foundations

27 Tuesday Jan 2015

Posted by Bill Rider in Uncategorized

≈ 2 Comments

If you have built castles in the air, your work need not be lost; that is where they should be. Now put the foundations under them.

― Henry David Thoreau

In all endeavors we desire success, and the best success endures. The endurance of success is predicated on the foundations upon which that success is grounded. If foundations are systematically deprived of the basis, they will crumble and induce a crisis. Another way of saying this is success is dependent on balance. If the short-term success is continually rewarded, the long-term success will be undermined. These principles apply broadly including to the conduct of computational science and scientific computing.images copy

To apply this principle it is important to understand the nature of the foundation, and how the inter-linking areas of focus come together to provide a broad base for success. I see “computing” as a general stream of activities running from an impact in the reality of people’s lives to the method of achieving this on a computer (models with methods and algorithms). These methods and algorithms need to be expressed to the computer in useful form through computer code, and ultimately have a computing platform adequate to the purpose. Every single step in the chain is important, but turl-1he unnamedrelative value and priority of each is different. A lot depends on what the pacing requirements for progress are, but the focus of the value proposition should be an imperitive.

Insanity is doing the same thing, over and over again, but expecting different results

― Narcotics Anonymous

Let’s explore.

I made the argument that the thing that has set apart computers in recent times is the ability to make things matter to our daily lives, in and out of work. Computers can now have a huge impact on every aspect of living. When this happened the value of the entire computing enterprise exploded to a level unimaginable before. Every other aspect the model, algorithm, code and computer needed to be competently executed and adequate, but the connection to reality was the enabler for unprecedented growth.

Observing and understanding are two different things.

― Mary E. Pearson

imgresThe secondary fuel for this revolution is the model of interaction and the algorithms to efficiently deliver the value. The actual code and compute needs of this delivery needs to be competently executed, but beyond that offer nothing distinguishing to it. This is a massive lesson right in front of the scientific community, which seems to be not understood these observations as measured by its actions. Today’s computing for twitter-bigscience emphasis has completely inverted the value stream revolutionizing computing in the rest of the World.

The computing hardware has taken center stage in scientific computing followed by computer code. The methods and algorithms have greatly diminished importance in charting the path forward. More troublingly the methods and algorithm work is typically focused upon the effective implementation on new exotic computing hardware, not establishing fundamentally new capabilities. It is important to get the most out of expensive computers, but we fail to harness the power of algorithms; the greatest power of algorithms is to transform what is possible to do with a model of reality. They can change what is even conceivable to solve, and open new vistas of fidelity to solution. A prime example is Google’s search, the value is putting the right information is people’s hand, the model is the connectivity of the Internet, and the PageRank algorithm makes it happen well enough. The code and computers putting it together are necessary, but not innovative.

But better to get hurt by the truth than comforted with a lie.
― Khaled Hosseini

The models of reality are important as the interface between reality and algorithms for solution. Without the model all the algorithms work is for naught. Without an algorithm all the beautiful code and powerful computers are useless. Without the model you don’t have a connection to reality. Thus the lack of focus on modeling in scientific computing is perhaps even worse.images

Current work almost assumes that modeling available is adequate for the purposes. It is most assuredly not presently adequate, and it will almost as assuredly never be completely adequate. bradthelmaModeling must always be improving. If we are doing our computing correctly the models we use should continually be coming up short. Instead, the models seem to be completely frozen in time. They aren’t advancing. For example, I believe we should be undoing the chains of determinism in simulation, but even today deterministic simulations are virtually all of the workload.

Instead of seeing a need for improvement of the underlying models, and the way these models are solved, we have a program that tries to solve the same models, with the same algorithms on massive computers only changing the fidelity of the discretization. This assumes that everything in this chain is already at its ultimate state. This implicit assumption should be rejected out of principle.

To acquire knowledge, one must study;
but to acquire wisdom, one must observe.

― Marilyn Vos Savant

These concepts should be almost self-evident, but in practice we continually trade long-term success for short-term gains. We have adopted practices that lower the short-term risk by raising the long-term risk. Ultimately the entire enterprise is lurching toward a crisis in sustainability. The key to this crisis is starving the ContentImage-RiskManagementfoundation of value in scientific computing that is found foremost in models and their solution via algorithms and methods. The other aspect that has been systematically shortchanged is the value of the people who provide the ideas that form model, methods and algorithms. Ultimately, the innovation in scientific computing is the intellectual labor of talented individuals.

The scientific man does not aim at an immediate result. He does not expect that his advanced ideas will be readily taken up. His work is like that of the planter—for the future. His duty is to lay the foundation for those who are to come, and point the way.

― Nikola Tesla

 

 

 

 

 

 

What is the Real Value in Code?

23 Friday Jan 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

 

A computer lets you make more mistakes faster than any other invention with the possible exceptions of handguns and Tequila.

― Mitch Ratcliffe

Computers have never been that important, but really its software that is important, but actually its algorithms that matter, which really isn’t that true either. It is the sum total of these things that matters to our daily lives. This is true for iPhone apps and modeling & simulation in the sciences. The real value in each of these things is how they connect to real things like grocery shopping or airflow over an airplane’s wing.Unknown

Therein lies the real truth, the value of computers is the code they run; the value in code are the algorithms they implement; the value in algorithms are the problem solving of the people devising them; the value of algorithms is found in the real World. All of this revolves around a single unassailable truth, this is a fundamentally human activity at every level expressed using tools that make rote calculations trivial, and power connections between people.

computer_bg_macbookpro1This truth is valid whether the human activity is the search on your phone or laptop, purchasing through Amazon, predicting tomorrow’s weather, solving the airflow over an aircraft wing, or the flow neutrinos in a supernova using the Boltzmann transport equation. The real revolution in computing is the ability of computing to matter to how we live our daily lives whatever the activity. Given that the value in all of this is the added capacity to achieve our goals, it might be worth considering whether our priorities actually reflect this. Where these values are present in computing the iphone_4simportance and value of computing has swelled. Given my personal focus on the scientific use of computing my assessment would be that we have lost our way. The values in computing programs are horribly distorted and out of balance. A key to this is the loss of perspective on what really matters.

Its useful to discuss what a code actually is. I’ll focus on a scientific code in particular. I will argue that the part of it that is considered the “code” is actually the least important aspect of the entire activity. The classically viewed code is a set of instructions that the computer can understand to take a sequence of steps. Usually this code is defined to solve a problem or better yet a class or type of problem. The code is a collection of ideas that the computer assists the solution of. A code is no better than the ideas it expresses and the skill of the programmer in making these instructions work. Despite this obvious aspect of computing the quality of the ideas in code has shrunk from importance. They are merely assumed to be important. Clever, crafty and innovative aspects of the problem solving define algorithms, methods and heuristics that make the computed solution better or faster or bot

Code itself is an algorithm. The machine actually understands a horribly opaque and obscure language that defines basic operations, and moves data around. The code is a way of expressing these basic ideas using human comprehensible language and abstractions that collect basic operations together into units. Fortran was the first of these languages, and it is considered one of the most important algorithms of the 20th Century. It allowed the expression of more complex ideas to computers and greatly aided the advance of computing. Other languages have come into exisitence, but always with the same intent as Fortran originally had. The code is the key to unleasing the power of the computer.

 Let us remember that the automatic machine is the precise economic equivalent of slave labor. Any labor which competes with slave labor must accept the economic consequences of slave labor.

― Norbert Wiener

Unknown-1In scientific computing the key connection to reality are models. The most basic models are the governing equations such as the Euler, or Navier-Stokes or Boltzmann equations. These models are augmented by other models of subprocesses (often called subgrid models), and constitutive data that are typically experimentally measured and define the mean behavior of materials (accumulating the effects that would otherwise be statistical). These descriptions are the essential element in the value of computing to human activity. Their value transcends any of the other aspects: the algorithm, the code, and the computer itself. If the basic models are inadequate or faulty everything else is basically for naught. If the model is good, the rest of the components need to get it right, the algorithm or method needs to correctly or accurately solve the model, the implementation in code needs to be correct, and the computer needs to be capable of solving the problem. It is an exercise in balance and perspective. Our key issue
today is a lack of balance caused by a faulty perspective.

That’s the thing about people who think they hate computers. What they really hate is lousy programmers.

― Larry Niven

Another key element is the impact and connection of the code to human activity. The use of calculated results for science or engineering is one outcome. Another is the intersection of computing with the development and continuity of talent. In all cases the human intellect and talent is the core of the value stream, this aspect cannot be overlooked or the core of the value is lost.

The computer focuses ruthlessly on things that can be represented in numbers. In so doing, it seduces people into thinking that other aspects of knowledge are either unreal or unimportant. The computer treats reason as an instrument for achieving things, not for contemplating things. It narrows dramatically what we know and intended by reason.

― George Friedman

What is code mindmap M&S mindmap

Algorithms Have Hit the Big Time

20 Tuesday Jan 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

??????????????????????????????????????????????????????????????????????????One of the things that seem intriguing is the appearance of the algorithm in the broader cultural milieu. Despite its inherently esoteric and abstract character, the algorithm is becoming a bit of a celebrity these days. Popular press articles have started to examine the impact of the algorithm on our daily lives and explore the power and dangers of relying upon them.

 When we change the way we communicate, we change society

― Clay Shirky

Why? What is happening?

Moore’s law is fading and approaching the end of its wonderful reign (for most of the computing world its already effectively pushing up daisies). Gone are the halcyon days when we could be assured of waiting a couple of years and purchasing a new computer that offered double or more of the performance of the old one. Given this extra power, the software on the older computer rapidly becomes equally or more obsolete. This is still happening today, but for different reasons; the software now has new ideas in it, new algorithms with new capabilities. From 1975 to 2005 Moore’s law produced a growth in computer power that fueled a rise in computing from a scientific backwater or corporate niche to the centerpiece in the World’s economy. Halfway through this great lurch forward, the Internet became the tie that bound all that power into a whole that was greater than any of its parts. Computing power around the world was connected together along with all the people whose numbers recently swelled to all of humanity through cell phones becoming magical handheld computers.

new-google-algorithm As the massive gains from computer power wound down, and simultaneously the Internet transitioned into a huge web of human connectivity, the value proposition for computing changed. Suddenly the greatest value in all of this power switched to connection, access and sorting information. There were some fitful starts at attacking this key problem, but one solution rose above the rest, Google. Based on the work of a couple of Stanford graduate students and some really cool mathematics, Google took the world by storm. In a decade it had transformed itself into the World’s most powerful company. An algorithm that solved the data and connectivity access problem better than anything before it fundamentally powered Google.

Communications tools don’t get socially interesting until they get technologically boring.

― Clay Shirky

UnknownGoogle replaced a computer software company as the World’s most powerful company, Microsoft. In both cases computer programming was the engineering vehicle for these companies. Programming is a technique where intellectual labor is committed to a form where a computer can automatically execute a method, or algorithm to solve a problem. Usually the computer program is actually a large collection of methods,
algorithms, and heuristics that are uniquely composed together to solve problems. As these problems are more difficult and elaborate, the software gains more value.

The bottom line is that all of a sudden the algorithm and its software manifestation had eclipsed the computer hardware as a source of value. This transformation began when Microsoft rushed past IBM. IBM failed to see that software’s importance was about to eclipse hardware, and paid for it. Google put the algorithm together with tIBMhe ability to give people access to information and connectivity to eclipse Microsoft. The algorithm had moved from being a topic of nerdish academic interest to one of the most powerful things in the World. The world’s economy spun on an axis determined by a handful of algorithms.

Change almost never fails because it’s too early. It almost always fails because it’s too late.

― Seth Godin

2000px-Netflix_logo.svgMeanwhile scientific computing has lost its mind and decided that the path that led IBM down the path towards disaster is its chosen path. The end of Moore’s law has resulted in a collective insanity of spending vast sums of money supporting the hardware path in the face of looming disaster. At the same time they have turned their backs on algorithms. Effort and focus flows into obtaining and building massive computers that are increasingly useless for real science while ignoring the value that algorithms bring. The infatuation with the biggest and fastest computer measured by KuDr42X_ITXghJhSInDZekNEF0jLt3NeVxtRye3tqcoan increasingly meaningless benchmark only grows with time. This continues while the key to progress stares them in the eye every time they do an Internet search, the power of the algorithm.

The easiest way to solve a problem is to deny it exists.

― Isaac Asimov

What the hell is going on?

Part of the problem is the ability to artificially breathe life into the corpse of Moore’s law through increasingly massively parallel computers. This has been done through moving the goalposts significantly. The LINPAC benchmark never had much to do with the core of scientific computing and this distance has only grown over the past three decades. This benchmark papers over the myriad of vexing issues with the new computers. What once was a gulf or irritating width has widened into a chasm of dangerous proportions. Disaster looms in the not too distant future as a result.

A secondary goalpost moving is the adoption of “weak scaling”. Scaling is the metric of how well an algorithm or code uses the parallel computing to solve problems faster. True (strong) scaling would ask how much faster could I solve a certain problem with more processors. Perfect scaling means that with “N” processors I would solve the problem “N” times faster. Weak scaling changes this reasonable measure by making the problem “N” times bigger at the same time as the number of processors grows. If the performance of a code is poor on a single processor, weak scaling will successfully hide this fact (most scientific codes in fact suck on single processors, and suck more on many processors). In fact, our codes are performing worse and worse on single processors, and little or nothing has been done about it, weak scaling carries some of the blame by hiding the problem.

Scientific computing is fundamentally about problem solving with computers, not computers unto themselves. The field is being perverted into a fetish where the focus is computers, and problem solving is secondary. This is where we come back to a necessary focus on algorithms. Algorithms are fundamentally about solving problems, and algorithm research is about better, faster, more efficient problem solving. Everything we do in scientific computing runs through an algorithm instantiated in software. Without the algorithms and software the computers are worthless. Without the model being solved and its connection to physics and engineering the value to society is questionable. The combination of algorithm and model is an expression of human intellect and problem solving. It needs a capable computer to allow the solution, but the essence is all-human. We have lost the context of the place of the computer as a tool; it should never be an end unto itself. Yet that is what it’s become.

Any sufficiently advanced technology is indistinguishable from magic.

― Arthur C. Clarke

At the end of the 20th Century a list of the top algorithms was published (Dongarra, Jack, and Francis Sullivan. “Guest editors’ introduction: The top 10 algorithms.” Computing in Science & Engineering 2.1 (2000): 22-23.):

1. 1946: The Metropolis Algorithm for Monte Carlo.

  1. 1947: Simplex Method for Linear Programming.
  2. 1950: Krylov Subspace Iteration Method.
  3. 1951: The Decompositional Approach to Matrix Computations.
  4. 1957: The Fortran Optimizing Compiler.
  5. 1959: QR Algorithm for Computing Eigenvalues.
  6. 1962: Quicksort Algorithms for Sorting.
  7. 1965: Fast Fourier Transform.
  8. 1977: Integer Relation Detection.
  9. 1987: Fast Multipole Method.

One can argue for a few differences (finite elements, shock capturing, multigrid, cyptography,…) in the list, but the bottom line is that scientific computing dominates the list. What about since the turn of the 21st Century? The algorithmic heavy hitters are Google, Facebook, Netflix, encryption, iPhone apps, … If that top ten list were redone now, Google’s PageRank would almost certainly take one of the places. Scientific computing has shrunk from the algorithmic limelight, and commercial interests have leapt to the fore. The intellectual core of scientific computing has committed to utilizing these massive computers instead of solving problems better, or smarter. It is a truly tragic loss of leadership and immensely short sighted.

…invention is a somewhat erratic thing.

— J. Robert Oppenheimer

The key to progress is balance coupled with faith in the human intellect and its power to create. These creations are wonders, this includes computers, but they are machines that are merely tools. As tools they are only as good as what controls them, the algorithms and the software. I am convinced that breakthroughs are still possible. All that is needed is the focus and resources so that great minds will prevail. The modern world of computing offers vast opportunity for science that remains unexplored. Current leadership only seems to see the same path as we have taken in the past. It seems like a low risk path, but in fact represents the highest risk possible, the loss of potential. The lessons from commercial computing are there to be seen plain as day, algorithms rule. All we need to do is pay attention to what is sitting right in front of us.

images

Know where the value in work resides

16 Friday Jan 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

We all die. The goal isn’t to live forever, the goal is to create something that will.

― Chuck Palahniuk

When we achieve a modicum of success professionally it usually stems from a large degree of expertise or achievement in a fairly narrow realm. At the same time this expertise or achievement has a price; it was gained through a great degree of focus, 20131011153017_Nobel_Prize_03_5d9eb62fefluck and specialization. Over time this causes a lack of perspective for the importance of your profession in the broader world. It is often difficult to understand why others can’t see the intrinsic value in what you’re doing. There is a good reason for this, you have probably lost the reason why what you do is valuable.

Ultimately, the value of an activity is measured in terms of its impact in the broader world. Often times these days economic activity is used to imply value fairly directly. This isn’t perfect by any means, but useful nonetheless. For some areas of necessary achievement this can be a jarring realization, but a vital one. Many monumental achievements actually have distinctly little value in reality, or the value comes far after the discovery. In many cases the discoverer lacks the perspective or skill to translate the work into practical value. Some of these are necessary to achieve things of greater value. Achieving the necessary balance in these cases is quite difficult, and rarely, if ever achieved.

pic017It’s always important to keep the most important things in mind, and along with quality, the value of the work is always a top priority. In thinking about computing, the place where the computers change how reality is engaged is where value resides. Computer’s original uses were confined to business, science and engineering. Historically, computers were mostly the purview of the business operations such as accounting, payroll and personnel management. They were important, but not very important. People could easily go through life without ever encountering a computer and their impact was indirect.

As computing was democratized via the personal computer, the decentralization of ibm-pcaccess to computer power allowed it to grow to an unprecedented scale, but an even greater transformation laid ahead. Even this change made an enormous impact because people almost invariably had direct contact with computers. The functions that were once centralized were at the fingertips of the masses. At the same time the scope of computer’s impact on people’s lives began to grow. More and more of people’s daily activities were being modified by what computing did. This coincided with the reign of Moore’s law and its massive growth in the power and/or the decrease in the cost of computing capability. Now computing has become the most dominant force in the World’s economy.

Why? It wasn’t Moore’s law although it helped. The reason was simply that computing began to matter to everyone in a deep, visceral way.

Nothing is more damaging to a new truth than an old error.

— Johann Wolfgang von Goethe

The combination of the Internet with telecommunications and super-portable personal cell-phonecomputers allowed computing to obtain massive value in people’s lives. The combination of ubiquity and applicability to the day-to-day life made computing’s valuable. The value came from defining a set of applications that impact people’s lives directly and always within arm’s reach. Once these computers became the principle vehicle of communication and the way to get directions, find a place to eat, catch up with old friends, and answer almost any question at will, the money started flow. The key to the explosion of value wasn’t the way the applications were written, or coded or run on computers, it was their impact on our lives. The way the applications work, their implementation in computer code, or the computers themselves just needed to be adequate. Their characteristics had very little to do with the success.

It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.

― Richard P. Feynman

UnknownScientific computing is no different; the true value lies in its impact on reality. How can it impact our lives, the products we have or the decisions we make. The impact of climate modeling is found in its influence on policy, politics and various economic factors. Computational fluid dynamics can impact a wide range of products through better engineering. Other computer simulation and modeling disciplines can impact the military choices, or provide decision makers with ideas about consequences for actions. In every case the ability of these things to influence reality is predicated on a model of reality. If the model is flawed, the advice is flawed. If the model is good, the advice is good. No amount of algorithmic efficiency, software professionalism or raw computer power can save a bad model from itself. When a model is good the solution algorithms and methods found in computer code, and running on computers enable its outcomes. Each of these activities needs to be competently and professionally executed. Each of these activities adds value, but without the path to reality and utility its value is at risk. srep00144-f2

Despite this bulletproof assertion about the core of value in scientific computing, the amount of effort focusing on improving modeling is scant. Our current scientific computing program is predicated on the proposition that the modeling is good enough already. It is not. If the scientific process were working, our models would be improving from feedback. Instead they are stagnant and the entire enterprise is focused almost exclusively on computer hardware. The false proposition is that the computers simply need to get faster and the reality will yield to modeling and simulation.

climate_modeling-ruddmanSo we have a national program that is focused on the least valuable thing in the process, and ignores the most valuable piece. What is the likely outcome? Failure, or worse than that abject failure. The most stunning thing about the entire program is the focus is absolutely orthogonal to the value of the activities. Software is the next largest focus after hardware. Methods and algorithms are the next highest focus. If one breaks out this area of work into its two pieces, the new-breakthroughs or the computational implementation work, the trend continues. The less valuable implementation work has the lion’s share of the focus, while the groundbreaking type of algorithmic work is virtually absent. Finally, modeling is nearly a complete absentee. No wonder the application case for exascale computing is so pathetically lacking!

 It is sometimes an appropriate response to reality to go insane.

― Philip K. Dick

ClimateModelnestingAlas, we are going down this road whether it is a good idea or not. Ultimately this is a complete failure of the scientific leadership of our nation. No one has taken the time or effort to think this shit through. As a result the program will not be worth a shit. You’ve been warned.

The difference between genius and stupidity is; genius has its limits.

― Alexandre Dumas-fils

The Holiday Movie Club

11 Sunday Jan 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

pulp-fiction-posterOne of the things that Winter holiday means to me is movies, and good ones at that. It is something my wife and I love to do, enjoy and argue about. My son noted that we
are different than other families in that we see good movies and his friends see crap all the time. To sort of normalize things I’ll say that I really enjoy movies with an edge generally favoring movies with a bit of noir in their soul. A recent example of such would be “Drive” from 2011, going a bit further back some of my favorite films are “Pulp Fiction” “Fight Club” “American History X” and ”Full Metal Jacket”. I also love some of the more majestic movies of days gone by such as “Lawrence of Arabia” and “2001: A Space Odyssey” in particular.

fight_club_zpsce1c50eeHere are my holiday movie observations for the current season. I’ll assign each a letter grade with an Academy award winning film usually getting an “A”. I’d give all of the above-mentioned movies this grade and a few “A+”.

I’ll note that some of the quotes below contain some very bad language, which you would hear in the theatre. Be forewarned.

Whiplash, A

Whiplash-posterAn absolute stunner of a movie with one of the best acting performances I can remember seeing in a long time by JK Simmons as Terence Fletcher. It is a student-teacher story set in a conservatory. The kid is a young talented jazz drummer (played with skill by Miles Teller) looking to catch the eye of the famous teacher. He does and then the fireworks start. The filming and acting produces the sort of tension that usually come from action flicks. This is literally edge-of-your-seat stuff,
the tension arising from the interplay between the teacher and student is incredible.

Terence Fletcher: There are no two words in the English language more harmful than good job.

Selma, B+

selmaI really wanted to like this better. It was a finely acted and crafted historical drama based on a key moment in the civil rights movement. It is stunning to see the kind of things that used to happen in the United States. We’ve made progress as a country, but shockingly little as the events of the last year show. There is action in the deeply racist Alabama of 1964 and 1965, and tension between MLK and LBJ. Other figures like J. Edgar Hoover and George Wallace come across like the villainous humans they were. Overall an important movie that was competently executed, but not the brilliant movie I had hoped it would be.

“People out there actually say they’re gonna kill our children, they’re trying to get into your head.” – Coretta Scott King

Imitation Game, A-

Unknown-1This is a good movie, the worst of the ones I gave an “A-“ to. It is a very Hollywood version of Alan Turing’s life. Benedict Cumberbatch takes the material and produces a wonderful performance. The storytelling is unique running three timelines in parallel from Turing’s life with great lessons relevant to today’s problems. The upshot is that Turing’s life was immensely tragic, and his service to England and the World was never paid what it was due.

Joan Clarke: Sometimes it is the people who no one imagines anything of who do the things that no one can imagine.

Hobbit, C

hobbit-battle-five-armies-banner-thranduill-bannerI would sum this movie up as being thoroughly disappointing. I am guessing that the problem is that no one can tell Peter Jackson “no” any more and he is reverting to his roots. Some of the film making decisions are simply ludicrious and remind me strong of Jackson’s earlier films like “Dead Alive”. The choices almost always comical and some one should have told him, “this is a bad idea”.

Bilbo Baggins: One day I’ll remember. Remember everything that happened: the good, the bad, those who survived… and those that did not.

Wild, A-

I didn’t actually see it, my wife and daughter did while my son and I saw the Hobbit. She said it was great (she got the good end the deal to be sure).

Cheryl: What if I forgave myself? I thought. What if I forgave myself even though I’d done something I shouldn’t have?

Boyhood, A

UnknownAlong with Whiplash this is my choice for the best picture of the year. The movie has a massive gimmick being filmed a bit over time a week or two a year for 12 years. It chronicles the childhood of a boy whose parent divorce and how he develops from a small boy entering school to an adult entering college. The gimmick the film uses is remarkable using the same actors to show the passage of time. The film is wonderful beyond the gimmick and delivers a wonderful tale of personal growth for all the characters. It is both simple and immensely rich.

Nicole: You know how everyone’s always saying seize the moment? I don’t know, I’m kind of thinking it’s the other way around, you know, like the moment seizes us.

Snowpiercer, A-

MV5BMTQ3NzA1MTY3MV5BMl5BanBnXkFtZTgwNzE2Mzg5MTE@._V1_SX640_SY720_This movie is a wonderful bit of pay-for-view surprise and quite enjoyable on the whole. Some aspects of the movie are odd, but it is filled with great performances including surprising depth from Chris Evans. He is a much better actor than people realize. The movie has action, tension and deep commentary on our modern world and its problems. The film requires a degree of suspecnsion of disbelief regarding the basic premise, but if you can manage that it is a real gem.

Curtis: You know what I hate about myself? I know what people taste like. I know babies taste the best.

Nightcrawler, A-

Unknown-3This was a marvelously dark movie and portrait of a true sociopath. Jack Gylennhaal is wonderfully creapy in the roll and manages to make himself genuinely unlikable. He is driven and relentless in achieving fame and success without a hint of morality. At the same time the film succeeds in providing a tremendously insightful commentary on our modern society and our appetite for news that titillates much more than informs.

Lou Bloom: That’s my job, that’s what I do, I’d like to think if you’re seeing me you’re having the worst day of your life.

Intersteller, B

MV5BMjIxNTU4MzY4MF5BMl5BanBnXkFtZTgwMzM4ODI3MjE@._V1_SX640_SY720_This is a film that divides opinions for good reasons. It is a wonderfully majestic movie that is horribly flawed. Good, but not great performances can be found working on a script that was uninspired. The concept and arc of time with an innovative narrative concept make the story watchable. In the end it produces a watchable film that won’t be remembered 10 years from now.

I never ask for the science parts of movies to be too realistic simply not too cartoonish. It succeeds in being realistic enough not to offend. I tend to believe that we don’t understand the universe well enough to know what isn’t possible. It offers a Hollywood friendly version of relatively forward looking science. Having Habitable planets in the vicinity of a black hole is not the most realistic thing. I would have been much happier with a wormhole alone leaving the black hole out.

Cooper: We used to look up at the sky and wonder at our place in the stars, now we just look down and worry about our place in the dirt.

Cooper: We’ve always defined ourselves by the ability to overcome the impossible. And we count these moments. These moments when we dare to aim higher, to break barriers, to reach for the stars, to make the unknown known.

The Interview, C-

Unknown-2This film is the controversy of the season with the hacking of Sony and the capitalization to terrorism initially declining to release the movie then coming to their senses. We saw it on pay for view. The hackers did a better job sponsoring the movie than it deserved. This was easily the worst movie we saw all season. It was amusing and thoughtlessly entertaining, but a cinematic turd. It was a couple hours of my life I can’t get back.

Dave Skylark: [admires a war tank] Holy fuckamole. Is that real?

Kim Jong-un: It was a gift to my grandfather from Stalin

Dave Skylark: In my country it’s pronounced Stallone.

Kim Jong-un: You’re so funny, Dave.

MV5BMTk0MDQ3MzAzOV5BMl5BanBnXkFtZTgwNzU1NzE3MjE@._V1_SX640_SY720_Gone Girl, A-

This was an absolute gem of a movie. We loved it. It was thrilling, dark and relentless. It was a wonderfully grim look into a romance and marriage with an edge. So much is happening out of view and hidden from the viewer adding to the tension in theatre. Rosamund Pike is simply wonderful and creates a character of great depth who ultimately generates a deep emotional response. It is never clear who is the biggest villain. It is one movie where no one is a hero.

Nick Dunne: You fucking cunt!

Amy Dunne: I’m the cunt you married. The only time you liked yourself was when you were trying to be someone this cunt might like. I’m not a quitter, I’m that cunt. I killed for you; who else can say that? You think you’d be happy with a nice Midwestern girl? No way, baby! I’m it.

Nick Dunne: Fuck. You’re delusional. I mean, you’re insane, why would you even want this? Yes, I loved you and then all we did was resent each other, try to control each other. We caused each other pain.

Amy Dunne: That’s marriage.

Here are the movies that I haven’t seen yet, but want to see (A Most Violent Year, Birdman, Theory of everything, American Sniper, CitizenFour, Foxcatcher).

Why is Greatness Passing Us By?

09 Friday Jan 2015

Posted by Bill Rider in Uncategorized

≈ 2 Comments

mistakesdemotivatorIt has been one of the worst weeks I can remember. Every day I go home from work frustrated, angry, demotivated and despondent. While I recognized that going back to work after vacation would be bad, it has been so much worse than I could have imagined.

Why?

By the way, my vacation was outstanding. It was one of the best ones I can remember. Here is a brief observation about vacations; the French know what they are doing. One of the best things about this vacation was that everyone was off work, so no worry about catching up on email or other things going on, it was all break for two weeks.

Nothing is a mistake. There’s no win and no fail. There’s only make.

― Corita Kent

Now back to the issue that has ruined my week.

mediocritydemotivatorI have worked very hard in the last year to instill some really good habits into my daily life. It has worked, and I really believe that this has been an immense success. My year at work was great, and I was looking forward to refining these habits. As part of the good habits, I’ve started keeping better track of my thoughts, ideas and reading. There are some absolutely incredible tools out there to enhance your productivity. You can really see how technology can improve productivity in ways that are hard to articulate.

Today my goal is to be more productive than I was yesterday, and tomorrow more productive than today.

― Noel DeJesus

When I returned to work after the vacation, I found that my employer had killed one of them. They had killed perhaps the single most important tool I had adopted in the past year.

Good habits are worth being fanatical about.

― John Irving

Here is the point of this post. We live in a world where the slightest potential downside will cause something to be avoided or outlawed with no regard whatsoever for the upside potential (even if it is demonstrated). In my case with this tool we are honestly talking about 10 or 20 percent productivity effect (for me this is actually worth real money!). It would be like taking my cell phone away and making me use a rotary KONICA MINOLTA DIGITAL CAMERAlandline! Seriously. The change is that profound. You wouldn’t stand for the rotary phone. It would be catastrophic.

So, I’m not sure what to do. My problem is that I’m an early adopter of technology, and the system doesn’t know what sort of upside potential we are talking about. They only care about the potential dangers regardless of how remote they are. Of course this is associated with the reward system we work under. We are never rewarded for being more productive, at least at the level of the institution, and the price of a mistake is brutal, expensive and embarrassing. The consequence is that risks are avoided at all costs, and benefits are not gained if they have a risk associated with their enabling factors.736b673b426eb4c99f7f731d5334861b

Habits are patterns, and even the smallest ones tell a lot about who you are as a person.

― Jarod Kintz

a582af380087cd231efd17be2e54ce16This is one of the key reasons we are losing greatness as a nation. Any danger regardless of how remote or obviously obscure will trigger a massive effort to thwart its possibility. Any potential positive outcomes, no matter how large, cannot overcome the reaction to the minimal danger. Our response to terrorism is a perfect societal example. We have instituted the TSA and its idiotic security measures, which offer no actual safety, but only the perception of it. We are literally wasting lifetimes of time instituting this useless measure. Then there is over-reach of the NSA, which is threatening to undermine our economy by destroying trust in American companies. All to guard against risks that are actually far less than a host of common threats to our health.

We become what we repeatedly do.

― Sean Covey

demotivatorsAs a result it is we who make terrorism work through our fear. It is a force that is killing any greatness we have as a nation. It is destroying our ability to do great things. It is the biggest threat to our future.

Stop Blaming. Take responsibility for your thoughts and your actions.

― Dee Dee Artner

What is the essence of computational science?

05 Monday Jan 2015

Posted by Bill Rider in Uncategorized

≈ 1 Comment

Crisis is Good. Crisis is a Messenger.

― Bryant McGill

Computational What? Science? Engineering? Physics? Mathematics?

My last post is about computational science’s impending multi-crisis with the loss of Moore’s law, exploding software complexity, and failure to invest in its intellectual foundation. A reasonable question is how broadly the issues described there apply to subsets of the field. What about computational physics? What about computational engineering? What about computer science? What about applied mathematics? What are the differences between these notions of computation’s broader role in the scientific world? What are the key characteristics that make the essence of scientific computing?

One of the most important of disciplines is the one that never assumed its logical name: mathematical engineering.

—Nick Trefethen

Computational science is an umbrella for a variety of things that have gradations of difference and don’t form a terribly coherent whole. Instead it is a continuum with one end being held down by computer science and the other end by computational engineering; or perhaps the fields that birthed computing, physics and mathematics. The differences between engineering, mathematics and physics show themselves in computation as they do in other fields, but scientific computing should really be something of amalgam of all of these areas.

We are not creators; only combiners of the created. Invention isn’t about new ingredients, but new recipes. And innovations taste the best.

― Ryan Lilly

The Origin: Physics and Mathematics
JohnvonNeumann-LosAlamosTo start our discussion it is worth taking a look at the origins of computing when mathematics and physics combined to create the field. This combination is embodied in John von Neumann whose vision largely produced the initial instantiation of scientific computing. Scientific computing began in earnest under the aegis of the development of the atomic bomb. The application of computing was engineering analysis done by some of the greatest physicists in the world most notably Hans Bethe and Richard Feynman using methods devised by John von Neumann and Rudolf Peierls. Engineering was limited to the computer itself. Feynman_and_Oppenheimer_at_Los_AlamosMathematicians played key roles in more properly using computers notably through the efforts of Robert Richtmyer, Nicholas Metropolis and Richard Hamming. As a rule, the overall effort was conducted by a host of geniuses for an application of monumental international impact and importance. Practically speaking, they were exquisitely talented scientists who were also immensely motivated and had every resource available to them.

From this august origin, computing began to grow outside the womb of nuclear weapons’ work. Again, it was John von Neumann that provided the vision and bh_computers_01leadership. This time from the Institute for Advanced Study in Princeton focused on weather and development of better computers. Again, the application was largely in the realm of physics with the engineering being applied to the computers. Meanwhile computing was broadening in its appeal and attention from the success in Los Alamos and Princeton along with colleagues at universities. Other Labs in the United States and the Soviet Union also began exploring the topic. It still remained immature and speculative especially in a world that scarcely comprehended what a computer was or could do.

Computers are incredibly fast, accurate, and stupid: humans are incredibly slow, inaccurate and brilliant; together they are powerful beyond imagination.

― Albert Einstein

Engineering Joins the Revolution

It wasn’t until the 1960’s when engineering activities began to include computation. Part of the reason for this was the development of the initial set of methods by physicists and mathematicians and the unavailability of computing in general and specifically computing power sufficient to contemplate engineering. At first, the engineering uses of computing were exploratory and relegated to the research activities more like the physical or mathematical sciences than engineering. By the 1970’s this ended led by a handful of pioneers in Aerospace, Mechanical and Civil engineering. The growth of engineering use of computing led to some bad things like hubris of the awful “numerical wind tunnel” affair. In the late 1970’s talk of replacing wind tunnel testing with numerical simulations became an embarrassing set back (which we have naively made all over again). It represented a massive technical over reach and ultimately a setback by driving a wedge between computing and experiments.aj_pic_color

Civil engineering made progress by utilizing the finite element method, which was ideally suited for that fields’ intellectual basis. In mechanical engineering heat transfer and fluid flow problems and dominantly heat exchanger design led the way. Together with aerospace engineering the important topic of computational fluid dynamics (CFD), which is the archetype of computational science in general. Nuclear engineering was birthed from physics and had computing at its heart almost from the beginning especially with the problem of reactor core design. These methods were born directly from the nuclear weapons’ program as a natural outgrowth of the peaceful exploration of nuclear power.

Science is the extraction of underlying principles from given systems, and engineering is the design of systems on the basis of underlying principles.

—Nick Trefethen

Mathematics Shrinks from View

Computer Science is no more about computers than astronomy is about telescopes

― Edsger Wybe Dijkstra

15_Courant_2All of this was a positive outgrowth of the combination of physics and mathematics. During the same period the mathematical contributions to scientific computing went several directions, with pure mathematics birthing computer science, and applied mathematics. Computer science has become increasingly divorced from scientific computing over time and failed to provide the sort of inspirational impetus mathematics had previously provided. For several decades applied mathematics filled this vacuum with great contributions to progress. In more recent times applied mathematics has withdrawn from this vital role. The consequence of these twin developments has taken a terrible toll of depriving scientific computing of a strong pipeline of mathematical innovation. I will admit that statistics has made recent strides in connecting to scientific computing. While this is a positive development, it hardly makes up for the broader diminishing role of other mathematics from computing.

TerenceTaoWe see that computation was born from physics and mathematics with engineering joining after the field had been shaped by those fields. Over the past thirty or forty years engineering has come to play an ever larger part in scientific computing, the physical sciences have continued their part, but mathematics has withdrawn from centrality. Computer science has taken the mantle of pure mathematics’ lack of utility. Applied mathematics leapt to fill this void, but has withdrawn from providing the full measure of much needed intellectual vitality.

Computer science is one of the worst things that ever happened to either computers or to science.

—Neil Gershenfeld

Computing Becomes Something Monstrous

Part of the reason for this is a change in the cultural consciousness regarding computing. In the beginning there was physics and mathematics combining in the imagination of John von Neumann to produce something new, something wonderful and era defining. It gestated in the scientific community for several decades until computing exploded into the public conscious. It was a combination of maturity in the use of computing and sufficient computing power available to the masses that triggered the transition. Computing was no longer the purview of nerds and geeks, now being owned by all of humanity. As such computing became somewhat pedestrian in nature and lost its sheen. This also explains the rise of engineering as an outlet for computing, and the loss of mathematics. In the absence of innovation we substituted raw power. Rather than continue to improve through better thinking we came to rely upon Moore’s law for progress. Where we used to out-smart problems, they now would be overpowered by an unparalleled force.Dts_news_bill_gates_wikipedia

Steve_Jobs_Headshot_2010-CROPWhile scientists and big business owned computing until about 1995, all of sudden it became public property. Soon it grew to be something that dominated the global economy. Powered by Moore’s law computing became ubiquitous and ironically ceased being about computing; computers became about communication. Now everything valuable about computers is communication, not computation. Computation is an essential, but minor element in the value proposition. A big part of the reason is the power of computers is so great that the computational load has become trivial. The Internet gives access to information, data and connects people in ways never imaginable. As such the business possibilities are staggering. Computing is now longer so much about computers as it is about people and their money.138

Moore’s law also became a cipher for technology and progress. It has infected computational science with its pervasively superficial nature. It isn’t that Moore’s law is superficial per se; it is the societal interpretation of its implications. Moore’s law is at death’s door, if it hasn’t already passed. Its death does not mean progress will die, it just means progresses path will change. 

You can’t solve a problem with the management of technology with more technology.

— Bill Joy

Where do we go from here?

What we can conclude form this discussion is a fundamental change in the character of scientific computing. Where the use of computing for engineering work should have added to form a more complete whole, the withdrawal of mathematics has cheated us of that brighter future. Engineering is an essential human activity and the natural outgrowth of our scientific achievements, but it can lack creativity at times. Such creativity is always beneficial and better when combining disciplines. The structure and rigor of mathematics is essential for putting this creativity on the strongest footing. To make progress in the future it should be essential to include engineering, the physical sciences and mathematics with some degree of equality. The rather weak minded approach of simply utilizing Moore’s law to drive scientific computing forward must end both from a fundamental standpoint as well as by the death of this source of progress.

Postscript

I’ve decided to get off the daily writing thing, or more accurately the daily posting. I still need to write every single day, so that’s not changing. I’m a lot better off thinking a bit more about the topic to write about, and working it out over several days. My new goal is two or three blog posts a week.

The Future is Already Here, Everyday.

The future is already here – it’s just not evenly distributed.

― William Gibson

2015: Time for a New Era in Scientific Computing?

01 Thursday Jan 2015

Posted by Bill Rider in Uncategorized

≈ 1 Comment

Societies in decline have no use for visionaries.

― Anaïs Nin

gordon-moore-blue-3-1.jpg.rendition.cq5dam.webintel.960.320 So it’s a new year with all the requisite reflective looks forward and backwards. I’ll do both here and posit that perhaps an era is drawing to a close and its time for a big change in scientific computing. Even more, I’ll argue that a big change is being thrust upon us, and its time to get ahead of it. I’ve taken the history of scientific computing and laid it out in a series of eras each 15-20 years long. These eras are defined by a combination of ideas, algorithms, methods, hardware and software. Changes in the composition of all of these define each era and trigger the changes.

 A man’s shortcomings are taken from his epoch; his virtues and greatness belong to himself.

― Johann Wolfgang von Goethe

I believe that a combination of crises will trigger the change that is upon us. One of these crises has all the headlines, one is barely hidden from view and a third is silent, but each has a huge role to play. The key visible crisis is hardware driven and revolves around the viability of Moore’s law operating in computing and computational performance. We seem to have taken the approach that maintaining Moore’s law is essential and we are willing to expend vast amounts of money to achieve it. This money could be spent more profitably elsewhere in the enterprise. The second crisis is software driven and associated with the complexity of scientific software and the ponderous nature it has taken on. Software is becoming increasingly unsustainable and expensive threatening to swallow all of the available resources. The third silent crisis is the dearth of new ideas in scientific computing and/or an inability to impact progress. This third crisis is primarily driven by the combination of hard-to-impossible to use hardware with software complexity exploding to strangle any new ideas in their proverbial cribs. Even when the ideas can be breathed to life, they are starved of the sort of resources and focus necessary to bring them to fruition. Dealing with the first two problems is simply taking all the resources available, and leaving nothing else.

History is a Rorschach test, people. What you see when you look at it tells you as much about yourself as it does about the past.

― Jennifer Donnelly

Yesterday I tweeted “scientific computing was once the grand avenue of computing, now it is a dark alley in a bad neighborhood.” The scientific community once drove computing as a vanguard, and now has to adapt to whatever the market does. It has become a niche activity and economically slaved to a colossus of global reach. A huge marketplace, which might benefit computing, now drives hardware and software innovation but its direction is not optimal. We must react to directions that benefit the marketplace rather than determine the direction for the market.

Let us study things that are no more. It is necessary to understand them, if only to avoid them.

― Victor Hugo

images-1The politics of the time have an enormous impact on focus and resource availability. Scientific computing was born in the crucible of a World War and matured in the urgency of the Cold War. Nothing like this exists to focus the mind and open the pocketbook like that today. On the other hand computing has never been as important as it is today. Never have more of society’s resources gone in its direction. How can we harness this massive creative force for our benefit?

The greatest and most powerful revolutions often start very quietly, hidden in the shadows. Remember that.

― Richelle Mead

I’ve taken the history of scientific computing and broken in up into distinct five eras and made the leap of defining 2015 as the beginning of a new one. I’ll grant that the dates are rounded up or down by a few years, so maybe we’re already in a new era, or it’s a few years off.

The farther backward you can look, the farther forward you are likely to see.

― Winston S. Churchill

  • imagesUnknownPre-1945 (prehistory): There was no scientific computing because computers were in their infancy, and the combination of their use for science was not yet seen. In this time the foundations of mathematics and physics were made by a host of greats along with numerical methods crafted for hand computations. The combination of computers
    with the vision of John von Neumann and the necessity of World War 2 brought scientific computing out of this womb and into practice.
  • vnc011945-1960 (creation): In this time scientific computing was largely taking place in the most important Labs on the most important topic with access to high priority and huge resources. Great innovations were taking place in computers and the practice of computing. Along with refinements in the engineering of computers, the practice of programming began to take shape. The invention of Fortran and its capacity to express methods and algorithms in code was one of the developments to bring this era to a close. In this time, the development of mathematical theory and numerical analysis was key. The invention of stability, and convergence of numerical methods was one of the great achievements. These provided a platform for systematic development in the 1960’s.
  • 1960-1975 (foundations): During this period scientific computing emerged from the shadows into the light. Computers became increasingly available outside the top-secret environment, and computing began to be a valid academic endeavor. With this democratization of computing came extensive application to an ever-widening set of problems. Many of the key methods and algorithms for scientific computing were created in this period. The field of computational fluid dynamics (CFD) came into being. CFD was then viewed as an enormous boon to aerospace science. By the time the period drew to a close there was great optimism and hope. Computers were becoming quite capable and more generally available. The Labs still led the world especially because they always had access to better hardware and software than anyone else. They were beginning to be indispensible tools for business. Moore’s law was defined and the beginning of a massive growth in computing power had started.
  • 1975-1995 (glory years): I’ve described this time as the “golden age” of Unknown-2computational science. For the first time the computers and software was balanced with the methods and models. In many ways Seymour Cray defined the era first with the CDC 6600 and 7600 computers then with the machines bearing his name. The vision set forth by von Neumann came into force. Academic scientific computing became completely respectable with mathematics, physics and engineering all taking part. The first hints of extreme hubris were witnessed; the “numerical wind tunnel” debacle unfolded in aerospace. The ability for CFD to displace physical wind tunnel testing in design and qualification was a masUnknown-4sive over-reach in capability. Great damage was done in the process, and no one seems to have learned from the experience. It foreshadows the developments of the current time with ASC when the creation of “virtual underground testing” was proposed to make up for a ban on actual underground testing.
  • images-21995-2015 (mid-life): Then the glory days ended with a bang through a combination of events. The Cold War ended and soon nuclear testing ceased. The Labs would have their own “numerical wind tunnel” moment, but no actual wind tunnel would be available to test it. At the same time the capacity of the supercomputers of the golden era to maintain Moore’s law came to an end. The entire ASC program hinged upon the premise that advances in computational performance would pave the way for predictive simulation. We had the attack of the killer micros and the birth of massively parallel computation to keep hardware performance on the increase. Unknown-3Getting the methods and models of old to work on these computers became an imperative; the access to more computing power via Moore’s law became an imperative as well. At the same time the complexity of the codes was growing by leads and bounds. New programming paradigms were being ushered into use with C++ leading the way. Its object-oriented principles were thought to be a way to handle the seemingly overwhelming complexity. With more resources flowing into hardware and software the amount of energy going into methods and models waned. Where efforts in these endeavors had previously yielded gains larger than Moore’s law such gains have simply evaporated during this era.
  • 2015- (mid-life crisis): Now we get to today and the elements of revolution are falling into place. We have three crises at hand with each brewing during the era we’re at the end of. Hardware is in crisis with Moore’s law either already dead or on death’s door. The complexity of the software is beginning to threaten progress. Lack of innovation in methods, algorithms and modeling is killing other sources of improved performance. Let’s look at each crisis in turn and its threat to scientific computing’s progress.

All revolutions are, until they happen, then they are historical inevitabilities.

― David Mitchell

Transistor_Count_and_Moore's_Law_-_2008_1024The most evident crisis is the demise of Moore’s law. Given the devotion to computing power as the route to predictive computational science, the loss of growth in computing power would be fatal. There are two worrying signs: the growth in computing power at the processor level has slipped to a crawl, and the ability to use all the power of the massively parallel computers for real problems is missing. At the low end of computing nothing will save Moore’s law especially as the computing industry has moved on to other priorities. It is just accepted. At the high end we grasp on to terrible metrics like weak scaling, or LINPAC to hide the problems, but the immensity of the issues become clearer every day. In the middle of this Moore’s law is clinging to life, but the two sides are converging on the middle and when they do Moore’s law will be dead. There are a host of hopes for life, but the laws of physics are arrayed against the continuation of this trend. With all the effort going into using Moore’s law what will be left to pick up the pieces?fastest-supercomputer-Fujitsu-Numerical-Wind-Tunnel

 

A second crisis simmering just below the surface is software complexity. The combination of trying to make codes work on cutting edge computers and the increasing desire for capability in codes is creating a software monstrosity. Earlier in the week I learned from readers of the blog about Gall’s law, which says that a complex system evolves from a simple system that works, and a complex system defined from scratch will not work. We run a great risk that we will be stuck with massive codes bases that are eroded by mountains of technical debt. These debts threaten to choke all progress and commit us to the fundamental methodology defined by the increasingly unwieldy code. The issue of software and how we translate our intellectual labor to silicon has to be dealt with soon or it will strangle progress more surely than the death of Moore’s law.

 

CFD_tunnel_comparisonThe third and most shadowy crisis is lack of impact from methods, models and algorithms in the most modern era of scientific computing. As I said earlier, part of the problem are the twin crises of decline in hardware gains and software-bloat sapping the energy from the system? Before our infatuation with Moore’s law as the heartbeat of progress innovation in algorithms, numerical methods and modeling produced more progress than hardware gains. These gains are harder to measure and far subtler than raw computational performance, but just as real. As hardware fades away as a source of progress they are the natural place to turn to for advances. The problem is that we have starved this side of scientific computing for nearly 20 years. Major changes are needed to reinvigorate this approach. As I’ve come to realize the software languages are themselves a massive algorithmic achievement (Fortran is listed among the 10 greatest algorithms of the 20th century!). This is to say that intellectual labor toward figuring out how to program computers in the future is part of this issue and a necessary element in fixing two of the crises.

 

But I suppose the most revolutionary act one can engage in is… to tell the truth.

― Howard Zinn

 

The question is whether we will answer the call to action that the current day’s developments should demand. The situation is so critical that the current state of affairs cannot continue for much longer. My greatest concern is the lack of leadership and the lack of appetite for the risk-taking necessary to take on the challenge of the day. If we can find the courage to step forward new vista await, it’s simply a matter of coming to terms with realities. If we don’t the next era of scientific computing could be marked by decline and obsolesce. It need not be this way, but some significant changes are needed in attitudes and approaches to leading the field. Are we up to the challenge? Do we have the leadership we need? It is time to get ahead of the crises now before they become overwhelming.

 

Those who make peaceful revolution impossible will make violent revolution inevitable.

― John F. Kennedy

 

A revolution is not a bed of roses. A revolution is a struggle between the future and the past.

― Fidel Castro

 

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • February 2026
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Regularized Singularity
    • Join 55 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Regularized Singularity
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...