In algorithms, as in life, persistence usually pays off.
― Steven S. Skiena
Over the past year I’ve opined that algorithm (method) research is under-supported and under-appreciated as a source of progress in computing. I’m not going to backtrack one single inch on this. We are not putting enough effort into using computers better, and we are too much effort to building bigger, less useful and very hard to use computers. Without the smarts to use these machines wisely this effort will end up being a massive misapplication of resources.
The problem is that the issues with algorithm research are even worse than this. The algorithm research we are supporting is mostly similarly misdirected. It turns out we are focused on algorithm research that does even more to damage our prospects for success. In other words, even within the spectrum of algorithm research there isn’t an equality of impact.
The fundamental law of computer science: As machines become more powerful, the efficiency of algorithms grows more important, not less.
— Nick Trefethen
There are two fundamental flavors of algorithm research with intrinsically different value to the capability of computing. One flavor involves the development of new algorithms with improved properties compared to existing algorithms. The most impactful algorithmic research focuses on solving the unsolved problem. This research is groundbreaking and almost limitless in impact. Whole new fields of work can erupt from these discoveries. Not surprisingly, this sort of research is the most poorly supported. Despite its ability to have enormous and far-reaching impact, this research is quite risky and prone to failure.
If failure is not an option, then neither is success.
― Seth Godin
It is the epitome of the risk-reward dichotomy. If you want a big reward, you need to take a big risk, or really lots of big risks. We as a society completely suck at taking risks. Algorithm research is just one of enumerable examples. Today we don’t do risk and we don’t do long-term. Today we do low-risk and short-term payoff.
Redesigning your application to run multithreaded on a multicore machine is a little like learning to swim by jumping into the deep end.
—Herb Sutter
A second and kindred version of this research is the development of improved solutions. These improvements can provide lower cost of solution through better scaling of operation count, or better accuracy. These innovations can provide new vistas for computing and enable the solution of new problems by virtue of efficiency. This sort of research can be groundbreaking when it enables something to be done that couldn’t be reached due to inefficiency.
This is the form of algorithm research forms a greater boon to efficiency of computing than Moore’s law has provided. A sterling example comes from numerical linear algebra where costly methods have been replaced by methods that made solving billions of equations simultaneously well within reach of existing computers. Another really good example were the breakthroughs in the 1970’s by Jay Boris and Bram Van Leer whose discretization methods allowed an important class of problems to be solved effectively. This powered a massive explosion in the capacity of computational fluid dynamics (CFD) to produce meaningful results. Without their algorithmic advances CFD might still be ineffective for most engineering and science problems.
The third kind of algorithm research is focused on the computational implementation of existing algorithms. Typically these days this involves making an algorithm work on parallel computers. More and more it focuses on GPU implementations. This research certainly adds value and improves efficiency, but its impact pales in comparison to the other kind of research. Not that it isn’t important or useful, it simply doesn’t carry the same “bang for the buck” as the other two.
In the long run, our large-scale computations must inevitably be carried out in parallel.
—Nick Trefethen
Care to guess where we’ve been focusing for the past 25 years?
The last kind of research gets the lion’s share of the attention. One key reason for this focus is the relative low risk nature of implementation research. It needs to be done and generally it succeeds. Progress is almost guaranteed because of the non-conceptual nature of the work. This doesn’t imply that it isn’t hard, or requires less expertise. It just can’t compete with the level of impact as the more fundamental work. The change in computing due to the demise of Moore’s law has brought parallelism, and we need to make stuff work on these computers.
Both are necessary and valuable to conduct, but the proper balance between the two is a necessity. The lack of tolerance for risk is one of the key factors contributing to this entire problem. Low-risk attitudes contribute to the dominance of focus on computing hardware and the appetite for the continued reign of Moore’s law. It also compounds and contributes to the dearth of focus on more fundamental and impactful algorithm research. We are buying massively parallel computers, and our codes need to run on them. Therefore the algorithms that comprise our codes need to work on these computers. QED.
The problem with this point of view is it’s absolute disconnect with the true value of computing. Computing’s true value comes from the ability to solve models of reality. We solve those models with algorithms (or methods). These algorithms are then represented in code for the computer to understand. Then we run them on a computer. The computer is the most distant thing from the value of computing (ironic, but true). The models are the most important thing, followed by how we solve the model using methods and algorithms.
Our current view and the national “exascale” initiative represents a horribly distorted and simplistic view of how scientific value is derived from computing, and as such makes for a poor investment strategy for the future. The computer, the thing the greatest distance from value, is the focus of the program. In fact the emphasis in the national program is focused at the opposite end of the spectrum from the value.
I only hope we get some better leadership before this simple-minded mentality savages our future.
Extraordinary benefits also accrue to the tiny majority with the guts to quit early and refocus their efforts on something new.
― Seth Godin

In watching the ongoing discussions regarding the National Exascale initiative many observations can be made. I happen to think the program is woefully out of balance, and focused on the wrong side of the value proposition for computing. In a nutshell it is stuck in the past.
to hardware. As the software gets closer to the application, the focus starts to drift. As the application gets closer and modeling is approached, the focus is non-existent. It is simply assumed that the modeling just needs a really huge computer and the waters will magically part and the path the promised land of predictive simulation will just appear. Science doesn’t work this way, or more correctly well functioning science doesn’t work like this. Science works with a push-pull relationship between theory, experiment and tools. Sometimes theory is pushing experiments to catch up. Sometimes tools are finding new things for theory to answer. Computing is such a tool, but it isn’t be allowed to push theory, or more properly theory should be changing to accommodate what the tools show us.
The question is whether there is some way to learn from everyone else. How can this centralized supercomputing be broken down in a way to help the productivity of the scientist. One of the things that happened when mainframes went away was an explosion of productivity. The centralized computing is quite unproductive and constrained. Computing today is the opposite, unconstrained and completely productive. It is completely integrated into the very fabric of our lives. Work and play are integrated too. Everything happens all the time at the same time. Instead of maintaining the old-fashioned model we should be looking into harvesting the best of modern computing to overthrow the old model.
drowning in data whether we are talking about the Internet in general, the coming “Internet of things” or the scientific use of computing. The future is going to be much worse and we are already overwhelmed. If we try to deal with every single detail, we are destined to fail.
in all the noise and represent this importance compactly and optimally. This class of ideas will be important in managing the Tsunami of data that awaits us.
be solved by exotic methods and algorithms. Ultimately, these methods and algorithms must be expressed as computer code before the computers can be turned loose on their approximate solution. These models are relics. The whole enterprise of describing the real world through these models arose from the efforts of intellectual giants starting with Newton and continuing with Leibnitz, Euler, and a host of brilliant 17th, 18th and 19th Century scientists. Eventually, if not almost immediately, models became virtually impossible to solve via available (analytical) methods except for a
handful of special cases.
When computing came into use in the middle of the 20th Century some of these limitations could be lifted. As computing matured fewer and fewer limitations remained, and the models of the past 300 years became accessible to solution albeit through approximate means. The success has been stunning as the combination of intellectual labor on methods and algorithms along with computer code, and massive gains in hardware capability have transformed our view of these models. Along the way new phenomena have been recognized including dynamical systems or chaos opening doors to understanding the World. Despite the progress I believe we have much more to achieve.
Today we are largely holding to the models of reality developed prior to the advent of computing as a means of solution. The availability of solution has not yielded the balanced examination of the models themselves. These models are
effectively. This gets to the core of studying uncertainty in physical systems. We need to overhaul our approach of reality to really come to grips with this. Computers, code and algorithms are probably at or beyond the point where this can be tackled.
Here is the problem. Despite the need for this sort of modeling, the efforts in computing are focused at the opposite end of the spectrum. Current funding and focus is aimed at the computing hardware, and code with little effort being applied to algorithms, methods and models. The entire enterprise needs a serious injection of intellectual energy in the proper side of the value proposition.
For solution verification the problem is much worse. Even when solution verification is done we are missing important details. The biggest problem is the lack of solution verification for the application of scientific computing to problems. Usually the problem is simply computed and graphs are overlaid, and success is declared. The comparison looks good enough. No sense of whether the solution is accurate is given at least quantitatively. An error estimate for the solution shown, or better yet a convergence study would provide much enhanced faith in the results. In addition to the numerical error, the rate of convergence would also provide information on the tangible expectations for the solution for practical problems. Today such expectations are largely left to be guessed by the reader.


he
relative value and priority of each is different. A lot depends on what the pacing requirements for progress are, but the focus of the value proposition should be an imperitive.
The secondary fuel for this revolution is the model of interaction and the algorithms to efficiently deliver the value. The actual code and compute needs of this delivery needs to be competently executed, but beyond that offer nothing distinguishing to it. This is a massive lesson right in front of the scientific community, which seems to be not understood these observations as measured by its actions. Today’s computing for
science emphasis has completely inverted the value stream revolutionizing computing in the rest of the World.
Modeling must always be improving. If we are doing our computing correctly the models we use should continually be coming up short. Instead, the models seem to be completely frozen in time. They aren’t advancing. For example, I believe we should be undoing the chains of determinism in simulation, but even today deterministic simulations are virtually all of the workload.
foundation of value in scientific computing that is found foremost in models and their solution via algorithms and methods. The other aspect that has been systematically shortchanged is the value of the people who provide the ideas that form model, methods and algorithms. Ultimately, the innovation in scientific computing is the intellectual labor of talented individuals.
This truth is valid whether the human activity is the search on your phone or laptop, purchasing through Amazon, predicting tomorrow’s weather, solving the airflow over an aircraft wing, or the flow neutrinos in a supernova using the Boltzmann transport equation. The real revolution in computing is the ability of computing to matter to how we live our daily lives whatever the activity. Given that the value in all of this is the added capacity to achieve our goals, it might be worth considering whether our priorities actually reflect this. Where these values are present in computing the
importance and value of computing has swelled. Given my personal focus on the scientific use of computing my assessment would be that we have lost our way. The values in computing programs are horribly distorted and out of balance. A key to this is the loss of perspective on what really matters.
In scientific computing the key connection to reality are models. The most basic models are the governing equations such as the Euler, or Navier-Stokes or Boltzmann equations. These models are augmented by other models of subprocesses (often called subgrid models), and constitutive data that are typically experimentally measured and define the mean behavior of materials (accumulating the effects that would otherwise be statistical). These descriptions are the essential element in the value of computing to human activity. Their value transcends any of the other aspects: the algorithm, the code, and the computer itself. If the basic models are inadequate or faulty everything else is basically for naught. If the model is good, the rest of the components need to get it right, the algorithm or method needs to correctly or accurately solve the model, the implementation in code needs to be correct, and the computer needs to be capable of solving the problem. It is an exercise in balance and perspective. Our key issue

One of the things that seem intriguing is the appearance of the algorithm in the broader cultural milieu. Despite its inherently esoteric and abstract character, the algorithm is becoming a bit of a celebrity these days. Popular press articles have started to examine the impact of the algorithm on our daily lives and explore the power and dangers of relying upon them.
As the massive gains from computer power wound down, and simultaneously the Internet transitioned into a huge web of human connectivity, the value proposition for computing changed. Suddenly the greatest value in all of this power switched to connection, access and sorting information. There were some fitful starts at attacking this key problem, but one solution rose above the rest, Google. Based on the work of a couple of Stanford graduate students and some really cool mathematics, Google took the world by storm. In a decade it had transformed itself into the World’s most powerful company. An algorithm that solved the data and connectivity access problem better than anything before it fundamentally powered Google.
Google replaced a computer software company as the World’s most powerful company, Microsoft. In both cases computer programming was the engineering vehicle for these companies. Programming is a technique where intellectual labor is committed to a form where a computer can automatically execute a method, or algorithm to solve a problem. Usually the computer program is actually a large collection of methods,
he ability to give people access to information and connectivity to eclipse Microsoft. The algorithm had moved from being a topic of nerdish academic interest to one of the most powerful things in the World. The world’s economy spun on an axis determined by a handful of algorithms.
Meanwhile scientific computing has lost its mind and decided that the path that led IBM down the path towards disaster is its chosen path. The end of Moore’s law has resulted in a collective insanity of spending vast sums of money supporting the hardware path in the face of looming disaster. At the same time they have turned their backs on algorithms. Effort and focus flows into obtaining and building massive computers that are increasingly useless for real science while ignoring the value that algorithms bring. The infatuation with the biggest and fastest computer measured by
an increasingly meaningless benchmark only grows with time. This continues while the key to progress stares them in the eye every time they do an Internet search, the power of the algorithm.
luck and specialization. Over time this causes a lack of perspective for the importance of your profession in the broader world. It is often difficult to understand why others can’t see the intrinsic value in what you’re doing. There is a good reason for this, you have probably lost the reason why what you do is valuable.
It’s always important to keep the most important things in mind, and along with quality, the value of the work is always a top priority. In thinking about computing, the place where the computers change how reality is engaged is where value resides. Computer’s original uses were confined to business, science and engineering. Historically, computers were mostly the purview of the business operations such as accounting, payroll and personnel management. They were important, but not very important. People could easily go through life without ever encountering a computer and their impact was indirect.
access to computer power allowed it to grow to an unprecedented scale, but an even greater transformation laid ahead. Even this change made an enormous impact because people almost invariably had direct contact with computers. The functions that were once centralized were at the fingertips of the masses. At the same time the scope of computer’s impact on people’s lives began to grow. More and more of people’s daily activities were being modified by what computing did. This coincided with the reign of Moore’s law and its massive growth in the power and/or the decrease in the cost of computing capability. Now computing has become the most dominant force in the World’s economy.
computers allowed computing to obtain massive value in people’s lives. The combination of ubiquity and applicability to the day-to-day life made computing’s valuable. The value came from defining a set of applications that impact people’s lives directly and always within arm’s reach. Once these computers became the principle vehicle of communication and the way to get directions, find a place to eat, catch up with old friends, and answer almost any question at will, the money started flow. The key to the explosion of value wasn’t the way the applications were written, or coded or run on computers, it was their impact on our lives. The way the applications work, their implementation in computer code, or the computers themselves just needed to be adequate. Their characteristics had very little to do with the success.
Scientific computing is no different; the true value lies in its impact on reality. How can it impact our lives, the products we have or the decisions we make. The impact of climate modeling is found in its influence on policy, politics and various economic factors. Computational fluid dynamics can impact a wide range of products through better engineering. Other computer simulation and modeling disciplines can impact the military choices, or provide decision makers with ideas about consequences for actions. In every case the ability of these things to influence reality is predicated on a model of reality. If the model is flawed, the advice is flawed. If the model is good, the advice is good. No amount of algorithmic efficiency, software professionalism or raw computer power can save a bad model from itself. When a model is good the solution algorithms and methods found in computer code, and running on computers enable its outcomes. Each of these activities needs to be competently and professionally executed. Each of these activities adds value, but without the path to reality and utility its value is at risk. 
So we have a national program that is focused on the least valuable thing in the process, and ignores the most valuable piece. What is the likely outcome? Failure, or worse than that abject failure. The most stunning thing about the entire program is the focus is absolutely orthogonal to the value of the activities. Software is the next largest focus after hardware. Methods and algorithms are the next highest focus. If one breaks out this area of work into its two pieces, the new-breakthroughs or the computational implementation work, the trend continues. The less valuable implementation work has the lion’s share of the focus, while the groundbreaking type of algorithmic work is virtually absent. Finally, modeling is nearly a complete absentee. No wonder the application case for exascale computing is so pathetically lacking!
Alas, we are going down this road whether it is a good idea or not. Ultimately this is a complete failure of the scientific leadership of our nation. No one has taken the time or effort to think this shit through. As a result the program will not be worth a shit. You’ve been warned.
One of the things that Winter holiday means to me is movies, and good ones at that. It is something my wife and I love to do, enjoy and argue about. My son noted that we
Here are my holiday movie observations for the current season. I’ll assign each a letter grade with an Academy award winning film usually getting an “A”. I’d give all of the above-mentioned movies this grade and a few “A+”.
An absolute stunner of a movie with one of the best acting performances I can remember seeing in a long time by JK Simmons as Terence Fletcher. It is a student-teacher story set in a conservatory. The kid is a young talented jazz drummer (played with skill by Miles Teller) looking to catch the eye of the famous teacher. He does and then the fireworks start. The filming and acting produces the sort of tension that usually come from action flicks. This is literally edge-of-your-seat stuff,
I really wanted to like this better. It was a finely acted and crafted historical drama based on a key moment in the civil rights movement. It is stunning to see the kind of things that used to happen in the United States. We’ve made progress as a country, but shockingly little as the events of the last year show. There is action in the deeply racist Alabama of 1964 and 1965, and tension between MLK and LBJ. Other figures like J. Edgar Hoover and George Wallace come across like the villainous humans they were. Overall an important movie that was competently executed, but not the brilliant movie I had hoped it would be.
This is a good movie, the worst of the ones I gave an “A-“ to. It is a very Hollywood version of Alan Turing’s life. Benedict Cumberbatch takes the material and produces a wonderful performance. The storytelling is unique running three timelines in parallel from Turing’s life with great lessons relevant to today’s problems. The upshot is that Turing’s life was immensely tragic, and his service to England and the World was never paid what it was due.
I would sum this movie up as being thoroughly disappointing. I am guessing that the problem is that no one can tell Peter Jackson “no” any more and he is reverting to his roots. Some of the film making decisions are simply ludicrious and remind me strong of Jackson’s earlier films like “Dead Alive”. The choices almost always comical and some one should have told him, “this is a bad idea”.
Along with Whiplash this is my choice for the best picture of the year. The movie has a massive gimmick being filmed a bit over time a week or two a year for 12 years. It chronicles the childhood of a boy whose parent divorce and how he develops from a small boy entering school to an adult entering college. The gimmick the film uses is remarkable using the same actors to show the passage of time. The film is wonderful beyond the gimmick and delivers a wonderful tale of personal growth for all the characters. It is both simple and immensely rich.
This movie is a wonderful bit of pay-for-view surprise and quite enjoyable on the whole. Some aspects of the movie are odd, but it is filled with great performances including surprising depth from Chris Evans. He is a much better actor than people realize. The movie has action, tension and deep commentary on our modern world and its problems. The film requires a degree of suspecnsion of disbelief regarding the basic premise, but if you can manage that it is a real gem.
This was a marvelously dark movie and portrait of a true sociopath. Jack Gylennhaal is wonderfully creapy in the roll and manages to make himself genuinely unlikable. He is driven and relentless in achieving fame and success without a hint of morality. At the same time the film succeeds in providing a tremendously insightful commentary on our modern society and our appetite for news that titillates much more than informs.
This is a film that divides opinions for good reasons. It is a wonderfully majestic movie that is horribly flawed. Good, but not great performances can be found working on a script that was uninspired. The concept and arc of time with an innovative narrative concept make the story watchable. In the end it produces a watchable film that won’t be remembered 10 years from now.
This film is the controversy of the season with the hacking of Sony and the capitalization to terrorism initially declining to release the movie then coming to their senses. We saw it on pay for view. The hackers did a better job sponsoring the movie than it deserved. This was easily the worst movie we saw all season. It was amusing and thoughtlessly entertaining, but a cinematic turd. It was a couple hours of my life I can’t get back.
Gone Girl, A-