Crisis is Good. Crisis is a Messenger.
― Bryant McGill
Computational What? Science? Engineering? Physics? Mathematics?
My last post is about computational science’s impending multi-crisis with the loss of Moore’s law, exploding software complexity, and failure to invest in its intellectual foundation. A reasonable question is how broadly the issues described there apply to subsets of the field. What about computational physics? What about computational engineering? What about computer science? What about applied mathematics? What are the differences between these notions of computation’s broader role in the scientific world? What are the key characteristics that make the essence of scientific computing?
One of the most important of disciplines is the one that never assumed its logical name: mathematical engineering.
—Nick Trefethen
Computational science is an umbrella for a variety of things that have gradations of difference and don’t form a terribly coherent whole. Instead it is a continuum with one end being held down by computer science and the other end by computational engineering; or perhaps the fields that birthed computing, physics and mathematics. The differences between engineering, mathematics and physics show themselves in computation as they do in other fields, but scientific computing should really be something of amalgam of all of these areas.
We are not creators; only combiners of the created. Invention isn’t about new ingredients, but new recipes. And innovations taste the best.
― Ryan Lilly
The Origin: Physics and Mathematics
To start our discussion it is worth taking a look at the origins of computing when mathematics and physics combined to create the field. This combination is embodied in John von Neumann whose vision largely produced the initial instantiation of scientific computing. Scientific computing began in earnest under the aegis of the development of the atomic bomb. The application of computing was engineering analysis done by some of the greatest physicists in the world most notably Hans Bethe and Richard Feynman using methods devised by John von Neumann and Rudolf Peierls. Engineering was limited to the computer itself.
Mathematicians played key roles in more properly using computers notably through the efforts of Robert Richtmyer, Nicholas Metropolis and Richard Hamming. As a rule, the overall effort was conducted by a host of geniuses for an application of monumental international impact and importance. Practically speaking, they were exquisitely talented scientists who were also immensely motivated and had every resource available to them.
From this august origin, computing began to grow outside the womb of nuclear weapons’ work. Again, it was John von Neumann that provided the vision and
leadership. This time from the Institute for Advanced Study in Princeton focused on weather and development of better computers. Again, the application was largely in the realm of physics with the engineering being applied to the computers. Meanwhile computing was broadening in its appeal and attention from the success in Los Alamos and Princeton along with colleagues at universities. Other Labs in the United States and the Soviet Union also began exploring the topic. It still remained immature and speculative especially in a world that scarcely comprehended what a computer was or could do.
Computers are incredibly fast, accurate, and stupid: humans are incredibly slow, inaccurate and brilliant; together they are powerful beyond imagination.
― Albert Einstein
Engineering Joins the Revolution
It wasn’t until the 1960’s when engineering activities began to include computation. Part of the reason for this was the development of the initial set of methods by physicists and mathematicians and the unavailability of computing in general and specifically computing power sufficient to contemplate engineering. At first, the engineering uses of computing were exploratory and relegated to the research activities more like the physical or mathematical sciences than engineering. By the 1970’s this ended led by a handful of pioneers in Aerospace, Mechanical and Civil engineering. The growth of engineering use of computing led to some bad things like hubris of the awful “numerical wind tunnel” affair. In the late 1970’s talk of replacing wind tunnel testing with numerical simulations became an embarrassing set back (which we have naively made all over again). It represented a massive technical over reach and ultimately a setback by driving a wedge between computing and experiments.
Civil engineering made progress by utilizing the finite element method, which was ideally suited for that fields’ intellectual basis. In mechanical engineering heat transfer and fluid flow problems and dominantly heat exchanger design led the way. Together with aerospace engineering the important topic of computational fluid dynamics (CFD), which is the archetype of computational science in general. Nuclear engineering was birthed from physics and had computing at its heart almost from the beginning especially with the problem of reactor core design. These methods were born directly from the nuclear weapons’ program as a natural outgrowth of the peaceful exploration of nuclear power.
Science is the extraction of underlying principles from given systems, and engineering is the design of systems on the basis of underlying principles.
—Nick Trefethen
Mathematics Shrinks from View
Computer Science is no more about computers than astronomy is about telescopes
― Edsger Wybe Dijkstra
All of this was a positive outgrowth of the combination of physics and mathematics. During the same period the mathematical contributions to scientific computing went several directions, with pure mathematics birthing computer science, and applied mathematics. Computer science has become increasingly divorced from scientific computing over time and failed to provide the sort of inspirational impetus mathematics had previously provided. For several decades applied mathematics filled this vacuum with great contributions to progress. In more recent times applied mathematics has withdrawn from this vital role. The consequence of these twin developments has taken a terrible toll of depriving scientific computing of a strong pipeline of mathematical innovation. I will admit that statistics has made recent strides in connecting to scientific computing. While this is a positive development, it hardly makes up for the broader diminishing role of other mathematics from computing.
We see that computation was born from physics and mathematics with engineering joining after the field had been shaped by those fields. Over the past thirty or forty years engineering has come to play an ever larger part in scientific computing, the physical sciences have continued their part, but mathematics has withdrawn from centrality. Computer science has taken the mantle of pure mathematics’ lack of utility. Applied mathematics leapt to fill this void, but has withdrawn from providing the full measure of much needed intellectual vitality.
Computer science is one of the worst things that ever happened to either computers or to science.
—Neil Gershenfeld
Computing Becomes Something Monstrous
Part of the reason for this is a change in the cultural consciousness regarding computing. In the beginning there was physics and mathematics combining in the imagination of John von Neumann to produce something new, something wonderful and era defining. It gestated in the scientific community for several decades until computing exploded into the public conscious. It was a combination of maturity in the use of computing and sufficient computing power available to the masses that triggered the transition. Computing was no longer the purview of nerds and geeks, now being owned by all of humanity. As such computing became somewhat pedestrian in nature and lost its sheen. This also explains the rise of engineering as an outlet for computing, and the loss of mathematics. In the absence of innovation we substituted raw power. Rather than continue to improve through better thinking we came to rely upon Moore’s law for progress. Where we used to out-smart problems, they now would be overpowered by an unparalleled force.
While scientists and big business owned computing until about 1995, all of sudden it became public property. Soon it grew to be something that dominated the global economy. Powered by Moore’s law computing became ubiquitous and ironically ceased being about computing; computers became about communication. Now everything valuable about computers is communication, not computation. Computation is an essential, but minor element in the value proposition. A big part of the reason is the power of computers is so great that the computational load has become trivial. The Internet gives access to information, data and connects people in ways never imaginable. As such the business possibilities are staggering. Computing is now longer so much about computers as it is about people and their money.
Moore’s law also became a cipher for technology and progress. It has infected computational science with its pervasively superficial nature. It isn’t that Moore’s law is superficial per se; it is the societal interpretation of its implications. Moore’s law is at death’s door, if it hasn’t already passed. Its death does not mean progress will die, it just means progresses path will change.
You can’t solve a problem with the management of technology with more technology.
— Bill Joy
Where do we go from here?
What we can conclude form this discussion is a fundamental change in the character of scientific computing. Where the use of computing for engineering work should have added to form a more complete whole, the withdrawal of mathematics has cheated us of that brighter future. Engineering is an essential human activity and the natural outgrowth of our scientific achievements, but it can lack creativity at times. Such creativity is always beneficial and better when combining disciplines. The structure and rigor of mathematics is essential for putting this creativity on the strongest footing. To make progress in the future it should be essential to include engineering, the physical sciences and mathematics with some degree of equality. The rather weak minded approach of simply utilizing Moore’s law to drive scientific computing forward must end both from a fundamental standpoint as well as by the death of this source of progress.
Postscript
I’ve decided to get off the daily writing thing, or more accurately the daily posting. I still need to write every single day, so that’s not changing. I’m a lot better off thinking a bit more about the topic to write about, and working it out over several days. My new goal is two or three blog posts a week.
The Future is Already Here, Everyday.
The future is already here – it’s just not evenly distributed.
― William Gibson
So it’s a new year with all the requisite reflective looks forward and backwards. I’ll do both here and posit that perhaps an era is drawing to a close and its time for a big change in scientific computing. Even more, I’ll argue that a big change is being thrust upon us, and its time to get ahead of it. I’ve taken the history of scientific computing and laid it out in a series of eras each 15-20 years long. These eras are defined by a combination of ideas, algorithms, methods, hardware and software. Changes in the composition of all of these define each era and trigger the changes.
The politics of the time have an enormous impact on focus and resource availability. Scientific computing was born in the crucible of a World War and matured in the urgency of the Cold War. Nothing like this exists to focus the mind and open the pocketbook like that today. On the other hand computing has never been as important as it is today. Never have more of society’s resources gone in its direction. How can we harness this massive creative force for our benefit?
1945-1960 (creation): In this time scientific computing was largely taking place in the most important Labs on the most important topic with access to high priority and huge resources. Great innovations were taking place in computers and the practice of computing. Along with refinements in the engineering of computers, the practice of programming began to take shape. The invention of Fortran and its capacity to express methods and algorithms in code was one of the developments to bring this era to a close. In this time, the development of mathematical theory and numerical analysis was key. The invention of stability, and convergence of numerical methods was one of the great achievements. These provided a platform for systematic development in the 1960’s.
computational science. For the first time the computers and software was balanced with the methods and models. In many ways Seymour Cray defined the era first with the CDC 6600 and 7600 computers then with the machines bearing his name. The vision set forth by von Neumann came into force. Academic scientific computing became completely respectable with mathematics, physics and engineering all taking part. The first hints of extreme hubris were witnessed; the “numerical wind tunnel” debacle unfolded in aerospace. The ability for CFD to displace physical wind tunnel testing in design and qualification was a mas
sive over-reach in capability. Great damage was done in the process, and no one seems to have learned from the experience. It foreshadows the developments of the current time with ASC when the creation of “virtual underground testing” was proposed to make up for a ban on actual underground testing.
1995-2015 (mid-life): Then the glory days ended with a bang through a combination of events. The Cold War ended and soon nuclear testing ceased. The Labs would have their own “numerical wind tunnel” moment, but no actual wind tunnel would be available to test it. At the same time the capacity of the supercomputers of the golden era to maintain Moore’s law came to an end. The entire ASC program hinged upon the premise that advances in computational performance would pave the way for predictive simulation. We had the attack of the killer micros and the birth of massively parallel computation to keep hardware performance on the increase.
Getting the methods and models of old to work on these computers became an imperative; the access to more computing power via Moore’s law became an imperative as well. At the same time the complexity of the codes was growing by leads and bounds. New programming paradigms were being ushered into use with C++ leading the way. Its object-oriented principles were thought to be a way to handle the seemingly overwhelming complexity. With more resources flowing into hardware and software the amount of energy going into methods and models waned. Where efforts in these endeavors had previously yielded gains larger than Moore’s law such gains have simply evaporated during this era.
The most evident crisis is the demise of Moore’s law. Given the devotion to computing power as the route to predictive computational science, the loss of growth in computing power would be fatal. There are two worrying signs: the growth in computing power at the processor level has slipped to a crawl, and the ability to use all the power of the massively parallel computers for real problems is missing. At the low end of computing nothing will save Moore’s law especially as the computing industry has moved on to other priorities. It is just accepted. At the high end we grasp on to terrible metrics like weak scaling, or LINPAC to hide the problems, but the immensity of the issues become clearer every day. In the middle of this Moore’s law is clinging to life, but the two sides are converging on the middle and when they do Moore’s law will be dead. There are a host of hopes for life, but the laws of physics are arrayed against the continuation of this trend. With all the effort going into using Moore’s law what will be left to pick up the pieces?
The third and most shadowy crisis is lack of impact from methods, models and algorithms in the most modern era of scientific computing. As I said earlier, part of the problem are the twin crises of decline in hardware gains and software-bloat sapping the energy from the system? Before our infatuation with Moore’s law as the heartbeat of progress innovation in algorithms, numerical methods and modeling produced more progress than hardware gains. These gains are harder to measure and far subtler than raw computational performance, but just as real. As hardware fades away as a source of progress they are the natural place to turn to for advances. The problem is that we have starved this side of scientific computing for nearly 20 years. Major changes are needed to reinvigorate this approach. As I’ve come to realize the software languages are themselves a massive algorithmic achievement (Fortran is listed among the 10 greatest algorithms of the 20th century!). This is to say that intellectual labor toward figuring out how to program computers in the future is part of this issue and a necessary element in fixing two of the crises.
The problem with the money spent on computational science is one of lack of balance and aggression. There is plenty of money; the problem is what that money is spent on. Too much focus has been made on hardware with a reliance on Moore’s law for progress. Hardware improvement has been a sure thing for half a century, and thus a low risk. Recent investments in scientific computing have largely forgotten the history of scientific computing. A lot of issues remain unaddressed by current work. This is a theme I’ve touched on before (
There is a significant practical problem with maintaining the progress provided by Moore’s law. It has become very hard to do. This is because Moore’s law is already dead; at least its dead in any practical sense. For a tiny and relatively unimportant, impractical application we can still make it work, but at a ridiculous cost. For most of the things we really use computers to do, Moore’s law died about 10 years ago. To keep it alive we make computers that are incredibly hard to use, to build and maintain. They use too much power and cost way too much money. The resources going into this “fool’s errand” are starving all the work that actually makes the computers useful. This downward spiral needs to end. The commercial computing world has already divested from Moore’s law, and now focuses on software and communication capability for value. The hardware is improving, but modestly compared to the past.
beyond the date it should be rewritten. It becomes less useful and more expensive to maintain. These costs amplify over the long-term, but in the short-term, the patch and kick the can down the road approach is viable. The problem simply gets worse every year. We are unwilling to deal with rewriting the software because of the investment they represent. The problem is quite analogous to the Nation’s physical infrastructure problems.
We also have issues regarding methods and models. As code becomes more capable, it becomes harder to develop new codes because it is so expensive to measure up. The old codes can do marvelous things for practical problems. Testing new methods and models becomes nearly intractable. Ideas to make methods and models abstract with “components” have largely failed to provide a path forward. Part of the issue is the inability for component-based methods to solve “real” applications, which requires a lot of dirty work (robustness, efficiency, reality). As a result the older methods and models have become engrained. As this happens the community working on methods and models becomes estranged from computing. Additionally, the effort to put these older codes on new computers has become extremely difficult and expensive. This is compounded by the size of the code base. Together we have the recipe for disaster.
hat are outrageously expensive, but less useful every year. It simply cannot be sustained. The short-term thinking and the lack of tolerance for risk keep us from solving any of these problems. We end up of settling for mediocrity as a result.
won’t have the security of doing nothing and making progress. The problem is that they don’t recognize the exorbitant costs of propping up Moore’s law for the last decade, or the cost of what has been scarified. The terrible thing is that the costs and risks of the path we’ve taken are far higher. We are moving toward a collapse of astounding proportions. Instead of building a sustainable future we are building on the past while losing sight of how we actually got here. For decades the mathematics and physics were miles ahead of computers. During the late-70’s computers caught up and for fifteen or twenty years there was a glorious balance of computing hardware and intellectual capital. We have lost sight of what made all of this possible, and we are taking a huge risk moving forward.
At some level it all stems from fear of failing, which ironically, leads to actual failing, or at least success that is so modest that it seems indistiquishable from failure for success-minded folk. I simply don’t see an appetite for progress that can overwhelm the desire to never appear to fail. This outcome is assured by the belief that we can manage our way to success (and manage away failure), and the short-term focus for everything.
t ones hiding in the shadows will never see the light. They are viewed as bad because they don’t fit conventions. Conventions are safe, and lead to the form of mediocrity masquerading as success today.
happiness in my private life, so I mean the best gift for my professional life. It would lead to a much more vibrant and rewarding research life than the zombie-like march toward mediocrity we’re been on for several decades. The mediocrity is being fueled by the tendency to celebrate false achievement to generate the need for unvarnished success. So what would I like to find under the tree this morning?
l science opens doors to understanding unavailable to experimental and theoretical science; it is a compliment to each, but not a replacement. A large part of the problem is the extent to which it has been sold as a replacement to traditional science. This is at the core of the problem, but there is much more wrong with the direction.
performance computing world whole. It’s being taken over today by an attack of legions of portables. The entire computing industry has become enormous and dominated by cell phone and the impending “Internet of things”. Before the killer micros, we had custom machines and custom chips tailored to the scientific computing world, and typified by Crays.
with computing hardware has gotten far worse over the last twenty years. Sure the computers are much faster, but they are terrible to use. From a user’s perspective, the systems are worse. We have accepted the commodity bargain and a “worse is better” approach instead of demanding something focused on solving our problems. I most hypothesize that we would have been better off with somewhat slower computers that were more useful and better constructed to our tasks. In relative terms we have accepted crap in pursuit of a politically correct, but technically corrupt vision of computing. In pursuit of the fastest, biggest computer we have accepted worse actual, real performance.
esearch in scientific computing has not had enough impact over this time. In too many cases new numerical methods are still not being used in codes. The methods embedded in code are largely and significantly the same ones we used twenty years ago. The models too are twenty years old. The older models should be invalid many cases simply due to the refinement of the mesh, and the requisite change in time and length scales. New capabilities in uncertainty estimation and coupled physics are still research rather than deployed and producing results. In many cases the codes are two, three or four generations of methods and models overdue from a complete change. Back in the days of Crays and a bit more courage in research, new codes would be spawned every few years. Now we hold on to our old codes for decades. These new codes are the vehicle for new ideas, new methods and new models to be used for useful, important work.
Perhaps a useful analogy would be to think of high performance computing as a car. In this view the computer is the engine, and the code is the steering, interior, stereo, and other features. What kind of car have we been driving? Basically we are driving a car from the 1980’s with a series of new engines. The steering is the same, the interior has at most reupholstered and all the original equipment is still in place. Instead of being able to hook up your iPhone to the stereo, we are playing our old 8-tracks. No built in navigation either, so make sure you buy a map at the gas station. This works fine, but you won’t get any warning about the major construction zone. This is the bargain we’ve entered into; it is an absurd and unbalanced approach. If it were a car we wouldn’t stand for it, so why do we put up with this in computing?
Doug makes a number of good points in his commentary, but I think he misses the single biggest issue. The main problem with the list is the degree of political correctness associated with the list. He still hails from the point-of-view that Moore’s law must be pursued. It is the tail that is wagging the dog, and it has been in control of our mindset too long. Doug also hits upon some of the same issues that Greg touches on, and again software engineering done professionally is a necessity. Finally the increasing emphasis on V&V is touched upon, but as I will discuss, our current ideas about it miss something essential. V&V should be the discipline that points to where progress has been made, and been lacking.

degree to which human ingenuity plays a roll in progress. The program that funds a lot of what I work on, the ASC program, is twenty years old. It was part of a larger American effort toward science based stockpile stewardship envisioned to provide confidence in nuclear weapons when they aren’t being tested.
For example, think about weather or climate modeling and how to improve it. If we model the Earth with a grid of 100 kilometers on a side (so about 25 mesh cells would describe New Mexico), we would assume that a grid of 10 kilometers on a side would be better because it now uses 2500 cells for New Mexico. The problem is that a lot else needs to change too in the model to take advantage of the finer grid such as the way clouds, wind, sunlight, plant life, and a whole bunch of things are represented. This is true much more broadly than just weather or climate, almost every single model that connects a simulation to reality needs to be significantly reworked, as the grid is refined. Right now, insufficient work is being funded to do this. This is a big reason why the benefit of the faster computers is not being realized. There’s more.
problems of the modern supercomputing is the lack of effort to improve the solution of balance laws. We need to create methods with smaller errors, and when errors are made bias them toward physical errors. There’s more.
For special sorts of linear systems associated with balance laws we can do a lot better. This has been a major part of the advance of computing, and the best we can do is for the amount of work to scale exactly like the number of equations (i.e., linearly). As the number of equations grows large the difference between the cube and the linear growth is astounding. This linear algorithm were enabled by multigrid or multilevel algorithms invented by Achi Brandt almost 40 years ago, and coming into widespread use 25 or 30 years ago.
We can’t really do any better today. The efforts of the intervening three decades of supercomputing has focused on making multilevel methods work on modern parallel computers, but no improvement algorithmically. Perhaps linear is the best that can be done although I doubt this. Work with big data is spurring the development of methods that algorithmically scale at less than linear. Perhaps these ideas can improve on multigrid’s performance. The key is would be to allow inventiveness to flourish. In addition risky and speculative work would need to be encouraged instead of the safe and dull work of porting methods to new computers.

Like most Americans I speak primarily with people who are like me. In my case it those educated in a scientific or technical field, working at a Lab or University, doing research. I don’t have a lot of contact with people of a different background. I do have a handful of conservative friends, and the difference in their Worldview is both understandable as it is stunning. What is really breathtaking is the difference in what we think is true. This is attributable to where our information comes from.
In large part the Internet has allowed everyone to draw from sources of information that suits their preconceptions. To a large extent we are fed information that drives us
public.
The traditional news media is dominated by corporate interests with Fox News leading the way, but the old three ABC, CBS and NBC being no different. MSNBC is viewed as the liberal vanguard, but again it’s no different either. Once this dynamic was primarily associated with Newspapers, but as they die, it has been replaced by TV and as they begin to die by the Internet. Big money is running the show I every case, and finding a niche that provides them profit and power.
Sometimes it’s semi-innocuous such as advertising for a TV show or movie. The dangerous aspect is the continuous avoidance of issues that the big moneyed interests don’t want portrayed, discussed or explored. This lack of coverage for a class of issues associated with money and class is poisoning democracy, tilting the discussion, and ultimately serving only the short-term needs of the businesses themselves.
A more innocuous aspect is the slanting of the political dynamic, which is happening pervasively via Fox News and its use of a message firmly associated with a single political agenda. In the UK it’s Sky that does the same thing. Across the board the impact has been to turn up the heat on partisan bickering and diminish the ability of the democratic process to function. Part of the problem is that it becomes clear if you make the mistake of talking about such things, people no longer operate with the same facts. Each side of the debate simply cherry-picks facts to suit their aims and avoids facts that undermine their chosen message. As a result no one really gets the full story, and the truth is always slanted one way or another. As a result the ability to compromise and move forward on vexing problems has become impossible.
he situation today favors those who have power, and today power stems from wealth. Money is paying for information, and this information is paving the way toward an even greater accumulation of power. The Internet is not providing the democratization of knowledge, but rather giving those already in power access to unparalleled capability to issue effective propaganda. Those in power are assisted by a relatively weak government whose ability to counter their stranglehold on society is stymied by inaction.
One of the things that continually bother me the most about the changes where I work is the sunset of the individual from importance. The single person is gradually losing power to the nameless and increasingly faceless management. Increasingly everyone is viewed as a commodity, and each of us is interchangeable. Another scientist or engineer can be slotted into my place without any loss of function. My innate knowledge, experience, creativity, passion are each worthless when compared to the financial imperatives of my employer. We are encouraged if not commanded to be obedient sheep. Along the way we have lost the capability to foster deep sustained careers, the current regime encourages the opposite. The reason given for sacrificing this aspect of work is financial. No one can pay for the cost of the social construct necessary to enable this. I think that is BS, the reason is power and control. 
employer. This model was the cornerstone of the National Laboratory system. The Nation benefited greatly from the model both in terms of security and National defense, but also from the scientific and engineering greatness it engendered.
scientists during the Manhattan Project provided the catalyst to extend this model more broadly. Their achievements fueled its continued support into the late 70’s. Then it was deemed to be too expensive to maintain. The management of the Labs is choking this culture to death. If it isn’t already gone, it soon will be.
A deep part of its power was the enabling of individual achievement and independent thought. Perhaps more than the cost of the social contract, the Nation has allowed the force of conformity, lack of trust and fear of intellect to undermine this model. While the financial costs have escalated largely due to systematic mismanagement and the absence of political courage and leadership, it has been the excuse for the changes. While the details are different at the Labs the overall forces are hand-in-hand with the overall destruction of the middle class who was offered a similar social contract in the post war years. This has been replaced by cheap labor, or outsources always with the excuse about cost.
objectives actually undermined the achievement of innovation. It was a fascinating and provocative idea. I knew it would be the best thing I did all day (it was), and it keeps coming back to my thoughts. I don’t think it was the complete answer to innovation, but Professor Stanley was onto a key aspect of what is needed to innovate.
In other words, the system we have today is harmful. We are committed to continue down the current path, results be damned. We have to plan and have milestones just like business theory tells us with no failure being accepted. In fact we seem to act as if failure can simply be managed away. Instead of recognizing that failure is essential to progress, and that failure is actually healthy, we attempt to remove it from the mix. Of course, failure is especially unlikely if you don’t try to do anything difficult, and choose your objectives with the sort of mediocrity accepted today because lack of failure is greeted as success.
The courage that once described Americans during the last century has been replaced by a fear of any change. Worries about a variety of risks have spurred Americans to accept a host of horrible decreases in freedom for a modest-to-negligible increase in safety. The costs to society have been massive including the gutting of science and research vitality. Of course fear is the clearest was to pave the way for the very outcomes that you sought to avoid. We eschew risk attempting to manage the smallest detail as if that might matter. This is combined with a telling lack of trust, which implies a certain amount of deep self-reflection. People don’t trust because they are not trustworthy and project that character onto others. The combination of risk avoidance, and lack of trust produces a toxic recipe for decline and the opposite of the environment for innovation.
We collectively believe that running everything like a business is the path to success. This means applying business management principles to areas it has no business being applied to. It means applying principles that have been bad for business. Their application has destroyed entire industries in the name of short-term gains provided to a small number of shareholders. The problem is that these “principles” have been extremely good for a small cadre of people who happen to be in power. Despite the obvious damage that they do, their application widely makes perfect sense to the chief beneficiaries. It is both an utterly reasonable and completely depressing conclusion.
When returning to the theme of how to best spur innovation and its antithetical relation to objectives, I become a bit annoyed. I can’t help but believe that before we can build the conditions for innovation we must destroy the false belief that business principles are the way to manage anything. This probably means that we have to see a fundamental shift in what business principles we favor. We should trade the dictum of shareholder benefit for a broader social contract that benefits the company’s long-term health, the employees and the communities as well. Additionally, we need to recover our courage to take risks and let failure happen. We need to learn to trust each other again and stop trying to manage everything.
I’m sure that most people would take one look at the sort of things I read professionally and collectively gasp. The technical papers I read are usually greeted by questions like “do you really understand that?” Its usually a private thing, but occasionally on a plane ride, I’ll give a variety of responses based on the paper from “yeah, it actually makes sense” to “not really, this paper is terrible, but I think the work might be important.”
and becomes increasingly unapproachable to anyone else. This tendency should mark the death knell of the area, but instead the current system seems to do a great deal to encourage this pathology.
Other areas seem to be so devoid of the human element of science that the work has not contextual basis. In every case science is an intrinsically human endeavor, yet scientists often work to divorce humanity from the work. A great deal of mathematics works this way and leads to a gap in the understanding of the flow of ideas. The source and inspiration for key ideas and work is usually missing from the writing. This leads to a lack of comprehension of the creative process. A foolhardy commitment to loses history only including the technical detail in the writing. Part and participle of this problem are horrific literature reviews.
In some fields the number of citations for the work is appalling. The author ends up providing no map for the uninitiated reader to figure out what they are talking about. Again this works both to hide information and context while making the author seem smarter than they really are.