• About The Regularized Singularity

The Regularized Singularity

~ The Eyes of a citizen; the voice of the silent

The Regularized Singularity

Category Archives: Uncategorized

A response to criticism: Are we modernizing our codes?

14 Thursday Jan 2016

Posted by Bill Rider in Uncategorized

≈ 1 Comment

To avoid criticism say nothing, do nothing, be nothing.

― Aristotle

One warning there will be some bad words in the post today. If you don’t like it, don’t go on. It is the way most of us really talk about stuff!

legacy-code-1Earlier this week I gave a talk on modernizing codes to a rather large audience at work. The abstract for the talk was based on the very first draft of my Christmas blog post. It was pointed and fiery enough to almost guarantee me a great audience. I can only hope that the talk didn’t disappoint. A valid critique of the talk was my general lack of solutions to the problems I articulate. I countered that the solutions are dramatically more controversial than the statement of the problems. Nonetheless the critique is valid and I will attempt to provide the start of a response here.

Here are the solutions I would propose at a high level:

  1. Constantly question ourselves on whether we are going in the right direction. Today we follow directions mindlessly, and refuse to make reasonable course corrections. (this point was a part of the talk! I think it is the vital starting point)
  2. Micromanagement is killing us; we need to macromanage at focus on large objectives.
  3. Focus on improving reality and stop focusing on large-scale projects that produce impact in the real world.
  4. Destroy the mediocrity, obedience and compliance cultures that have arisen because of fear-based management and decision making
  5. Allow failure, encourage risk, and celebrate mistakes
  6. If something is really failure, allow it, say it and be all right with it. No more bullshit statements of success where reality is a train wreck.
  7. Create a learning environment where failure and mistakes create opportunity for growth
  8. Focus on excellence rather than subservience and mediocrity.
  9. Stop using money as a measure of success
  10. Rid us of short-term thinking and measurement of success. Start looking toward long-term success
  11. Rid ourselves of the vast amount wasted effort going into menial tasks that serve no purpose (obedience and compliance related)
  12. Get rid of project management of science. This concept is utterly and completely destructive for the conduct of science.
  13. We are never prepared for serendipitous outcomes and the change of direction they should provide.
  14. We are ruled by fear and use this fear as the reason to avoid risk and fail to reach highly for achievements.
  15. Quit allowing our system to turn failure into success. Let failure happen. Celebrate. Learn move forward. By allowing failure to be rebranded as success lets failure serve the wrong purpose.

If failure is not an option, then neither is success.

― Seth Godin

bullshit_everywhere-e1345505471862ASC is a prime example of failing to label and learn from failures. As a result we make the same mistakes over and over again. We are currently doing this in ASC in the march toward exascale. The national exascale initiative is doing the same thing. This tendency to relabel failures as success was the topic of my recent “bullshit” post. We need failure to be seen as such so that we do better things in the place of repeating our mistakes. Today the mistakes simply go unacknowledged and become the foundation of a lie. Such lies then become the truth and we lose all contact with reality. Loss of contact with reality is the hallmark of today’s programs.

Remember the two benefits of failure. First, if you do fail, you learn what doesn’t work; and second, the failure gives you the opportunity to try a new approach.

—Roger Von Oech

6767444295_259ef3e354One of the serious problems for the science programs is their capacity to masquerade as applied programs. For example ASC is sold as an applied program doing stockpile stewardship. It is not. It is a computer hardware program. Ditto for the exascale initiative, which is just a computing hardware program too. Science or the stockpile stewardship missions are mere afterthoughts. The hardware focus becomes invariant to any need for the hardware. Other activities that do not resonate with the hardware focus simply get shortchanged even when they have the greatest leverage in the real world.

[Warning foul language ahead!]

In other words, when things are fucked up, someone needs to say so, and the fuck up needs to be acknowledged. If we don’t we can’t learn and we end up fucking up again. The general bullshit way of reporting success for things that are not successful creates the environment where fuck ups are not viewed as such. Good fuck ups should not end careers they should be highlights as long as you learn from it. If you don’t learn from your mistakes then you have a problem. Today we have the bigger problem of never allowing ourselves to ever learn from a genuine mistake, which only plants the seeds of ever greater fuck ups in the future.

dt160111.gif

The ubiquity of the humor of the Dilbert comics lends perspective to all of this. Dilbert is dominated by issues from the corporate world and seems to indicate that the issues we are having a pervasive society-wide. Corporate governance is as messed up as government funded research. I’m almost certain that academia is in a similar or worse shape. Some places like NASA are envious of the screwed up state of affairs we have in DOE. I probably don’t court too much controversy by saying that money is at the root of many problems largely because it’s become the universal way of keeping score of success.

Judge a man by his questions rather than by his answers.

― Voltaire

 

 

What is important to work on?

08 Friday Jan 2016

Posted by Bill Rider in Uncategorized

≈ 2 Comments

Dare to think for yourself.

― Voltaire

6a00d8341c51c053ef00e54f8863998834-800wiThe beginning of the year is a prime time for such a discussion. Too often the question of importance is simply ignored in lieu of simple and thoughtless subservience to other’s judgment. If I listen to my training at work, the guidance is simple. Do what you’re paid to do as directed by your customer. This is an ethos of obedience and your particular judgment and prioritization is not really a guide. This is a rather depressing state of affairs for someone trained to do independent research; let someone else decide for you what is important, what is a priority. This seems to be what the government wants to do to the Lab, destroy them as independent entities, and replace this with an obedient workforce doing whatever they are directed to do.

This is one of the innovator’s dilemmas: Blindly following the maxim that good managers should keep close to their customers can sometimes be a fatal mistake.

― Clayton M. Christensen

When one takes a long critical look at your customer’s credentials and knowledge, the choice to seek their guidance on importance and priority becomes laughable. They are usually radically less competent to make such choices than most of you. Perhaps this is just my good old-fashioned Los Alamos arrogance showing. My customers don’t actually know best! The problem with the current standard of priority setting is that the customer’s guidance is destroying research, and slowly suffocating formerly great sources of innovation and excellence. Where independent and competent scientific and engineering leadership once underpinned the Nation’s security, the forced compliance and obedience has bred cultures of mediocrity and pathological avoidance of risk driven by fear of any failure.

imgresAn important, but depressing observation about my work currently is that I do what I am supposed to be doing, but it isn’t what is important to be doing. Instead of some degree of autonomy and judgment being regularly exercised in my choice of daily activities, I do what I’m supposed to do. Part of the current milieu at work is the concept of accountability to customers. If a customer pays you to do something, you’re supposed to do it. Even if the thing you’re being paid for is a complete waste of time. The truth is most of what we are tasked to do at the Labs these days is wasteful and nigh on useless. It’s the rule of the day, so we just chug along doing our useless work, collecting a regular paycheck, and following the rules.

Actually that isn’t completely true, I try to carve out time each day to write something. It’s a priority to myself that I exercise almost without fail. I have endeavored to make it a habit. It is actually several habits in one. For example, I write a personal journal each day to capture my innermost thoughts about my life and it’s an invaluable sounding board for myself. It is writing and that is good in and of it self, but it is completely private. Being private I don’t share it, or expect anyone else to read it. As such it doesn’t have to be clear or make sense to anyone, but me. On the other hand, the blog is intended to be read although that isn’t its purpose. It serves several purposes: the habit of writing, venting off steam, and working on concepts that are occupying my thoughts. All three things are very important to me, and I make them a priority for each and every day.

I wish that the same concept and discipline could be applied more broadly. Sadly it isn’t and most of us spend our time doing useless stuff. Most of what all of us are paid to work on is a complete waste of time. I end up spending most of my time communicating between groups of people, negotiating agreements and troubleshooting. I do far too little of the creative technical work that I love. Most of my writing is focused on raging against the forces of mediocrity eating away at any focus on doing something useful, important and high in quality. I recently came to the realization that the first ten years of my career were easily characterized by having my generally high expectations over-delivered upon at work; while the last fifteen years have seen my ever lowering expectations under-delivered upon in a seemingly endless search for the bottom. I wonder just how much lower my expectations need to drop to match the reality of today.

So, what might be more important than the crap they pay us to do? There are a lot of things in my personal life that qualify as more important than anything at work. This is rather obvious, but sad as the importance of work ought to put up a better fight. The problem with important work is that it is risky and difficult. We might fail at it, and failure is something we can’t bear to do these days. Ultimately we can’t muster the will to do important things because that would make our work real, consequential and meaningful, but also risky.

What’s important?

Judge a man by his questions rather than by his answers.

― Voltaire

Mainframe_fullwidthThe real world is important. Things in the real world are important. This is an important thing to keep in mind at all times with modeling and simulation. We are supposed to be modeling the real world for the purpose of solving real world problems. Too often in the programs I work on this seemingly obvious maxim gets lost. Sometimes it is completely absent from the modeling and simulation narrative. Its lack of presence is palpable in today’s efforts in high performance computing. All the energy is going into producing the “fastest” computers. The United States must have the fastest computer in the World, and if it doesn’t it is a calamity. The fact that this fastest computer will allow us to simulate reality better is a foregone conclusion.

Cray XE6 imageThis is a rather faulty assumption. Not just a little bit faulty, but deeply and completely flawed. This is true under a set of conditions that are increasingly under threat. If the model of reality is flawed, no computer, no matter how fast can rescue the model. A whole bunch of other activities can provide an equal or greater impact onto the efficiency of modeling than a faster computer. Moreover the title of fastest computer has less a less to do with having the fastest simulation. The benchmark that measures the fastest computer is becoming ever less relevant to measuring speed with simulations. So in summary, efforts geared toward the fastest computer are not very important. Nonetheless they are the priority for my customer.

So what do I do? Being a good little trooper, I work on the fastest computer stuff cause it pays the bills.

So what should we be doing instead?

I might suggest that we take a page from the success of the wider computing industry, and figure out how to make computing more ubiquitous in the daily life of a scientist. Well that’s already happened by fiat as scientists are functioning (just barely in many cases) members of society. So, let me be a bit more specific; make modeling and simulation part of the daily life of a scientist or engineer. This is something we as a community are failing at. Progress is being made by pure inertia, but that progress is dismal compared to what it could be.

mad-menThe reason for the lack of progress is simple, high performance computing is still acting as if it were in the mainframe era. We still have the same sort of painful IT departments that typified that era. High performance computing is more Mad Men than Star Trek. The control of computing resources, the policy based use and the culture of specialization all contribute to this community-wide failing. We still relystar trek tosupon centralized massive computing resources to be the principle delivery mechanism. Instead we should focus energy on getting computing for modeling and simulation to run seamlessly from the mobile computer all the way to the supercomputer without all the barriers we self-impose. We are doing far too little to simply put it at our collective
fingertips and make it a simple and ubiquitous part of what gets done.

The mobile computing is the key and mobile computing is not embraced by high performance computing. The challenge is to get the mobile computing to figure into the workflow of the average scientist. This is to do scientific work, not check Facebook, or email, or right swipe (they’ll continue to do this). The work should seamlessly move between the big iron and the mobile as appropriate. Remember if Linpac is to be believed my iPhone is as the Cray-2 was and certainly capable of doing some real work. If we can create this environment modeling and simulation would take off and truly meet its potential. Without the change it will continue iphone_4sto be a niche activity, and not fulfill its potential.

So what needs to change to accomplish this sort of transformation and success? First, the emphasis on big iron needs to diminish because it takes all the energy available. The trick is to make improvements in modeling, methods and algorithms and enable the more correct and efficient use of computing. Part of the modeling is the practice of modeling and simulation. Today we have generally poor practices, and a deep skewing of modeling by the tendency to adopt hero calculations. These calculations are focused at using the most massive computing available rather than be fitted to the purpose and suitable to assessment. The methods and algorithms can provide fundamental efficiency and scaling that will enable more to be done at the mobile level of computing.

People don’t want to buy a quarter-inch drill. They want a quarter-inch hole.

― Clayton M. Christensen

One of the modern mantras of business is the “disruptive innovation”. These innovations are sought out as the route to marketplace supremacy, creating a product that changes the business landscape and overwhelms the competition. These disruptive innovations are viewed as only being good things. I will challenge that assertion with the belief that massively parallel computing was a disruptive innovation (a common assertion), but not a positive one. It has been a negative innovation and successfully undermined the broader pipeline of innovation and discovery in high performance computing replacing with a slavish devotion to hardware as the route to progress.

To put increasingly outdated codes on these computers has taken all the available energy and robbed us of key innovations in models, methods and algorithms that would have been more beneficial than all the hardware gains. In addition these innovations would have created bona fide intellectual products that the current scientists would own, understand and benefit from mastery over. The computing hardware has become increasingly ill suited for scientific computing as well as being difficult to use. The next generation of computing promises to be worse in almost every regard. The focus on the massively parallel aspects of implementations has also starved focus on creating methods that are well suited to the modern CPU’s. As a result we utilize a tiny fraction of the available computing power from each CPU. In sum the current and future hardware focus has been destructive, wasteful and counter-productive. It may be laying waste to an entire generation of computational scientists.

The high performance computing community would be well advised to choose a different path towards progress before its too late. The path should acknowledge the holistic path to progress and the role of prioritizing work that impact the world beyond computing. It would be inspiring to take a page from the broader computing industry’s embrace of mobile computing as the route to ubiquitous computing that impacts the life of those who use it in immutable ways. I will take note that the death of Moore’s law for microprocessors and the assent of mobile computing happened in the same time frame. Worth thinking about, are these events correlated? We should realize that code is simply a computer recognizable expression of intellectual ideas. It should be understood if it is to be used for anything serious.

All of us should do some serious thought about what is important to work on. I think most of the current research emphasis in high performance computing is focusing on the trivial as the essential, and effectively trivializing the essential. A future where real progress and impact is made depends on changing this dynamic around.

‘Controversial’ as we all know, is often a euphemism for ‘interesting and intelligent.

― Kevin Smith

Are we really modernizing our codes?

01 Friday Jan 2016

Posted by Bill Rider in Uncategorized

≈ 2 Comments

Real generosity towards the future lies in giving all to the present.

― Albert Camus

cell-phoneIt goes without saying that we want to have modern things. A modern car is generally better functionally than its predecessors. Classic cars primarily provide the benefit of nostalgia rather than performance, safety or functionality. Modern things are certainly even more favored in computing. We see computers, cell phones and tablets replaced on an approximately annual basis with hardware having far greater capability. Software (or apps) gets replaced even more frequently. Research programs are supposed to be the epitome of modernity and pave the road to the future. In high end computing no program has applied more resources (i.e., lots of money!! $$) to scientific computing than the DoE’s Advanced Simulation & Computing(ASC) program and its original ASCI. This program is part of a broader set of science campaigns to support the USA’s nuclear weapons’ stockpile in the absence of full scale testing. It is referred to as “Science-based” stockpile stewardship, and generally a commendable idea. Its been going on for nearly 25 years now, and perhaps the time is ripe (over-ripe?) for assessing our progress.

So, has ASC succeeded?

Unknown-3My judgment is that ASC has succeeded in replacing the old generation of legacy codes with a new generation of legacy codes. This is now marketed to the unwitting masses as “preserving the code base”. This is a terrible reason to spend a lot of money and fails to recognize the real role of code, which is to encode expertise and knowledge of the scientists into a working recipe. Legacy codes simply make this an intellectually empty exercise making the intellect of the current scientists subservient to the past. The codes of today have the same intellectual core as the codes of a quarter of a century ago. The lack of progress in developing new ideas into working code is palpable and hangs heavy around the entire modeling and simulation program like a noose.

It’s not technology that limits us. We’re the limitation. Our technology is an expression of our intelligence and creativity, so the limitations of our technology are a reflection of our own limitations. We can’t fundamentally advance technology until we fundamentally advance ourselves.

― Christian Cantrell

legacy-code-1A modern version of a legacy code is not modernizing; it is surrender. We have surrendered to fear, and risk aversion. We have surrendered to the belief that we already know enough. We have surrendered to the belief that the current scientists aren’t good enough to create something better than what already exists. As I will outline this modernization is more of an attempt to avoid any attempt to engage in risky or innovative work. It places all of the innovation in an inevitable change in computing platforms. The complexity of these new platforms makes programming so difficult that it swallows every amount of effort that could be going into more useful endeavors.

The prevailing excuses for the modernization program we see today are the new computers we are buying. These computers are the embodiment of the death rattle of Moore’s law. These computers are still the echoes of the mainframe era that died so long ago, but lives on in scientific computing. The whole model of scientific computing is anything but modern; it is a throwback to a bygone era that needs to die. Mobile computing drives computing today and the true power of computing is connectivity and mobility, or perhaps ubiquity. These characteristics have not been harnessed by scientific computing.

The future is already here – it’s just not evenly distributed.

― William Gibson

mistakesdemotivatorIs a code modern if it executes on the newest computing platforms? Is a code modern if it is implemented using a new computer language? Is a code modern if it utilizes new software libraries in its construction and execution? Is a code modern if it has embedded uncertainty quantification? Is a code modern if it does not solve today’s problems? Is a code modern if it uses methods developed decades ago? Is a code modern if it runs on my iPhone?

What makes a code, or anything else for that matter, modern?

For the most part the core of our simulation codes are not changing in any substantive manner. Our codes will be solving the same models, with the same methods and algorithms, using the same meshing approach and the same analysis procedures. The things that will be changing are the coding and implementation of these model, methods and algorithms. The operating systems, system software, low-level libraries and degree of parallelization will all change substantially. The computers we run the codes on will change dramatically too. So at the end of process will our codes be modern?

imagesThe conventional wisdom would have us believe that we are presently modernizing our codes in preparation for the next generation of supercomputers. This is certainly a positive take on the current efforts in code development, but not a terribly accurate characterization either. The modernization program is certainly limited to the aspects of the code that have the least impact on the results, and avoids modernizing the aspects of a code most responsible for its utility. To understand this rather bold statement requires a detailed explanation.

Ultimately if our masters are to be believed, the point of ASC, SBSS and our codes is the proper stewardship of the nuclear weapons stockpile. The stockpile exists in the real, physical world and consists of a decreasing number of complex engineered systems we are charged with understanding. Part of that understanding involves the process of modeling and simulation, which needs a chain of activities to succeed. Closest to reality is our model of reality, which is then solved by a combination of methods and algorithms, which in turn are implemented in code to run on a computer. All of this requires a set of lower-level libraries and software that effectively interface the coded implementation with the computer. Finally we have the computer that runs the code.

Each one of these steps is essential and must work properly to succeed; the needs of each step need to be balanced against the others in a holistic fashion. For example no amount of computer power, computer science, or scaling will ever rescue a code whose models are flawed. If you believe that our models as presently stated are inappropriate to answer the questions facing the stockpile today (and I do), the current program does nothing to alleviate this problem. I believe we have failed to properly balance our efforts, and allowed ourselves to create a new generation of legacy codes to replace the previous one. A legacy code is the opposite of a modern code, but its exact what we have made.

The goal of a life purpose is not what you will create, but what it will make you into for creating it.

― Shannon L. Alder

A major problem with the approach we have taken to computing is the impact on the careers of our staff. Instead of producing a cadre of professionals spanning the full spectrum of the necessary knowledge and skills, we have a skewed base. The bias toward stewardship by massive computer power without emphasis on modeling, methods or algorithms, the development of our scientists and engineers is similarly and unhealthily skewed as well. By not embracing a holistic path with an emphasis on creation and innovation, the development of the current generation of scientists and engineers is stunted. We see our current path perpetuating both an unbalanced approach that amplifies its harmful impact by eschewing risky research and avoiding both innovation and discovery in the process. This produces the knock on effect of killing the development of our staff.

It is notable that this is a New Year’s Day post; so the future is here. Given this, and upon some reflection a research program isn’t really good enough if it is modern, it must be futuristic. Research should be creating the future, not simply be in the present. If research is stuck in the past, the future really can’t be accessed. My concern is that the view of low risk endeavors is severely shaped by what has succeeded in the past. The best way to be successful, at least superficially, is to do what has worked in the past. This seems to be what we are doing in high performance computing. We build the codes that worked in the past and put them on our big mainframes. The truth is we can’t be modern if we are in the past, and we will never create the future.

computational_fluid_hSo this is where we are at, stuck in the past, trapped by our own cowardice and lack of imagination. Instead of simply creating modern codes, we should be creating the codes of the future, applications for tomorrow. We should be trailblazers, but this requires risk and taking bold chances. Our current system cannot tolerate risk because it entails the distinct chance of failure, or unintended consequence. If we had a functioning research program there is the distinct chance that we would create something unintended and unplanned. It would be wonderful and disruptive in a wonderful way, but it would require the sort of courage that is in woefully short supply today. Instead we want to have certain outcomes and control, which means that our chance of discovering anything unintended disappears from the realm of the possible.

With relative ease this situation could be rescued. Balance could be restored and progress could proceed. We simply need to produce a greater focus and proper importance on the issues associated with modeling, methods and basic algorithms (along with appropriate doses of experiments, physics and real applied math). Each of these areas is greatly in need of an injection of real vitality and modernity, and offer far greater benefits than our present focus on computing hardware. It is arguable that we have evolved to point where the emphasis on hardware is undermining more valuable efforts. This would require a requisite reduction is some of computer science and hardware focus, which is useless without better models anyway.

titan-supercomputerThe core of the issue is the difficulty of using the next generation of computers. These machines are literally monstrous in character. They raise parallelism to a level that makes the implementation of codes incredibly difficult. We are already in a massive deficit in terms of performance on computers. For the last 25 years we have steadily lost ground in accessing the potential performance of our computers. Our lack of evolution for algorithms and methods plays a clear role here. By choosing to follow our legacy code path we are locked into methods and algorithms that are suboptimal both in terms of performance, accuracy and utility on modern and future computing architectures. The amount of technical debt is mounting and magnified by acute technical inflation.

I’ll posit an even more controversial idea about massively parallel computers. These machines were a bona fide disruptive innovation, but instead of disrupting positively as this concept usually applies, parallel computing has been destructive. The implementation of standard scientific computing models and methods has been so difficult that more valuable efforts have been decimated in the process. For example numerical linear algebra has been completely static algorithmically for thirty years. The effort to merely implement multigrid on parallel computers has swallowed all of the innovation. The problem is that the innovative algorithmic progress would crush the impact of implementations with a single breakthrough. Have we been denied a breakthrough because of effort is all focused on implementation?

The clincher is that the next generation of computing may be even more catastrophically disruptive than the previous one.

To accomplish this we would have to turn our back on the mantra of the last quarter century; we just need a really fast computer (preferably the fastest one on Earth) and the stockpile will be OK. This mindset is so incredibly vacuous as to astound, but the true epitome of modernity is superficiality. The view that a super fast computer is all we need to make modeling and simulation work effectively is simplistic in the extreme. In modern America simplistic is what sells. Americans don’t do subtlety and the current failures in high performance computing can all be linked to the subtle differences between the simplistic messaging that gets funding and the subtle messaging of what would be effective. Our leaders have consistently chosen to focus on what would get funded over what would be effective. We cannot continue to make these choices and be successful; the deficit in intellectual depth will come due soon. Instead of allowing this to become a crisis we have the opportunity to get ahead of the problem and change course.

The best way to predict your future is to create it.

― Abraham Lincoln

The real goal should not be modernizing our codes; it should be creating the codes of the future. First, we must throw off the shackles of the past and refuse to perpetuate the creation of a new generation of legacy codes. The codes of the future should solve the problems of the future using futuristic model, methods and algorithms. If we continue to keep our attention in the past and promoting the continued preservation of an antiquated code base, the future will never arrive. Simply implementing the codes of the past so that they work on new computers is merely a useful exercise for proofing concepts. These computers purchased at great cost should be looking forward, not back, with fresh eyes and new ideas for solving the problems ahead of us, not yesterday’s.

Today’s science is tomorrow’s technology.

― Edward Teller

We owe the future nothing less. The future is in our hands; we can make it into what we want to.

You realize that our mistrust of the future makes it hard to give up the past.

― Chuck Palahniuk

 

 

 

 

 

The Unfortunate Myth of the Hero Calculation

25 Friday Dec 2015

Posted by Bill Rider in Uncategorized

≈ 3 Comments

Legends don’t have to make sense. They just have to be beautiful. Or at least interesting.

― Terry Pratchett

titan2Along with the purchase of a big computer and the press releases trumpeting its massive power comes another form of systemic bullshit, the heroic calculation. Such heroic calculations provide massive insight into the World through this profoundly important scientific instrument, or so we are told. It is the largest simulation ever of these particular exotic phenomena and provides new perspectives as surely as the expensive new microscope or powerful telescope. The problem with these massive calculations is that this profound capability is a myth and largely complete and utter bullshit. The heroic calculation is an enormous pain-in-the-ass to perform, which justifies the heroism, but the utility of the calculation is completely over-stated. It is a stunt to “show” that the computer works, and sell its importance through propagating bullshit.

The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.

― Bertrand Russell

The calculation is usually characterized by how many computational elements it contains especially if it sounds like a lot (billions, so it must be great!), and how many floating operations it took (trillions and trillions, so of course its pretty awesome). These big numbers are completely meaningless and simply exist to sell the massiveness of the whole thing to unwitting lay people. Even more they say nothing whatsoever about the quality or utility of the calculation. These massive hero calculations are rarely assessed in any systematic way that defines any sense of uncertainty or accuracy. They are simply great because they are so fucking massive and huge. They are great because they are computed on this massive computer that cost a shitload of money too.

imagesThese calculations do have a utility in their trailblazing nature, but it is limited. These calculations shake out the kinks from using a new computing platform, or working at the scale that these calculation demand. As vehicles for doing science they are generically terrible, and the press releases touting the scientific achievements are bullshit. Instead the honest view of what is achieved is a voyage into the wilds of computing. The computation is done with shaky code, shaky compilers, and shaky operating systems on a shaky computer. They are achievements of will and patience too, but not science or engineering. Some day five years down the road calculation of the scale done for these heroic calculations become common. This is their achievement and it is important, but just not to the level we promote today.

Perseverance is the act of true role models and heroes.

― Liza M. Wiemer

LMCT_modellingThese calculations are largely just marketing for the expensive computers. As vehicles for science these calculations are largely useless. Science needs to be done properly. Proper science does not mean one big massive calculation, proper science means a whole bunch of smaller calculations whose purpose is to carefully study a problem and determine the quality of the simulation. One calculation is never the answer; it is the ensemble of many calculations that make for good computational science. If you see a massive calculation touted as a scientific achievement your mind should immediately ask the question, how do I know its any good? Are those conducting said big calculations careful and self-critical?

The bigger the dream, the better the story.

― Richelle E. Goodrich

A worse side effect of the hero calculation is the distortion of the scientific process to both the public, but the scientific community at large. Far too much modeling and simulation is conducted in this hero mode. The standard for quality from the hero mode calculations is unfortunately the common approach for science to proceed. It makes for lousy science and worse engineering. It is common to see the heroic stunt calculation lauded by management as a stunning achievement despite it being an intrinsically poor conduct of simulation and modeling. These stunts rarely have any verification, validation or uncertainty quantification done on them. They are simply supposed to be accurate by virtue of their massiveness. They are simply the money shot for the massive computer we just spent so much on.

Elmer-pump-heatequationThe problem is the nature of quantifying the uncertainties and accuracy of a modeling exercise conducted through simulation. These hero calculations are one-off exercises. They are not part of simulation campaigns that produce results that might be utilized effectively as part of quality work. They can be useful “what if” explorations of phenomena or hypotheses, but whatever the hero calculations find needs to be scrutinized heavily. A full spectrum of verification, validation and uncertainty quantification is needed to bring the hero calculation’s finding into repute. This process is science or engineering done right and its too boring for the sort of bullshit press releases we market our programs through.

A good example comes in the creation of models for complex systems. The common approach is to hand craft the model to utilize every bit of available resource and include every detail possible. This creates models that can only be run once or maybe a handful of times. The modeling approach does not lend itself to the creation of consistent models that have a coarser representation. This coarser representation could be used to assess accuracy (through solution verification), or conduct detailed uncertainty estimation. The detailed model simply can’t be used this way. It uses so much of the available computing resource that it can’t be refined either.

Don’t try to follow trends. Create them.

― Simon Zingerman

It may be the best single calculation that might be conducted, but its quality is unknown. The key question is what is important, the unknown quality of the single calculation (an presumed to be good one) or a sequence of presumably timeline-18lower quality, but fully assessed calculations? The scientific method would come down firmly on the side of the assessed lower quality simulations. Their assessment makes them useful and reliable, while the presumably superior single calculation is an unknown entity.

In other words, scientists don’t concentrate on what they know, which is considerable but also miniscule, but rather on what they don’t know. The one big fact is that science traffics in ignorance, cultivates it, and is driven by it.

― Stuart Firestein

In the end we need to wean ourselves from our addiction to the fiction of the hero calculation. These calculations make for good press releases, but lousy science and even worse engineering. They distort modeling and simulation into something it is not, and certainly something it shouldn’t be. It’s high time to kill the hero, reject the legend and replace them with hardworking foot soldiers. We need to recognize that the hero calculation has a limited value, which is primarily associated with its trailblazing character in computing, but not its scientific value.

Superficiality is the curse of the modern world.

― Matthew Kelly

 

 

 

ISIS and the American Right have the same Goals

18 Friday Dec 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Or

A Social Revolution is taking us to the Edge of Civil War

We’re collectively living through 1500, when it’s easier to see what’s broken than what will replace it.

― Clay Shirky

14BCCULTURE-LN-tmagArticleI was talking to some people a week ago about the times we live in. I wondered aloud if this is what the 1960’s felt like with societal upheaval seemingly everywhere. Are we in the midst of a similar level of upheaval that will leave a much different World in its wake? Looking at the World the signs of major change and the inevitable reaction to it are everywhere. The ubiquity of the Internet through mobile computing is changing society in a myriad of ways. Demographics are approaching tipping points while the inequality of wealth is at or over an economic tipping point. Open bigotry, racism and class struggle are erupting across the World. Fear mongering and demagoguery are the stock and trade of politicians seeking to benefit from the under-currents of discomfort. The populace is afraid and the political opportunists are poised to take advantage.

maxresdefault copyTerrorism and ISIS are simply symptoms of the times rather than driving forces. The same for the rise of right wing extremism in the United States (and Europe); it is a symptom of the times. They seem to be driving the issues and debate, but instead should be viewed as the most glaring evidence of the conflict poised to erupt. Both of these major problems are simply rather natural reactions to the upset of numerous equilibriums we once relied upon. Numerous forces have destabilized our society and reactionary movements are to be expected as a result.

Change is not what we expect from religious people. They tend to love the past more than the present or the future.

― Richard Rohr

My conclusion is that today’s World is in the midst of changes and conflict that may be larger, more dangerous and more pervasive than anything we have seen for centuries. The 1960’s were tranquil by comparison to the storm we are living through.

The essence of Conservatism is fear: fear of what is not understood, fear of change, fear of the new, fear of what’s different, fear of the other.

― Gene Pozniak

While the two symptoms would seem to be at odd with each other, they are actually allies, forces of conservatism trying to overthrow the vast changes pushing forth out of demographics, economics and technology. Both ISIS and right wing extremism have strong religious elements combined with a distain for the new, liberalism arising everywhere. The new societal structures emerging are terrifying to the traditional elements in society. Nothing spurs a conservative response like feminism, sexuality and gender. Both have elements of violence in their approach being pervasive with ISIS and still on the fringe with right wing extremists, but becoming more normative. While both use technology to varying degrees in furthering their cause ultimately technology is one of the elements driving these forces toward reactionary extremist views.482208094-anti-abortion-activists-hold-a-rally-opposing-federal.jpg.CROP.promovar-mediumlarge

Through the open warfare conducted by the fearful right wing in the West and terrorism from the Islamic World, we see effective combat against the forces of change in society across the World. Whether it is directly through violence as with ISIS or the rise of the National Security state in the West, the open, pluralistic and secular society is driven back into the shadows. In many respects the impact of terrorism is similar to the military-law enforcement’s response to it; the freedom of the populations exposed to it is reduced. 140619-isil-iraq-mn-905_f8e4758ae16a0155ba21adb1595c738c-1It acts to squash the liberalization of society by violence or the response to it. Fear of terrorism is the greatest impact of terrorism and its most powerful weapon. The right wing reactionary movements in the West act to empower terrorists by stoking the elements of fear in society. These right wing movements use fear to further their own goals for pushing back against secular elements.

When we change the way we communicate, we change society

― Clay Shirky

In this way ISIS and the right are allies. They both seek to drive social progress backwards and revert the rules of life back to the middle ages. They hate each other because of bigoted religious views held on both sides, but socially their aims are quite similar. Both seek to impose a religious rule at home and across the world largely in reaction to the discomfort arising from the vast changes in society that make them
uncomfortable. Religion in this case is simply a tool of the fearful to drive away the changes that drive their discomfort. Perhaps nothing stokes the fear of the reactionary elements like sex. Whether it is the empowerment of women in society toward equality, freedom of sexual expression, normalization of homosexuality, broader changes in the definition of marriage, or gender, these conservative elements respond with anger, violence and fear.

Our social tools are not an improvement to modern society, they are a challenge to it.

― Clay Shirky

Of course this societal Molotov’s cocktail is fueled by more than just sexuality; it has an economic element to push it over the top. We have an economic system that is near crisis. In the United States the inequality of wealth is near an all-time high. In the past this level of inequality preceded massive upheavals. Society stands on the brink of tumbling into despotic governance or rampant corruption if it isn’t already there. The rich know this and work actively to control the levers of power to hold onto and increase their ill-gotten wealth.

[N]ew technology enables new kinds of group-forming.

― Clay Shirky

In the Islamic World this sort of inequality is epidemic and provides a fertile breeding ground for the sort of political-religious extremism that ISIS is. Nothing would defuse this ticking time bomb better than economic equality and education. Ignorance and poverty are the fuel for the violent extremism we see. Moreover the United States in particular and the World in general is well on its way to creating the same elements with itself due to corrupt and incompetent governance. This volatile mixture is itself the recipe for revolution or societal free-fall, but even more problems contend to make these times uniquely dangerous.

How is the United States at once the most conservative and commercial AND the most revolutionary society on Earth?

― Christopher Hitchens

Technology allows new things to happen, new ways of communicating, and new ways of living, and new ways of working. Each of these threatens the status quo and stokes discomfort especially among those not ready for change. In many respects technology has the impact of enabling the inequality to grow, but also democratize the masses and allow liberalization of many social norms. The rich and powerful have the advantage of training and education, buying technology, but have the disadvantage of being conservative and not accepting changes. The key to the discussion is the ubiquity of the technology’s capacity to create the environment for change and the threat it poses to the foundation of our lives.

One of the biggest changes in our society is the shift from prevention to reaction…

― Clay Shirky

facebook-friends.jpg.pagespeed.ce_.UPAsGtTZXH
This has played out in the Arab spring. Social media played a role in powering a popular uprising only to see much of the progress undone by reactionary counter-movements. To a large respect the cancer that is ISIS arose from this environment. Just as ISIS uses social media to spread its message, the populist movements similarly utilize this means to boost their power base. While the use of social media by ISIS is disquieting, the power of social media is strongly slanted towards being a force for liberalizing social structures across the World. To win the fight against forces like ISIS we must simply have faith that the force of light and good are stronger than the dark. Social media is like everything in humanity; it can be used for good or evil to equal measures. Those who would destroy its power because of fear act in a manner that makes them evil. The fear used and powering the right wing reactionary movement is itself an evil that must be defeated.tinder-640x334

Collaboration is not an absolute good.

― Clay Shirky

The final element is this recipe for crisis is environmental danger. There is a combination of issues that will produce tremendous opportunities for warfare, violence and chaos. Climate change is an active element in producing and exaggerating economic and political problems Worldwide. Shortages of food, water and climate_modeling-ruddmanenergy create wars and fuel elements of extremist movements. Every aspect of society that reacts poorly to social changes is driven further by environmental crises. Together these elements produce a deep well of danger for much of the World. At the same time we see the right wing reactionary apparatus is geared to denying the impact of environmental change. To a large degree this reaction is dominated by the business element’s desire to not kill the goose that is laying golden eggs.

What is changing in society? We are in the midst of massive changes in how our social structures work.

At the same time as this massive social change, we see economic conditions that are primed for conflict. The levels of inequality both nationally and internationally have evolved to a level that threatens the stability of society by themselves. We already dodged a bullet seven years ago when our financial system almost cGTY_stock_cash_pile_money_dollar_bills-thg-130726_16x9_992ollapsed. We have done far to little to assure that the sort of bubble that created that problem cannot happen again. Instead we are primed for another crisis. In the United States the steady decline in the middle class is producing an increasingly stratified country. Levels of poverty are growing and the educational system that serves to buoy citizens against the downward forces is falling apart. The economic conditions produce a real source of fear and are useful for collecting people into fear-driven angry mobs.

The increasingly level of corpulent wealth among the rich is working to remove resources from society as a whole, and pitting the rich against the poor. The rich are growing in their ability to control the levers of political power primarily to attempt to stabilize the economic and legal systems in a manner that benefits the rich. The fear of change, terrorism and race is a perfect vehicle for the rich to exploit. All of these things create a recipe for an angry mob under the control of the rich and powerful. Right now the angry mob is supporting Donald Trump, and aren’t exactlyRepublican U.S. presidential candidate businessman Donald Trump poses before the start of the 2016 U.S. Republican presidential candidates debate held by CNBC in Boulder, Colorado, October 28, 2015. REUTERS/Rick Wilking - RTX1TPV9acting as the rich (aside from Trump) would like. The rich may get the last laugh and put one of their own puppets in control of the mob. Overall it is simply another element in looking toward a chaotic outcome.

The social fabric of society is also changing in ways that stoke the fears of the populace. Technology is changing how we interact with one another in a myriad of ways. Facebook is the simplest and perhaps most innocuous of these forces. Other aspects of the Internet and mobile apps have a greater degree of impact on society. Changes in how relationships form, develop and proceed are undergoing fundamental changes. The nature of the family unit, marriage and our basic identity as adults is evolving. Your tribe no longer depends upon locality, but now can form remotely and far more specifically a person’s interests and desired social niche.

CG-DL33pUYQQ0XD4Ztr0cDl72eJkfbmt4t8yenImKBVvK0kTmF0xjctABnaLJIm9The deepest penetration of this change has been seen in the societal adoption of gay marriage, but it is the tip of a societal iceberg. It is driven by a massive change in societal comfort with homosexuality. Almost everyone now knows an openly gay person, and sees them as someone who needs their rights to be respected. Beyond gay marriage movements are afoot to normalize polyamourous relationships, open marriage, and other forms of family and relationship dynamics. Online and mobile communication allows new communities to form without intersecting with traditional mediums of social structure.

yg1wddouuyahugwrlzl8Sexual freedom that provided one of the key elements for the 1960’s is undergoing a new renaissance revolving around mobile computing. Mobile apps and new forms of dating like Tinder, OKCupid are the most innocuous forms for these changes in the relationship landscape. The forces of conservatism are horrified and fearful of these changes, and are mobilizing to push back against them. All of this is ripe for deep and wrenching conflict on new social battlefields. Just as the forces of liberalization utilize modern technology, the forces arrayed against changes organize themselves with modern technology too (ISIS is an example!).

All of this comes together to challenge the traditional views of what constitutes moral behavior. Discussions of morality become the fuel for the right wing reactionary movements. This produces a deep societal tension between those who are freed by personal prisons through liberalization of morality and those whose power is magnified by the older moral norms. LBGT issues are a common lightning rod for polarization of the two sides of the equation. While LBGT is the epitome of the tensions around morality, it is the tip of the iceberg. All the elements in the modern World conspire to create quantries to be solved and challenges to traditional morality.

In the United States and Europe right wing extremism is a reaction to all of the change. There is a pervasive social, religious and racial basis to these reactionary movements. They are responding to all of the changes seen across society at large. The combination of quickly growing minority populations in Europe and the impending minority majority in the United States has the traditional white majorities terrified. The combination of gender, sexual and social changes in the structure of families and relationships are furthering the depth of fear of change. All of this is driven by a combination of demographic shifts, technology and generational changes that are inevitable. This does not mean that the forces of status quo will not fight back. The resistance to change will be mighty and the potential for violence is rather extreme.

Isis fighters, pictured on a militant website verified by AP.ISIS is another reactionary force. It is a reaction to a broad based economic deprivation and widespread corruption in the Islamic World. Of course the West has had a hand in creating the elements in Islamic society that have produced ISIS. The form of extreme religious views is simply a detail, not the core of the problem. ISIS is not an Islamic movement; it is driven by large scale poverty, lack of opportunity and powerless masses. It derives its power and manpower base from the millions of poor, uneducated and desperate masses in the Islamic World. The elements of religious extremism play to lack of economic power and uneducated populace whether it is Islamic or Christian.

The irony of ISIS is how alike the American and European right wing movements they are. They should be allies and thankfully they cannot be. The truth is that the right wing movements in the West help ISIS with every bomb they drop and every right they suppress. The liberalization of the West is the single greatest weapon against ISIS. Feminism, secular society, mass media and sexuality all work to undermine the right wing whether it is Christian-based or Islamic. Our movies, television, Facebook and yes our porn all work to hurt ISIS. When the right wing religious zealots in the West attack any of these liberal forces, they help ISIS. The right wing in the West would like to unplug everything that would act to undermine ISIS.

05VOWS1-master675Change is here; change is upon us. There is no going back, but there is going to be deep conflict around all these beachheads. The forces of inequality and traditional society will not go away quietly. Traditional religion, gender, family and inequality will not relinquish their power without a fight. The changes in society are giving power to the disenfranchised, and they will not go back into the shadows without a fight either. This is a recipe for conflict, warfare and perhaps even revolution. We may be set for a period of change in society that will make the 1960’s look like a peaceful serene episode in history.

This creates the recipe for war.

Real generosity towards the future lies in giving all to the present.

― Albert Camus

 

Gallery

Bullshit is Corrosive

10 Thursday Dec 2015

Posted by Bill Rider in Uncategorized

≈ 1 Comment

This gallery contains 2 photos.

  A lie that is half-truth is the darkest of all lies. ― Alfred Lord Tennyson Note: Today’s post contains a …

Continue reading →

Science today is timid, short-term and utterly unimaginative

04 Friday Dec 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Judge a man by his questions rather than by his answers.

― Voltaire

It probably isn’t much of a stretch to say that science takes on the characteristics of the era in which it is conducted. When I look at science today, it seems to reflect the basic character of the time we live in: timidity born of baseless fear, short-term thinking and goals with a general lack of imagination. Science like many things in society is swept up in the overall sea of culture and experience. Perhaps it might be too much to ask for science to have a differing tone than other aspects of our technicaldebtWorld. On the other hand, the downsides to the current milieu with regard to science are stark and obvious. Science is in complete disarray and we are to blame.

For example, I work largely on a National project with enormously aggressive goals and sweep. I’ve worked in this program for two decades. This is Science-based Stockpile Stewardship (SBSS). This is the answer of the United States to the cessation of nuclear weapons’ testing and an important cornerstone of the effective test ban. To succeed it must be executed in a bold and fearless fashion with an unerring long-term vision aided by expressive creativity. The reality is so different. Our national leaders declare success without any basis in truth. It is part of the overall problem where the actual definition of success sews the seeds of failure. The actual program falls so far short of its vision as to be nearly comical if it weren’t so important.

It is hard to fail, but it is worse never to have tried to succeed.

― Theodore Roosevelt

Why then is the reality so different from this expression of the character of success? For all of its vision and importance, SBSS is caught up in the realities of today and the political nature of science today. As our political bomb.jpg_1718483346climate has become a festering and poisonous environment, the conduct of science has taken on a similar air. Part of this issue comes from the public funding of science, which is an absolute necessity considering the utter wasting away of corporate science. The imposition of short-term thinking as the principle organizing principle for industry has obliterated basic research at the corporate level. Unfortunately the corporate mantra has been adopted by the government as a way of improving the quality of governance. This has simply multiplied the harm done by the sort-term thinking and its ability to ravage any long-term accomplishments.

As a nation we can’t even take care of our roads, bridges, airports in a reasonable fashion because it’s payoff is not realized immediately. The undoing of the Nation’s infrastructure and failure to invest in its 21st Century equivalents is a clear and present danger to the economic future of the United States. Yet inaction rules the political response to the obvious need for investment. Science, whose benefits are far more ephemeral, is far more difficult to manage in a similarly long-term fashion. Like infrastructure, the investment in science is utterly inadequate and the destructive management environment imposed on it then compounds this inadequacy. Everything from politics to business is operating in a completely short-term, immediate payoff fashion. Almost everything must show progress and payoff quarterly and all long-term interests are scarified at this altar.

Unknown-2The truth is that short-term thinking is bad for business, bad for science, bad for careers, bad for everyone except those at the top. It only benefits activities like finance as a way of powering their moneymaking shell game. It benefits the very rich and their rent-seeking behavior. We get sold a complete line of bullshit in calling all the finance “investment” when it is simply moving money around to make more money. The middle class has been sold on this strategy through their retirement accounts, but this is the equivalent of a bribe as it only buys their acceptance of a system that harms the middle-class in the long-term. The only interest that truly benefits is the status quo that is locked in from this focus. For science, the short-term thinking is simply destructive and a recipe for mediocrity and lack of progress. Again, the problems are only seen in the long-term when the lack of scientific progress will harm our descendants.

Despite this obvious downside of defining all benefit in a short-term fashion everyone has bought this idea hook, line and sinker. The quarterly report driven attitude is driven by a false belief that it is good for business. All it is good for is driving the stock market and money around in the giant shell game that finance is today. In the process we have systematically destroyed many business interests and undermined the economic well being of the entire World. As bad as this is the damage to the conduct of science is greater. It has destroyed the scientific vitality of corporate laboratories. Discoveries that could have been the basis of the future economy are now being driven out of existence. Our World will be poorer; lives will be shorter in the long run due to the shortsighted, risk adverse and fear-driven policies of today.

Only those who dare to fail greatly can ever achieve greatly.

― Robert F. Kennedy

UnknownFor me I see my life and career playing out in this shitty time. I could have been part of something bold and wonderful with science providing the backbone of an important societal endeavor. Instead we are destroying research institutions through utter neglect. We are wasting careers and lives in pushing down risks due to irrational fears. All of this is done in service to short-term thinking that benefits the rich and powerful. No one else benefits from the short-term thinking, no one, it is simply a vehicle for empowering the status quo and assuring that the identities of those on top do not change. It is the straightjacket through which lack of social mobility arises.

How did all this happen?

images-2All one has to do is look at the political environment today. Americans and perhaps the entire Western world have never been more fearful and afraid. At the same time the World has never been more peaceful. Our society and so-called leaders amp up every little fear and problem into a giant boogieman when the actual reality is completely opposite. We have never ever been safer than today. This is true even with the orgy of gun violence in the United States. The powers that be use the tiny danger of terrorism to drive the forces of the status quo while utterly ignoring larger dangers (like firearms). The truth is that we have never had less to fear. Yet fear is the driving force politically and used to sell fear-spewing candidates to a weak, shivering populace. Fear sells products to people whether is drugs, cars, media content, guns, or almost anything else. Our mass media is simply a tool of their corporate overlords with the ultimate design to enslave us to the status quo. Our society runs on fear and fear is used to enslave the populace. Science is simply an innocent bystander slain by the societal drive-by.

If you’re not prepared to be wrong, you’ll never come up with anything original.

― Ken Robinson

The consequence of this constant fear-mongering and risk avoidance is a societal timidity. This timidity is a slow-motion surrender to the forces of stasis and decay. In almost all endeavors fortune favors the bold and aggressive, yet everything in the system discourages such attitudes. We have a socially imposed lack of confidence born of fear and enforced by systematic cowardice. The result is a broad-based diminishment of accomplishment. All the while the forces of the establishment openly crow about their successes and achievements publically while privately undermining every attempt to actually produce progress. We have reaped a World that makes Orwell’s vision of 1984 seem remarkably prescient.

To produce the sort of progress we are capable of the system needs to support it. Fear must be tamped down and reduced, and risks should be actively encouraged. We should seek aggressive and bold paths forward perhaps approached with some degree of reckless abandon, or the sort of over-confidence that allows the impossible to be achieved. Rather than squash progress at every opportunity, it needs to be fertilized and nurtured. Today’s systems effectively strangle the infant progress in its crib, aborting virtually every attempt to do anything that breaks the mold. You never hear a proposal being criticized for not being risky enough, only being too risky.

In the process we stay within the realm of the known and keep the unknown at bay. The unknown is where progress lives along with risk and fear. Without courting failure we cannot produce anything new. Today’s management looks for everything that could go wrong with a bold research path, and rarely looks for what could go right. We seek that safe and obvious incremental path because it will almost surely succeed with a little bit of competent effort.

Cielo rotatorThis thinking infests the approach to high performance computing where a tried and true path of relying upon Moore’s law has powered modest improvements for decades. At the same time we have avoided progress in other areas of computing with greater benefits, but also greater risks and higher probabilities of failure. A prime example of this disservice can be found in numerical linear algebra where the solution of sparse systems has stagnated for decades. All the effort has been consumed by moving the existing methods to the new computing platforms, and little or nothing on improving the methods themselves. Orders of magnitude in performance improvements have been scarified to fear and risk avoidance. Let’s not forget that the principle beneficiaries of the current supercomputing program are computer vendors who will receive great sums of money to produce the monstrous computers being contemplated. These horrible machines will sap the resources left over to actually use them and simply compound the stasis already evident in the field.

Perhaps no area of modeling and simulation has been more neglected than the physical models used. We are in a decades long trajectory where the basic assumptions of the physical models used in simulation have been fixed. Despite rapidly accumulating evidence of the utter lack of applicability and appropriateness of many models forming the basis of codes, the models remain fixed. In a number of cases key assumptions such as separation of scales has broken down, yet no attempt has been made to overhaul the models. If a model is flawed no amount of raw computing power can save it, yet we are devoting massive resources and political will to increasing raw computing power.

It is quite evident that the key to successful modeling and simulation is not found in computer power, but rather in a variety of other activities. Rather than pursue a path that leads to greater success, but requires greater risk and more opportunity for failure, we pursue the path that seems safe. We need to focus on models, methods and algorithms along with innovative uses of computing. Instead we hold all of these aspects fixed while pursuing a host of rather safe and pedestrian activities that will do little to improve science. The lack of bold visionary leadership in scientific computing is somewhere between depressing and pathetic.

Ifig10_rolen a very clear way we are taking enormous risks with our future. We are accumulating massive long-term risk by consistently taking the low-risk short-term path. This is clearest when examining the state of the careers in science. Once we allowed people to aggressively pursue research with a high change of failure, but the possibility of massive payoffs. Today we timidly pursue incremental progress, yet view this as enormously risky. The greatest risk of the continued pursuit of a computing hardware driven path in high performance computing is the destruction of promising scientific careers, and the destruction of a balanced program for advancing modeling and simulation. Make no mistake, the current approach to modeling and simulation is completely unbalanced. It is timid. It lacks creativity. It lacks vitality. It is not science-based; it is fear-based. It is the result of an unhealthy fixation on short-term thinking about progress.

I work in computing, so I see the field rather fully. I am fairly certain that the attributes seen in computing are fairly broadly applicable to science in general. In areas close to computing and essential for progress, the same symptoms are there. By all accounts, experimental science is in even worse shape than computational science; theoretical physics even more so. The failings of experimental and theoretical science are profoundly evident in the lack of any vibrancy or vitality in modeling for computing, as they are the source of change. In both bases the state of these fields should roundly condemn SBSS to utter and complete failure. Those in power have declared success in SBSS, but the evidence is all to the contrary.

Perhaps my greatest concern is that these issues are all embedded with a societal environment that shows no sign of changing without great upheaval. Such an upheaval would be enormously painful, but perhaps greatly overdue. The last upheaval of such a magnitude was the 1960’s and probably created the environment we have today. The level of income and societal inequality we have today in unsustainable, and probably creates a social instability that will sooner or later explode. Perhaps we are seeing the beginnings of this, and it might be the best thing in the end.

The scientist is not a person who gives the right answers, he’s one who asks the right questions.

― Claude Lévi-Strauss

Calibration, Knobs and Uncertainty

27 Friday Nov 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

When the number of factors coming into play in a phenomenological complex is too large scientific method in most cases fails. One need only think of the weather, in which case the prediction even for a few days ahead is impossible.

― Albert Einstein

One of the dirty little secrets of computing in the scientific and engineering worlds is the fact that the vast majority of serious calculations are highly calibrated (and that’s the nice way to say it). In many important cases, the quality of the “prediction” is highly dependent upon models being calibrated against data. In some cases calling the calibrated “models,” does modeling a great disservice, and the calibration instruments are simply knobs used to tune the calculation. The tuning accounts for serious modeling shortcomings and often allows the simulation to produce results that approximate the fundamental balances of the physical system. Often without the calibrated or knobbed “modeling” the entire simulation is of little use and bears no resemblance of reality. In all cases this essential simulation practice creates a huge issue for the proper and accurate uncertainty estimation.

Confidence is ignorance. If you’re feeling cocky, it’s because there’s something you don’t know.

― Eoin Colfer

At some deep level the practice of calibrating simulations against data is entirely unavoidable. Behind this unavoidable reality is a more troubling conclusion that our knowledge of the World is substantially less than we might like to freely admit to ourselves. By the same token the actual ClimateModelnestinguncertainty in our knowledge is far larger than we are willing to admit. The sort of uncertainty that is present cannot be meaningfully addressed through the focus on more computing hardware (its assessment could be helped, but not solved). This uncertainty can only be addressed through a systematic effort to improve models and engage in broad experimental and observation science and engineering. If we work hard to actively understand reality better the knobs can be reduced or even removed as knowledge grows. This sort of work is exactly the sort of risky thing our current research culture eschews as a matter of course.

Do not fear to be eccentric in opinion, for every opinion now accepted was once eccentric.

― Bertrand Russell

This area of modeling and simulation is essential to many areas to varying degrees. If we are to advance our use and utility of modeling and simulation with confidence, it must be dealt with in a better and more honest way. It is useful to point to a number of important applications where calibration or knobs are essential to success. For air flow over an airplane or automobiles turbulence modeling is essential and turbulence is one of the key areas for calibrated results. Climate and weather modeling is another area where knobs are utterly essential. Plasma physics is yet another area where the modeling is so poor that calibration is absolutely necessary. Inertial or magnetically confined fusion both require knobs to allow simulations to be useful. In addition to turbulence and mixing, various magnetic or laser physics add to the problems with simulation quality, which can only be dealt with effectively through calibration and knobs.

You couldn’t predict what was going to happen for one simple reason: people.

― Sara Sheridan

The conclusion that I’ve come to is that the uncertainty in the cases of calibrated or knobbed calculation has two distinct faces each of which should be fully articulated by those conducting simulations. One is the best-case scenario of the simulated uncertainty, which depends on the modeling and its calibration being rather complete and accurate in capturing reality. The second is the pessimistic case where the uncertainty comes from the lack of knowledge that led to the need for calibration in the first place. If the simulation is calibrated, the truth is that the calibration is highly dependent upon the data used and guarantees of validity are dependent on matching the conditions closely associated with the data. Outside the range where the data was collected, the calibration should carry with it greater uncertainty. The further we move outside the range defined by the data, the greater the uncertainty.

This is most commonly seen in curve fitting using regression. The curve and the data are closely correlated and standard uncertainties are relatively small. When the uncertainty is taken outside the range of the data, it grows much larger. In the assessment of uncertainty in calculations this is rarely taken into account. Generally those using calculation like to be blithely unaware of whether the calibrations they are using are well within the range of validity. Calibration is also imperfect and carries an error with them intrinsic to the determination of the settings. The uncertainty associated with the data itself is always an issue when either taking the optimistic or more pessimistic face of uncertainty.

A potentially more problematic aspect of calibration is using the knobs toMesh_Refinement_Image4 account for multiple effects (turbulence, mixing, plasma physics, radiation and numerical resolution are common). In this cases the knobs may account for a multitude of poorly understood physical phenomena, mystery physics and lack of numerical resolution. This creates a massive opportunity for severe cognitive dissonance, which is reflected in an over-confidence in simulation quality. Scientists using simulations like to provide those funding their work with greater confidence than it should carry because the actual uncertainty would trouble those paying for it. Moreover the range of validity for such calculation is not well understood or explicitly stated. One of the key aspects of the calibration being necessary is that the calculation cannot reflect a real World situation without it. The model simply misses key aspects of reality without the knobs (climate modeling is an essential example).

In the cases of the knobs accounting for numerical resolution, the effect is usually crystal clear when the calibration of the knob settings needs to be redone whenever the numerical resolution changes because a new faster computer becomes available. The problem is that those conducting the calculations rarely make a careful accounting of this effect. They simply recalibrate the calculations and go on without ever making much of it. This often reflects a cavalier attitude toward computational simulation that rarely intersects with high quality. This lack of transparency can border on delusional. At best this is simply intellectually sloppy, at worst it reflects a core of intellectual dishonesty. In either case a better path is available to us.

Science is not about making predictions or performing experiments. Science is about explaining.

― Bill Gaede

titan2In essence there are two uncertainties that matter: the calibrated uncertainty where data is keeping the model reasonable, and the actual predictive uncertainty that is much larger and reflects the lack of knowledge that makes the calibration necessary in the first place. Another aspect of the modeling in the calibrated setting is the proper use of the model for computing quantities. If the quantity coming from the simulation can be tied to the data used for calibration, the calibrated uncertainty is a reasonable thing to use. If the quantity from the simulation is inferred and not directly calibrated, the larger uncertainty is appropriate. Thus we see that the calibrated model has intrinsic limitations, and cannot be used for predictions that go beyond the data’s physical implications. For example climate modeling is certainly reasonable for examining the mean temperature of the Earth. One the other hand the data associated with extreme weather events like flooding rains are not calibrated, and uncertainty regarding their prediction under climate change are more problematic.

climate_modeling-ruddmanIn modeling and simulation nothing comes for free. If the model needs to be calibrated to accurately simulate a system, the modeling is limited in an essential way. The limitations in the model are uncertainties about aspects of the system tied to the modeling inadequacies. Any predictions of the details associated with these aspects of the model are intrinsically uncertain. The key is the acknowledgement of the limitations associated with calibration. Calibration is needed to deal with uncertainty about modeling, and the lack of knowledge limits the applicability of simulation. One applies the modeling in a manner that is cautious, if they are being rational. Unfortunately people are not rational and tend to put far too much faith in these calibrated models. In these cases they engage in wishful thinking, and fail to account for the uncertainty in applying the simulations for prediction.

It is impossible to trap modern physics into predicting anything with perfect determinism because it deals with probabilities from the outset.

― Arthur Stanley Eddington

If we are to improve the science associated with modeling and simulation the key is uncertainty. We should charter work that addresses the most important uncertainties through well-designed scientific investigations. Many of these mysteries cannot be addressed without adventurous experimentation. Current modeling approaches need to be overthrown and replaced with different approaches without limitations (e.g., the pervasive mean field models of today). No amount of raw computing power can solve any of these problems. Our current research programs in high performance computing are operating in complete ignorance of the approach necessary for progress.

All you need in this life is ignorance and confidence, and then success is sure.

– Mark Twain

Supercomputing Today is Big Money Chasing Small Ideas

19 Thursday Nov 2015

Posted by Bill Rider in Uncategorized

≈ 1 Comment

We adhere to the saying, “if it ain’t broke, don’t fix it,” while not really questioning whether “it” is “broke.”

― Clayton M. Christensen

Supercomputing is a trade show masquerading as a scientific conference and at its core big money chasing small ideas. It takes place this week in Austin, and features the slogan “HPC Transforms“. The small idea is that all we need to do for modeling & simulation (and big data) to succeed is build faster computers. This isn’t a wrong idea per se, but rather a naïve and simplistic strategy that is suboptimal in the extreme. Its what we are doing despite the vacuous thinking behind it. Unfortunately we and other countries are prepared to spend big money on this strategy while overlooking the rather obvious and more balanced path to success. The balanced path is more difficult, challenging and risky, which is part of our unwillingness to pursue it. The tragedy that is unfolding is one of lost opportunity for true sustainable progress and massive societal impact.Mainframe_fullwidth

“HPC Transforms” isn’t a bad or wrong idea either. The problem is the basic concept of transformation happened decades ago, and today HPC works on the pure inertia of that generation old progress. It was the 1980’s that marked the birth of HPC and its transformative power on science. If look at HPC today we see a shell left over with only massive computing hardware being the focus. The elements of progress and success that fed the original transformative power of HPC have been allowed to whither. The heart and soul of HPC is whithering due to lack of care and feeding. A once balanced and important effort has become a dismal shell of its former self. We have allowed shallow slogans to replace a once magnificent scientific field’s path to change.

This week marked some insightful commentary about Clayton Christensen’s theory of disruptive innovation (https://hbr.org/2015/12/what-is-disruptive-innovation or the reader’s digest version http://www.businessinsider.com/clay-christensen-defends-his-theory-of-disruption-2015-11), which has become a bit of a hollow mantra and buzzword in many places. For many, like those in HPC, it has become a bit of a shallow offering about the nature of Supercomputing. Instead I’ll submit that the last twenty years has been marked by a disruptive disinnovation. The parallel computing “revolution” has ripped the heart and soul from supercomputing and left a rotting husk behind. The next generation of computing will only offer an acceleration of the process that has lobodomized supercomputing, and left a vertiable zombie Unknown-3behind. The lobodomy is the removal of attention and focus on the two pieces of computing that are most responsible for impacting reality, which I am going to refer to as the heart and soul of HPC. It doesn’t need to be this way, instead the path we are taking is a conscious choice driven by naivity and risk-aversion.

If you defer investing your time and energy until you see that you need to, chances are it will already be too late.

― Clayton M. Christensen

So what is this opposing concept of disruptive disinnovation that I’m proposing? It is a new technology that you are forced into using that undermines other important technologies. For supercomputing the concept is relatively easy to see. Computing has transformed quickly into a global economic colossus, but focused on the mobile market, which derive their value primarily through mobility, connectivity and innovation in applications.

Traditional mainframe sort of computing has changed with a distinct lack of drive for raw computing power. Low power that allows long battery life became the design mantra for computer chips and the easy performance of improvements of Moore’s law ended last decade. At the same time we have a mantra that we must have the most powerful computer (measured by some stupid benchmark that is meaningless!). This demand for the fastest computer became some sort of emptyend-world-survival-guide-staying-alive-during-zombie-apocalypse.w654national security issue to sell it without a scintila of comprehension for what makes these computers useful in the first place. The speed of the computer is one of the least important aspects of the real transformative power of supercomputing, and the most distant from its capasity to influence the real world.

To enable us to claim to have the fastest computer, which naively means we have the best science. In the process of using these new computers we undermine our modeling, methods and algorithmic work because just using these new computers was so hard. The quality of the science done with computers is completely and utterly predicated on the modeling used.

There are quarters that like to say that parallel computing was a disruptive innovation, except it made things worse. In the process we underminded the most important aspect of supercomputing to enable meaningless boasting. The concept is really simple to understand and communicate: it’s the apps stupid.url-1 The true value of computers are the applications, not the hardware. If anything should be obvious about the mobile computing era, it is the software that determines the value of computing, and we have systematically undermined the value, content and quality of our software. When I say this it is not an SQE question, but the application’s utility to impact reality.

What is this heart and soul of HPC?

Modeling is the heart of high performance computing. Modeling is the process of producing a mathematical model of the real world. HPC provided a path to solving a far greater variety and complexity of models scientifically and opened new vistas for scientific exploration and engineering creation. When modeling is a living breathing entity, it grows when it is critically compared with the reality it is supposed to represent. Some models die and others are born to replace them. Models breed with their genetic material mixing to produce better and more powerful offspring.

Today we have created walls that keep our models from breeding, growing and extending them to become better and more relevant to the issues that society is depending upon them to contribute toward. The whole modeling aspect of HPC is rather static and simply reflects a fixed point-of-view toward what we should be modeling. More than anything the current slogan-based approach to HPC simply promulgates models from the past into the future by fiat rather than an explicit choice.

You view the world from within a model.

― Nassim Nicholas Taleb

Perhaps the worst thing about the lack of attention being paid to modeling is the extreme needs that are unmet and the degree of opportunity being lost. The degree of societal impact that supercomputing could be having is being horrendously shortchanged. The leadership is fixated on hardware primarily as a low-risk path to seeming progress (a gravy train that is about to end). A higher risk path would be the support of work that evolves the utility of supercomputing into the modern world. The risk is higher, but the payoff would be potentially2-29s03immense and truly transformative. We have deep scientific, engineering and societal questions that will be unanswered, or answered poorly due to our risk aversion. For example, how does climate change impact the prevalence of extreme weather events? Our existing models can only infer this rather than simulate it directly. Other questions related to material failure, extremes of response for engineered systems, and numerous scientific challenges will remain beyond our collective grasp. All of this
opportunity is missed because we are unwilling to robustly fund risky research that would challenge existing modeling approaches.

Risks must be taken because the greatest hazard in life is to risk nothing.

― Leo Buscaglia

The soul of HPC is methods and algorithms, which together power the results that the computer can produce. We used to invest a great deal in improving methods and algorithms to amplify the good that the computer does. Today we simply use what we already have developed and re-implement them to fit onto the modern monstrosities we call supercomputers. The drive to continually improve and extend existing methods and algorithms to new heights of quality and performance is gone. We have replaced this with the attitude that these areas are mature and well developed not needing attention. Again, we can honestly assess this as a lost opportunity. In the past methods and algorithms have produced as much gain in performance as the machines. In effect they have been a powerful multiplier to the advances in hardware. Today we deny ourselves this effect to the detriment of the transformation this conference is touting to the World.

JohnvonNeumann-LosAlamosAll of this reflects a rather fundamental misunderstanding of what HPC is and could be. It is not a fully matured topic, nor is it ready to simply go into this station-keeping mode of operation. It still requires the extreme intellectual efforts and labors that put it in this transformative place societally. If HPC were more mature we might reasonably be more confident in its results. Instead HPC relies upon bravado of boastful claims that hardly match what capability it truly has. Any focused on attention on the credibility of computed results reveals that HPC has a great deal of work to do, and the focus on hardware does little to solve it. The greatest depth of work is found in modeling closely followed by issues associated with methods and algorithms.

Instead of basing a program for making HPC transformative on empirical evidence, we have a program based on unsupported suppositions. Hardware is easily understood by the naïve masses, which includes politicians and paper pushers. They see big computers making noise and lots of blinking lights. Models, methods and algorithms don’t have that appeal, yet without them the hardware is completely useless. With an investment in them we could make the hardware vastly more powerful and useful. The problem at hand isn’t that the new hardware is a bad investment; it is a good investment. The problem is how much better the new hardware could be with an appropriately balanced development program that systematically invested in modeling, methods and algorithms too.

People don’t want to buy a quarter-inch drill. They want a quarter-inch hole.

― Clayton M. Christensen

Despite this we have systematically disinvested in the heart and soul of HPC. It is arguable that our actual capacity for solving problems has been harmed by this lack of investment to the tune of 10, 100 or even 1000 times. We could have HPC that is a 1000 times more powerful today if we had simply put our resources into a path that had already been proven for decades. If we had a bolder and more nuanced view of supercomputing, the machines we are buying today could be vastly more powerful and impactful. Instead we clunk along and crow about a transformative capability that largely already happened. There are stunning potential payoffs societally that we are denying ourselves.

Modeling defines what a computer can do, and methods/algorithms define how well they can do it. What our leadership does not seem to realize is that no amount of computing power can do anything to improve a model that is not correct. The only answer that improves the ability to impact reality is a newer, better model of reality. The second aspect of supercomputing we miss is the degree to which methods and algorithms provide benefit.

Our computing power today is more dependent and has received greater benefit from the quality and efficiency of methods and algorithms than hardware. Despite the images-1clear evidence of its importance we are shunning progress in method and algorithms in order to focus on hardware. This is a complete and utter abdication of leadership. We are taking a naïve path simply because it is politically saleable and seemingly lower in obvious risk. The risk we push aside is short term, in the long term the risks we are taking on are massive and potentially fatal. Unfortunately we live in a World where our so-called leaders can make these choices without consequence.

This is an absolute and complete failure of our leadership. It is a tragedy of epic proportions. It reflects poorly on the intellectual integrity of the field. The choices made today reflect a mindset that erupted at the end of the cold war and was successful then in keeping the DOE’s national labs alive. We have gotten into a model of confusing survival with success. Instead of building from this survival strategy into something sustainable, the survival strategy has become the only strategy. If science were actually working properly, the lack of balance in HPC would have become evident. The Supercomputing meeting this week is an annual monument to the folly of our choices in investment in HPC.images-1

I can only hope that saner, more intelligent and braver choices will be made in the not too distant future. If we do we can look forward to a smarter, less naïve and far bolder future with high performance computing that brings the transformative power of modeling and simulation to life. The tragedy of HPC today isn’t what it is doing; it is what isn’t being done and the immense opportunities squandered.

We all die. The goal isn’t to live forever, the goal is to create something that will.

― Chuck Palahniuk

 

Today’s “Accountability” Destroys Quality in Science

13 Friday Nov 2015

Posted by Bill Rider in Uncategorized

≈ Leave a comment

Accountability is generally a good thing. We are at our best when we are held accountable to our colleagues, our efforts and ourselves. So how can accountability ever be a bad thing? The way it’s done today is a vehicle of unparalleled destructive power.

There is nothing so useless as doing efficiently that which should not be done at all.

― Peter Drucker

How to Increase Employee Accountability in the WorkplaceAvoiding accountability is never a good thing. On the other hand too much overbearing accountability starts to look like pervasive trust issues. The concomitant effects of working in a low-trust environment are corrosive to everything held accountable. As most things the key is balance between accountability and freedom, too much of either lowers performance. Today we have too little freedom and far too much accountability in a destructive form. For the sake of progress and quality a better balance must be restored. Today’s research environment is being held accountable in ways that reflect a lack of trust, and a complete lack of faith in the people doing the work, and perhaps most importantly produce a dramatic lack of quality in the work (https://williamjrider.wordpress.com/2014/10/23/excellence-and-accountability/).
Accountability can be implemented in many ways, and today in science it looks like micromanagement. How can we make accountability (a generally good thing!) destructive? We define work that should be innovative and creative in terms of well-defined deliverables and milestones (https://williamjrider.wordpress.com/2014/12/04/the-scheduled-breakthrough/), which must never be failed to execute. An important thing that comes from research is finding out what are distinctly bad ideas. The right thing to do is stop when you something is a bad idea and finds a new idea. Today we continue to plow along a path even when we know it’s the wrong one because of the sort of contracts we are accountable to. Perhaps most importantly the quality of the work rarely if ever enters into the accountability. We live in an environment where quality is simply assumed to be in place, and no one seems to have a direct and unbreakable commitment to it. In today’s accountability culture, quality is simply not part of the expectations.

images-1It shows in everything we do.

We divvy up the work into smaller and smaller bins with well-defined deliverables and quarterly progress reports. The same principles that are corrupting our business world are being applied to science (https://williamjrider.wordpress.com/2014/10/10/corporate-principles-do-not-equal-good-management/). Where these principles are arguably appropriate for business (the whole shareholder value concept as the point of business), they are unremittingly damaging to science. Yet apply them we do gleefully and wantonly. It is strangling the quality of the work that is being made accountable as surely as it wastes precious resources. Time and money are interchangeable, but the most unforgivable aspect of this is the waste of careers, talent and human potential to a cause that undermines more than it builds.

Small minds just like small stones can never create giant waves.

― Mehmet Murat ildan

The accountability we see today is destroying the ability to define, think about, and executes big ideas. We live in an era of small-minded, small ideas and a general lack of accomplishment of anything that matters. People are encouraged to work very prescriptively and narrowly within their prescriptively and narrowly defined scope of work. Success often (if not always) depends on things outside the scope of the work we are accountably doing. How can we do something “out of the box” if we are driven to always stay in “the box”? We then say that since it is outside our scope of work, it is outside what we are responsible for. We then feel that ignoring things out of scope for our responsibilities is a duty we are accountable for. The present form of accountability allows one to ignore the big picture and execute the body of work promised whether it matter or not, whether it is useful of not, and whether it is quality or not. It almost assures that work done is not well integrated or adaptive to deeper understanding.

…If there is no risk, there is no reward.

― Christy Raedeke

Another impact of the small-minded thinking is a complete lack of ownership of anything bigger than what you are directly accountable for. You are encouraged to focus only on what you are directly being paid to focus on. Coupled with naïve intellectually shallow management you have a recipe for systematic mediocrity. Just as damning is the extreme risk aversion of the management and increasingly by rank and file scientists. This pervasive risk aversion almost assures that nothing of significance will be accomplished. One can work hard on meaningless tasks and feel successful empowering an ever-diminishing quality standard for all the work touched by accountability. It assures that we will never accomplish anything big or important. In many cases this sort of approach is appropriate for building bridges, repaving roads or putting up a skyscraper. For research, science or high-end engineering it is harmful, damaging and ultimately a giant waste of money. We follow plans that do not stand the test of time and we fail to adjust to what we learn.

VyXVbzWOur accounting systems are out of control. They spawn an ever-growing set of rules and accounts to manage the work. All of this work is nothing more than a feel good exercise for managers who mostly want to show “due-diligence” and those they “manage risk”. No money is ever wasted doing anything (except increasingly all the money is wasted). Instead we are squeezing the life out of our science, which manifests itself as low quality work. In a very real way low quality science is easier to manage, far more predictable and easy to make accountable. One can easily argue that really great science with discovery and surprise completely undermines accountability, so we implicitly try to remove it from the realm of possibility. Without discovery, serendipity and surprise, the whole enterprise is much more fitting to tried and true business principles. In light of where the accountability has come from, it might be good to take a hard look at these business principles and the consequences they have reaped.

We exist in an increasingly risk adverse (https://williamjrider.wordpress.com/2015/10/23/we-want-no-risk-and-complete-safety-we-get-mediocrity-and-decline/) and fearful society beset by massive inequality of income, wealth and opportunity. Many of these terrible outcomes can be traced directly to the sorts of business principles being applied to science. Such principles are completely oriented toward driving outcomes preferentially toward the “haves” and away from the “have not’s”. Ultimately, the biggest threat to the rich and powerful is change in the status quo. The sorts of management and accountability used today mostly works to undermine any real progress, which favors the status quo. Science is one of the major societal engines of progress and change. The rich and powerful are fearful of progress, and work to kill it. We are tremendously successful at killing progress, and modern accountability is one of the best tools to do it.

Creativity requires the courage to let go of certainties.

― Erich Fromm

mediocritydemotivatorQuality suffers because of loss of breadth of perspective and injection of ideas from divergent points of view. Creativity and innovation (i.e., discovery) are driven by broad and divergent perspectives. Most discoveries in science are simply repurposed ideas from another field. Discovery is the thing we need to progress in science and society. It is the very thing that our current accountability climate is destroying. Accountability helps to drive away any thoughts from outside the prescribed boundaries of the work. Another maxim of today is the customer is always right. For us the customers are working under similar accountability standards. Since they are “right” and just as starved for perspective, the customer works to narrow the focus. We get a negative flywheel effect where narrowing focus and perspective work to enhance their effect.

Never attribute to malice that which can be adequately explained by stupidity.

― Robert Heinlein

This has manifested itself as the loss of the Labs as honest brokers. The Labs are simply sycophants today who work on what they are paid to work on. A large-scale extension of the customer is always right principle. They never provide even a scintilla of feedback to government programs because of the fear of having their funding cut. Instead they pile on to poorly constructed and intellectually shallow programs because they promise funding. Thus we get programs that are phenomenally shallow and intellectually empty, but are managed at a level that provides no freedom or innovation to rescue them from their mediocrity. The accountability means that the empty intellectual goals are executed to a tee, and any value that might have arisen from the resources is sacrificed to the altar of doing what you’re told to do.

When programs of the sort that the government is funding are integrated over decades you see an immense decline in the institutions due to the loss of autonomy of the staff. Our National leadership in science simply corrodes and younger scientists do not develop in any sort of coherent way. Careers are starved of the sorts of efforts needed to build them. We have created a generation of mediocre scientists who excel at obedience and simply grinding through projects. They are distinguished by their ability to produce the deliverables they promised and little else. Once great institutions are steaming caldrons of mediocrity and mostly just pork barrel spending (I often joke that the execution of the Lab Mission is best achieved by going out and buying a car).

An inappropriate focus on money is the root of many of these problems. These days we will do almost anything for money, and money is the primary measure of everything (https://williamjrider.wordpress.com/2014/08/29/money-makes-for-terrible-priorities/). In particular the accountability of what money is spent on provides the standard form of success. Did we do what the money was supposed to pay for? If so, success is declared. Never mind that the money has been sub-divided into ever-smaller bins that effectively destroy the ability to achieve anything bigger and more coherent. The big ideas that would really make a huge difference to everyone never happen because we can’t ever produce a body of work that is coherent enough to succeed. We are always doing work “in the box”.

imagesThe end result of our current form accountability is small-minded success. In other words we succeed at many small unimportant things, but fail at large important things. The management can claim that everything is being done properly, but never produce anything that really succeeds in a big way. From the viewpoint of accountability we see nothing wrong all deliverables are met and on time. True success would arise by attempting to succeed at bigger things, and sometimes failing. The big successes are the root of progress and the principal benefit of dreaming big and attempting to achieve big. In order to succeed big, one must be willing to fail big too. Today big failure surely brings congressional hearings and the all to familiar witch-hunt. Without the risks of failure we are left with small-minded success being the best we can do.

Big goals, trust and leadership are the cures. We need to prioritize progress and discovery by producing an environment that is tailored to produce it. Hand in hand with this is a level of faith in the human spirit and ingenuity. Let people believe that their work matters with proof that they are contributing to a meaningful goal. Daniel Pink wrote a book called “Drive” where a workplace is described that is the utter antithesis to the sort of accountability science labors under today (http://www.amazon.com/Drive-Surprising-Truth-About-Motivates/dp/1594484805/ref=sr_1_1?ie=UTF8&qid=1447431195&sr=8-1&keywords=drive).61MnRyNuIDL I was stunned by how empowering his description of work could be, and how far from this vision I work under today. I might simply suggest that my management read that book and implement everything in it. The scary thing is that they did read it, and nothing came of it. The current system seems to be completely impervious to good ideas (or perhaps following the book would have been too empowering to the employees!). Of course the book suggests a large number of practices that are completely impossible under current rules and opposed by the whole concept of accountability we are under today.

It is completely ironic that the very forces that are pushing accountability down our throats are completely free of any accountability themselves. Our current political class is virtually invulnerable to any accountability from the voters. The rich and powerful overlords rule the masses with impunity. Their degree of wealth makes them completely resistant to accountability. The accountability thrust upon the rest of us is simply a tool to maintain and magnify their power through killing progress and assuring that the status quo that favors them is never threatened. Accountability is simply a way of crushing progress, and making sure that the current societal order is maintained.

I worry that only some external force and/or event will be able to dismantle the current system, and it will not be pretty or pleasant for anyone. The forces in power today are quite entrenched and resist any move that might reduce their stranglehold on the World.

The best way to find out if you can trust somebody is to trust them.

― Ernest Hemingway

← Older posts
Newer posts →

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • February 2026
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Regularized Singularity
    • Join 56 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Regularized Singularity
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...