• About The Regularized Singularity

The Regularized Singularity

~ The Eyes of a citizen; the voice of the silent

The Regularized Singularity

Monthly Archives: June 2014

Keeping it real in high performance computing

27 Friday Jun 2014

Posted by Bill Rider in Uncategorized

≈ Leave a comment

“Theories might inspire you, but experiments will advance you.” ― Amit Kalantri

This week I have a couple of opportunities to speak directly with my upper management. At one level this is nothing more than an enormous pain in the ass, but that is my short sighted monkey-self speaking. I have to prepare two talks and spend time vetting them with others. It is enormously disruptive to getting “work” done.

On the other hand, a lot of my “work” is actually a complete waste of time. Really. Most of what I get paid for is literally a complete waste of a very precious resource, time. So it might be worthwhile making good use of these opportunities. Maybe something can be done to provide work more meaning, or perhaps I need to quit feeling the level of duty to waste my precious time on stupid meaningless stuff some idiot calls work. Most of the time wasting crap is feeding the limitless maw of the bureaucracy that infests our society.

Now we can return to the task at hand. The venues for both engagements are somewhat artificial and neither is ideal, but its what I have to work with. At the same time, it is the chance to say things that might influence change for the better. Making this happen to the extent possible has occupied my thoughts. If I do it well, the whole thing will be worth the hassle. So with hope firmly in my grasp, I’ll charge ahead.

I always believe that things can get better, which could be interpreted as whining, but I prefer to think of this as a combination of the optimism of continuous improvement and the quest for excellence. I firmly believe that actual excellence is something we have a starkly short supply of. Part of the reason is the endless stream of crap that gets in the way of doing things of value. I’m reminded of the phenomenon of “bullshit jobs” that has been recently been observed (http://www.salon.com/2014/06/01/help_us_thomas_piketty_the_1s_sick_and_twisted_new_scheme/). The problem with bullshit jobs is that they have to create more work to keep them in business, and their bullshit creeps into everyone’s life as a result. Thus, we have created a system that works steadfastly to keep excellence at bay. Nonetheless in keeping with this firmly progressive approach, I need to craft a clear narrative arc that points the way to a brighter, productive future.

Image

High performance computing is one clear over-arching aspect of what I work on. Every single project I work on connects to this. The problem is that to a large extent HPC is becoming increasingly disconnected from reality. Originally computing was an important element in various applied programs starting with the Manhattan project. Computing had grown in prominence and capability through the (first) nuclear age in supporting weapons and reactors alike. NASA also relied heavily on contributions from computing, and the impact of computation modeling improved the efficiency of delivery of science and engineering. Throughout this period computing was never the prime focus, but rather a tool for effective delivery of a physical product. In other words there was always something real at stake that was grounded in the physical “real” world. Today, more and more there seems to have been a transition to a World where the computers became the reality.

More and more the lack of support for the next supercomputer is taking on the tone and language of the past, as if we have “supercomputer gap” with other countries. The tone and approach is reminiscent of the “missile gap” of a generation ago, or the “bomber gap” two generations ago. Both of those gaps were BS to a very large degree, and I firmly believe the supercomputer gap is too. These gaps are effective marketing ploys to garner support for building more of our high performance computers. Instead we should focus on the good high performance computing can do for real problem solving capability, and let the computing chips fall where they may.

ImageThere is a gap, but it isn’t measured in terms of FLOPS, CPUs, memory, it is measured in terms of our practice. Our supercomputers have lost touch with reality. Supercomputing needs to be connected to a real tangible activity where the modeling assists experiments, observations and design in producing something that services a societal need. These societal needs could be anything from national defense, cyber-security, space exploration, to designing better more fuel-efficient aircraft, or safer more efficient energy production. The reality we are seeing is that each of these has become secondary to the need for the fastest supercomputer.

A problem is that the supercomputing efforts are horribly imbalanced having become primarily a quest for hardware capable of running the LINPAC benchmark the fastest. LINPAC does not reflect the true computational character of the real applications supercomputers use. In many ways it is almost ideally suited towards demonstrating high operation count. Ironically it is nearly optimal in its lack of correspondence to applications. As a result of the dynamic that has emerged is that real application power has become a secondary, optional element in our thinking about supercomputing.

These developments highlight our disconnect from reality. In the past, the reality of the objective was the guiding element in computing. If the computing program got out of balance, reality would intercede to slay any hubris that developed. This formed a virtuous cycle where experimental data would push theory, or computed predictions would drive theorists to explain, or design experiments to provide evidence.

In fact, we have maimed this virtuous cycle by taking reality out of the picture.

The Stockpile Stewardship program was founded as the alternative to the underground testing of nuclear weapons, and supercomputing was its flagship. We even had a certain official say that a computer could be “Nevada* in box” and pushing the return key would be akin to pressing the button on a nuclear test. It was a foolish and offensive thing to say, almost everyone else in the room knew it was; yet this point of view has taken root, and continues to wreck havoc. Then and now, the computer hardware has become nearly to sole motivation with a loss of the purpose for the entire activity far too common. Everything else needed to be successful has been short-changed in the process. With the removal of the fully integrated experiments of the nuclear test from the process, the balance in everything else needed to be carefully guarded. Instead, this balance was undermined almost from the start. We have not put together a computing program with sufficient balance, support and connections to theory and experiment to succeed, as the Country should demand.

“The real world is where the monsters are.” ― Rick Riordan

Image

I have come to understand that there is something essential in building something new. In the nuclear reactor business, the United States continues to operate old reactors, and fails to build new ones. Given the maturity of the technology, the tendency in high performance computing is to allow highly calibrated models to be used. These models are highly focused on working within a parameter space that is well trodden and containing to be the focus. If the United States were building new reactors with new designs the modeling would be taxed by changes in the parameter space. The same is true for nuclear weapons. In the past there were new designs and tests that either confirmed existing models, or yielded a swift kick to the head with an unexplained result. It is the continued existence of the inexplicable that would jar models and modeling out of an intellectual slumber. Without this we push ourselves into realms of unreasonable confidence in our ability to model things. Worse yet we allow ourselves to pile all our uncertainty into calibration, and then declare confidently that we understand the technology.

Image

At the core of the problem is the simple, easy and incorrect view that bigger, faster supercomputers are the key. The key is deep thought and problem solving approach devised by brilliant scientists exercising the full breadth of scientific tools available. The computer in many ways is the least important element in successful stewardship; it is necessary, but woefully insufficient to provide success.

“Never confuse movement with action.” ― Ernest Hemingway

Supercomputing was originally defined as the use of powerful computers to solve problems. Problem solving was the essence of the activity. Today this is only true by fiat. Supercomputing has become almost completely about the machines, and the successful demonstration of the machines power on stunt applications or largely irrelevant benchmarks. Instead of defining the power of computing by problems being solved, the raw power of the computer haImages become the focus. This has led to a diminishment in the focus on algorithms and methods, which has actually a better track record than Moore’s law for improving computational problem solving capability. The consequence of this misguided focus is a real diminishment in our actual capability to solve problems with supercomputers. In other words, our quest for the fastest computer is ironically undermining our ability to use computers effectively as possible.

The figure below shows how improvements in numerical linear algebra have competed with Moore’s law over a period of nearly forty years. This figure was created in 2004 as part of a DOE study (the Scales workshop URL?). The figure has several distinct problems: the dates are not included, and the algorithm curve is smooth. Adding texture to this is very illuminating because the last big algorithmic breakthrough occurred in the mid 1980’s (twenty years prior to the report). Previous breakthroughs occurred on an even more frequent time scale, 7-10 years. Therefore in 2004 we were already overdue for a new breakthrough, which has not come yet. On the other hand one might conclude that multigrid is the ultimate linear algebra algorithm for computing (I for one don’t believe this). Another meaningful theory might be that our attention was drawn away from improving the fundamental algorithms towards a focus on making these algorithms work on massively parallel supercomputers. Perhaps improving on multigrid is a difficult problem, and the problem might be that we have already snatched all the low hanging fruit. I’d even grudgingly admit that multigrid might be the ultimate linear algebra methods, but my faith is that something better is out there waiting to be discovered. New ideas and differing perspectives are needed to advance. Today, we are a full decade further along without a breakthrough, and even more due for a breakthrough. The problem is that we aren’t thinking along the lines of driving for algorithmic advances.

Image

I believe in progress; I think there are discoveries to be made. The problem is we are putting all of our effort into moving our old algorithms to the new massively parallel computers of the past decade. Part of the reason for this is the increasingly perilous nature of Moore’s law. We have had to increase the level of parallelism in our codes by immense degrees to continue following Moore’s law. Around 2005 the clock speeds in microprocessors stopped their steady climb. For Moore’s law this is the harbinger of doom. The end is near, the combination of microprocessor limits and parallelism limits are conspiring to make computers amazingly power intensive, and the continued rise as in the past cannot continue. At the same time, we are suffering from the failure to continue supporting the improvements in problem solving capability from algorithmic and method investments that had provided more than Moore’s law-worth in increased capability.

A second piece of this figure that is problematic is the smooth curve of advances in algorithm power. This is not how it happens. Algorithms have breakthroughs and in the case of numerical linear algebra it is how the solution time scales with the number of unknowns. This results is quantum leaps in performance when a method allows us to access a new scaling. In between these leaps we have small improvements as the new method is made more efficient or procedural improvements are made. This is characteristically different than Moore’s law in a key way. Moore’s law is akin to a safe bond investment that provides steady returns in a predictable safe manner. Program managers and politicians love this because it is safe whereas algorithmic breakthroughs are like tech stocks; sometimes it pays off hugely, most of the time the return is small. This dynamic is beginning to fall apart; Moore’s law will soon fail (or maybe it won’t).

I might even forecast that the demise of Moore’s law even for a short while might be good for us. Instead of relying on power to grow endlessly, we might have to think a bit harder about how we solve problems. We won’t have an enormously powerful computer that will simply crush problems into submission. This doesn’t happen in reality, but listening to supercomputing proponents you’d think it is common. Did I mention bullshit jobs earlier?

The truth of the matter is that computing might benefit from a discovery that will allow the continuation of the massive progress of the past 70 years. There is no reason to believe that some new technology will bail us out. The deeper issue regards the overall balance of the efforts. The hardware and software technologies have always worked together in a sort of tug-of-war that bares similarity to what we see in tension between theoretical and experimental science. One field drives the other depending on the question and the availability of emergent ideas or technologies that opens new vistas. Insofar as computing is concerned my concern is plain: hardware concerns have had preeminence for twenty or thirty years while algorithmic and method focus has waned. The balance has been severely compromised. Enormous value has been lost to this lack of balance.

This gets to the core of what computing is about. Computing is a tool. It is a different way to solve problems, manage or discover information and communicate. For some computing has become an end unto itself rather than a tool for modern society. We have allowed this perspective to infect scientific computing as a discipline because of the utility of acquiring new supercomputers outweighs using them effectively. This is the root of the problem and the cause of the lack of balance we see at present. This is coupled to a host of other issues in society, not the least of which is a boundless superficiality that drives a short-term focus and disallows real achievement because of the risk of failure has been deemed unacceptable.

ImageWe should work steadfastly to restore the necessary balance and perspective for success. We need to allow risk to enter into our research agenda and set more aggressive goals. Requisite with this risk we should provide greater freedom and autonomy to those striving for the goals. Supercomputing should recognize that the core of its utility is computing as a problem solving approach that relies upon computing hardware for success. There is an unfortunate tendency to simply state supercomputing as a national security resource regardless of the actual utility of the computer for problem solving. These claims border on being unethical. We need computers that are primarily designed to solve important problems. Problems don’t become important because a computer can solve them.

* Nevada is the location of the site the United States used for underground nuclear testing.

The Absolute Necessity of Honest and Critical Peer Review

20 Friday Jun 2014

Posted by Bill Rider in Uncategorized

≈ Leave a comment

“To avoid criticism say nothing, do nothing, be nothing.” ― Aristotle

Peer review is undergoing a bit of a crisis these days. As with peer review itself a hard sharp look at a topic is a good thing. It is a key professional responsibility that is done without pay and little explicit appreciation. With the cost of academic journals skyrocketing people are rightly asking tough questions about the system where institutions pay for journals twice: once for the access to the journals, and a second time through the labor of their employees in conducting the peer review. Over the past twenty years I have also seen the standard of peer review for organizations up close. It is backsliding for the simple reason that the powers that be do not understand or appreciate peer review’s basic role. Because the consequences of a negative review have become so severe real hard-hitting peer review is rarely done, and never on the record. The powers that be have come to be completely intolerant of anything that looks like a mistake or failure.

“If failure is not an option, then neither is success.” ― Seth Godin

Critical hard-hitting peer review is necessary for the successful conduct of science. This is almost never questioned until one starts to peel away the most superficial layers of the scientific enterprise. In its purest form, peer review works, but reality soon intrudes as the issue is examined. It is broadly acknowledged that a positive, but critical review is enormously challenging. It is also utterly essential to self-improvement. Too often the delivery of reviews is off the mark and either comes off as parochial and mean-spirited, or perhaps worse yet, completely without depth and integrity.

Image

In the cold light of day we should greet a well-done critical review as a gift. A well done review won’t just tell you how great you are, or how good you ideas are, but point out the weaknesses in your work, and suggest how to improve. If you’re just hearing how great you are in all likelihood the review isn’t being done well and the praise you are getting is bullshit. Moreover this bullshit is doing you a great disservice.

For me, I open email containing the review of a paper with dread. It is always personal, at the same time it isn’t usually nearly as bad as I fear. In the long run, my own work is greatly improved by peer review, and to be honest it would suffer greatly without it. Ultimately it is a great force for the good, it keeps me sharp, honest and exacting in the quality of my work. Nonetheless it is a difficult thing to deal with and the weaker part of me would gladly avoid it at times. My thinking self intervenes and takes it as an important part of self-improvement. Occasionally there is a deep disagreement, and the reviewer is wrong, but these points are usually contentious and not fully decided by the community. A good review is always an opportunity to learn and grow.

There are other issues with peer review worth taking note of. For example, certain people become quite renowned and unfortunately get immunized from peer review. A friend of mine had two fairly famous advisors and submitted a paper with them. The paper received no real review, the reviews just said “great work” and “publish”. It was a good paper, but in this case the process was broken. These non-reviews were a severe disservice to the community, my friend and even his famous advisors. Because the reviewers were anonymous, we don’t know who they were, but they did no one a favor. Even this paper could stand to be improved. I can say that my reviews are never that glib. On the other hand there are times I could do better as a reviewer, but I’m under the view that I could always do better.Image

The places I’ve worked are themselves reviewed in keeping with the basic scientific attitude that peer review is a necessity. The idea is good, and could be just as valuable for my employers as the peer review is for me as a scientist. Like many good ideas, the execution of the idea is flawed. Over time the flaws have grown until it is fair to say that the system is broken. The technical work is highly scripted, shows the best the organization has and never receives any more than a smattering of critique. The review only hits around the edges, and the marks given to the organization are always “World Class”. What criticism is received doesn’t really need to be addressed anyway. Rarely, if ever, does the review lead to anything except a declaration “we are great again this year”.

How did we get this way? This whole attitude has two sources: the lack of understanding of how science works, and the unwillingness to accept failure of any sort. There is a lack of understanding that mistakes happen when people or organizations stretch and challenge themselves. If you aren’t making mistakes you probably aren’t applying yourself, or trying very hard. We also systematically lowball all our expectations for achievement to avoid the possibility of mistakes or failure. This is a chronic condition that is slowly sucking the vitality from our research institutions. It is literally a crisis.

As most educators note, mistakes are necessary for success as they are the foundation of learning. If one isn’t making mistakes, or outright failing, they aren’t pushing their own limits. We are increasingly defining a system that takes mistakes and the possibility of failure off the table. The consequences of this are grave in that these mistakes and failures are the engine of success. This is not to say that we should allow the problems of malpractice or lack of seriousness of effort to creep into our work. I am saying that we need to encourage the possibility of failure and mistakes arising from honest, earnest efforts without the current threats of repercussions. By driving mistakes and failures from the system we guarantee mediocrity. If I look around me at the system we have created I see boundless mediocrity. We are becoming a milquetoast Nation. Gone are the bold initiatives that made our country proud. Now everyone is afraid of screwing up, which is precipitating the biggest screw-up of them all.

What needs to be done? How do we get out of this horrible bind?

We need to start by meaningfully differentiating between mistakes and failures due to incompetence from those coming from ambition. If we continue to punish ambitious efforts that fail in the manner we do, we will kill ambition. In many cases one glorious success is worth 100 noble failures. Our attitude toward any of these things in making sure that we have 100 mediocre successes that can be spun into seeming competence. We need to demand high standards, ask hard questions and focus on doing our best. We need to recognize that the lack of mistakes and failures is actually a bad thing and sign that things are not working.

We need to encourage our critics to find our flaws, demand we fix them and tell us where we can do better. We are not good enough, not by a long shot.

Why do scientists need to attend conferences?

13 Friday Jun 2014

Posted by Bill Rider in Uncategorized

≈ 1 Comment

What comes to mind when you think of a scientist at a conference? What about when the conference is being held somewhere nice like Hawaii, a ski resort, Italy, or France? Does this make it a boondoogle? Should the government severely regulate or stand in the way of scientists attending these meetings?

Image

Well they do, and it is harming the quality of science in the United States. In addition to harming science it isn’t saving any money, but rather costing more money. The scandal isn’t scientists attending conferences, but rather the government’s mismanagement of the scientific enterprise to such a massive degree.

I attend a number of conferences each year (probably on the order of five to eight a year). As a working scientist this is an absolute necessity for success. It is a responsibility as a researcher to actively participate in the presentation of new research findings and as part of the peer audience. Additionally, it is an essential form of continuing professional education. As I’ve matured, the organization of conferences and associated sub-meeting or mini-symposia has become a staple of my professional work. It is important work, and the challenges have become excessive lately.

The attendance at conferences of those who either work for, or are funded by the federal government is being heavily scrutinized. The reason was a General Services Administration (GSA) conference in 2010 that was quite a boondoggle. The GSA management in organizing and structuring the meeting showed exceedingly poor judgment.* They probably should have lost their jobs, which would have been a rational response. As most people know, the governmental response is far from rational.  Like most scandals, the over-reaction has been worse and more expensive than the original scandal itself.  The costs incurred by the administration of conference attendance and extra costs through delays, and unnecessary management attention on the topic makes it clear that money is not the issue. Avoiding the appearance of impropriety is the goal. The system is succeeding in producing an environment that is increasingly hostile to scientific research, and undermines the advance and practice of science. Another more poetic way of stating the approach would be dubbing it as a “circular firing squad.”

So why do scientists need to attend conferences?

We can start by talking about what a conference is and what purpose it serves. Typically a conference is associated with a defined technical field such as “Compressed Sensing” or a professional organization such as the “American Physical Society” or a combination of the two. Conferences come in all shapes and sizes. Some meetings are enormous (think meetings of societies such as the American Geophysical Union) to small topical workshops on emerging fields with 20 or 30 scientists. Each has immense importance to science’s progress. The key aspect of the conference is the exchange of information, with people taking a number of distinct roles: presenter, audience, critic, connector, teacher, and student… A conference is enormously important to the conduct of science. The exchange of ideas and subsequent debate, sharing of common experience, friendships all play a key role in successful research.

Image

Judging by how conference attendance is managed the main goal of attending a conference is giving a talk. Everything else is secondary. This is where the damage crosses the line over into outright malpractice. When a young scientist joins the new community sometimes the best thing to do is have them attend a conference and absorb the breadth and depth of the field. It also provides an avenue to meet their new colleagues, and learn the culture by immersing themselves in it. This is almost impossible today.

The benefits of attending conferences go well beyond the purely technical aspects of the profession. Conferences are where new ideas are presented or different ideas are debated in open forum. Sometimes different points-of-view can be engaged directly leading to breakthroughs that wouldn’t be possible otherwise. There is something special about human beings sharing a meal together that cannot be replicated in other ways. Conferences are key in developing vibrant technical communities that empower the advance of science and technology. My government’s response to a stupid GSA scandal is putting all of these benefits at risk.*

ImageI’ve quipped that we should have a special conference center is some awful place where no one would want to go. That way the Congress and public would know that we go to the conferences to engage in technical work. On the other hand, part of going to conferences involves getting inspired to do better work. Why not go to some place that is inspiring? Why not go to some place that has great restaurants so that the sharing of the meal can be memorable on multiple levels? Why not make the entire event memorable and worthwhile and enriching at a personal level? At the core of the attitude of many in government is a sense that life should be suffered with work being the most unpleasant aspect of them all. It is a rather pathetic point of view that leads to nothing positive. We shouldn’t be punished for working in the public sphere, yet punishment seems to be the objective.

Let me get to the point of attending conferences in foreign countries. Science is international, now more than ever. Thanks to lousy funding, lousy education and lousy management (with the topic here being the latest example) a lot of the best science happens in other countries. It always has, but the balance has been tipped ever more toward Europe, China, India… The mismanagement of conference attendance is some ways is completely consistent with the mindset that is overturning the United States’ supremacy in science. One can argue that like the health of the American middle class, we are already second rate in many regards. The mismanagement of science is simply driving this outcome ever more strongly. Politicians, the citizens who put them in office, and the vested interests funding campaigns care little about the state of science in the United States. We are working to undo the sort of advantage the United States had during most of the 20th Century. Corporations seem to care little especially considering that they don’t really abide by borders thus science in Europe or China can benefit them as well. It is the rank and file citizens of the United States who will suffer the economic price for the lack of scientific discovery and technological innovation precipitated by the systematic mismanagement we see today.Image

Scientists are people and we respond to the same things as everyone else. The attendance of conferences is an essential aspect of doing science, the current approach and attitude toward conferences is undermining the quality and effectiveness of science. This should deeply concern every citizen because the quality of science has a direct impact on society as a whole. Whether your concerns are grounded on the health of the economy, or the National security, or our role as World leaders, science plays a key role in success. In the process of our systematic mismanagement of the scientific enterprise we are failing each of these.

* In an earlier version of the post I incorrectly identified the Internal Revenue Service (IRS) as the government agency responsible for the scandalous conference in 2010.

Why climate science fails to convince

06 Friday Jun 2014

Posted by Bill Rider in Uncategorized

≈ 2 Comments

“In other words, it’s a huge shit sandwich, and we’re all gonna have to take a bite.”–Lt. Lockhart, Full Metal Jacket

Projected_change_in_annual_mean_surface_air_temperature_from_the_late_20th_century_to_the_middle_21st_century,_based_on_SRES_emissions_scenario_A1B

When I starting to write this the thought occurred to me, “do I really want to do this? This topic is a lightning rod and it’s sure to piss everyone off!” Of course, this is exactly the reason it needs to be discussed. Climate is an enormous scientific and societal problem that has become a horrific “shit sandwich” that we all get to share. It is starting to infect the entirety of the societal engagement with science in a profoundly negative way. A thoughtful and open discussion about the quality and reliability of the underlying science cannot be had. This serves two terrible purposes: it energizes the climate “deniers” who have defacto won the argument by making it completely toxic, and it savages the public image of science, damaging not just the perception of science, but its practice. We are facing the prospect that the only outcome for humanity is bad. So I’m going to grab the proverbial lightning rod with both hands.

Let’s get to one of the elephants in the room right away; the issue of deniers, skeptics and critics needs immediate attention. My contention is that these labels are important and the distinctions are important. First, the positive side of the coin, the critics are essential to progress and an honest dialog on the subject. The current circumstances are drowning out the ability to be critical of climate science. This is dangerous. The science is good, but not good enough; it is never good enough. Because criticism is so muted by the polarized political atmosphere-surrounding climate, the skeptics are rightly energized. Some degree of skepticism is warranted especially considering how the scientific community is characterized. The problem is that as one moves along the spectrum of skepticism, one approaches the third category, the denier. The denier cannot be defended, it is simply the outright denial of facts, of science, but we are creating a situation where the facts are muddied by both sides of the argument. A modern society should rightly, repudiate the deniers; instead cynical and greedy forces are empowering them. To the extent that the scientific community misbehaves, they also empower denial as reasonable.

Science is being horribly politicized these days. No subject is more so than climate science on the topic of climate change or the more pointed and accurate term, global warming. Scientists are not the core of the problem, but they do add to the toxic mix by responding to the environment in a manner that undermines the credibility of science. We are confronted with a situation where the scientific results threaten the ability of greedy people to make lots of money. The greedy people who stand to have their earning power diminished are fighting back. Some of their weapons are scientists who side with them for largely ideological reasons (or outright financial gain). There are always scientists who are willing to sow doubt as hired guns of the greedy, just as both sides of a court cases can get their own experts if the price is right. The truly damaging part of this dialog is the damage the credible scientific community is doing to itself in joining the battle.

Hurricane-CC-NASA2010-628x314

Climate science is the archetype of this dynamic much like tobacco was a generation ago. The amazing thing is that some of the same hired guns that are attacking climate scientists today attacked the idea that tobacco was causing cancer then. The majority of these scientists have absolutely zero credibility. Or to put it another way, they have the same credibility I have as a climate expert. My views here are associated with how science is being conducted, and the atmosphere for improvement, or how criticism of the quality of the science can proceed without playing into the hands of the denial industry. As a scientist this is completely within the frame of valid expertise on my part.

I’m not engaging in any sort of false equivalence although one might see this as the conclusion of this piece. The difference in level of damage to science and society by the misbehavior on each side is vastly different. The side of denial has almost the complete absence of merit. Whatever merit it does have is associated with the critical side of skepticism, and nothing from the self-interested parties providing most of the resources for the “movement”. The sins of the climate science community are basically at the margins, but lead to a loss of effectiveness. First and foremost, the World needs to realize that an enormous problem exists, and needs to be addressed quickly and forthrightly. The problem is when the climate community starts to address the actions to be taken to deal with global warming. There are many potential ways of addressing the problem and all of them are expensive and controversial. Some of the ways that would be most effective do not suit the “green” agenda. The problem is that too many who are trying to sound the alarm are also pushing specific solutions, and in particular solutions from the left. This undermines the ability to convince the World that there is a problem that must be solved. Moreover it makes science itself look like a partisan activity with a given political point-of-view.

Before I go further, I will just state up front that my judgment, for what its worth, is that man is the prime element causing the observed global warming through our collective industrial and agricultural activities. The evidence of the warming is rock solid, and the hypothesis that this warming is dominantly anthropogenic is very likely true. This is still a hypothesis and most of my quibbling rests upon discomfort with the level of credibility of climate models. We must be careful with regard to the level of uncertainty of these models, which is likely to be larger than commonly characterized because of the methodology used. This care must include the proviso that the amount of warming could well be much larger than predicted, which would prove catastrophic for humanity.

I have to admit that this issue is personal at some level. My parents are deniers born and bred through watching the propaganda machine known as Fox News. This has led to them being exposed to above-mentioned hired guns as appropriate experts in climate science. I’ve read one of their books (by Fred Singer, which my dad had been reading), and found it to be complete crap, but well enough written to fool an educated person without an appropriate technical background. My advice was to pay attention to the author’s conflicts of interest and past associations. For example previous funding by the tobacco lobby should be a clear “red flag”. My key point is there are more scientifically credible skeptics who make valid critiques. Those critiques are more scientific and not meant to be digested like another source of propaganda. As such they are much more difficult to make serve the biased purposes of the denier’s funding sources.

The climate skeptics are primarily driven and funded by interests that have massive financial interests in continued (or accelerated) use of carbon-based fuel. Others have a conservative world-view associated with Manifest Density (or put more bluntly, God put the Earth here for man to rape to his heart’s content). Fortunately these attitudes are countered by evangelicals who believe in stewardship of the Earth is their divine responsibility. Here in the USA, the old school rape and pillage the Earth types still have the edge. These people aren’t skeptics they are simply greedy or delusional self-centered people who have absolutely no credible argument against the science. Their only goal is to seed doubt in the mind of the untrained masses that form the majority of our society.

Jim-Inhofe-by-Gage-Skidmore

This is not to say that every skeptic is simply the willing tool of greedy corporate interests, or ideological zealots devoted to clear cutting as a God-given right. It is a spectrum where one end has honest and meaningful critical assessments, and the other with near denial of all evidence contrary to their opinion. Climate science has a lot of problems today that need to be solved. The honest skeptic is an important voice of criticism that ultimately drives the science to be better. There are some skeptics who have reasonable scientific arguments, but the charlatans drown them out. Moreover too many of the skeptics fail to call out the charlatans for what they are. Others are worse and figuratively get in bed with them. Unfortunately, the climate community isn’t behaving themselves either and has succeeded in helping to produce an appallingly poisonous environment for improving the science.  Worse yet, the climate community’s defense of their conclusions is assisting the poisoning the entire societal dialog on science and its proper role.

Let me be clear, the vast majority of the blame for the current state of affairs lies with greedy corporate interests that want to preserve the status quo that enriches them, the future of humanity be damned. They fund scientific skeptics to undermine science and ally themselves with conservative ideological interests who agree with the outcome they want. Their goals are fundamentally unethical and immoral. They are the worst kind of societal scum. If science doesn’t help line their pockets, they will oppose it. In opposing these forces, the scientific community has lowered their standards by embracing concepts that are unscientific. Key among these is the concept of consensus as defacto proof.

Consensus is not proof, but it plays a thorny role in the scientific process (i.e., peer review). Sometimes the consensus view is not correct. Most of time it is correct, but on occasion it is wrong. Consensus is a reflection of agreement within the community and a statement about what the best technical judgment is (at the moment), but not proof. The problem is that the public is being sold on the idea that consensus is proof, or at least when it comes off that way, no one corrects them (aside from the skeptics). The awful thing about this is that the scientific community is then handing the moral high ground over to the skeptics, even the scummy hired guns. Consensus is simply the critical judgment of the community that a given line of reasoning is favored given all of the evidence. All in all I would add my name to that consensus while remaining critical of the science. It really doesn’t matter that the consensus is 97% or if it were 92% or 99%. These numbers are meaningless insofar as proof is concerned. It is agreement, and nothing more.

Global_Climate_Model

Let’s talk about the scientific method and where the climate science sits with respect to it. Proof comes through agreement with observation or experiment. This is the rub. Anthropogenic (man-made) climate change (global warming) is a hypothesis. Observations seem to be unambiguous regarding warming, it is occurring and its magnitude and rate of increase are unprecedented. Arguments about the recent pause in warming are largely irrelevant to this aspect of the discussion. The observation is also highly correlated with man’s industrial activity primarily seen through increased CO2 concentrations in the atmosphere. It is worth noting that correlation does not imply causation; however, we do know unequivocally that CO2 does cause warming via the greenhouse effect. Thus the warming is correlated with a known causal effect.

Nonetheless, the hypothesis testing for the man-made basis for warming comes via modeling the climate on supercomputer. These models consistently show that warming is anthropogenic, but the proof via modeling is far from definitive despite being rather convincing. The relatively larger amount of warming in the Polar Regions is important because the models predict this. The loss of polar ice and melting of permafrost is another important observation that backs up the credibility of the models. It is the texture of this discussion that we are missing. The modeling needs to be better especially with regard to studying the sensitivity and uncertainty of the models. To put it bluntly, this work is not up to scratch, and the effort, dialog and discussion of these matters is not presently productive. Part of the problem is a general unwilliness to admit the flaws in the work publically because of how it would empower skeptics. The scumbag side of the skeptics has no interest in improving the science, and would use this honesty against climate science. This is the start of the toxic spiral because an important aspect of science has been short-circuited by the nature of the dialog. Self-criticism in the climate community is not as sharp, nor as open as it needs to be. The quality of the science will suffer, or has already suffered as a direct consequence.

Global_Atmospheric_Model

The modeling community needs to be acutely focused on doing better. It is true that there is vigorous debate inside the climate science community on many aspects of the modeling, but some topics are less open. The big issue is the nature of the projections into the future, are they predictive? And if so, how predictive? How does on grapple with the question? Key to answering these big questions how does calibration enter into the models? What is this calibration doing from a modeling point-of-view? Generally, the modeling does not produce a useful result without the gross calibration, but at what cost?

Again, we get to the heart of the problem. Critics and skeptics are essential to progress, and the climate community seems to be interested in silencing skeptics, at least publically. They are systematically over emphasizing the surety of their work. They are not willing to admit the imperfections in their work openly. For example, the uncertainty in the projections in the global climate is derived by the trajectories of a host of climate models. This is not uncertainty; it is simply a model-based voting scheme, none of which has any assurance of correctness. Each of these models has an innate uncertainty associated with the model, its numerical solution, and other modeling imperfections. The issue not explored is the nature of these models’ intrinsic bias, and their impact on the projections. This entire topic needs a substantially better scientific treatment. This is an old-fashioned way of exploring the issue rather than a reflection of modern computational science. By not providing a path to better methods we are not doing our best. This is basically ceding the debate to the deniers, and empowering the status quo for decades.

solar-plant-cc-Brookhaven-National-Laboratory-2012

Let’s get to the heart of where climate scientists really start to potentially damage their work’s significance. Whenever climate science aligns itself with the left wing of the environmental movement, the general public acceptance of the issue is harmed. Given the science it is reasonable to suggest that carbon neutral policy be pursued; however when nuclear power is rejected the community goes too far. The scientific evidence would clearly point towards reducing the production of energy via carbon-based sources as greatly as possible. Nuclear power is probably the single greatest hope to reduce carbon emissions greatly without wrecking the economy. Solar, wind and other energy sources have their place, but they cannot replace base capacity (today). Nuclear power can do this right now. Solving the energy production issue without carbon is an immense political and technical problem, but it is out of scope for the climate community. Advocating an energy path directly hurts their ability to provide the impact their science needs.

When the advocacy begins to tread into the area of economics and equality (inequality) it has definitively cross over into the political realm. People who use global warming to push these issues are a direct threat to the legitimacy of the whole field. This isn’t to say that inequality isn’t a legitimate issue; it just isn’t at the core of climate change. By coupling the two issues so closely, they simply equate themselves with the skeptics who ally themselves with carbon-spewing oligarchs. Neither extreme has a place in the debate over whether global warming is occurring, occurring due to man-made effects or whether it is a threat. The issues are important, but decoupled. Coupling them creates the toxic blend we have today.

Bjoern-Schwarz-Nuclear-Power-Plant-Germany-creative-commons-2010

The consensus issue is a problem for science in general. Science is about truth and we don’t vote for what is right. If 97 percent of scientists believe something it doesn’t mean a damn thing. They could be completely wrong, science does not work through consensus, and it works through evidence.

Let’s talk about what the evidence says. The Earth is warming, and warming at a rate that is unprecedented in the natural climate record. Something very dangerous to every inhabitant of the planet is happening. Why it is happening is the issue. The working hypothesis is greenhouse gases, and that is where modeling comes in. Just because all the models seem to agree with they hypothesis does not make it the truth. This focus on consensus as proof is hurting the scientific community in every field because it poisons the public perception of science.

wind-farm-cc-Jeff-Hayes-2008

To be clear I’m not saying that I don’t believe in anthropogenic global warming, I do. I believe that the combined effect of man’s activity in burning carbon-based fuel, agriculture and deforestation is driving climate change. This is a hypothesis. It is a compelling scientific argument that makes logical sense and fits the observations. I just cannot prove it. I don’t think climate science has proven it either. The greenhouse effect along with other human activity is the leading contender for the observed warming. While not proven, the evidence is large enough that National and International policy should be reflected in mitigating the activities that are most likely causing it.

What is proven is that the Earth is warming up a lot, and it is almost certainty a very bad thing. By very bad I mean that millions if not billions of people will lose their lives as a result. This ought to spur action. The people paying the majority of the skeptics don’t care; there is too much money to be made.

Something I am an expert on is modeling. I’m also an expert in modeling credibility. The approach that the climate community (IPCC) has taken to demonstrating credibility is very problematic. Basically the models are voting for outcomes, and in this perverse way it shadows the consensus argument in a perverse way. They are not actually providing any credible view of their uncertainty or accuracy. In the end this fails to provide the sort of clear guidance needed to improve the modeling. The whole of their approach does not reflect the best in computational science. Despite this criticism, I’d take their models as being a reasonable reflection to the Earth’s response. I’m questioning the overall quality of their approach and evidence.

james-hansen-cc-2012

The end result is a climate science community that has played to the lowest denominator. We need great science to study this issue. Instead we have devolved into a mindless shouting match that basically hands victory to those who would have us do nothing. This is a tragedy because climate change is an existential threat to our species.

Ultimately science and technology must be healthy if we can hope to deal with the consequences of global warming. Transportation will need dramatic overhaul. We need to produce new options for producing energy in an economically viable manner. Geoengineering may be needed to mitigate the dumping of carbon into the atmosphere. Biological and agricultural sciences are needed to provide relief from the impacts of science. If the ability to progress via science is damaged by our collective dysfunction, the ability to respond in a healthy way to the warming will be harmed.

What would make the whole situation better? If you believe global warming is happening because of the evidence, and that the hypothesis of human causation is the best explanation then work to make the science going into these better by being critical of every weakness. If someone supports this belief because of their worldview and not the science treat them with suspicion. There are supporters of climate change as an issue because it empowers their political goals (like radical environmentalism, Marxism, etc..), not because the science says it is a problem. People with this view hurt the ability of society to deal forthrightly with the issue. If on the other hand, you don’t believe global warming is happening and/or humans are causing it then work to make the science better by being purposefully critical of what undermines your belief. As a skeptic also be critical of people who don’t believe these things because of their worldview. If you don’t then you’ll be put into the same camp as religious zealots and greedy oligarchs. These people work to undermine the legitimacy of any skepticism of the science. In the end the people who choose their answer to global warming based on worldview aren’t interested in the truth, they are interested in winning, everyone else be damned.

The quote from “Full Metal Jacket” applies to both the issue of climate change, and the harm done through the nature of the public dialog. Science is being damaged deeply by the dialog. Those trying to undermine any response to climate change aren’t simply the only ones doing this damage; those who rightly call themselves scientists are causing harm. As a direct result no one on either side is going to win, we are all going to lose.

Subscribe

  • Entries (RSS)
  • Comments (RSS)

Archives

  • February 2026
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013

Categories

  • Uncategorized

Meta

  • Create account
  • Log in

Blog at WordPress.com.

  • Subscribe Subscribed
    • The Regularized Singularity
    • Join 55 other subscribers
    • Already have a WordPress.com account? Log in now.
    • The Regularized Singularity
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...