GIGO, for garbage-in, garbage-out is a basic principle of computing and/or decision-making which holds that the validity or integrity of the input will determine the validity or integrity of the output. Which is why first-year computer students are taught to check and recheck their input data and assumptions. It is not unreasonable, therefor to expect the same of seasoned scientists with multiple letters after their names, utilizing some of the most sophisticated and expensive computers and operating out of prestigious universities and laboratories. Especially when taxpayers are underwriting their work and the studies produced by their computer models are the basis for far-reaching public policies that will dramatically impact those taxpayers, as well as all of society.
However, when it comes to the theory of anthropogenic (human-caused) global warming, or AGW, the GIGO principle appears to be the norm. The so-called mainstream media (MSM) never seem to tire of headlining scary scenarios of climate catastrophe brought on by AGW, based on the latest projections generated by computer modeling of atmospheric temperatures, ocean temperatures, sea levels, glaciers, rain fall, extreme storms, etc. The same media organs, however, rarely report on the many scientific studies that regularly debunk the schlocky — and often outright fraudulent — computer models.
The Hockey Schtick blogspot reported on December 10 that a new paper published in the Journal of Climate finds there has been "little to no improvement" in simulating clouds by state-of-the-art climate models. The authors note the "poor performance of current global climate models in simulating realistic [clouds]," and that the models show "quite large biases ... as well as a remarkable degree of variation" with the differences between models remaining "large."
But, once the storm made landfall, it jumped to 0.43 percent and took about four days to return to normal, according to a new report by scientists at the Information Sciences Institute (ISI) at the University of Southern California Viterbi School of Engineering.
“On a national scale, the amount of outage is small, showing how robust the Internet is. However, this significant increase in outages shows the large impact Sandy had on our national infrastructure,” says John Heidemann, who led the team that tracked and analyzed the data. Heidemann is a research professor of computer science and project leader in the ISI’s computer networks division.
Heidemann worked with graduate student Lin Quan and research staff member Yuri Pradkin, sending tiny packets of data known as “pings” to networks and waiting for “echoes,” or responses. Though some networks—those with a firewall—will not respond to pings, this method has been shown to provide a statistically reasonable picture of when parts of the Internet are active or down.
The team, which was also able to pinpoint where the outages were occurring, noted a spike in outages in New Jersey and New York after Sandy made landfall.
Their research is published as a technical report on the ISI webpage, and the raw data will be made available to other scientists who would like to analyze it.
The data is not yet specific enough to say exactly how many individuals were affected by the outage, but it does provide solid information about the scale and location of outages, which could inform Internet service providers on how best to allocate resources to respond to natural disasters.
“Our work measures the virtual world to peer into the physical,” Heidemann says. “We are working to improve the coverage of our techniques to provide a nearly real-time view of outages across the entire Internet. We hope that our approach can help first responders quickly understand the scope of evolving natural disasters.”
The results of Dr. Mrner’s research are especially relevant to assessing the claims of climate modelers that the survival of island nations such as Maldives and Tuvalu, and low-lying coastal areas in developing nations, such as India and Bangladesh, is being threatened by rising sea levels due to AGW from emissions of the “rich countries.” The phony climate models projecting catastrophic sea-level rises are then used at UN climate summits, such as at Copenhagen, Cancun, Durban, Rio, and the recently concluded Doha summit, to call for carbon taxes and “loss and damages” payments to the “threatened” nations, in the interest of “climate justice.”
As Prof. Mrner charges, “sea-level gate” is indeed a grave scandal, showing widespread unethical practices and serious perversion of science. However, “sea-level gate” is just one of a multitude of scandals, collectively known as Climategate, (See here, here, and here), nearly all of which employ computer modeling chicanery to craft wild scenarios (which invariably are contradicted by real-world observations and verifiable historical data) to promote an agenda of empowering governments at local, national, and international levels to deal with the fabricated “crises.”
In a July 10, 2012 op-ed column for the Australian journal Quadrant, Professor Cliff Ollier of the School of Earth and Environment at the University of Western Australia took aim at the dangerous practice of allowing unvetted and unreviewed computer models to determine policies in the name of “science.”
No comments:
Post a Comment