I'm doing some research for a paper on "Property Rights and Contract Enforcement in the Post-Zombie Apocalypse." Seriously.
In the process of trying to learn more about zombies, I ran across this interview with Max Brooks, in which he says,
I'm doing some research for a paper on "Property Rights and Contract Enforcement in the Post-Zombie Apocalypse." Seriously.
In the process of trying to learn more about zombies, I ran across this interview with Max Brooks, in which he says,
I'm old enough to have experienced major paradigm shifts in many areas. And so I wonder, which theoretical models make predictions most in line with future events?
Climate Change Models?
And for each model, there have been people who claim "It's settled science!"
Authoritarianism, always latent in progressivism, is becoming explicit. Progressivism’s determination to regulate thought by regulating speech is apparent in the campaign by 16 states’ attorneys general and those of the District of Columbia and the Virgin Islands, none Republican, to criminalize skepticism about the supposedly “settled” conclusions of climate science. ...
“The debate is settled,” says Obama. “Climate change is a fact.” Indeed. The epithet “climate change deniers,” obviously coined to stigmatize skeptics as akin to Holocaust deniers, is designed to obscure something obvious: Of course the climate is changing; it never is not changing — neither before nor after theMedieval Warm Period (end of the 9th century to the 13th century) and the Little Ice Age (1640s to 1690s), neither of which was caused by fossil fuels. ...
And of course, it's all for own good.
According the rezoning proposals rubber-stamped by London City Council, the proposed 8-story and 28-story buildings at 50 King Street would have little or no impact on the heritage Middlesex County Courthouse.
I took some photos today at 11am, showing the shadows cast by the current 3-story building at 50 King. I also took some photos of the shadows cast by the Renaissance Tower on King Street. The shadow from the Renaissance Tower stretches all the way from the south side of King Street north to the north side of Dundas, more than a block in length. Here is a photo of the north Renaissance Tower, 11am, Feb 18, 2016, casting its shadow onto the Budweiser Gardens.
And here is a photo of that shadow across the Bud Gardens (located directly east of the heritage Middlesex County Courthouse):
And here is a photo taken at the same time, showing the reach of the shadow from The Renaissance all the way up to and onto Dundas Street, more than a block north of the 20-story Renaissance Tower.
Quite clearly, a 28-story building located on the north side of King Street, slightly to the west of the current building at 50 King, would cover the heritage Middlesex County Courthouse in shadows much of the day, even in the summer when shadows are shorter.
But that's not all.
The other portion of the proposal includes an 8-story building where there is presently a 3-story building. Even this lower building will cast shadows that reach the heritage Middlesex County Courthouse much of time. Here are some photos I took at the same time. These show the shadow cast by the present 3-story building. You can see the shadows reach halfway or more to the heritage Middlesex County Courthouse.
An 8-story building, especially the planned building which would be even closer to the heritage Middlesex County Courthouse, would cast shadows that would reach the heritage building some of the time and would cover the space between the two buildings most of the time. Putting an atrium between the 8-story building and the 28-story tower would not provide much relief from these shadows.
This overshadowing will have a strong, negative effect on the heritage value of the Middlesex County Courthouse, but it will also greatly darken the space between the proposed building at 50 King and the heritage Middlesex County Courthouse. The entire heritage value of the property associated with the heritage block will be severely diminished.
These heritage issues need to be addressed by the London Heritage Advisory Committee and need to be considered by the Ontario Municipal Board.
Happy Valentine's Day!!!
Previous examples of my snow stomp art:
Bryan Caplan has a very interesting and very provocative post at Econlog challenging the standard, typical medical classifications relating to mental illnesses in general and to ADHD in particular. I have come to respect Caplan's work, and so I never dismiss anything he writes without giving it careful consideration. His material in this post seems generally right to me. Two telling paragraphs about ADHD:
Overall, the most natural way to formalize ADHD in economic terms is as a high disutility of work combined with a strong taste for variety. Undoubtedly, a person who dislikes working will be more likely to fail to 'finish school work, chores or duties in the workplace' and be 'reluctant to engage in tasks that require sustained mental effort'. [see chart below] Similarly, a person with a strong taste for variety will be 'easily distracted by extraneous stimuli' and fail to 'listen when spoken to directly', especially since the ignored voices demand attention out of proportion to their entertainment value. ...
As the DSM uses the term, a person who 'has difficulty' 'sustaining attention in tasks or play activities' could just as easily be described as 'disliking' sustaining attention. Similarly, while 'is often forgetful in daily activities' could be interpreted literally as impaired memory, in context it refers primarily to conveniently forgetting to do things you would rather avoid. No one accuses a boy diagnosed with ADHD of forgetting to play videogames.
Caplan re-presents a checklist to help professionals diagnose someone with ADHD. Here it is:
If this stuff had been around when I was young, I'd have been a drugged-out zombie. All nine of these applied to me.
When I was in Grade 2, the teacher wrote that I did good work when I did it, but that I rarely finished it. Also on behavioural items, I think I was given 13 minuses and only 3 pluses over one report-card period.
Also about that time, a woman who was visiting our home for dinner told my parents I should be put on drugs because I jiggled my legs so much.
So much of what is termed ADHD behaviour is better dealt with via behaviour training. Thank goodness my parents didn't put me on drugs. Instead, I had to learn to cope and adjust in some settings.
As one of my FB friends posted yesterday on a completely different (yet identical?) topic,
"You think too much because there's work that you don't want to do." - Andy Warhol's advice to Lou Reed.
I have railed relentlessly in the past about "Storm Porn" and about how forecasters and mediots so often focus on worst-case scenarios --- forecasters because they don't want to be held responsible if things turn out to be worse than forecast [someone called it CYA forecasting]; mediots because drama sells and pumps up ratings. [see this, this, and this]
For the storm this weekend that hit the U.S. middle east, forecasters got it wrong on the low side, though. From the NYTimes, this is the map of how much snow was being forecast on Friday afternoon.
And this is how much actually fell over Friday and Saturday:
That storm was MUCH worse than anticipated. And now these places have to figure out how to remove so much snow and then where to put it all!
Suddenly, beginning about a month ago, I am no longer able to take videos that have a small file size. I used to be able to shoot a couple of minutes and the file would be only 5-10 mbs. But now I can't seem to find any way to take videos that are anything less than one mb per second! What happened? How can I reset something on my frickn iPhone 6+ to get back to taking smaller video files?
I don't want to take big ones and compress them. I don't want to email large video files. I just want to shoot and upload brief (one-two minutes) videos that are small.
Helpful suggestions will be greatly appreciated. Thanks.
According to TWN [The Weather Network, also referred to as Trono Weather Network by friends in the west], the temperature will immediately rise by two degrees tomorrow night at midnight here in London, ON:
The forecast high for tomorrow is 4C, and the forecast low for Wednesday is 6C. These numbers imply a sudden two-degree increase in temperature during the hour at midnight Tuesday. I suppose that's possible, but I wonder whether their weather models may need to be adjusted to allow for consistency and trends.
from Popular Science:
#8, if true, would please many. Please let it be correct.
In 1907, famed psychologist William James claimed, “We are making use of only a small part of our possible mental and physical resources.” A journalist later misquoted him as saying the average person develops only 10 percent of his mental capacity. Scans, however, show that we use every part of our brain, though not all regions are active at once. (Sorry, Morgan.) That’s why damage to any area of the brain—such as the aftermath of a stroke—usually results in mental and behavioral effects.
The state of Georgia began distributing classical-music CDs to the families of newborns in 1998. Each CD included a message from the governor: “I hope both you and your baby enjoy it—and that your little one will get off to a smart start.” While the sentiment is appealing, the so-called Mozart Effect is dubious. The idea sprang from a 1993 study at the University of California at Irvine, which showed that 36 college students performed better on an IQ test after listening to Mozart than after relaxation exercises or silence. No one has been able to replicate those results. In fact, a 1999 Harvard University review of 16 similar studies concluded the Mozart Effect isn’t real.
Adult rats, rabbits, and even birds can grow new neurons, but for 130 years, scientists failed to identify new brain-cell growth in adult humans. That all changed in 1998, when a Swedish team showed that new brain cells form in the hippocampus, a structure involved in storing memories. Then, in 2014, a team at the Karolinska Institute in Sweden measured traces of carbon-14 in DNA as a way to date the age of cells, and confirmed that the striatum, a region involved in motor control and cognition, also produces new neurons throughout life. While our brains aren’t exactly an orgy of wildly replicating cells, they do constantly regenerate.
There are small anatomical differences between male and female brains, this much is certain. The hippocampus, involved in memory, is usually larger in women, while theamygdala, involved in emotion, is larger in men. (The opposite of what you’d expect from this myth.) But evidence suggests gender disparities are due to cultural expectations, not biology. For example, in 1999, social psychologists at the University of Waterloo in Ontario gave women and men a difficult math test. Women—even those with strong math backgrounds—scored lower than men, unless told the test had revealed no gender differences in the past. Then the women performed equally well as the men.
In the movies, comas look harmless: A well-groomed patient lays in bed for a few months and wakes fully articulate, seemingly unscathed by his or her ordeal. In real life, those emerging from comas often suffer disabilities and need rehabilitation. Brain scans point to why. Scientists at the French National Center for Scientific Research, in 2012, found that high-traffic brain regions—normally bright hubs of activity, even during sleep—are eerily dark in coma patients (while other areas inexplicably light up). Most comas also don’t last more than two to four weeks. So don’t believe everything (or anything) you see on Grey’s Anatomy.
If you’ve ever despaired at the Sunday crossword, here’s good news: Neuroscientists have found that doing crossword puzzles makes you very good at—drumroll, please—doing crossword puzzles. A 2011 study, led by researchers at the Albert Einstein College of Medicine, found that solving crossword puzzles initially delayed the onset of memory decline in individuals between the ages of 75 and 85, but sped the decline (for reasons unknown) once a person showed signs of dementia. Today, most neuroscientists agree there is no harm in the activity. But don’t expect it to make you any better at finding your keys come Monday morning.
Ever asserted that you need lessons delivered visually or verbally? We hate to break it to you, but there’s just no support for that. In 2006, psychologists at the University of California at Santa Barbara found that students didn’t perform any better on a test when given instructions in their preferred style. And a 2009 review paper found no studies upholding the claim—popular among both educators and students—that teaching and learning styles should match. That said, there are broad principles under which everyone seems to learn better, such as through repetition, testing, and by spacing out learning sessions.
That woozy feeling you get after three or four glasses of wine isn’t from brain cells expiring. When scientists at the Bartholin Institute in Denmark compared the brains of deceased alcoholics and nonalcoholics, they found the total number of neurons to be the same. Alcohol, like other substances, can kill brain cells at high doses (especially the sensitive brain cells of developing fetuses), but moderate alcohol use does not. It does interfere with how neurons communicate, affecting one’s ability to perform tasks like walking, speaking, and making decisions. But you already knew that.
Extrasensory perception (ESP), the so-called sixth sense, can be traced back to an experiment in the 1930s. Joseph Banks Rhine, a botanist at Duke University, claimed that individuals who were shown the blank face of a card could correctly guess a shape printed on the back (supposedly by reading the mind of the person administering the test). Although no other type of test has produced evidence for ESP, the myth lives on—thanks in part to the CIA, which employed psychic spies during the Cold War. The spymasters shut down their psychic network in 1995, when they finally concluded ESP isn’t a weapon—or even a thing.
In the 1960s, Roger Sperry, a neuropsychologist at the California Institute of Technology, cut fibers connecting the brain’s two hemispheres in a handful of epilepsy patients to reduce or eliminate their seizures. He then ran an experiment, flashing images—of letters, lights, and other stimuli—into either the left or right eye of the patients. Sperry found that the brain’s left hemisphere better processed verbal information and the right hemisphere, visual and spatial. Over decades, those findings became misinterpreted as dominance, particularly in self-help books. There is no evidence to support personality types based on dominant hemispheres, but there’s plenty of evidence to refute it: In 2012, for example, psychologists at the University of British Columbia found that creative thinking activates a widespread neural network without favoring either side of the brain.
For a good time, listen to radio AM980 from London at 1pm today. Here is the link to their netcast.
I gather there were some interesting confrontations during the interview Lawton conducted with Suzuki. It was supposed be a press conference but only Andrew showed up!! ;-)
What if they gave a press conference and nobody came?
[disclaimer: I have known Andrew, though we are not close friends, for several years.]
I have been thinking about posting about this for some time, but have been reluctant. I may lose some FB friends over this, but here goes:
For those who don't know, I was born and raised in the USA.
When the US changed its pledge of allegiance to the flag in 1954 to add the prepositional phrase "under God", as a very young student I was confused and I think more than a bit disappointed. I saw no reason to add that phrase. I was, at the time, being raised in a solid Christian family, but it was in the Congregational church, which had a somewhat liberal (?) view of theology. I had no idea what I believed or didn't believe theologically, but believe me I revealed these doubts very rarely.
I saw then, and I see now, absolutely no reason for that phrase ("Under God") to have been added to the Pledge of Allegiance. I had a somewhat negative reaction when it was added, even at the tender age I was (grade school?) at the time. My feeling then, as I recall, somewhat vaguely, was "Why add that? It doesn't matter, and (believe it or not, I think I had this thought) I thought we had separation of church and state in the United States.
And only recently did I realize the US had replaced "E pluribus unum" on its coins. Good grief. "E pluribus unum" is a wonderful statement about the history of the USA. The replacement "In God we trust" seems so different. It denies the history of the US (albeit indirectly) and borders on turning the US into a theocracy (heaven forbid! [incongruity intended]).
What prompted this post? Earlier today I read yet another Facebook posting about the pledge of allegiance to the US flag (which strikes me as idolatry that Moses would have discouraged). Here it is:
My reaction? Yea Pepsi! If this is correct, I may have to switch from Coke Zero to whatever Pepsi sells.
Discovering that some 16-ounce glass mugs we had recently purchased were not quite as big as we would like, Ms Eclectic and I were delighted to discover these 17-ounce glass mugs on Amazon.
It turns out they aren't really much, if any, bigger than the 16-ounce mugs we already had.
Q. How could 17-ounces and 16-ounces be approximately the same thing?
A. If the 17 ounces are imperial fluid ounces, and the 16-ounces are US fluid ounces.
Roughly, 17 imperial fluid ounces equal 16.334 US fluid ounces.
No foolin'. They are different. See this from Wikipaedia.
An imperial fluid ounce is ... approximately 28.4 ml.
A US fluid ounce is ... approximately 29.6 ml. ...
1 imperial ounce =~ 0.960759940 US fluid ounces
In fact I once knew this difference, after winning an argument over 40 years ago about the comparative sizes of US vs Imperial gallons. I just didn't expect the difference to appear here, and I guess I should have.
From now on, I'll try to stick to millilitres.
According to this article, the universe as we know it is dying [h/t Jack]. Stars are burning out and energy is being dispersed.
JR (my favourite drug dealer) added (with less whimsy than it might initially seem),
[Our] universe is expanding, communicating with other universes ([via] black holes), and who knows, it might even procreate by fission or budding or by exchanging universal fluids with another universe one day: this sounds like living more than dying.What would Jonathan Livingston Seagull do?
First, traditionally "organic" meant chemical compounds with carbon in them. All living things, plants and animals, have carbon in them, and so all plant and animal food is "organic" in that sense.
So what about health? The main issue tends to focus on the ‘evils’ of pesticide residues. The problem here is that although pesticides can harm in large doses, there is no evidence that they harm at the minute quantities left on foods. As Dick Tavern points out in his book,
In fact every mouthful of food contains some poison, as does every sip of water. Carcinogenic’ substances are routinely consumed by all of us in the form of natural chemicals made by plants to repel predators, but at amounts so low they do not harm us. … There are some dioxons in every breath of air we take...
If there is little basis in fact for the claims made by the organic movement then it looks like the word organic is just one more advertising word used to push expensive, unnecessary products on us. Furthermore, and more damning, by focusing on organic production, our society pays less attention to farming methods and technology advances that really could improve health, protect wildlife and ensure a consistent quality and quantity of food supply. Rather than securing our health, the illogical worship of the word ‘organic’ could be damaging us all.
CBC has some information on how to maximize your chances of seeing falling stars during the Perseid meteor shower over the next few nights.
Also, see this.
Despite the fact that the odds are stacked against us, several years ago Ms. Eclectic and I sat on the balcony for a couple of hours one evening, and we both saw several shooting stars. I was thrilled because I had never seen any before.
So maybe if it isn't cloudy, rainy, or too cold on the night of the 12th, I'll consider sleeping on the balcony.
This week, warnings of an impending “mini ice age,” set to hit in the 2030s, have been circulating in the media. ...
The ice age idea got rolling last week when researcher Valentina Zharkova, a professor of mathematics at Northumbria University in England, presented some of her recent research into solar variations at the Royal Astronomical Society’s National Astronomy Meeting in Wales. The presentation was based on a study ... which presented a technique for understanding variations in solar radiation and made some predictions about how this radiation will change in the near future. Most notably, the research predicts that between 2030 and 2040, solar activity should drop significantly, leading to a condition known as a “solar minimum.” ...
According to the research, solar activity at this time should resemble conditions last seen in the mid-1700s during a period known of low solar radiation known as the “Maunder Minimum.” The interesting thing about this period was that it coincided with a “little ice age” in Europe and North America — a time marked by unusually cold temperatures and bitter winters. Now that Zharkova and her colleagues are predicting another solar minimum coming up, media coverage has jumped on the idea that a modern “mini ice age” is in store.
Michael Mann, a leading proponent of concern about AGW (and whose work has been seriously criticized [see this] ), doesn't buy it, but his apparent only explanation is
As far as the solar variations go, “The effect is a drop in the bucket, a barely detectable blip, on the overall warming trajectory we can expect over the next several decades from greenhouse warming,” said Michael Mann, distinguished professor of meteorology at Pennsylvania State University, in an e-mail to The Washington Post.
The article points out that Zharkova refuses to go on the record scientifically as to whether her predicted mini-ice-age would have much of an impact on the earth's temperature. It does add, however, that
On the one hand, Zharkova maintains that her research was not intended to make assumptions about the effects of solar variation on climate — only to lay out predictions about the solar activity itself. “What will happen in the modern Maunder Minimum we do not know yet and can only speculate,” she says. On the other hand, she adds, her gut assumption is that temperatures will drop as they did 370 years ago.
There is a conference about climate change going on in Trono these days. It is filled with people who have lots of rhetoric and considerable disdain for the scientific process. Matt Ridley explains why. His article is quite lengthy, but it is worth taking some time to read it.
[I]nch by inch, the huge green pressure groups have grown fat on a diet of constant but ever-changing alarm about the future. That these alarms—over population growth, pesticides, rain forests, acid rain, ozone holes, sperm counts, genetically modified crops—have often proved wildly exaggerated does not matter: the organisations that did the most exaggeration trousered the most money. In the case of climate, the alarm is always in the distant future, so can never be debunked.
These huge green multinationals, with budgets in the hundreds of millions of dollars, have now systematically infiltrated science, as well as industry and media, with the result that many high-profile climate scientists and the journalists who cover them have become one-sided cheerleaders for alarm, while a hit squad of increasingly vicious bloggers polices the debate to ensure that anybody who steps out of line is punished. They insist on stamping out all mention of the heresy that climate change might not be lethally dangerous.
Today’s climate science...is based on a “pre-ordained conclusion, huge bodies of evidence are ignored and analytical procedures are treated as evidence”. Funds are not available to investigate alternative theories. Those who express even the mildest doubts about dangerous climate change are ostracised, accused of being in the pay of fossil-fuel interests or starved of funds; those who take money from green pressure groups and make wildly exaggerated statements are showered with rewards and treated by the media as neutral.
Ridley goes on to provide numerous examples to support these assertions. He labels himself not as a denier and not as a skeptic; rather he refers to himself as a "lukewarmer", a label that probably suits Bjorn Lomborg as well:
This is the “lukewarmer” school, and I am happy to put myself in this category. Lukewarmers do not think dangerous climate change is impossible; but they think it is unlikely.
Barack Obama says that 97 per cent of scientists agree that climate change is “real, man-made and dangerous”. That’s just a lie (or a very ignorant remark): as I point out above, there is no consensus that it’s dangerous.
So where’s the outrage from scientists at this presidential distortion? It’s worse than that, actually. The 97 per cent figure is derived from two pieces of pseudoscience that would have embarrassed a homeopath. The first was a poll that found that 97 per cent of just seventy-nine scientists thought climate change was man-made—not that it was dangerous. A more recent poll of 1854 members of the American Meteorological Society found the true number is 52 per cent.
The second source of the 97 per cent number was a survey of scientific papers, which has now been comprehensively demolished by Professor Richard Tol of Sussex University, who is probably the world’s leading climate economist. As the Australian blogger Joanne Nova summarised Tol’s findings, John Cook of the University of Queensland and his team used an unrepresentative sample, left out much useful data, used biased observers who disagreed with the authors of the papers they were classifying nearly two-thirds of the time, and collected and analysed the data in such a way as to allow the authors to adjust their preliminary conclusions as they went along, a scientific no-no if ever there was one. The data could not be replicated, and Cook himself threatened legal action to hide them. Yet neither the journal nor the university where Cook works has retracted the paper, and the scientific establishment refuses to stop citing it, let alone blow the whistle on it. Its conclusion is too useful.
None of this would matter if it was just scientific inquiry, though that rarely comes cheap in itself. The big difference is that these scientists who insist that we take their word for it, and who get cross if we don’t, are also asking us to make huge, expensive and risky changes to the world economy and to people’s livelihoods. They want us to spend a fortune getting emissions down as soon as possible. And they want us to do that even if it hurts poor people today, because, they say, their grandchildren (who, as Nigel Lawson points out, in The Facts, and their models assume, are going to be very wealthy) matter more.
Yet they are not prepared to debate the science behind their concern. That seems wrong to me.
As I said at the outset, Ridley's article is quite lengthy. The above quotes represent no more than about 5% of it. I've left out his examples, his more detailed arguments, and his references.
If you uncritically accept the pronouncements from the politicians at the Climate Conference in Trono, please, PLEASE read Ridley's piece. Even if you don't like it, even if you think he omits important information, even if you have a different set of subjective estimates of the probabilities he discusses, what he says in the article merits your attention.
Yes, the clocks are going to be reset Tuesday night to realign our clocks with sun time. This reset, which occurs now and then, according to this article, is called a "leap second".
But will the clocks be set ahead by one second? or will they be set back by one second?
The CBC headline reads,
Leap seconds: Why our clocks are being set ever-so-slightly ahead
But the subheadline and the article imply the clocks will be set back by one second:
Leap second being added around the world on June 30
and from the article,
Just before midnight Greenwich mean time on June 30, official timekeeping bodies around the world will add a single second — the so-called leap second — to the clock.
While the time-shift seems too infinitesimal to matter to the average person, there are very good reasons for it.
A leap second is an extra second that is added to an agreed-upon day every few years in order to keep Co-ordinated Universal Time (or UTC, the modern replacement for Greenwich mean time), the world standard for regulating clocks, in sync with Mean Solar Time, which marks the passage of time based on the sun's position in the sky.
It means that the last minute UTC of June 30 will actually be 61 seconds long.
Wouldn't setting the clock ahead subtract a second from our time? And wouldn't adding a second require that the clocks be set back, not forward, by one second?
My take on this is that to get 61 seconds into that minute, when the clock hits midnight, it will have to be set back one second so that it will strike midnight again one second later.
I confess to having some confusion, and the aphorism, "spring forward, fall back" doesn't help here.
CBC: our tax dollars at work. The other media seem to understand this (see links below).
Update: CBC has changed the headline to read "back" now.
A few years ago in Canada, The Weather Network decided to stop reporting the Humidex under that name. Instead, they call it "Feels like", which seems in keeping with the general drift toward touchy-feely-ness of the times.
But from what I can tell, it is just the frickn humidex, showing a formulaic combination of humidity and temperature. [see this in Wikipaedia].
The humidex formula is as follows:
The humidity adjustment effectively amounts to one Fahrenheit degree for every millibar by which the partial pressure of water in the atmosphere exceeds 10 millibars.
Yeah, ok. Whatever. I spent a few minutes trying to figure this stuff out (superficially, of course) when this morning I see that the current temperature in London, Ontario, is 22C* but according to The Weather Network [TWN]**, it feels like 30C.
No way this feels like 30C. There is a good, strong wind, and if anything it "feels like" 18C.
So I went to the Environment Canada website. That site tells me the current temperature is 22C, with a humidex reading of 30C. That makes more sense. After all, the "feels like" that TWN uses is just Environment Canada's humidex, so far as I can tell. The humidex name makes it clear that humidity is involved and that it is a computed index number. "Feels like" misleadingly tells us what we should expect the weather to feel like.
I'd hate to go out today, expecting "feels like" 30C and dressed for that, only to find that it actually feels like 18C or even cooler in the shade.
But, as is in keeping with their condescending paternal/maternal-isms, TWN gives us a number with a name that is far less informative than "Humidex".
Note that the same thing happens in the winter when temperature and wind are combined to give us "wind chill" which (to the best of my ability to discern) omits humidity from its calculation. The term "wind chill" at least tells that the wind and temperature are involved, unlike "feels like", for (as everyone on the prairies alleges) a dry cold doesn't feel as cold as a more humid cold.
Part of the problem is trying to devise an index number that is useful to many people. We see this in economics all the time: GDP, CPI, etc. are all indices that try to measure, combine, and reflect useful information, just as "humidex" and "wind chill" do that with temperature, humidity, and wind chill.
But my complaint here is not with the index. Rather it is with the term, "feels like".
*TWN, according my friends in Saskatchewan, means Toronto Weather Network, just as TSN means Toronto Sports Network. They have the sense that these networks focus far too much attention on Trono and not enough on the west.
** For new visitors to EclectEcon, a reminder that C=Canadian; F=Foreign
Apologies to TWN if I am mistaken. I doubt if I am, though.
To all my friends (and others) who worship David Suzuki...
or maybe don't worship him but think he is right on environmental and climate issues...
or maybe think he is sometimes worth listening to:
David Suzuki wants to prosecute those who disagree with him. (see this)
David Suzuki is at it again: Calling for "climate change deniers" and other people who disagree with him -- including Prime Minister Stephen Harper -- to be jailed.
So much for freedom of inquiry and freedom of expression.
So much for intellectual challenges with open discussion.
About 25 years ago, I watched Suzuki (a noted geneticist) debate the infamous Phil Rushton (a schlock psychologist) about the relationship between race and I.Q. Suzuki's only point was, essentially, the morally superior attitude of, "How dare you even think about studying this topic?" He certainly contributed nothing to the debate. And he definitely disagreed with the academic process of scientific inquiry.
This man is no scientist. He is a demagogue.
Before you comment, please read the entire article. It's long, it's detailed, and it raises some serious doubts about the NFL's position. From the conclusion,
Because the NFL had little or no experience with measuring psi in the heat (or cold) of a championship game, it is not surprising that the initial readings, from either gauge brought by Walt Anderson, suggested that the Patriots had been cheating. But a careful review of the measurements should have led them to conclude that the entire process of measuring and complying with the psi regulation was much more complicated than had been previously understood. ...
Instead, the NFL decided to tarnish the reputation of a future Hall-of-Famer who some would argue is the greatest player in the history of the NFL. That player is known to even the casual fan as a very intense competitor. I would not be surprised if under the pressure of an impending championship game, he encouraged or allowed staffers to break a rule. It’s a shame that the hard evidence that would make that conclusion definitive is not provided by the Wells Report.
How about buttered coffee? Sounds good to me, but even before reading this article, I had developed a taste for having my coffee with whipping cream (unwhipped) in it.
It's a lengthy article, and so here are a couple of snippets:
[Asprey] completely dismantled the food pyramid—the 1992 chart that advised people to eat a carbohydrate-rich diet and very few fats—and argues that the proper diet should consist of as much as 70 percent fat. It’s similar to the paleo diet, the regimen that forbids any food not available to prehistoric man, with some modifications, like allowing white rice. “Your hormones are made of saturated fat, your brain is made of fat, and the membrane of every cell in your body is made of fat,” Asprey says. “When you go on a low-fat diet, you limit the performance of so many key systems in your body that it’s no wonder you have cravings and feel tired.” ...
“I used to weigh 300 pounds,” Asprey tells Gotzler. “I worked out six days a week, and I cut my calories to around 1,800 calories per day for almost two years. And I was still fat. I’m eating salads and my friends are eating onion rings, and they’re still thin. I said, ‘This isn’t working.’” ...
Asprey found some low-mold beans from Guatemala and blended them with the coconut oil and grass-fed butter, which is higher in omega-3 fatty acid than regular butter or cream. It was delicious. Bulletproof coffee was born. Asprey envisioned the beverage as a 450-calorie breakfast alternative that would suppress hunger and provide mental clarity.
Sounds like a lot of the evidence we read that convinced us to move toward Atkins-type low-carb, high-fat diets. If only I could stop eating the cheap-carb, refined wheat, refined sugar things I find so tasty.
There's a math problem raging on Facebook that depends on the order of operations.
Too many smart people have memorized the mnemonic BEDMAS and misapply it.
These mnemonics may be misleading when written this way, especially if the user is not aware that multiplication and division are of equal precedence, as are addition and subtraction. Using any of the above rules in the order "addition first, subtraction afterward" would also give the wrong answer to the problem:
The correct answer is 9 (and not 5, which we get when we do the addition first and then the subtraction). The best way to understand a combination of addition and subtraction is to think of the subtraction as addition of a negative number. In this case, we see the problem as the sum of positive ten, negative three, and positive two:
A different perspective that might help clear things up: Within the multiplication and division groups, start at the left and work right. Similarly, within the addition and subtraction groups, start at the left and work to the right.
But I doubt if this will stop or slow the battles on Facebook.
If only people would Google things and look at Wikipaedia...
The inestimable Steve Horwitz writes on his FB page,
Ronald Bailey's prediction 15 years ago about what Earth Day will be like in 30 years continues to be on target. The world has never been cleaner and healthier, yet we are, according to the professional purveyors of doom, always on the edge of catastrophe. Or at least so says this screaming CNN Headline. http://www.cnn.com/…/…/sutter-climate-two-degrees/index.html
Instead, celebrate Earth Day by going to Cato's Human Progress website and get the real state of the planet.
Steve Horwitz posted this on Facebook. It is a wonderful suggestion:
[W]hen our environmentalist friends DO make predictions that can be falsified, they are often very wrong. I still maintain that Reason or some other libertarian organization should give an annual Paul Ehrlich Award to scholars whose predictions have turned out to be spectacularly wrong. (Ehrlich is ineligible as he'd win it every year.)
"Finally, think about this question, posed by Ronald Bailey in 2000: What will Earth look like when Earth Day 60 rolls around in 2030? Bailey predicts a much cleaner, and much richer future world, with less hunger and malnutrition, less poverty, and longer life expectancy, and with lower mineral and metal prices. But he makes one final prediction about Earth Day 2030: “There will be a disproportionately influential group of doomsters predicting that the future–and the present–never looked so bleak.” In other words, the hype, hysteria and spectacularly wrong apocalyptic predictions will continue, promoted by the “environmental grievance hustlers.”"
We are halfway there and Ron's predictions, unlike those of the doomsayers, have largely come to pass.
Steve then linked to this article.
18 spectacularly wrong apocalyptic predictions made around the time of the first Earth Day in 1970, expect more this year
In a recent posting, I argued that OPS [On-base-percentage Plus Slugging-average] is an excellent comparatively easy and comparatively good statistic to use for assessing the performance of batters in baseball.
For the same reasons, I think OOPS [Opponents' OPS] is a comparatively easy and comparatively good statistic for assessing baseball pitchers. The statistic is readily available via the MLB website, and it measures how well a pitcher avoids letting batters reach base and how well the pitcher avoids letting opposing batters hit for power.
I have noticed that baseball sportscasters are moving toward telling us about opponents' batting average [which tell us nothing about walks given up nor about extra-base hits] or about WHIP, which is Walks plus Hits per Inning Pitched [which is a bizarre measure telling us nothing more than (and really not as much as ) "Opponents' On-Base-Percentage].
Maybe in another ten years' time they will start using OOPS as well as OPS.
May I live to see the day.
Ever since I read about the possibility of multiverses, I have been intrigued. Since then, I had imagined that the multiverses would exist because of the possibility that there are really 11 dimensions in the universe, but this article [via Jack] presents a different possibility: there are multiverses out there all within our given 3- or 4-dimensional universe but we don't see them because they are so far away their light could not have gotten to us yet.
Our definition of "the universe" has been changing since the invention of the first telescope when we peered out into the cosmos and learned that the Earth is not the totality of existence.
But the universe is a lot bigger than what we could ever see with a telescope.... Our universe is just the spherical amount of light that has had time to reach us. If we wait another billion years for more light to reach us, our definition of the universe would change... [emphasis added].
Someone standing on a planet trillions of lightyears away would have a completely different picture of "the universe" based on how much light has reached their planet.
By definition there's no way to get to these other bubble universes because we'd have to travel faster than the speed of light. [emphasis in the original].
What a neat perspective!
No, this post is not about the weather. It is to announce that finally, after many ups and downs, I have reached my goal weight. Over 5 years ago I weighed nearly 205 lbs. I knew I was overweight and out of shape. I set my goal at 160 (I had weighed only 155 just 8 years previously).
I told myself I would not eat a Dairy Queen Blizzard (one of my favourite treats!) until I reached the goal. Today I reached that goal. I won't be able to get to a Dairy Queen for several days, but believe me, I'm going for a Skor Blizzard sometime soon.
Here is a graph of my weight for the past 61 months.
You can see all the bouts of lack of will power in the graph. The big start came from using weight-watcher/point-counting/calorie-counting. But I was hungry all the time on that diet and kept cheating and regained lots of weight.
The second half of the graph shows what happened under our modified version of a low-carb diet. We went on this diet in July, 2012.
I have not been nearly so hungry on this diet, I eat lots of fat, protein, and vegetables. I don't count calories, and I certainly don't try to avoid fat anymore.
I generally eat cheese or pepperettes as snacks. In restaurants, I sometimes order a pasta dish - hold the pasta, or burgers - no bun. I really have enjoyed this diet much more than any other diet I have been on.
Yes, I go off this diet frequently (as my Facebook friends know, one of my favourite hashtags is #carbsbedamned), but the neat thing about the low-carb diet is that when I go back on it, my weight goes right back down fairly quickly.
Exercise? I try to walk some, and I do some exercises now and then for my back, but overall I know I exercise less now than I used to. The weight loss is due to the diet change, not exercise.
The weight chart is from a smartphone app called "Lose It!". It's a good app in that it is no-charge, and it stores your data in the cloud so you can continue it as you change phones and platforms. I started it using an iPhone3 and kept it through all my various phone changes, including an android for two years.
For my earlier postings about this diet, see: