- 1 week ago
A Hitch logic bomb for any climate change skeptics.
An environmental archaeologist professor I had once paraphrased a palaeoclimatologist and I’m paraphrasing my prof: “The climate is a gigantic, scary and unpredictable monster that looms over us whose behaviour is known to be erratic and violent, and who, as a whole, we know not nearly enough about. And we’re poking it with a sharp stick.”
Yes. HmmmSource: astrogasmic
- 3 weeks ago
Dear Science Communication Professionals: We have a problem.
Earlier this month, the Bill Nye vs. Ken Ham creationism “debate” received a disproportionate amount of press coverage. Considering that there really is no debate to be had when it comes to the science of evolution, for bad or for worse, Nye faced a hostile audience at the Creationist Museum in Kentucky. He hoped to score some scientific points against Ham’s literal translation of the Bible and his absurd assertion that the world was created in 6 days and that the universe is 6,000 years old.
In my opinion, (an opinion shared by other science communicators), the Nye vs. Ham debate did little for science outreach. It was all about who sounded more convincing and only gave creationists some free advertising.
And then, today, the National Science Foundation (NSF) delivered news of a pretty shocking poll result: around one in four Americans (yes, that’s 25 percent) are unaware that the Earth orbits the sun. Let’s repeat that: One in four Americans — that represents one quarter of the population — when asked probably the most basic question in science (except, perhaps, “Is the Earth flat?” Hint: No.), got the answer incorrect. Suddenly I realized why the Nye vs. Ham debate was so popular.
But wait! I hear you cry, perhaps the NSF poll was flawed? Perhaps the poll sample was too small? Sadly not. The NSF poll, which is used to gauge U.S. scientific literacy every year, surveyed 2,200 people who were asked 10 questions about physical and biological sciences. On average, the score was 6.5 out of 10 — barely a passing grade. But for me personally, the fact that 26 percent of the respondents were unaware the Earth revolves around the sun shocked me to the core.
Perhaps I’m expecting too much of the U.S. education system? Perhaps this is just an anomaly; a statistical blip? But then, like the endless deluge of snow that is currently choking the East Coast, another outcome of the same poll appeared on the foggy horizon of scientific illiteracy: The majority of young Americans think astrology is a science.
What the what? Have I been transported back to the Dark Ages? Astrology, of course, is not a science; it is a spiritual belief system at best and at worst a pseudoscience driven by charlatans and the tabloid press. The positions of the stars and planets in the sky do not affect my mood and my horoscope has little bearing on the kind of person I am. Even in China, one of the birthplaces of astrology, 92 percent of the people know that astrology is bunk. Really America, get your act together.
Unfortunately, if we are to use the “Is astrology a science?” as a litmus test for scientific literacy, things are looking grim. In 2004, 66 percent of the American public said astrology was bunk. Every year since then, that majority has slipped. By 2012, only 55 percent of Americans considered astrology “not at all scientific.” Probably of most concern is the fact that only 42 percent of young respondents aged between 18-24 said astrology is “not at all scientific.”
But there is a small glimmer of hope. According to the same NSF poll, the vast majority of Americans seem to love science. Although they returned woeful test results, it seems America is hungry to learn about science and think that science funding is essential for the well-being of the nation. But I’m now concerned about what America thinks science really is, especially in light of that astrology result. Also, just because the U.S. public wants to learn, can they find the institutions that will actually teach real science?
Schools across the nation are currently facing the unthinkable notion of teaching creationism alongside evolution in science classrooms. The fact that religion is given the same standing as science is not only absurd, it’s a fundamental institutional failing where children (who may be excited to learn about science) will grow up with a second-rate education, neglecting decades of scientific knowledge in favor of pseudo-scientific religious agendas.
For a nation that prides itself on science and discovery, it will be a tragedy on a national scale if fundamental science is undercut by superstition and the bad policies it inspires.
You can read detailed results of the NSF poll here (PDF).
- - - - - - - - - - - - - - - - - - - - - - -
…….so. There’s this. However, we do have much working in our (humanity’s) favor: the very technology we depend on for information and communication is being used by humanity to evaluate, compare, and verify through a self-correcting process called science.
I know this report is extremely grim, but my fellow curious human family…this is precisely why we delight in sharing information, educating others, communicating across these artificial boundaries set up before us, and encouraging alternative means to pay it forward for the next generation. We’re in the midst of a grand transition regarding how we inspire, create, and contribute to the world.
If there were any time in our society where a massive transition from long-held beliefs, superstitions, and traditions was needed, now is that time. Let’s keep doing what we’re doing with as much patience as possible. We have resources and access to information on a scale never before witnessed or applied to any society throughout history. Not even the Library of Alexandria could compete with the amount of knowledge we have and the means by which we can communicate it to others.
Oh… Dear me.Source: news.discovery.com
- 3 weeks ago
- 4 weeks ago
Your Brain On Beer vs. Coffee
America-based Japanese coffee lover Ryoko Itawa of I Love Coffee came across the article ‘Coffee VS Beer: Which drink makes you more creative?’ written by Ooomf’s cofounder Mikael Cho—which she found interesting, and decided to create an infographic based on the article.
This sums up yesterday. The coffee and beer part, not the ideas part.
Great. Now to come up with an idea and then go through with it.
(via 23pairsofchromosomes)Source: moshita
- 1 month ago
- 1 month ago
Boston University professor and editorial contributor Andrew J. Bacevich has a harsh evaluation of the U.S. military’s actions in recent years:
“The U.S. military is like the highly skilled, gadget-toting contractor who promises to give your kitchen a nifty makeover in no time whatsoever…
Yet by the time he drives off months later, the kitchen’s a shambles and you’re stuck with a bill several times larger than the initial estimate.”
Image: Edel Rodriguez / For The Times
Yeah. Well. I was gonna say -Source:
- 1 month ago
- 1 month ago
Many thinkers have approached consciousness from a first-person vantage point, the kind of philosophical perspective according to which other people’s minds seem essentially unknowable. And yet, we spend a lot of mental energy attributing consciousness to other things. We can’t help it, and the fact that we can’t help it ought to tell us something about what consciousness is and what it might be used for. If we evolved to recognise it in others - and to mistakenly attribute it to puppets, characters in stories, and cartoons on a screen - then, despite appearances, it really can’t be sealed up within the privacy of our own heads.
Lately, the problem of consciousness has begun to catch on in neuroscience. How does a brain generate consciousness? In the computer age, it is not hard to imagine how a computing machine might construct, store and spit out the information that ‘I am alive, I am a person, I have memories, the wind is cold’ and so on. But how does a brain become aware of those propositions?
In a period of rapid evolutionary expansion called the Cambrian Explosion, animal nervous systems acquired the ability to boost the most urgent incoming signals. Too much information comes in from the outside world to process it all equally, and it is useful to select the most salient data for deeper processing. Over time, though, it came under a more sophisticated kind of control - what is now called attention. Attention is a data-handling method, the brain’s way of rationing its processing resources. Mammals and birds both have it, and they diverged from a common ancestor about 350 million years ago, so attention is probably at least that old.
Attention requires control. In the modern study of robotics there is something called control theory, and it teaches us that, if a machine such as a brain is to control something, it helps to have an internal model of that thing. Think of a military general with his model armies arrayed on a map: they provide a simple but useful representation - not perfectly accurate, but close enough to help formulate strategy. Likewise, to control its own state of attention, the brain needs a constantly updated simulation or model of that state. The brain will attribute a property to itself and that property will be a simplified proxy for attention. What exactly is that property? When it is paying attention to thing X, we know that the brain usually attributes an experience of X to itself - the property of being conscious, or aware,of something. Why? Because that attribution helps to keep track of the ever-changing focus of attention.
We humans are continually ascribing complex mental states - emotions, ideas, beliefs, action plans - to one another. But it is hard to credit Matt with a fear of something, or a belief in something, or an intention to do something, unless we can first ascribe an awareness of something to him. Awareness, especially an ability to attribute awareness to others, seems fundamental to any sort of social capability. We paint the world with perceived consciousness. Family, friends, pets, gods, ventriloquist’s puppets - all appear before us suffused with sentience.
But what about the inside view, that mysterious light of awareness accessible only to our innermost selves? How do neurons produce a magic internal experience? How does the magic emerge from the neurons?
One way to think about the relationship between brain and consciousness is to break it down into two mysteries. I call them Arrow A and Arrow B. Arrow A is the mysterious route from neurons to consciousness. If I am looking at a blue sky, my brain doesn’t merely register blue. I am aware of the blue. Did my neurons create that feeling?
Arrow B is the mysterious route from consciousness back to the neurons. The most basic, measurable, quantifiable truth about consciousness is simply this: we humans can say that we have it. We can conclude that we have it, couch that conclusion into language and then report it to someone else. Speech is controlled by muscles, which are controlled by neurons. Whatever consciousness is, it must have a specific, physical effect on neurons, or else we wouldn’t be able to communicate anything about it.
Any workable theory of consciousness must be able to account for both Arrow A and B. Most accounts, however, fail miserably at both. Suppose that consciousness is a non-physical feeling, an inner essence that arises somehow from a brain or from a special circuit in the brain. The ‘emergent consciousness’ theory is the most common assumption in the literature. But how does a brain produce the emergent, non-physical essence? And even more puzzling, once you have that essence, how can it physically alter the behaviour of neurons, such that you can say that you have it? ‘Emergent consciousness’ theories generally stake everything on Arrow A and ignore Arrow B completely.
The attention schema theory does not suffer from these difficulties. It can handle both Arrow A and Arrow B. Consciousness isn’t a non-physical feeling that emerges. Instead, dedicated systems in the brain compute information. Cognitive machinery can access that information, formulate it as speech, and then report it. When a brain reports that it is conscious, it is reporting specific information computed within it. In short, Arrow A and B remain squarely in the domain of signal-processing. There is no need for anything to be transmuted into ghost material, thought about, and then transmuted back to the world of cause and effect.
Some people might feel disturbed by the attention schema theory. It says that awareness is not something magical that emerges from the functioning of the brain. When you look at the colour blue, for example, your brain doesn’t generate a subjective experience of blue. Instead, it acts as a computational device; it computes a description, then attributes an experience of blue to itself. Subjective experience, in the theory, is something like a myth that the brain tells itself.
The heart of the theory is that awareness is a model of attention, like the general’s model of his army laid out on a map. The real army isn’t made of plastic, it isn’t quite so small, and has rather more moving parts. In these respects, the model is totally unrealistic. And yet, without such simplifications, it would be impractical to use.
In reality, attention is a data-handling method used by neurons. It isn’t a substance and it doesn’t flow. But it is a neat accounting trick to model attention in that way; it helps to keep track of who is attending to what. Science commonly regards ghost-ish intuitions to be the result of ignorance, superstition, or faulty intelligence. In the attention schema theory, however, they are not simply ignorant mistakes. Those intuitions are ubiquitous among cultures because we humans come equipped with a handy, simplified model of attention. Many of our superstitions might emerge naturally from the simplifications and shortcuts the brain takes when representing itself and its world.
One of the long-standing questions about consciousness is whether it really does anything. Most of us intuitively understand it to be an active thing: it helps us to decide what to do and when. And yet, at least some of the scientific work on consciousness has proposed the opposite: that it doesn’t really do anything at all; that it is the brain’s after-the-fact story to explain itself. We act reflexively and then make up a rationalisation.
There is some evidence for this post-hoc notion. In countless psychology experiments, people are secretly manipulated into making certain choices. When asked why they made the choice, they confabulate. They make up reasons that have nothing to do with the truth, and they express great confidence in their bogus explanations. It seems, therefore, that at least some of our conscious choices are rationalisations after the fact. But if consciousness is a story we tell ourselves, why do we need it? Why are we aware of anything at all?
This idea that consciousness has no leverage in the world, that it’s just a rationalisation to make us feel better about ourselves, is terribly bleak. It runs against most people’s intuitions. Some people might confuse the attention schema theory with that nihilistic view. But the theory is almost exactly the opposite. It is not a theory about the uselessness or non-being of consciousness, but about its central importance. Why did an awareness of stuff evolve in the first place? Because it had a practical benefit. The function of awareness is to model one’s own attentional focus and control one’s behaviour. In this respect, the attention schema theory is in agreement with the common intuition: consciousness plays an active role in guiding our behaviour. To attribute awareness to oneself is the first step towards attributing it to others. That, in turn, leads to a remarkable evolutionary transition to social intelligence. We live embedded in a matrix of perceived consciousness. Most people experience a world crowded with other minds, constantly thinking and feeling and choosing. We intuit what might be going on inside those other minds. This allows us to work together: it gives us our culture and meaning, and makes us successful as a species. We are not, despite certain appearances, trapped alone inside our own heads.
(via thenewenlightenmentage)Source: we-are-star-stuff
- 1 month ago
Neil deGrasse Tyson on Why We Shouldn’t Feel Small in the Universe
This week, Neil deGrasse Tyson, director of the Hayden Planetarium, talks with Bill Moyers about the return of Cosmos, the nature of the universe, and why science matters.
Watch the first show in the multi-part series.
no matter how many times he says this, no matter how many different ways. It’s still one of my favorite descriptions of our place in the universe
C’est vraiSource: amnhnyc