Showing posts with label cognitive science. Show all posts
Showing posts with label cognitive science. Show all posts

Sunday, March 08, 2015

Being your whole self

The Upside of Your Dark Side: Why Being Your Whole Self — Not Just Your "Good Self — Drives Success and Fulfillment, by Todd Kashdan and Robert Biswas-Diener, is not the sort of book I usually read. I'd call it pop psychology. Its tone was such that I was ever afraid of it veering toward self-help territory, but I think that's more an artefact of how it's marketed than the intent of the actual text.

It was given to me as a gift, because I've had a lot of dark looming up in my life lately, in both my home and work lives.

Its starting point is the American obsession with positive thinking: the right to the pursuit of happiness has been confused with an obligation to be happy, all the time.

I've been obsessed with the notion of happiness since I was in grade 7. Not that I wanted a direct path to it. I just wanted the idea of it defined. (Were we reading Brave New World?) I wanted to establish the difference between thinking you're happy and really being happy. Cuz goddamit I know there's a difference. And real happiness is an elusive, if not altogether imaginary, beast. I've known all my life that it is not possible to live in a state of constant joy. Contentment is another matter. Which is where the problem of definition comes in. In fact, for some 30+ years, I've been a proponent of happiness being just one aspect of living a full life. Which is what this book is about. (Gosh, I was a smart kid.)

So the fact is: people get angry, or sad, or bored, or frustrated, all the time. When we experience physical pain, we tend to take it seriously. But we're generally pretty dismissive of emotional pain and view it simply as wrong, without ever listening to what that pain is telling us about ourselves or our environment.

The upshot: it's OK to feel angry, sad, mean, selfish, whatever, in certain contexts, and it's better for us to acknowledge those feelings, even indulge them, than to mask them because we're supposed to be happy all the time. These "negative" feelings are shown to fuel creativity, heighten awareness, and enhance performance.

But you already knew that, right?

Regarding one study conducted in a workplace setting:
The take-home lesson is simple: do not create a culture based on the assumption that positivity must reign supreme. Instead, create a culture where everyone knows that it's safe to be real, and that depending on the situation it's sometimes better to feel something other than happiness.
I worked for a company some years ago where the gung-ho, go-get-'em, go-go-go all-American attitude just didn't fly with us Canadian counterparts. So I can see how HR departments could learn from this book, to build the right corporate culture for the right skills to flourish.

In addition, the book is full of little insights on sprawling topics, like:
Love is about adopting another person's perspective of the world, and when overvaluing your happiness gets in the way, it leads to unfortunate by-products such as loneliness.
And:
Research suggests that you, like everyone else, think that you are better than other human beings. This so-called better-than-average effect shows that most people believe that they are above average, which, of course, is a mathematical impossibility.[...] The average person lives inside a narcissistic bubble, a self-serving bias that gives most of us the confidence we need to face a complex and uncertain day.
So I don't think I learned anything, but the book is full of interesting research studies, and it's nice to have my intuitive thinking validated. A pleasant-enough way to while away a train ride on a wintry afternoon.

Reviews
Huffington Post
Positive Psychology

Presentation.

Tuesday, February 10, 2015

And what does our purportedly decluttered mind now allow us to do?

Do you use an online dictionary? Have you ever "liked" a word, or shared it on social media? Have you ever commented on a dictionary entry? I mean, by leaving a comment that satisfies the prompt (What made you want to look up ___? Please tell us where you read or heard it.). I've often wondered why anyone would be motivated to do that, apart from students of English as a second language, who seem quite genuine in their queries regarding usage. But what if you could vote a word up or down? What if its definition were crowdsourced? What if its shape, meaning, sound, morphed as data was received?

What if it all happened with the aid of the technology of our very near future, with a kind of Google Glass or a chip integrated directly into our neural network? Among its other functions for day-to-day living (hailing cabs, making payments, checking contact details, researching background info — with less than a blink of the eye), it would fulfill linguistic services, not only looking up unknown words and supplying their meanings but suggesting entire conversational tacks. What if you could devise a business model that earned you money for every look-up, while dumbing down the culture and creating a dependence on your service? You might need a monopoly on the dictionary industry first, of course.
It is comforting to believe that consigning small decisions to a device frees up our brains for more important things. But that begs the question, which things have been deemed more important? And what does our purportedly decluttered mind now allow us to do? Express ourselves? Concentrate? Think? Or have we simply carved out more time for entertainment? Anxiety? Dread?

We fear that Memes may have a paradoxical effect — that indeed, contrary to Synchronic's claims, they tend to narrow rather than expand consciousness, to the point where our most basic sense of self — our interior I — has started to be eclipsed. Our facility for reflection has dimmed, taking with it our skill for deep and unfettered thinking. And another change is taking place: our capacity for communication is fading.

In the most extreme cases, Meme users have been losing language. Not esoteric bits of linguistic debris but everyday words: ambivalence, paradox, naïve. The more they forget, the more dependent on the device they become, a frightening cycle that only amplifies and that has grown to engulf another of Synchronic's innovations, the Word Exchange.
The Word Exchange, by Alena Graedon, is on its surface a mystery story — a search for a missing person. But soon enough it takes on thriller-like aspects, with corporate intrigue on an international stage. But it's also a linguistic nerd's dream. It covers synchronic versus diachronic approaches to language study, the basics of lexicography, Hegel's philosophy of language (Graedon acknowledges guidance from Jim Vernon), the theory of universal grammar, book burnings, Jabberwocky-type nonsense and countless references to Lewis Carroll's wonderland ("When I use a word, [...] it means just what I choose it to mean.").

Also, secret libraries and pneumatic tubes!

What if the Word Exchange were hacked, and everyone who used the device were infected with Word Flu, effectively losing language?
Maybe Hegel had it wrong: laber there's no mystical link between the speaker of a word and the recipient of its sound. Maybe language isn't unity but domination. Unilateral. Unkind.
Fantastic premise, wonderful vocabulary usage. Mostly interesting characters. Somewhat uneven pacing, but it's Graedon's first novel.

Interview
Bustle: Q&A: Alena Graedon on 'The Word Exchange': The Influence and Influenza of Words

Reviews
New York Times: World Wide Web
Slate: When Smartphones Attack
Tor.com: Science Fiction Saves the Dictionary: The Word Exchange by Alena Graedon
Toronto Star: The Word Exchange by Alena Graedon: review

Monday, September 29, 2014

We must obey the forces we want to command

For a recent MOOC, On Strategy: What Managers Can Learn from Great Philosophers, the final exam asked us to respond to Francis Bacon's assertion that "we must obey the forces we want to command," presenting two arguments, with a quotation and an example for each.

*************************

Francis Bacon famously wrote that we must obey the forces we want to command in reference to the laws of nature. One can readily transpose this dictum to other domains: market forces, military forces, cultural forces, psychological forces, etc. — each being subject to the same rigor and scrutiny we demand when performing natural science.

Literature is one such domain. Although it is steeped in tradition – the rules of language (from grammar to semantics), conventions of genre, formal narrative structures, as well as cultural expectations – truly original work emerges only once these elements are firmly understood [Argument 1]. The rules are acknowledged and assimilated, and subverted to new ends. This is especially true in the example of Oulipo – a formally defined literary movement. Cofounder Raymond Queneau described Oulipians as "rats who construct the labyrinth from which they plan to escape"[1].

Consider for a moment, though, how it (or any other constraint, for that matter) works. It places a restriction on the expressions and phrases that can be used in a poem, and it determines to some extent what the poet is able to say. It makes the process of writing both more difficult — by short-circuiting habitual modes of self expression — and, paradoxical as it may seem, easier: certain decisions have already been made for the writer. A constraint confronts the writer with a puzzle to solve, not a blank page, and this can be strangely comforting. Finally, a constraint will almost always force a writer to be creative, to seek out new means of self expression.[2]

Clearly the forces of language are fully obeyed by Oulipians in order that their practitioners can bend them to their will.

Science has evolved since Bacon’s time, and its ambitions have become more complex and its progress more nebulous. The pursuit of artificial intelligence is limited in exactly the way Bacon’s dictum would suggest [Argument 2]: "How do you make a search engine that understands if you don't know how you understand?"[3].

Douglas Hofstadter is a cognitive scientist who has become disillusioned with the common approach: "Sometimes it seems as though each new step towards AI, rather than producing something which everyone agrees is real intelligence, merely reveals what real intelligence is not"[4].

While advances have been made in data processing, and a form of "intelligence" has grown out of this capability, we have not yet achieved a truly artificial intelligence. We cannot master this domain until we have fully understood the workings of the mind and can obey the algorithms that are in play.

Bacon's assertion is thus borne out in both successes and failures across domains.

1. Raymond Queneau. Definition provided at Oulipo meeting. Apr 5, 1961.
2. Paul Kane. Review of Oulipo Compendium. Oct 2006.
3. James Sommers. "The Man Who Would Teach Machines to Think." Atlantic Monthly. Oct 23, 2013.
4. Douglas Hofstader. Gödel, Escher, Bach: An Eternal Golden Braid. 1979.

Friday, September 05, 2014

All kinds of thinking

I recently completed a MOOC on neuroeconomics. Basically, the study of decision-making. It's a little more exacting than other courses I've taken, but sometimes I think if I just let things play in the background I'll become smarter through osmosis.

The course actually covered some fairly familiar concepts regarding intuitive thinking versus rational thinking, how responses are affected by whether a situation is framed positively or negatively, how those responses can be primed by unrelated factors in our environment, etc. And the material referenced the work of scientists with which I'm actually conversant, like Antonio Damasio and Steven Pinker.

And then: Nobel laureate Daniel Kahneman. And I realized I have a book of his lying around somewhere unread that I received for Christmas a few years ago. Well, Thinking, Fast and Slow: no longer unread.

Kahneman summarizes the book in his conclusion:
I began this book by introducing two fictitious characters, spent some time discussing two species, and ended with two selves. The two characters were the intuitive System 1, which does the fast thinking, and the effortful and slower System 2, which does the slow thinking, monitors System 1, and maintains control as best it can within its limited resources. The two species were the fictitious Econs, who live in the land of theory, and the Humans, who act in the real world. The two selves are the experiencing self, which does the living, and the remembering self, which keeps score and make the choices. In this final chapter I consider some applications of the three distinctions.

I read the whole book word for word, cover to cover (is that how people read nonfiction?). The writing's a bit tedious at times; basically it's a retrospective of Kahneman's career, touching on all his research and the discoveries he made along the way. Anyone with an interest in cognitive processes will find examples and case studies to entertain or to puzzle over, some resonating more than others.

What Kahneman saves for the conclusion, though, caused a lightbulb moment. Individuals make intuitive but irrational decisions, and contradictory judgments, and "objective" assessments imbued with external influences — and these issues hold at the societal level of decision making as well. Kahneman makes a soft appeal for libertarian paternalism, where, given the known workings and weaknesses of systems of reasoning, policymakers should judiciously guide individuals using the principles of behavioral science (with experts to advise the policymakers, of course). Examples of applications of these principles include opt-out enrolment in social plans (like health care) and regulations regarding the labeling on food packaging and the framing of disclosures regarding fuel consumption. So, not just brain theory stuff.

Reviews and Insight
Two Brains Running — Jim Holt in the New York Times
And frowning — as one learns on Page 152 of this book — activates the skeptic within us: what Kahneman calls "System 2." Just putting on a frown, experiments show, works to reduce overconfidence; it causes us to be more analytical, more vigilant in our thinking; to question stories that we would otherwise unreflectively accept as true because they are facile and coherent. And that is why I frowningly gave this extraordinarily interesting book the most skeptical reading I could.

How to Dispel Your Illusions — Freeman Dyson in the New York Review of Books
There are huge differences between Freud and Kahneman, as one would expect for thinkers separated by a century. The deepest difference is that Freud is literary while Kahneman is scientific. The great contribution of Kahneman was to make psychology an experimental science, with experimental results that could be repeated and verified. Freud, in my view, made psychology a branch of literature, with stories and myths that appeal to the heart rather than to the mind. The central dogma of Freudian psychology was the Oedipus complex, a story borrowed from Greek mythology and enacted in the tragedies of Sophocles. Freud claimed that he had identified from his clinical practice the emotions children feel toward their parents that he called the Oedipus complex. His critics have rejected that claim. So Freud became to his admirers a prophet of spiritual and psychological wisdom, and to his detractors a quack doctor pretending to cure imaginary diseases. Kahneman took psychology in a diametrically opposite direction, not pretending to cure ailments but only trying to dispel illusions.

The King of Human Error — Michael Lewis in Vanity Fair
There's a quality both impish and joyous to Kahneman's work, and it is most on display in his collaboration with Amos Tversky. They had a rule of thumb, he explains: they would study no specific example of human idiocy or irrationality unless they first detected it in themselves. "People thought we were studying stupidity," says Kahneman. "But we were not. We were studying ourselves." Kahneman has a phrase to describe what they did: "Ironic research."

Sunday, May 11, 2014

The future of reading

Today's episode of Spark on CBC Radio focused on the future of reading in the internet age. How, generally, the rhythm of our lives and the rhythm of deep reading no longer intersect.

Listen online:

Robotics in work and life: Margaret Atwood on robots and AI.

Bite-sized reading: Rooster, an app that breaks down novels into easily digestible bite-sized chunks.

Scanning and skimming: It turns out we do read differently whether text is on screen or on paper. A conversation with Maryanne Wolf, neuroscientist and author of Proust and the Squid. One of the points made is that reading is not a natural thing — it's totally learned. So internet reading feeds our predatory/preservationist instincts for watching, searching, jumping, quick processing. "The quality of our attention is a mirror of the quality of our thought."

Social reading: The idea of "social reading" is not for me — I read alone, I don't want to be intruded upon. But this segment is about a story-sharing app, Wattpad. Of course stories are a social phenomenon, and this app seems to be about engaging with people who want to tell them.

With commentary by bookfuturist Tim Carmody.

The 53-minute podcast is worth a listen. . . or you may wish to spend that time reading instead.

Saturday, March 01, 2014

Mind reading: cognitive poetics

How to read... a mind is a two-week online course starting March 17, offered by the University of Nottingham.

The journey from new student to advanced study is really very short. Over two weeks, you will become fairly expert in cognitive poetics. You will understand in quite a profound way what it is to read and model the minds of other people, both real and fictional. You don't need any preparation other than your curiosity and your own experience of reading literary fiction or viewing film and television drama.

Although I haven't figured out how to read the title of this course, it sounds fascinating, and far too short. I'm going to be an expert in cognitive poetics!

I am positively addicted to MOOCs. (And I love saying "MOOC.") I just finished my fourth MOOC last week, and I start another next week. Only one of them has been purely for personal interest — the rest were for, as they say, professional development. "How to read" (MOOC number six) is a return to feeding my non-working life.

Thursday, June 20, 2013

A word is a recipe

"You went to school," Lee said. "I mean, at some point. And it didn't suit you very well. They wanted too teach you things you didn't care about. Dates and math and trivia about dead presidents. They didn't teach persuasion. Your ability to persuade is the single most important determinant of your quality of life, and they didn't cover that at all. Well, we do. And we're looking for students with natural aptitude."

Lexicon, by Max Barry, was a helluva read, and bears several noteworthy distinctions:
  • Starts with a needle stuck in an eyeball.
  • Made me twice almost miss my metro stop (as in, reading, reading, and as the warning chime sounds realizing, holy shit, this is my stop, and dashing through the closing doors in the nick of time).
  • Includes as characters T.S. Eliot, Sylvia Plath, Emily Dickinson, Kathleen Raine, Isaac Rosenberg, Goethe, etc. (Well, they're code names, but still.)
  • Secret society.
  • Neurolinguistics! (Which rocks my world, but maybe that's just me.)
  • Babel myths and brain hacking.
  • Made me cry.
  • Comes with personality quiz.
It's a thriller with a driving pace. The joy of reading it comes also with the dismay that you will eventually run out of book to read.

The story cuts between two main narrative threads, essentially running in opposite directions. (It's a little bit Time Traveller's Wife meets Snow Crash.) We follow Wil back across the chain of events that led to his eyeball being threatened in an airport bathroom. And there's Emily, whose story is told chronologically forward — she's a hustler who runs a three-card Monte scam who is recruited by a secret society to train as a "poet."

"What's a word?"

"Huh?

"You're feeling clever — tell me what a word is."

"It's a unit of meaning."

"What's meaning?"

"Uh... meaning is an abstraction of characteristics common to the class of objects to which it applies. The meaning of ball is the set of characteristics common to balls, i.e. round and bouncy and often see around guys in shorts."

Jeremy returned to the free throw line, saying nothing. She figured she must have that wrong, or at least not right enough.

"You mean from a neurological perspective? Okay. A word is a recipe. A recipe for a particular neurochemical reaction. When I say ball, your brain converts the word into meaning, and that's a physical action. You can see it happening on an EEG. What we're doing, or, I should say, what you're doing, since no one has taught me any good words, is dropping recipes into people's brains to cause a neurochemical reaction to knock out the filters. Tie them up just long enough to slip an instruction past. And you do that by speaking a string of words crafted for the person's psychographic segment. Probably words that were crafted decades ago and have been strengthened ever since. And it's a string of words because the brain has layers of defenses, and for the instruction to get through, they all have to be disabled at once."

So that's the neurolinguistic principle behind the brain hacking, essentially exerting a kind of mind control via a hypnotic-like suggestion. Once you've identified the segment to which a person belongs, the right string of words is easy. The ultimate purpose, of course, being something like world domination by this society, although this was never entirely clear to me, or to serve the aims of one individual corrupted by absolute power, something like that.

Lexicon is an idea book — in my view, a highly original one. I love the linguistic angle, but there's plenty of action and conspiracy to satisfy readers who aren't gaga for language processing theory.

There are some interesting discussions also about digital media and social media, how user data is gathered, and how that data can be used to generate content, so that every user has a customized user experience. A website can achieve the same end (from the point of view of a site owner) but through different, highly individualized means. (See this video about capturing data: "The global Internet becomes the personal Internet.")

At heart though, Lexicon is a love story and about the search for meaning, digging around in the thin space — the disconnect — between words, or whatever other symbols we choose to use, and the meaning they're meant to convey.

Wednesday, January 23, 2008

Change your mind

Something to think about for the rest of the year...

The Edge Annual Question — 2008:

When thinking changes your mind, that's philosophy.
When God changes your mind, that's faith.
When facts change your mind, that's science.

WHAT HAVE YOU CHANGED YOUR MIND ABOUT? WHY?

Science is based on evidence. What happens when the data change? How have scientific findings or arguments changed your mind?


I won't pretend to have read all the responses; I've skimmed them at random. There's enough there to keep you reading for days, and thinking for weeks.

Kevin Kelly, editor at Wired, has this to say:

Everything I knew about the structure of information convinced me that knowledge would not spontaneously emerge from data, without a lot of energy and intelligence deliberately directed to transforming it. All the attempts at headless collective writing I had been involved with in the past only generated forgettable trash. Why would anything online be any different?


The success of Wikipedia has changed his mind.

Things other thinkers have changed their mind about: Nuclear energy — these days it's much easier to see that the benefits outweigh the risks. There are some rambling entries regarding God. The nature of the differences between the sexes. How dinosaurs came to be extinct (asteroid!). Brian Eno changed his mind about Maoism.

For Alison Gopnik, imagination is real, and I'm citing her response in full because I think it's super interesting, and I see the evidence surrounding me every day to bear this out:

Recently, I've had to change my mind about the very nature of knowledge because of an obvious, but extremely weird fact about children — they pretend all the time. Walk into any preschool and you'll be surrounded by small princesses and superheroes in overalls — three-year-olds literally spend more waking hours in imaginary worlds than in the real one. Why? Learning about the real world has obvious evolutionary advantages and kids do it better than anyone else. But why spend so much time thinking about wildly, flagrantly unreal worlds? The mystery about pretend play is connected to a mystery about adult humans — especially vivid for an English professor's daughter like me. Why do we love obviously false plays and novels and movies?

The greatest success of cognitive science has been our account of the visual system. There's a world out there sending information to our eyes, and our brains are beautifully designed to recover the nature of that world from that information. I've always thought that science, and children's learning, worked the same way. Fundamental capacities for causal inference and learning let scientists, and children, get an accurate picture of the world around them — a theory. Cognition was the way we got the world into our minds.

But fiction doesn't fit that picture — its easy to see why we want the truth but why do we work so hard telling lies? I thought that kids' pretend play, and grown-up fiction, must be a sort of spandrel, a side-effect of some other more functional ability. I said as much in a review in Science and got floods of e-mail back from distinguished novel-reading scientists. They were all sure fiction was a Good Thing — me too, of course, — but didn't seem any closer than I was to figuring out why.

So the anomaly of pretend play has been bugging me all this time. But finally, trying to figure it out has made me change my mind about the very nature of cognition itself.

I still think that we're designed to find out about the world, but that's not our most important gift. For human beings the really important evolutionary advantage is our ability to create new worlds. Look around the room you're sitting in. Every object in that room — the right angle table, the book, the paper, the computer screen, the ceramic cup was once imaginary. Not a thing in the room existed in the pleistocene. Every one of them started out as an imaginary fantasy in someone's mind. And that's even more true of people — all the things I am, a scientist, a philosopher, an atheist, a feminist, all those kinds of people started out as imaginary ideas too. I'm not making some relativist post-modern point here, right now the computer and the cup and the scientist and the feminist are as real as anything can be. But that's just what our human minds do best — take the imaginary and make it real. I think now that cognition is also a way we impose our minds on the world.

In fact, I think now that the two abilities — finding the truth about the world and creating new worlds — are two sides of the same coins. Theories, in science or childhood, don't just tell us what's true — they tell us what's possible, and they tell us how to get to those possibilities from where we are now. When children learn and when they pretend they use their knowledge of the world to create new possibilities. So do we whether we are doing science or writing novels. I don't think anymore that Science and Fiction are just both Good Things that complement each other. I think they are, quite literally, the same thing.


I have changed my mind about relatively little, but then that's mostly because I've always been so slow to make up my mind one way or the other at all. That's something motherhood changed about me: suddenly, I had opinions, dammit! but least of all regarding the rearing of my child. Suddenly, I saw the relevance of the price of tea in China, and it mattered that I took a stance. In this way I have changed my mind: better to know something, believe something, however fleetingly, and change one's mind as new data become available, then to withhold opinions while waiting for a perfect analysis.

What have you changed your mind about?