I've written a couple times about "gamifying" life:
In 2011 I wrote review of Jane McGonigal's book "Reality is Broken" that encourages us to approach life as a game in order to change our lives and the world. I was on the fence, but kind of excited by the idea.
Later that year I wrote about Barbara Ehrenreich's "Bright-Sided," and decided that McGonigal's gamifying idea was pop psychology nonsense.
Well, the New Yorker has reviewed McGonigal's new book, "SuperBetter," and eloquently summed up the idea I had settled on: we can't game our through life, and pop psychology, while appealing, is really not great science. I highly recommend this review, if you've given any thought to gamification. It's a short article, for the New Yorker, and a long article for a bad book review.
The review also makes me think long and hard about adopting the first thing we read on a new-to-us concept as truth, even if it sounds science-y and is published by a reputable publisher- I'm talking anchoring, not being gullible or good at information literacy. This may sound obvious, but if you push yourself, I'm guessing that very few of us take the time to read an alternate perspective on a new concept. We've picked our sources and are already teaching ourselves complicated new things- do we really always need to go there? I just happened to stumble on this review in my intellectual magazine of choice- I didn't seek it out. But if we don't question our new concepts, we run into trouble.
Here's where I'm at: I'm a smart person with a degree in information science- I'm pretty good at knowing what is reliable information. But when are introduced to a novel concept and it's shrouded in the trappings of reliability (here we can hark back to "Galileo's Middle Finger"), it can be hard, not only to question the new information, but to even think about questioning the new information. Why would we? It sounds good, it merges easily into our core knowledge, and it comes from a reliable source (and may even, as in the case of McGonigal's work, be backed up by what look like scientific studies). If the information is so out of the spectre of something we consider, maybe we put the book or article down and stop reading. But if it fits, the information can quickly and dangerously become part of what we think and "know." We can start telling others about it, as if it were truth (and it may be truth, but it may also not be truth, or at least half-truth). Think Gwyneth's green juice.
I don't have an answer, just thoughts about how knowledge is formed, and how we can challenge ourselves to think before we know. That awful bumper sticker comes to mind (are all bumper stickers awful, or is it just me?)- "Don't believe everything that you think."