Loading...

Follow Research Digest on Feedspot

Continue with Google
Continue with Facebook
or

Valid

By Emma Young

Loneliness is a “disease”, associated with an increased risk of death equivalent to smoking 15 cigarettes a day. Strides have been made in understanding what form of loneliness is damaging (a lack of close relationships with other people, rather than a lack of relationships per se), but ways to tackle loneliness are badly needed. Now a new study, available as a preprint on PsyArXiv, reveals a way in which loneliness seems to be maintained and, therefore, a potential route to an intervention. 

A popular model of loneliness holds that it is maintained by abnormal processing of the social signals – such as smiles and eye contact – that underlie positive social interactions. One consequence of this abnormal processing could be a failure to automatically mimic other people’s facial expressions – a phenomenon that occurs naturally during most social interactions. To investigate for the first time whether this is the case, Andrew Arnold and Piotr Winkielman at the University of California, San Diego conducted a small, preliminary study on 35 student volunteers. 

The students first completed three scales that measured their loneliness, depression and personality. Based on the loneliness results, they were classed as either lonely or not lonely. Next they had electrodes attached to two pairs of their own facial muscles important for generating emotional expressions – regions of the zygomaticus major (smiling) muscles in the cheeks, and also the corrugator supercilii (frowning) muscles in the brow. They were then shown video clips of men and women making facial expressions of anger, fear, joy and sadness. 

Scales that they were given to complete showed that the lonely and non-lonely students were equally good at distinguishing between facial expressions, and there were no group differences in the strength of “negative emotion” ratings to anger, fear and sadness or “positive” ratings to joy. So the lonely people could recognise and understand emotional expressions just as well as the non-lonely group. However, there were important differences in how their own faces responded spontaneously to the video clips. 

When members of both groups saw videos of people displaying anger, for example, their own brows moved to automatically mimic this expression. But when the expression in the video was of joy, only the “non-lonely” group automatically smiled in response. The participants’ scores on depression and on extraversion bore no relation to this finding. It was loneliness that made the difference. 

The researchers checked that the lonely group could deliberately mimic smiles, as well as frowns (which they could). They also found that the lonely group smiled automatically while viewing “non-social” positive images (such as nature scenes) that also made the other group smile. These images didn’t include people (or if they did, their facial expressions weren’t obvious.) 

The findings suggest that a failure to mimic other people’s smiles automatically could be playing a role in loneliness. A failure to mimic a smile might send an antisocial signal to others, the researchers note, undermining social connections, and leading to social disconnect. “Indeed, this could be one behavioural mechanism that maintains chronic loneliness,” they add. 

This is a small study, and it can’t speak to the causal direction: does loneliness lead to a problem with smile mimicry or is it the other way around? But it does suggest a new target for addressing loneliness, and for further research into the role it might potentially play in real-world encounters. 

“Given the serious problem of loneliness in society and its danger to health, more research on how it presents in everyday social interactions is useful for greater understanding of and treatment of the condition, ourselves and each other,” the researchers write. 

Smile (but only deliberately) though your heart is aching: Loneliness is associated with impaired spontaneous smile mimicry [this study is a preprint meaning that it has yet to be subjected to peer review and the final published version may differ from the version on which this report was based]

Emma Young (@EmmaELYoung) is Staff Writer at BPS Research Digest

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Corrected press releases led to more accurate news, without any dip in quantity of coverage; via Adams et al, 2019

By Jesse Singal

There are many reasons why media outlets report scientifically misleading information. But one key site at which this sort of misunderstanding takes root is in the press releases that universities issue when one of their researchers has published something that has a chance of garnering some attention. A new open-access study in BMC Medicine attempts to change this by intervening in the process directly.

Press releases are often misleading in many different ways, but a common flaw is their tendency to confuse causal and correlational claims. That is, an observational study will find that (to take a hypothetical), the more wine people drink, the more likely they are to be diagnosed with cancer over a given period. A study like this, reported accurately, doesn’t show that drinking wine causes cancer – it shows that wine consumption is associated with cancer diagnoses. It could be some other factor or factors is/are responsible for the link, like maybe people who drink more wine engage in other behaviours that themselves increase the risk of cancer. 

Experimental studies, on the other hand, allow for the more confident drawing of causal inferences. If (again hypothetically) you took two otherwise equivalent groups, assigned one to make no lifestyle changes other than beginning to drink more wine, and then tracked differences in long-term cancer diagnoses, it’s more likely any observed group differences were caused by the experimental intervention.

The way that health press releases often present purely correlational evidence as though it is causal is, to an extent, understandable: “Wine Causes Cancer” is more eye-catching than “Researchers Uncover A Correlation Between Wine Consumption And Cancer That May Or May Not Be Causal.” And because journalists often write stories based entirely on press releases, the end result is that news coverage often lacks nuance.

For the new research, a team of psychologists led by Rachel Adams at Cardiff University and including her colleague Christopher Chambers, the author of “The Seven Deadly Sins of Psychology: A Manifesto for Reforming the Culture of Scientific Practice”, asked a bunch of university press offices to participate. The press offices sent the team hundreds of their biomedical and health-related press releases before they went out to the public, and Chambers and his colleagues randomly assigned them to different conditions: some they didn’t touch regardless of their contents or accuracy (the control group), whereas for the others they proposed edits that “aligned” the press release’s headline and content with the nature of the evidence (with experimental evidence allowing for stronger causal claims, and purely correlational evidence presented with cautious language). Then they watched the press releases go into the wild, evaluating how often their more careful approach was carried over into any ensuing national and international press stories – and whether the toned down releases led to less media coverage.

The most important takeaways are that news headlines were more accurate when they were written off more accurate press releases (which shows that journalists really are relying heavily on press releases rather than reading the studies themselves). And as judged by the amount of media coverage each press release generated, there was “no evidence of reduced news uptake for press releases whose headlines and main claims aligned to evidence.” Now, press releases should be accurate for accuracy’s own sake, but this does offer some evidence that honest press-release writers won’t be punished, in terms of reduced media coverage, by doing the right thing. This suggests, as Chambers and his team write in their abstract, that “[c]autious claims and explicit caveats about correlational findings may penetrate into news without harming news interest.”

As the research team further point out, what this study can’t answer is the actual effect of misleading versus appropriately hedged media coverage on news consumers themselves – that is, will their behaviour change on the basis of whether or not they are reading accurate health coverage? That’s a question for future research.

It would be fascinating and important to conduct a study like this on press releases related to psychological findings – another area where it’s been fairly standard for a while for weak or conflicting findings to be presented in a much stronger, more attention-getting manner in press releases, potentially causing harm to readers.

Claims of causality in health news: a randomised trial

Post written by Jesse Singal (@JesseSingal) for the BPS Research Digest. Jesse is a contributing writer at BPS Research Digest and New York Magazine, and he publishes his own newsletter featuring behavioral-science-talk. He is also working on a book about why shoddy behavioral-science claims sometimes go viral for Farrar, Straus and Giroux.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Emma Young

If you have healthy vision, there will be a specific region of your brain (in the visual cortex) that responds most strongly whenever you look at faces, and similar regions that are especially responsive to the sight of words or natural scenes. What’s more, in any two people, these face, word and scene regions are located in pretty much the same spot in the brain. However, there is not a specific region for every possible category of visible stimulus – there are no “car” or “shoe” regions, for example (at least, not that have been identified to date). Is that because childhood experience is critical for training the visual cortex – we spend a lot of time looking at faces, say, but not cars? And, if so, in theory, could a lot of childhood time spent looking at a different type of object generate its own dedicated, individual category region? 

The answer is “yes”, at least according to an ingenious study, published in Nature Human Behaviour, of people who played a Pokémon game for years of their childhood. 

Jesse Gomez led the new study while a graduate student at Stanford University. He was looking for a way to test whether there’s a critical developmental window for the formation of dedicated category regions in the human visual cortex, just as there is in macaque monkeys. He needed a kind of visual stimulus that some adults had been exposed to intensively in childhood but others had not. He thought of how, from about the age of six, he, like many other kids he knew, used to spend countless hours playing a game on his Nintendo Game Boy called Pokémon Red and Blue. It involved identifying hundreds of different Pokémon characters, which look a bit like animals or mythical beasts. 

Gomez realised that if he could find other people who had also started playing the game intensively at about the same age, using the same device, he could explore whether this had influenced the organisation of their visual cortex. 

He managed to recruit 11 such adults (including himself), and scanned their brains while they were shown images of Pokémon characters, as well as other things, such as faces, corridors and cartoons. Gomez and his colleagues found that within the visual cortex (in the ventral temporal region) of the Pokémon experts, there was a discrete area that was most active when looking at the Pokémon characters. There was no such region in a control group of non-players. 

Other work has found that the brains of people who become experts at recognising a type of object (like cars) as an adult respond differently to those objects than the brains of novices. But these differences are not in the visual cortex; they’re more often in the prefrontal cortex, which is involved in attention and decision-making rather than basic visual processing. 

Building on the work showing the plasticity of the visual cortex in young macaques, “the current finding of a Pokémon-preferring brain region really drives home just how amazing the plasticity of our developing visual system is,” write Daniel Janini and Talia Konkle of Harvard University in a news comment on the paper, published in the same issue. 

Gomez and his colleagues also found that – as with face-processing or word-processing regions – the “Pokemon region” shared a similar location in all of the experts’ brains. They think that the physical size of an object’s image on the retina is important in determining where, in the brain, the category region forms. The image size of a Pokémon character viewed on an old Game Boy screen by children is consistently smaller than that of someone’s face – and a lot smaller than that of a landscape, for instance – which could have a lasting effect on the way visual representations are handled in the adult brain, the researchers think.

As well as being fascinating, the study has potentially important practical implications. “Our data raise the possibility that if people do not share common visual experiences of a stimulus during childhood, either from disease, as is the case in cataracts, or cultural differences in viewing patterns, then an atypical or unique representation of that stimulus may result in adulthood, which has important implications for learning disabilities and social disabilities,” the researchers write.  

Consider autism, for example, which is associated with difficulties recognising faces and an aversion to eye contact. If kids with autism grow up looking at faces differently from how most children do, perhaps this explains the observed deficits in the function of the face-sensitive region of their visual cortex, and in turn this could contribute to the social difficulties that autistic children experience. If this account is correct, then finding out how long the window of visual cortical plasticity lasts will be critical for designing effective interventions for autism and other neurodevelopmental conditions. 

Extensive childhood experience with Pokémon suggests eccentricity drives organization of visual cortex

Image: Pokemon’s figures are on display during the International Tokyo Toy Show 2009 (Photo by Junko Kimura/Getty Images)

Emma Young (@EmmaELYoung) is Staff Writer at BPS Research Digest

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Matthew Warren

You should take just under two-and-a-half minutes to finish reading this blog post. That’s going by the findings of a new review, which has looked at almost 200 studies of reading rates published over the past century to come up with an overall estimate for how quickly we read. And it turns out that that rate is considerably slower than commonly thought.

Of the various estimates of average reading speed bandied around over the years, one of the most commonly cited is 300 words per minute (wpm). However, a number of findings of slower reading rates challenge that statistic, notes Marc Brysbaert from Ghent University in Belgium in his new paper released as a preprint on PsyArxiv

Brysbaert searched for all studies measuring reading rates in participants aged between 17 and 60 and in languages that use the Latin alphabet. The exact nature of the studies varied a lot: for example, in some, participants had to read a long passage before answering multiple choice questions about the text, while in others they read single sentences while their eye movements were measured. But Brysbaert included only those studies in which participants read for fun or comprehension, and excluded others that require memorization or other challenges. Altogether, he found 190 suitable studies conducted between 1901 and 2019, collectively involving 17,887 participants. 

The average reading rate across all these studies turned out to be just 238 wpm – much slower than the popular 300 wpm estimate. However, there was quite a lot of variability between studies, particularly those that used very short passages, where the slowest rate was just over 100 wpm and the fastest nearly 400 wpm. With longer texts, the rates fell more closely around the average, suggesting that longer reading tasks might be a more reliable measure.

Although the number of studies involving non-English languages was too small to draw any firm conclusions, there seemed to be a hint of differences between languages. For example, reading rate in the five Spanish studies was considerably faster than the average, at 278 wpm, while the average rate for the 144 English-only studies was 236 wpm. And while the meta-analysis only included participants under the age of 60, Brysbaert notes that other studies have found that reading rate declines in older age groups. 

Knowing that reading rates are closer to 240 than 300 wpm might seem fairly inconsequential. But it does have real-world implications. These kinds of thresholds are used by educators to determine whether someone is a slow reader and in need of remedial help – so honing in on a precise number is important. “Setting the target reading rate at 300 wpm is unrealistic for the majority of people and likely to result in disappointment of what can be achieved,” writes Brysbaert.

The meta-analysis of past reading-rate estimates is arguably the most interesting part of the new preprint – but in fact the manuscript encompasses a lot more. If you want to find out more about the long history of reading research  or what the fastest possible reading speed is, go and have a look at the paper. But set aside some time: at more than 19,000 words long, it will take you about 80 minutes to get through. 

How many words do we read per minute? A review and meta-analysis of reading rate

Image: via giphy.com

Matthew Warren (@MattbWarren) is Staff Writer at BPS Research Digest

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
The anatomy of the right hand of one of the polydactyl volunteers, via Mehring et al, 2019

By Christian Jarrett

Picture in your mind a futuristic, technologically enhanced human. Perhaps you imagined them with a subcutaneous device in their arm for phone calls and browsing the internet. Maybe they are wearing smart glasses for augmented reality. What I’d wager you didn’t think of is the presence of an artificial sixth digit attached to each hand. However, a breakthrough open-access study in Nature Communications – the first to study the physiology and sensorimotor mechanics of polydactyly volunteers (people born with extra fingers) – shows the feasibility and practical advantages that would be gained from such an extra appendage. The results also have implications for the medical treatment of polydactyl people, who often have their extra finger removed at birth on the presumption that it will be of no benefit to them.

Carsten Mehring and his colleagues conducted various tests with two polydactyly volunteers, a 17-year-old boy and his mother, both born with an extra fully formed finger between their thumb and index finger (known as preaxial polydactyly). The researchers note that polydactyly is “not rare”, with an incidence of around 0.2 per cent in the population. However, fully formed preaxial polydactyly is a rarer subset of that group.

Using MRI of the volunteers’ hands, the researchers established that the extra finger has a saddle joint, similar to a typical thumb, and that it is innervated by its own dedicated nerves. Further tests established that the volunteers had independent control of their extra finger and that they were able to use it to perform a pinch grip with each of their other fingers.

An MRI of the volunteers’ brains further showed that the extra finger was represented in the brain independently of the other fingers. Another test, that involved concealing the extra finger and asking the volunteers’ to identify landmarks on it, showed that they had an accurate mental representation of their extra digit.

Next, the researchers used video motion capture to observe the volunteers manipulating various objects. This showed that the volunteers engaged in a “rich ensemble of movement patterns” and that they frequently used their extra finger in coordination with both their thumb and index finger (it was not simply used as a substitute for these digits). “Taken together these results demonstrate that the movements of the six fingers of our two subjects had increased complexity relative to common five-fingered hands,” the researchers said.

But do these extra movement capabilities provide any functional advantage? Mehring and his team devised a video game that required coordinating key presses to respond to six boxes oscillating progressively faster up and down onscreen. A different key press was required to respond to each of the six boxes, so people with normal five-fingered hands would need two hands to succeed at the task, the researchers note. Critically, the polydactyl volunteers were able to achieve the same impressive game performance with one hand as with two.

Neuroscience and psychology have studied extensively the profound neural consequences for humans of losing a limb or other appendage, including documenting the pain caused by the phantom limb effect (usually explained as due to reorganisation of the brain’s representation of missing part). However, this new study represents the first neuroscientific exploration of having an extra body part, finding “… that the human nervous system is able to develop, embody and control multiple extra degrees of freedom and integrate them into coordinated movements with the other limbs, without any apparent deficits or conflicts in the sensorimotor or mental representations.”

This has immediate implications for the medical response to polydactyl, suggesting the need to “…thoroughly evaluate the functionality of [the extra digit] in polydactyl infants before deciding whether to remove it.” Also, from a cyborg perspective, the results “…suggest that it may be of value to augment normal five-fingered hands with an artificial supernumerary finger,” the researchers said. In fact this new research paves the wave for an entire new research endeavor. “Polydactyl individuals with functional [extra fingers] offer a unique opportunity to investigate the neural control of supernumerary limbs, analyse internal representations of body and the limits of sensorimotor capabilities in humans.”

Augmented manipulation ability in humans with six-fingered hands

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Matthew Warren

Psychologists have noticed that aspiring leaders generally pursue one of two different approaches for getting to the top of the social food-chain. Some people exert influence by building up skills or knowledge that command respect and deference from their peers – known as the prestige strategy. Others prefer to rule by fear instead, forcing others to fall into line – the dominance strategy. This dichotomy has even been suggested to account for the vastly different leadership styles of Barack Obama and Donald Trump. 

But many of the studies that have looked at the dynamics of prestige and dominance have done so in artificial social situations, examining groups of strangers brought together for a short time in the lab. So in a new study published open-access in Royal Society Open Science, Charlotte Brand and Alex Mesoudi went out into the world and looked at how hierarchies based on prestige and dominance affected the behaviour of real social groups. 

The researchers recruited 30 community groups from Cornwall, each made up of 5 individuals and ranging from choirs to chess clubs (my favourites include a band known as Falmouth Fish Sea Shanty Collective, and a group of board game creators called Pirates of Penryn).

Each participant completed a 40-item quiz covering topics like art and geography, first individually and then together with their group. The groups were told they had to come to an agreement on each answer, and the group that scored the highest overall would win £500. Finally, each individual cast an anonymous vote for the group member they wanted to represent them in a set of bonus quiz questions which could win them even more money.

Participants also rated each member of their group on prestige (e.g. “Members of your group respect and admire them”), dominance (e.g. “They are willing to use aggressive tactics to get their way”), their likability, and how much influence they had during the quiz and in the group generally. Thankfully for the enduring survival of the groups, these ratings were all anonymous.

Within groups, individuals who had more influence were more likely to be rated as highly prestigious or highly dominant, consistent with previous research suggesting that both strategies can be used to gain status in social groups. Dominance and prestige ratings were not related to each other, again in-line with findings that the two strategies are quite distinct.

But intriguingly, neither dominance nor prestige ratings determined whether someone would be elected to take the bonus quiz – even though prestige in particular is thought to be closely tied to having superior knowledge. Instead, elected representatives tended to be those who had scored highly during the individual quiz, suggesting that the other group members had picked upon their expertise in this specific context, and made a pragmatic decision based on this, rather than being swayed by people’s status in the group more broadly.

These findings contrast with previous studies that suggested prestige has a greater influence over group behaviour. They perhaps illustrate the pitfalls of performing social psychology studies solely in the lab: in natural groups, where individuals have had the chance to interact and establish hierarchies over long periods of time, group dynamics may be quite different to those among strangers who have just met. Of course, a selection of community groups from Cornwall is still not necessarily representative of the wider population, and it would be interesting to explore the dynamics of prestige and dominance in more diverse groups.

Nevertheless, when it comes to groups in the real world, the authors say, “prestige and dominance may be more domain-specific, or more fixed, than we had anticipated”. For example, group members may have gained their prestige because of a specific skillset – perhaps they were particularly good at singing or making board games – which was less useful for the general knowledge quiz. 

Prestige and dominance-based hierarchies exist in naturally occurring human groups, but are unrelated to task-specific knowledge

Image: Members of a Cornish male voice choir entertain the crowds near the harbour on August 19, 2013 in Padstow, England (Photo by Matt Cardy/Getty Images).

Matthew Warren (@MattbWarren) is Staff Writer at BPS Research Digest

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
via McCoy and Ullman (2019)

By Christian Jarrett

In a world with magic, how much effort do you think it would take to cast a spell to make a frog appear out of nowhere? What about to turn a frog invisible? Or make it levitate? And would it be easier to levitate a frog than a cow?

The researchers John McCoy and Tomer Ullman recently put such questions to hundreds of participants across three studies and found they were in remarkable agreement. The findings, published in PLOS One, suggest that we invoke our intuitive understanding of the physical world – our “folk physics” – to make sense of imaginary worlds. And they help explain why fantasy TV shows and books can lose their magic as soon as it feels like anything goes. “Superman leaps tall buildings in a single bound, but a building takes more sweat than an ant-hill,” the researchers said. “And even for Superman, leaping to Alpha Centauri is simply silly.”

In the first of three studies, McCoy and Ullman asked over 200 online participants (aged 18 to 83) to imagine a world in which wizards cast spells and to rank 10 spells in order of how much effort they would require. There was striking agreement across the participants that the easiest spell would be one that changed a frog’s colour and the hardest would be to conjure a frog into existence. In between, from easiest to hardest, the participants ranked the other spells as follows: levitate; teleport; make bigger; turn invisible; turn to stone; split into two frogs; transform to a mouse; and make cease to exist.

A second study with nearly 400 participants was similar but this time some of the participants considered the same spells but applied to a cow, or involving a greater distance than before (for instance, levitating a frog 100 feet off the ground rather than one foot). Also, this time the participants specified how many magic points would be required for each spell (providing a continuous measure of perceived effort), and the researchers converted their estimates to rank the different spells in order of difficulty.

Once again, there was striking agreement among participants in the relative effort required for the different spell types. For instance, the conjure spell was seen as about four times more taxing than the levitate spell. And there was agreement that the same spell type would require more effort when applied to a cow than a frog, and when greater distances were involved.

Across these two studies and a third (a direct replication of the second involving 600 participants), the researchers also assessed the participants’ familiarity with fantasy and magic in books, TV, movies and games. There was no evidence that exposure to fantasy and magic made any difference to participants’ estimates of the difficulty of the spells, ruling out “cultural learning” as an influence on the judgments. “We suggest that the  media does not primarily affect what spells are seen as more difficult, but rather than people bring their intuitive physics to bear when they engage with fiction,” McCoy and Ullman said.

Indeed, the pair note that the spells that were consistently judged most effortful – conjure, and cause to cease to exist – “… violate object permanence and cohesion, which are the earliest developing principles at the core of object understanding.” That is, even babies would have the intuition to recognise that something magical was going on if a frog suddenly appeared or disappeared out of nowhere. At the other extreme, the spells judged easiest, such as changing a frog’s colour or levitating it, “…change only accidental object properties such as location and colour.”

This research was conducted with US participants – it will be interesting to see if and how these findings differ in other cultures. For now, though, the results help explain why fantasy is most enjoyable when it is rooted in reality. The researchers conclude by citing the novelist George Macdonald: “The natural world has its laws, and no man must interfere with them … but they themselves may suggest laws of other kinds, and man may,  if he pleases, invent a little world of his own” – to which McCoy and Ullman add “It seems people’s little worlds do not stray far from home.”

Judgments of effort for magical violations of intuitive physics

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
“My project proposes that the field can engage with non-English ideas and practices in a much more inclusive and systematic way”

By guest blogger Tim Lomas

The novelist David Foster Wallace famously told a story of two young fish swimming in the sea, whereby an older fish glides by and asks, “how’s the water?”, to which they look at each other in puzzlement and say, “What’s water?” The central point of the parable is that we are constantly immersed in contexts to which we give little thought or consideration, but which nevertheless influence us profoundly. Among the most powerful of such contexts is language. A century of research on the linguistic relativity hypothesis (LHR; also known as the Sapir-Whorf hypothesis) has shown that the language we speak profoundly affects our experience and understanding of life, impacting everything from our perception of time and space to the construction of our self-identity.

What might the implications of the LHR be for psychology itself? As a science, the field generally aims to be neutral and objective, and to discover universal truths about the human mind. Yet it is surely consequential that the field mostly conducts its business in English, this being the default language in international journals and conferences. For instance, if a phenomenon has not been identified in English – even if it has in other languages – it is unlikely to be a topic of concern, and may not even “exist” for English-speaking scholars at all.

One way that the field has sought to address this limitation is by “borrowing” words from other languages and cultures.  To ascertain the extent of this cross-cultural borrowing, I analysed a sample of words in psychology and recently published my results in the Journal of Positive Psychology

I focused on my own specialism of wellbeing and in particular on a seminal article from positive psychology, published in American Psychologist in 2000 by Martin Seligman and Mihaly Csikszentmihalyi, which inaugurated this emergent field. My approach was to identify the etymology of every word in the main text of the article using the online etymology dictionary www.etymonline.com

My findings reveal the diverse etymological roots of psychology, and of English more broadly. Of the 1333 distinct lexemes (words and their variants) in the article, ‘native’ English words – belonging either to the Germanic languages from which English emerged, or originating as neologisms in English itself – comprise only 39.4 per cent of the sample. Thus, over 60 per cent of the article’s words are loanwords, borrowed from other languages at some point in the development of English. This is higher than analyses of the percentage of borrowed words in English for other categories of phenomena, such as religion and belief (41 per cent), clothing and grooming (39), the body (14), spatial relations (14) and sense perception (11), and in English as a whole (estimated at between 32 and 41 per cent).

In the American Psychologist text, the largest contributor of loan words is Latin (44.5 per cent) – which frequently arrived via French following the Norman conquest of 1066 – followed by French itself (7 per cent) and Greek (7 per cent), with the remainder provided by modern German (0.7 per cent), Old Norse (0.5 per cent), Italian (0.4 per cent), and Arabic, Dutch, and Scottish (all 0.1 per cent). Moreover, of the words treated as English in origin, 52.1 per cent are neologisms created from other languages (mainly Latin and Greek). If such words were also deemed loanwords (or at least, loan adaptations), the number of borrowed words rises to 70 per cent.

One may wonder why psychology has borrowed so many words. Sometimes borrowing reflects the importation of new psychological theories and practices. One example is “psychoanalysis” (coined by Freud as psychische analyse, before being rendered in French as psychoanalyse then Anglicised in 1906). Other borrowed words articulate phenomena of which English speakers may already have known but not yet named or conceptualised, hence the ready adoption of terms to allow such vocalisation. For instance, behaviours we would identify as altruistic presumably occurred throughout the centuries. However, the term “altruism” was not coined until the 1830s – in French as altruisme by the philosopher August Comte, based on autrui, meaning of or to others – and soon after entered English. 

By borrowing words from other languages, psychology and our understanding of life become more nuanced and enriched. In that respect, psychology would surely do well to go further, and more consciously and actively engage with non-English languages and cultures. Indeed, this is one aim of my own lexicographic project, which involves collecting “untranslatable” words relating to wellbeing (i.e., words without an exact equivalent in English). This is an evolving and collaborative work-in-progress, which currently includes nearly 1200 words, around half of which are crowd-sourced suggestions to my website.

A key premise of the project is that the augmentation of English over the centuries has been a haphazard and arbitrary process – shaped especially by conceptual innovation in the “classical” world (particularly Greece around the 5th and 4th centuries BCE, and the Roman empire between the 1st and 5th centuries), and by the vicissitudes of geopolitical power (notably the invasion by Germanic tribes in the 5th century, and the Norman conquest in the 11th century). By contrast, English – and psychology too therefore – has largely overlooked the conceptual and lexical innovations made in more distant cultures. There are exceptions though, such as the fruitful engagement by psychology with mindfulness, derived from a Buddhist concept and practice known in Pāli as sati, which illustrates the great value of this kind of cross-cultural engagement. 

My project therefore proposes that the field can engage with non-English ideas and practices in a much more inclusive and systematic way (including, of course, through collaboration and co-production with scholars from the cultures in question). Through this and other such endeavours, we can continue to add to the melting pot of ideas, helping the field to continue to develop over the years ahead.

Etymologies of well-being: Exploring the non-English roots of English words used in positive psychology

Post written by Dr Tim Lomas (@drtimlomas) for the BPS Research Digest. Tim is a lecturer in positive psychology, at the University of East London, trying to drive the field forward into new, uncharted territory … His previous books include Translating Happiness, A Cross-Cultural Lexicon Of Wellbeing, and The Happiness Dictionary, Words From Around The World To Help Us Live a Richer Life.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Christian Jarrett

The question is an old favourite – if you could travel back in time, what advice would you give to your younger self? Yet despite the popularity of this thought experiment, no one has, until now, actually studied what people would tell themselves.

Reporting their findings in The Journal of Social Psychology Robin Kowalski and Annie McCord at Clemson University have done just that in two surveys of hundreds of participants on Amazon’s Mechanical Turk website. Their findings show that people’s advice to their younger selves is overwhelmingly focused on prior relationships, educational opportunities and personal worth, echoing similar results derived from research into people’s most common regrets in life. Moreover, participants who said they had followed the advice they would give to their younger selves were more likely to say that they had become the kind of person that their younger self would admire. “…[W]e should consult ourselves for advice we would offer to our younger selves,” the researchers said. “The data indicate that there is much to be learned that can facilitate wellbeing and bring us more in line with the person that we would like to be should we follow that advice.”

The two studies followed a similar format with the participants (selected to be aged at least 30 years) asked to provide either three pieces or one piece of advice to their younger selves; to reflect on whether following this advice would help them become more like the person they aspire to be or ought to be; whether they had actually followed the advice later in life; to consider a pivotal event that had shaped them in life, especially in light of the advice they’d chosen to give their younger selves; and to reflect on what their younger self would make of their current self.

Participants mostly gave themselves advice around relationships (“Don’t marry her. Do. Not. Marry. Her.”), education (“Go to college”), selfhood (“Be yourself”), direction and goals (“Keep moving, keep taking chances, and keep bettering yourself”), and money (“Save more, spend less”). These topics closely match the most common topics mentioned in research on people’s regrets.

Most participants said that the advice they offered was tied to a pivotal event in their past, such as a time they were bullied, a relationship breakup, or an incident involving drink or drugs, and about half the time they had regret for what had happened. The timing of these pivotal events was most commonly between age 10 and 30 (consistent with research into the reminiscence bump – the way that we tend to recall more autobiographical memories from our teens and early adulthood).

Well over half the participants in both studies said that they had since followed the advice that they would offer to their younger selves. The majority of participants also said that following the advice would have brought their younger self closer to the kind of person they aspired to be, rather than making them more like their “ought self” (that is, the kind of person that other people or society said they should be). Finally, as mentioned, participants who said they’d followed their own advice (to their younger selves) were more likely to say that their younger high-school self would have respect for the person they had now grown into.

This is preliminary research on an unexplored topic and it’s possible the results might differ in other cultures and using other methods of collecting people’s reflections. However, the work lays the foundation for further questions, such as: how the advice we give our past selves might vary in quantity and kind as we get older; and how following the advice might affect our emotions and hope for the future. “This initial foray into advice to one’s younger self clearly raises so many interesting research questions, many of which will hopefully be examined in months and years to come,” the researchers said.

If I knew then what I know now: Advice to my younger self

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Christian Jarrett

It’s well-established in psychology that intense emotion and physiological arousal interfere with people’s ability to think straight. Most theories explain this in terms of anxiety consuming mental resources and focusing attention on potential threats. Although it’s tricky to study this topic in the psych lab, a handful of field studies involving parachutists and emergency simulations have largely supported this picture. However, a team at the Autonomous University of Barcelona believe that not enough consideration has so far been given to what they call the “valence” of intense situations – whether or not the person sees the intense experience as positive or negative. To find out whether this makes a difference, Judit Castellà and her colleagues tested dozens of bungee jumpers (most of them first-timers) three times: 30 minutes before a 15M free fall jump; immediately afterwards; and again eight minutes after that. 

The surprising findings, reported in Cognition and Emotion, suggest that when an intensely arousing experience is perceived positively, it may actually enhance cognition rather than be impairing. “Although we expected some degree of moderation, that is, an attenuation of the negative impact of high arousal reported in the literature, we did not predict an actual improvement or a total lack of impairment,” the researchers said. 

At the three testing stages conducted on a 30M high bridge in Catalonia, the bungee jumpers rated how positive or negative they were feeling, and the intensity of those feelings. They also completed tests of their working memory (the ability to recall strings of digits); their ability to concentrate and pay attention (using what’s known as the Go/No Go Task); and their decision making (their ability to identify which of four packs of cards was the most financially rewarding over time). Their performance was compared with an age-matched group of control participants who completed all the same tests in a similar environment but who were not performing a jump. 

As expected, the jumpers reported far more intense emotions than the control group. Importantly, the jumpers rated these feelings as highly positive before and especially after the jump. However the main finding is that working memory actually improved in the jumpers after their jump (but not in the controls), and there was a hint that the jumpers’ decision making might have improved too. Meanwhile, the jumpers’ attention performance was unaffected. In short, bungee jumping, although perceived as an intense emotional experience, was not found to impair cognition, and in fact enhanced aspects of it. 

Castellà and her colleagues interpreted their findings in terms of the “Broaden and Build Theory” – the idea that positive emotions can make cognitive functions more flexible and can counteract the narrowing influence of negative emotions. This is just one small study and, as always, the results need replicating and extending. However, the researchers added that their findings could have practical relevance for the training of emergency responders or any professionals who need to make rapid decisions in intensely arousing situations. “Training these professionals to cope with emergency situations by enhancing and focusing on the positive emotions derived from their actions could improve – or at least not impair – their cognitive performance when facing threats.”

Jump and free fall! Memory, attention, and decision-making processes in an extreme sport

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview