Loading...

Follow InformED By Open Colleges on Feedspot

Continue with Google
Continue with Facebook
Or

Valid



Science is constantly changing, and although we’ve come a long way since the days when it was widely believed that older people couldn’t learn new things, a number of learning myths have stood the test of time despite having no grounding in scientific research.

Tom Bennett—teacher, author, and director of ResearchED—points out that there are still too many unproven theories about learning that are taken as fact. He founded ResearchED to tackle these myths and contribute to greater research literacy in the education community.

“We have had all kinds of rubbish thrown at us over the last 10 to 20 years,” he says. “We’ve been told that kids only learn properly in groups. We’ve had people claiming that children learn using brain gym, people saying that kids only learn if you appeal to their learning style. There’s not a scrap of research that substantiates this, and, unfortunately, it is indicative of the really, really dysfunctional state of social science research that exists today.”

In fact, according to research from the Organisation for Economic Co-operation and Development (OECD), out of the trillions of dollars that are spent on education policies all around the world, only one in ten are actually evaluated.

So with this in mind, we thought we’d line up some of the most common learning myths of the 21st century and take a look at why they don’t hold water.

1. Re-reading and highlighting

When students prepare for a test, some of the most common things they’ll do to commit the relevant information to memory is to re-read it or highlight whatever they consider to be important.

However, a report published in the journal Psychological Science in the Public Interest showed that both of these study strategies are relatively ineffective. Passively reading the same text over and over again won’t do anything for comprehension or recall unless it’s spaced out over time and highlighting or underlining can even be detrimental if the wrong information is selected.

So if you want to make your study time count, check out these smart study tactics that are based on the latest brain research.

2. Students have different learning styles

You’ve probably heard about “learning styles” and how everyone has their preferred or ideal learning style, whether it’s visual, auditory, or kinesthetic. The theory is that some people learn better when they take in information by listening to it, while others learn more effectively when information is presented visually and others learn best through hand-on practice.

It’s so popular that a recent poll of head teachers at independent schools showed that over 85 percent believe in learning styles, 66 percent are using them in their schools and some have even sent teachers on courses or paid for external consultants.

But academics from the worlds of neuroscience, education, and psychology have been voicing their concerns about the popularity of this approach to teaching and learning. Systematic studies of learning styles have consistently found no evidence or very weak evidence to support the idea that matching the material to a student’s learning style is more effective.

3. You are either right or left brained

The idea that some people are right brained while others are left brained has been around for a while now. According to the theory, left-brained people are more logical, analytical and methodical, whereas right-brained people are more creative and artistic.

But a 2013 study by scientists from the University of Utah analysed over 1,000 brains and found no evidence that people preferentially use the left or right hemisphere.

Of course, certain functions are processed more by one region of the brain than others, and this is known as lateralization. But we all use our entire brain equally, and the fact that our brain regions are all connected is the very thing that allows us to think both creatively and analytically.

4. The 10,000 hour rule

Journalist and author Malcolm Gladwell popularised the 10,000 hour rule, which is based on research from psychologist Anders Ericsson and says that 10,000 hours of deliberate practice is enough to become world-class in your chosen field.

But although practice is certainly essential when you’re learning a new skill or studying a new topic, there’s no magic number of hours that will turn you into an expert or bring you to the proficiency level of a professional athlete or musician.

A Princeton study found that deliberate practice can only predict success in fields with stable structures where the rules never change, such as tennis, chess, or classical music. In less stable fields, mastery requires more than just practice.

So what’s the takeaway?

“There is no doubt that deliberate practice is important, from both a statistical and a theoretical perspective,” explains study co-author Brooke Macnamara. “It is just less important than has been argued. For scientists, the important question now is, what else matters?”

5. You should always stick with your first answer

Have you ever been advised not to change an answer on a multiple choice test or exam once you’ve put it down? This advice is common in school and even college, and one study found that 75 percent of college students and 55 percent of instructors believe that changing their initial answer would lower their score overall.

Despite the popularity of this theory, research shows that reconsidering your answers isn’t such a bad idea. A review of 33 studies found that, on average, people who change their answers score higher on tests than those who don’t. So if you’ve got extra time and are having doubts about one of your answers, don’t be afraid to give it a second look.

6. Intelligence is fixed at birth

We tend to think of intelligence as something that you either have or don’t have, and this is known as a fixed mindset. However, a growing body of research shows that our IQ can increase over time, and in fact, research on growth mindset by Stanford psychologist Carol Dweck shows that our beliefs about intelligence can actually affect our effort, and in turn, our performance.

So what can you do if you don’t have a growth mindset? Not to worry, it’s an area we could all stand to improve in, so check out these tips for developing a growth mindset.

7. Praising intelligence will motivate students

When we want to motivate our kids, students or even employees, we often praise their ability and intelligence by saying things like “Wow, that’s so smart” or “You’re really good at this.” However, the same research on the growth mindset by Stanford Psychologist Carol Dweck found that this kind of praise can actually be counterproductive and discourage people from taking risks.

So what should we be praising if not ability or intelligence? Dweck’s research shows that praising effort and persistence is a much better way to motivate people to work hard and keep improving. This is because praising effort rather than ability helps promote the idea that intelligence is malleable, and that trying and failing is all part of the learning process.

So instead of being afraid to make mistakes and seem dumb, students come to see that their brain is like a muscle that needs to be strengthened, and that mistakes can actually help them reach their full potential.

8. We only use 10 percent of our brain

The popular theory that we only use 10 percent to 20 percent of our brain has been around for years now, and was even promoted by recent Hollywood movies like Lucy and Limitless, where the protagonists uncover a way to unlock the rest of their brain and end up with superhuman powers.

Unfortunately, as appealing as it is to imagine that we have untapped potential, this theory is nothing more than an urban legend. It seems to have originated from the 1930s self-help book “How to Win Friends and Influence People,” in which a Harvard University professor was misquoted.

Even so, neuroscience has uncovered some things we can do to study smarter and retain more of what we learn, so it is possible to make your learning more efficient.

9. The learning pyramid

Although the learning pyramid myth was debunked a long time ago, it still lingers and is taken as fact by many teachers and students. The theory says that people remember 10 percent of what they read, 20 percent of what they hear, 30 percent of what they see, 50 percent of what they see and hear, 70 percent of what they say and write, and 90 percent of what they do or teach others.

But while this pyramid would be a useful tool if it were true, the problem with it is that it’s never actually been proven and the percentages given are pure fiction.

It’s unclear where the pyramid and numbers originated, but researcher, learning expert and instructional designer Will Thalheimer points out that if someone uses scientific verbiage, we’re more likely to believe it, which is probably why the learning pyramid is still widely accepted as fact.

“People do not necessarily remember more of what they hear than what they read. They do not necessarily remember more of what they see and hear than what they see,” he says. “The numbers are nonsense and the order of potency is incorrect.”

10. There are shortcuts to better learning

This is probably the biggest learning myth of all time, because every learning myth we’ve covered so far is tied to the idea that there’s a quicker way to commit new information to memory.

It’s understandable, of course, since learning is hard work and we’d all love to take a shortcut if we could. But despite all the learning fads that have come and gone, from mindfulness to brain training games and exercises, learning is and will always be a process. It requires time and effort, and is bound to feel difficult and uncomfortable at times.

So while an understanding of how the brain works can certainly help us study and learn more effectively, the bottom line is that there are no shortcuts. The next time someone tells you about an app or learning method that sounds too good to be true, take it with a pinch of salt and remember to view claims critically and look for the evidence behind them.

Want to learn how to read critically, check facts and gain a more balanced perspective? Check out these tips for honing your fact checking skills.

The post 10 Common Learning Myths That Might Be Holding You Back appeared first on InformED.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


Teachers are always on the lookout for ways to improve student engagement, and learners themselves can benefit from trying out different strategies. A new study, published in Frontiers in Psychology, has found that young students are twice as engaged and attentive in class following an outdoor learning session. The researchers suggest that including more nature in formal education could boost overall concentration, thereby improving academic performance, and the study also shows promise for independent learners seeking their own solutions to problems surrounding focus and sustained interest.

While scientists have known for some time that nature benefits the brain in a range of ways, from improving creativity to reducing stress, they haven’t as of yet demonstrated a clear connection between engagement levels and outdoor learning. Teachers, for their part, also tend to hesitate when it comes to holding outdoor lessons, worrying that students might get overexcited and distracted by what’s going on around them, or that once they return to an indoor setting they won’t be able to concentrate on the material at hand. This study lays the foundation for that connection and shows that teachers have nothing to worry about.

For the study, the researchers spent ten weeks observing third grade students at a school in the Midwestern U.S., asking two different teachers–one hopeful and one skeptical–to hold one lesson a week outdoors and a similar lesson indoors. The outdoor learning setting was a grassy spot with a view of a wooded area, just outside the school. After each outdoor or indoor lesson, the researchers measured how engaged the students were, counting “the number of times the teacher needed to redirect the attention of distracted students back to their schoolwork during the observation, using phrases such as ‘sit down’ and ‘you need to be working.'” An outside observer also examined photos taken of the class over this ten-week period and scored the level of class engagement, without knowing whether the photos were taken after an indoor or outdoor lesson. The teachers scored perceived engagement levels as well.

“We wanted to see if we could put the nature effect to work in a school setting,” says Ming Kuo, who led the study along with her colleagues at the University of Illinois at Urbana-Champaign. “If you took a bunch of squirmy third-graders outdoors for lessons, would they show a benefit of having a lesson in nature, or would they just be bouncing off the walls afterward?”

Results showed that students were more engaged after the outdoor learning sessions.

“Far from leaving students too keyed up to concentrate afterward,” Kuo says, “lessons in nature actually leave students more able to engage in the next lesson, even as students are also learning the material at hand.”

Kuo says this “nature effect” allowed instructors to teach for significantly longer during a subsequent indoor lesson.

“Our teachers were able to teach uninterrupted for almost twice as long at a time after the outdoor lesson,” Kuo says, “and we saw the nature effect with our skeptical teacher as well.”

Further research is planned to test the effect in other schools and for teachers of different experience levels. For the moment, regular outdoor learning sessions appear to be an inexpensive and convenient way for schools to boost student engagement and academic performance. That’s one more very good reason (as if we needed another) to protect the natural world.

The post Outdoor Learning Boosts Student Engagement appeared first on InformED.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


You’ve likely heard that exercise is good for the brain, but could you explain why exactly? An exciting piece of research from the National Institutes of Health (NIH) points to a very specific kind of exercise that directly affects brain health, spurring the creation of new cells. For those of us wondering whether it’s worth it to push ourselves and go the extra mile, this research offers a convincing argument for picking up the pace and extending our training a little longer than we might think necessary.

For the study, published in Cell Metabolism, NIH researchers isolated muscle cells from mice in petri dishes and added a peptide (an enzyme) that boosted cell metabolism, mimicking aerobic exercise and making the cells “think they were running.” Then the researchers tracked which proteins were released during the “exercise,” specifically looking for the ones that crossed the blood-brain barrier. One particular protein, called cathepsin B, spurred neurogenesis (brain cell creation) once it reached the brain. Cathepsin B is an important protein in helping sore muscles recover, helping to clear away cellular debris. Scientists had not, until now, considered it related to brain health.

During further testing with colleagues in Germany, the NIH researchers measured levels of cathepsin B in the bloodstream of mice running regularly for several weeks and humans (young men and women) running regularly for four months, exercising vigorously around three times a week for an hour or longer.

Concentrations of cathepsin B rose steadily in all subjects, as predicted, but here’s the interesting bit: All of the runners began performing better on memory and thinking tests.

Most striking, says Gretchen Reynolds for the New York Times Wellness blog, is that in the human volunteers, “the men and women whose fitness had increased the most–suggesting that they had run particularly intensely–not only had the highest levels of cathepsin B in their blood but also the most-improved test scores.”

Just to make sure it was really cathepsin B responsible for these improvements, the scientists bred mice that could not produce it and tested their ability to retain information after exercising. Those mice “learned haltingly and soon forgot their new skills,” which suggests that if we want to gain the biggest brain boost from exercise, it needs to be exercise that spurs the creation of this protein.

High-intensity, long-term endurance training appears to be correlated with the highest cathepsin B levels.

That’s not to say light exercise isn’t beneficial. “Any amount of exercise is going to be better than none,” says Henriette van Praag, an NIH investigator and who oversaw this study, but the lesson of these experiments is that “our brains appear to function better when they are awash in cathepsin B.”

For all the hype surrounding brain training games and the like, the fact appears to be that physical exercise aids brain power just as much, if not more, than mental exercise. Reynolds, a researcher herself, puts it this way: “These experiments strongly suggest that while mental stimulation is important for brain health, physical stimulation is even more potent.”

Next time you’re wondering whether you should train for a longer race, or move your legs a little faster, or head out the door at all—think of these findings and push yourself to do it. You’ll be helping your brain out as well.

The post Here’s Why Exercise Improves Brain Function appeared first on InformED.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


It’s no secret that our emotional state can affect our learning abilities, and although a bit of anxiety about an exam or upcoming assignment is normal, when that stress builds up too much, it can hinder our ability to take in, process, and store new information. Unfortunately, anxiety is on the rise among students, and according to one study, the number of students declaring a mental health problem has doubled in the last five years. But how and why do stress and anxiety impair our ability to learn?

Research shows that when we’re under stress, the brain simply stops forming new connections. This is because stress and anxiety activate the body’s fight-or-flight response and bring on physiological and psychological changes that enhance our ability to react to danger. For instance, our adrenaline levels will rise, our heart rate and breathing may speed up, blood is diverted to the limbs, and our body temperature may increase.

If this happens while we’re trying to study, however, the brain essentially blocks access to higher processing, which makes it difficult if not impossible to retain new information.

So if you frequently find yourself dealing with stress and anxiety that’s interfering with your studying, here are a few tried and proven tips for managing your learning anxiety.

1. Identify the source of your anxiety

If you’ve been feeling unusually anxious about your learning, it’s important to identify the underlying cause so you can tackle the problem head-on. If you’re unable to identify the source of your anxiety, start keeping a daily journal where you write down the events of the day along with your thoughts and feelings about them. This can help you identify unhealthy patterns and avoid specific things that trigger your anxiety, whether it’s a lack of sleep, unrealistic expectations, or even too much caffeine.

2. Try mindfulness training

Mindfulness is all about being aware of and paying attention to our thoughts and emotions, and research shows that mindfulness training can reduce anxiety and depression. One study from researchers at the University of Cambridge found that mindfulness training can be particularly useful in supporting students who are at risk of developing mental health problems and helping them develop preventative coping strategies.

So what did this mindfulness training entail?

For the study, students were offered eight face-to-face group-based sessions and were also encouraged to practice 15-25 minutes of mindfulness meditation at home, in addition to mindfulness practices like mindful eating and mindful walking. Students who received mindfulness training had lower distress scores after the course and during exam term. In fact, distress scores for the mindfulness group during exam time fell below baseline levels even during exam time, whereas students who had received standard support became increasingly stressed as the year progressed.

3. Seek support

Research shows that getting adequate social support is one of the best ways to cope with major life stresses, and people with good social support networks live longer and are healthier than those with few close relationships.

With this in mind, if a specific class or subject is causing you anxiety, don’t be afraid to approach your teachers, counsellors, or fellow students to ask for extra support. If you’re studying abroad or in another city, it’s also a good idea to allocate time to socialising and building networks in your new location, in addition to keeping in touch with your loved ones back home.

4. Prioritise your physical health

Our mind and body are closely connected, so if you’ve been feeling overly anxious, simply making an effort to eat the right foods, exercise regularly, and get plenty of sleep can already make a big difference to your state of mind.

Regular physical activity has been shown to reduce stress and anxiety, and resistance workouts such as weightlifting are linked to reduced anxiety.
Research also shows that certain dietary considerations can relieve anxiety. Complex carbohydrates such as legumes, whole wheat bread, or pasta and starchy vegetables, for example, are metabolised more slowly, which can reduce feelings of anxiety caused by dips in your blood sugar level.

5. Plan and organise

Another way to manage feelings of anxiety that are related to your learning is to get properly organised by breaking coursework into smaller chunks and setting personal goals and deadlines.

Oftentimes, our feelings of anxiety are caused by a feeling of powerlessness, so getting properly organised will help you to regain a sense of control and feel calmer about what needs to be done. If you need some help, check out these tips for scheduling your study time.

6. Distance yourself

Researchers have identified a new strategy to tackle stress and anxiety known as “self-distancing.” It involves talking to ourselves in the third person, which can help us distance ourselves from stressful situations and gain some outside perspective.

Previous research from Michigan State University also shows that talking to yourself in the third person during stressful times can help you control your emotions without any additional mental effort. So instead of asking “Why am I feeling anxious?” you can simply replace the first person pronoun and ask “Why is John feeling anxious?” It’s a subtle change, but one that can make a big difference in how we perceive our situation and emotions.

7. Emphasise positive self-talk

Since we all engage in self-talk whether we realise it or not, one thing that can have a powerful impact our state of mind is focusing on our inner monologue and becoming more aware of how we’re talking to ourselves.

Research shows that while destructive self-talk can cause us to question ourselves, positive self-talk can actually boost our productivity. So when something goes wrong, don’t let your first reaction be to chide yourself with thoughts like “How could you be so stupid?” Instead, try to focus on more positive or constructive thoughts like “I’m glad I tried, even if it didn’t go exactly as planned.”

8. Focus on your breathing

Breathing isn’t something we normally pay much attention to, but when we’re stressed or anxious, we can actually forget to breathe properly and hold in our breath for too long or breathe too quickly, which can cause us to tense up even more and increases our anxious feelings.

So whenever you feel yourself getting tense or anxious, focus on breathing in slowly for five counts, and then breathing out for five counts. By the time you’ve done this simple breathing exercise a few times, you’ll already be noticeably calmer and more composed.

9. Procrastinate productively

Most students are no strangers to procrastination, and according to one survey, between 80 and 95 percent of students procrastinate. But while it’s true that procrastination can be a student’s worst enemy, some experts believe that it’s possible to use this tendency to put things off for good.

In his book Wait: The Art and Science of Delay, University of San Diego professor Frank Partnoy points out that there are two types of procrastination: active and passive. Passive procrastination is a decidedly negative thing because it prevents you from getting things done. Active procrastination, on the other hand, can be a positive thing, because it involves delaying one task while you work on another important task instead.

So if you’re feeling anxious about tackling a particular learning task, you can temporarily put it off and still remain productive by crossing other important tasks off your to-do list. Not sure how to make it work for you? Check out this article for tips on how to procrastinate more productively.

10. Schedule your downtime

Sometimes when our stress and anxiety builds up too much, what we really need is some downtime to recharge our batteries and relax. So if you’ve been feeling unusually tense whenever it’s time to study, you might need to start scheduling some downtime the same way you would any of the other important responsibilities in your life.

Even if you feel you can’t afford to take a whole day off, make a point of scheduling at least an hour each day where you can turn off your phone, put away your laptop, and do something that totally relaxes you, whether it’s taking a nap, listening to music, or going for a nature walk.

Have you ever struggled with stress or anxiety that made it difficult to focus on your learning? If so, what strategies did you use to relax and get refocused?

The post Learning Anxiety: 10 Ways to Calm Your Mind appeared first on InformED.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


Blockchain technology, a data management tool forecast to disrupt a wide range of industries, has taken off for two main reasons: 1) it makes sensitive data simultaneously more shareable and more secure; 2) it takes sensitive data out of the hands of third party authorities and puts it back into the hands of its users. In the realm of education, this means everything from taking ownership of one’s learning credentials, to legitimizing one’s informal learning achievements, to expediting one’s knowledge transfer, to streamlining the job application process—all more easily and more securely. Educators and policymakers are in the earliest stages of applying blockchain tech to teaching and learning in these ways, but the possibilities point to a long-overdue disruption of the current education system.

First things first—how does blockchain work, exactly, and what are its main benefits?

What Is Blockchain Technology?

A blockchain is a database which stores permanent blocks of information, such as a transaction history, to be shared within a particular community. The best known example of blockchain technology is Bitcoin, a cryptocurrency that allows users to make and receive payments without going through a central banking authority. Crucially, blockchain makes it possible to send and receive only the most relevant bits of information to specified parties, and as each party receives a copy of the information, shared accountability keeps the information accurate and secure. Here’s an example of why that’s important:

“When you give a bartender your driver’s license, all that person needs to know is your age,” explains Zach Church, a writer for MIT’s Business Management blog. “But you’re revealing so much more—your address, your height, whether you’re an organ donor, etc. The same thing happens in commercial transactions.”

In other words, organizations don’t need to know everything about us, and it’s becoming increasingly critical that they don’t.

“Information disclosure is increasingly becoming a cost because of data breaches,” Church says. “We can’t keep our data private and it’s becoming increasingly complex to do so within large organizations. So imagine a model where you can verify certain attributes are true or false, potentially using a decentralized infrastructure, but you don’t have to reveal all these attributes all the time.”

That’s where blockchain comes in.

In their 2017 Joint Research Centre report, the European Commission explains why this technology could be so powerful:

  • Each member maintains his or her own copy of the information and all members must validate any updates collectively.
  • The information could represent transactions, contracts, assets, identities, or practically anything else that can be described in digital form.
  • Entries are permanent, transparent, and searchable, which makes it possible for community members to view transaction histories in their entirety.
  • Each update is a new ‘block’ added to the end of a ‘chain.’ A protocol manages how new edits or entries are initiated, validated, recorded, and distributed.
  • Cryptology replaces third-party intermediaries as the keeper of trust, with all blockchain participants running complex algorithms to certify the integrity of the whole.

Information on a blockchain can be thought of as a kind of public ledger for a specific group, designed to keep information secure and in the hands of only those to whom it is most relevant.

“The responsibility of keeping accurate ledgers has traditionally been assigned to a variety of institutions,” the Commission writes. “Governments control ownership of land by controlling ledgers of property; banks control the world’s monetary system by holding the ledgers for currency; while stock exchanges control large shares of the business world by holding ledgers for business-ownership.

But as data breaches become more frequent, trust in these third party intermediaries is waning.

“The corollary is that these institutions may individually or collectively cause significant harm or even social chaos by abusing the trust placed in them to accurately keep and maintain these ledgers. The inference is that these institutions have the power to use or abuse their control over the ledgers and exert significant control over individuals and societies within their immediate remit.”

In this sense, the main advantages of blockchain tech include:

  • Self-sovereignty: users can identify themselves while maintaining control over the storage and management of their personal data;
  • Trust: users can have more confidence in an infrastructure that securely offers transactions such as payments or the issue of certificates;
  • Transparency: users can conduct transactions knowing that each party has the capacity to enter into that transaction;
  • Immutability: users can rest assured that records will be written and stored permanently, without the possibility of modification or loss;
  • Disintermediation: the removal of the need for a central controlling authority to manage transactions or keep records;
  • Collaboration: the ability of parties to transact directly with each other without the need for mediating third parties

In the job sector, blockchain technology could help employers, job seekers, and organizations alike. Employers could better validate the accuracy of information supplied by job applicants, which could make the hiring process more efficient for recruiters and human resource managers. Job seekers could benefit from more personalized information management that matches their skills with job offers. Organizations could use the technology to automatise payments, contracts, and other documents and procedures.

But as we’re here to discuss the future of education, let’s take a look at how blockchain technology could benefit learning.

How Could Blockchain Change Education? 1. Disrupting the Current Education Model

Today’s students receive an education in both formal and informal learning settings, not just traditional brick-and-mortar university lecture halls. There are MOOCs and other online courses, workshops and conferences, co-learning spaces and boot camps.

“The centralised model of present-day learning is no longer sustainable,” the authors of the report write. “Blockchain technology allows for a total disintermediation and disaggregation of higher education.”

Micro-accreditation could take place through a blockchain, allowing for the easy validation and transfer of skills and credentials.

2. Storing Permanent Records

Since records are stored permanently on a blockchain, documents like degree and course certificates can be secured and verified regardless of whether a user has access to an institution’s record-keeping system.

“Even if the institutions that issued the certificates were to close down, or if the entire system of education collapses (as, for instance, happened in Syria), those certificates are still verifiable against the records stored in a blockchain,” the European Commission writes. “Furthermore, once institutions issue a certificate, they do not need to spend any further resources to confirm the validity of that certificate to third parties, since these will be able to verify the certificates directly themselves on a blockchain.”

3. Identity Verification & Information Security

Using blockchain, students and job candidates can identify themselves online while maintaining control over the storage and management of their personal data. Currently, this is not so easy to do, as the Commission explains:

“Within larger organisations, students need to regularly identify themselves with different parts of the organisation. In such cases, either each part of the organisation will collect the student data for itself, or the organisation will use single sign-on, whereby one shared copy of the student data is used by all parties within the organisation. Under both these models, tens if not hundreds of people might have access to a student’s personal information. Keeping that data safe requires managing access rights for all those people, and ensuring that their devices are also secure and hack-proof—a mammoth undertaking.”

With blockchain, only a select few—namely the parties responsible for verifying a student’s identity—can have access to the data. Other than that, it’s in the student’s hands.

“This means that the organisation no longer needs to manage the complex systems for access rights, and only needs to secure the device or network where the verifications initial verification is taking place. This would save significant resources spent in hardening the network against data breaches, staff training on data-protection and in managing access rights.”

4. Student Ownership of Learning

Blockchain allows personal data to be just that—personal.

Students gain control and ownership of all their education data, including accreditation and portfolios of work, “in a secure place that is accessible to anyone who needs to verify it—and for their entire lifetime.”

Drawing on the research of Au (2017) and Lewis (2017), the Commission explains:

“Public blockchains facilitate self-sovereignty by giving individuals the ability to be the final arbiter of who can access and use their data and personal information. Within an educational context, the term is on its way to becoming synonymous with the empowerment of individual learners to own, manage and share details of their credentials, without the need to call upon the education institution as a trusted intermediary.”

5. Interactive Learning & Analytics

“Imagine a scenario where every learning activity is registered on the Blockchain, including informal learning – together with informal feedback,” the authors write. “All assignment test scores will be mapped on learning environments across Europe. Europe-wide analytics could then be developed from the ground up. The best lecturers in Europe by subject could be easily identified. Learning would become that much more interactive – and reputations built on more tangible matrices.”

The success of blockchain pilots in one country could then be used to encourage knowledge-transfer across nations.

6. Automatic Transfer of Credits

Credit transfer is a thorny process, often leaving students at a disadvantage when they find, for example, that they must repeat courses to fulfill a new institution’s requirements.

“Currently, credit transfer depends on institutions to negotiate agreements to recognise each other’s credits subject to certain conditions—but students often report that these agreements are not recognised. Using a blockchain, these agreements could be written as smart-contracts whereby upon fulfilment of the conditions of the contract, the credits would automatically be transferred.”

7. The “Lifelong Learning Passport”

There are a few existing resources, e.g. social networks and e-portfolio companies, that provide users with a way to record their learning during and beyond schooling. But they don’t take advantage of digitisation the way blockchain does.

“Except for Open Badges, none of these [resources] provide ways to verify the experience and credentials described and included within these systems – therefore these systems operate as a digital counterpart to a box full of paper certificates – deriving, little to no additional benefits or efficiencies from the process of digitisation.”

With blockchain tech, learners could store their own evidence of formal or informal learning, share it with a desired audience, and ensure instant verification.
“This means students have a CV that updates itself and can be shared with employers. Employers, on their part, can reduce their workload since they won’t have to verify CVs and can simply search instantly to see whether candidates have the skills they require.”

Finally, a way to record lifelong learning.

8. Copyright for Educational Content

In theory, blockchain could allow educators to publish content openly, while keeping track of re-use, without putting limitations on the source material.
“Were such a system introduced, it would allow for teachers to be rewarded based on the level of actual use and reuse of their teaching materials, similar to how they are rewarded based on citations to research papers.”

Students and institutions could then make metrics-based decisions on which teaching materials to use.

Teachers could announce the publication of their resources and link to those resources, or announce which other resources they used in creating the material.
Coins could be awarded to educators according to the level of reuse of their respective resources.

“In an open-scenario, coins would not be spendable—and would be used to determine the prominence of an author. In a closed-scenario, coins would have monetary value and would result in monetary compensation. A more advanced implementation might automatically scan resources to identify what percentage of other resources were re-used and automatically award accordingly.”

For example, a smart-contract could distribute payment to authors based on how often their material is cited or used. Authors would no longer have to go through intermediaries such as research journals, which often limit use by charging high fees for access.

9. Multi-Step Accreditation

Validating credentials across education systems is not a simple business. In Europe, for instance, there are hundreds of accreditation pathways, through both public and private institutions, and employers therefore need to verify not only the issuer of the credential but the quality of the institution issuing the credential. Currently, the verification process involves consulting the institution itself, determining the quality of the accreditation the institution claims to have, confirming this issuance with the accrediting body itself, and checking with the governing authority to see if the accrediting body is authorized to operate in its particular capacity.

“This is an extremely time-consuming and technical process which requires experts in accreditation to manage,” the Commission says, noting how entire networks of agencies with offices in every EU member state currently exist to manage the process.

But blockchain offers a more efficient alternative:

“Using a blockchain, rather than researching these connections, institutions needing to check the ‘pedigree’ of a degree could easily do so with a single click. A fully automated process would then be able to visualize the accreditation chain and verify that certificates had indeed been issued, and (critically) that they were still valid for each step of the chain.”

10. Payment and Funding

Blockchain could allow students to pay for their education via cryptocurrency, which would eliminate barriers such as restricted access to bank accounts or credit cards depending on country of origin.

“Especially for cross-border studies, and also in response to legislation, many organizations only accept payment made through electronic means.”
Governments and organizations could also provide students with funding for tuition in the form of blockchain “vouchers” to be “spent” at universities, which could be programmed to “release tranches of funding to either the student or the educational organization, based on certain performance criteria such as grades.”

Several universities around the world are already implementing blockchain technology and reaping the benefits.

Open University UK is currently using blockchain to improve access to higher education and transparency of qualifications through MOOCs, open badges, and e-Portfolios. Their strategy is holistic, with “researchers encouraged to explore the full potential of technology, as opposed to one particular aspect (such as cryptography).”

The University of Nicosia on Cyprus claims several “world firsts” in using blockchain for education, including accepting Bitcoin for tuition for any degree program; offering courses on cryptocurrency and degrees in digital currency; and issuing academic certificates onto the Bitcoin blockchain, using its own in-house software. The cryptocurrency course alone has attracted students from over 80 different countries since its launch in 2014.

In 2015, the Massachusetts Institute of Technology’s Media Lab began using cryptology and blockchain to develop Blockcerts for issuing digital certificates to groups of people in its broader community. In 2017, MIT issued its first round of Learning Machine (LM) Certificates, commercial versions of Blockcerts, at the Media Lab and the Sloan School of Business. This is the first example of recipient-owned diplomas.

Will your school be next?

The post Blockchain Technology: Can It Change Education? appeared first on InformED.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


Attention-Deficit/Hyperactivity Disorder (ADHD) is a neurodevelopmental condition marked by a limited ability to focus, control behavioural impulses, and manage one’s energy. It affects around 7.2 percent of children and 3.4 percent of adults worldwide. Normally viewed as a disorder that stands in the way of efficient learning, ADHD may actually offer several cognitive benefits to those affected by it, according to experts.

“Children with ADHD can be highly creative and can spend a long time doing activities they love,” Australia’s Raising Children Network states. “Some children might enjoy using their energy on sport or dancing. They might also be more open to trying new things than other children. Finding positive ways for your child to use her energy can be good for her self-esteem and help protect her against mental health problems.”

Researchers have found that adults can benefit positively, too. Many high-profile CEOs, such as JetBlue Founder David Neeleman, cite having ADHD as one of the reasons for their success, reporting higher levels of creativity and motivation.

In both children and adults, there are cognitive benefits to having “too much energy” and “being easily distracted.” Let’s celebrate a few of these below.

1. Entrepreneurial Mindset

Johan Wikilund of Syracuse University has found a positive correlation between ADHD and entrepreneurial success. Hyperactivity and impulsivity, in particular, contribute to more success as an entrepreneur.

Wikilund has been studying entrepreneurship for over twenty years. So far he has conducted a case study of 16 entrepreneurs with ADHD diagnoses, a survey of MBA alumni, and a survey of successful entrepreneurs. He has found that “ADHD symptoms are directly linked to behaving more entrepreneurially within [people’s] organisations, and positively linked to growth and performance.”

2. Flexible Thinking

Mayra Mendez, PhD, of Providence Saint John’s Child and Family Development Center, suggests that, in some cases, having trouble focusing on one thing can also be seen as “flexible thinking.” People with ADHD appear to be better multi-taskers. Although focusing on a single task is crucial to learning, it’s worth noting that those with ADHD may have a leg up when it comes to cognitive flexibility.

3. Quick-Starter Attitude

People with ADHD are known to be more impulsive than most, making quick decisions without considering the consequences. Wikilund says this might be a good thing, however.

“Impulsivity is particularly interesting because it is such a negatively loaded word,” he says. “But it is impulsivity that triggers people with ADHD to act and take risks where other people would wait and see. They also tend to look at the potential gains rather than fear the potential losses, which helps them keep going and to keep coming back.”

Mendez adds: “Quick reactions lead to action. People who are impulsive don’t sit around and feel helpless.”

4. Creativity

Recent studies suggest that people with ADHD come up with more creative ideas more quickly than those without ADHD. In one test, wherein children were tasked with generating toy design ideas, those with ADHD “came up with a far more diverse array of different types of toys than those without ADHD.” In another test, wherein adults were asked to come up with as many uses as possible for a common object like a cup or a brick, “those with ADHD outperformed those without it.”

The creativity advantage seems only to apply to idea generation, though, and not to pattern recognition:

“When adults were given other tasks to test creativity, such as one in which they had to find something in common amongst three seemingly unrelated items (such as the words mines, lick, and sprinkle) those with ADHD performed worse than those without it.”

5. Emotional Processing

People with ADHD tend to have trouble controlling their emotions as well, often reacting according to how they feel without filtering it first. Researchers say this could be a good thing in some cases, as “allowing ourselves to feel emotions as they happen helps us process them and prepare for the future.” Having what’s called emotional dysregulation, then, could be beneficial if those with ADHD learn to use it to their advantage.

None of this is to say that living with ADHD is not a challenge, but it may be time to reduce the stigma surrounding it. Focusing on the positive aspects of the condition may help us move toward solutions that strengthen skills rather than treat disabilities. Findings like these offer hope to parents, educators, and students whose lives are affected by ADHD.

The post 5 Cognitive Advantages of People With ADHD appeared first on InformED.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


Researchers at the University of Bath have found that self-oriented and socially prescribed levels of perfectionism have increased in American, Canadian, and British college students over the past 27 years due to cultures of competitive individualism. Compared to students in the 1980s, recent generations of young adults perceive that “others are more demanding of them, are more demanding of others, and are more demanding of themselves.” Thomas Curran, who led the study, believes educators should help curb this trend, citing numerous psychological disorders that can be traced back to a perfectionist mentality.

In the first study to examine generational differences in perfectionism at a cohort level, Curran and his colleagues at the Center for Motivation and Health Behaviour Change conducted an extensive literature review of findings on perfectionism from 1989-2016. Data was collected from 41,641 students from the United States, Canada, and the United Kingdom who completed the Multidimensional Perfectionism Scale, a test for generational changes in perfectionism.

The team was looking for changes in three types of perfectionism among students: self-oriented perfectionism, wherein individuals “attach irrational importance to being perfect, hold unrealistic expectations of themselves, and are punitive in their self-evaluations”; socially prescribed perfectionism, wherein individuals “believe their social context is excessively demanding, that others judge them harshly, and that they must display perfection to secure approval”; and other-oriented perfectionism, wherein “individuals impose unrealistic standards on those around them and evaluate others critically.”

The research team found that levels of perfectionism increased over the past 27 years for students from all nations. Self-oriented, socially prescribed, and other-oriented perfectionism scores increased by 10 percent, 33 percent, and 16 percent, respectively.

“We speculate that this may be because, generally, American, Canadian, and British cultures have become more individualistic, materialistic, and socially antagonistic over this period, with young people now facing more competitive environments, more unrealistic expectations, and more anxious and controlling parents than generations before.”

Self-oriented and other-oriented perfectionism, in particular, were highest among American college students.

“Since the 1980s and the Reagan era, communal values in the United States have waned in favor of an individualised notion of liberty, in which the uninhibited pursuit of self-gain is prized more than anything else,” the researchers explain. “The especially strong individualistic and meritocratic culture in the United States may explain why self-oriented perfectionism is seemingly especially high among American college students.”

But the most important finding of the study, according to Curran, is that today’s college students are reporting higher levels of socially prescribed perfectionism than previous generations. These levels were nearly twice as high as the levels of other types of perfectionism.

“This finding suggests that young people are perceiving that their social context is increasingly demanding, that others judge them more harshly, and that they are increasingly inclined to display perfection as a means of securing approval.”

Socially prescribed perfectionism has been correlated with a range of psychological disorders, including depression, anxiety, social disconnection, social phobia, body dissatisfaction, bulimia nervosa, and suicide.

“It is likely to be the most important in terms of explaining recent increases in mental health difficulties among young people,” Curran writes.

Interestingly, socially prescribed perfectionism was found to be higher in Canadian and British college students than in American college students, suggesting that cultures with a greater sense of communal responsibility and social concern produce environments in which young adults feel more pressure to conform to social standards.

“The United States has been the fastest of the industrialised nations to shrink its communal investments,” Curran writes. “This contrasts with Canada and the United Kingdom, which, despite substantial reductions, still have sizable components of a welfare state (e.g., nationalised health services) and, possibly, a greater sense of communal responsibility and pressure.”

The post Perfectionism Is Increasing, and It’s Taking a Toll On Mental Health appeared first on InformED.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


There’s a lot of talk these days about interesting more girls in STEM, and for good reason: In the U.S., only 25 percent of STEM jobs belong to women; in Australia, just 15 percent of women end up in technology or engineering. The gender gap in tech persists for multiple reasons, from societal gender norms to parenting to perceptions of STEM itself. But few seem to be considering a psychological or neuroscientific explanation for the STEM gap. In our fierce pursuit of gender equality, we could be ignoring some very real, very important gender differences that might help explain why this imbalance exists in the first place—differences that could, ironically, help us close the gap.

A new study from the University of Zurich finds that women and men process prosocial and selfish behavior differently. Whereas selfish behavior activates the reward center in men’s brains more strongly, women receive a bigger dopamine boost from prosocial behavior. Previous studies have revealed similar differences between male and female brain activity.

If our cultural image of a scientist is a solitary man or woman in a lab coat, spending hours peering through a microscope, or a lonely data analyst staring at a computer screen all day, it’s little wonder why girls tend to choose more prosocial fields like education, healthcare, and the arts.

In fact, a new study found that the STEM gender gap itself is unequal: Women in the United States earn more than 50 percent of undergraduate degrees in biology, chemistry, and mathematics while only 20 percent of degrees in computer science, engineering, and physics. Possible explanations, according to the researchers, could include a “masculine culture that signals a lower sense of belonging to women than men, lack of sufficient early experience with these fields, and gender gaps in self-efficacy.”

But there’s an additional pattern to be seen here: the perceived nature of the jobs in these fields. While biology and chemistry conjure up images of social-minded medical causes, computer science and engineering are not traditionally viewed as prosocial careers.

Cultural gender norms notwithstanding, there are significant differences in the male and female brain, and findings like these are crucial to understanding why there aren’t more girls in STEM.

Are Females Really More Prosocial?

For the University of Zurich study, participants performed an interpersonal decision task in which they made choices between a monetary reward only for themselves (selfish reward option) and sharing money with others (prosocial reward option).

Females not only chose to share more often than males; they also received a dopamine rush while doing so.

“We find that in females the dopaminergic reward system is more sensitive to shared than to selfish rewards, while the opposite is true for males,” the researchers write. “This conclusion is evidenced both by the pharmacological and neuroimaging data.”

What’s more, when researchers blocked the effects of dopamine on participants’ brains, essentially eliminating the need for reward-seeking behavior, they found reduced prosociality in women and reduced selfishness in men. In other words, women are highly motivated by prosocial rewards, while men are highly motivated by self-serving rewards.

The researchers conclude: “Sharing tends to be more preferable than acting selfishly for women, whereas, for men, maximising self-reward tends to carry more value.”

But that’s not to say society and culture can’t influence the brain as well.

“From an early age, women may receive more positive feedback for prosocial behaviour than men, which may lead to an internalisation of cultural norms and make prosocial behaviour more valuable and predictive of rewarding feedback.”

The researchers also acknowledge differences among women or men from different cultures—for example, more independent social orientation among European Americans and more inter-dependent social orientation among Asians.

While this study focused on adults, complementary research has confirmed that young girls are motivated by social rewards too—and this preference develops right around the time they start exploring their academic identity.

When Do Girls Lose Interest In STEM?

The latest findings indicate that girls are just as interested in STEM as boys until around the time they enter high school, when engagement drops sharply and doesn’t recover.

A Microsoft survey asked young women in Europe between age 11 and 30 about their views regarding STEM, and revealed that by age 15 girls have lost the interest they had at age 11. Another survey, led by Accenture and Girls Who Code, found that girls are more likely to be interested in computer coding in middle school but lose this interest in high school.

It’s not a mystery: This is the time when girls are experiencing significant hormonal changes that affect social behaviour. During puberty, girls become more empathetic, better at perspective-taking, and more motivated by social rewards.

In her best-selling book The Female Brain, Louann Brizendine, M.D. writes about prosocial behavior among teenage girls:

“Connecting [with others] activates the pleasure center in a girl’s brain… we’re not talking about a small amount of pleasure. This is huge. It’s a major dopamine and oxytocin rush, which is the biggest, fattest neurological reward you can get outside of an orgasm… Estrogen at puberty increases dopamine and oxytocin production in girls. Oxytocin is a neurohormone that triggers and is triggered by intimacy. When estrogen is on the rise, a teen girl’s brain is pushed to make even more oxytocin—and to get even more reinforcement for social bonding.”

If socialising is so important to girls during the age they are supposed to choose a career path, why should we expect them to choose paths that appear to lead to isolation—like being a computer scientist, physicist, or engineer?

It’s time to change the way these professions are perceived.

How Is STEM Perceived?

“Research has shown that because of cultural beliefs and expectations, girls tend to be drawn to fields and careers where there is a clear link to helping people or making the world a better place,” says Jill Williams, director of Women in Science and Engineering at the University of Arizona. “This helps explain why medical-related STEM careers have greater female participation. In these fields, the link between science, math and technology and helping people is more obvious than in non-medical fields.”

But STEM is often marketed as “let’s blow something up or destroy something” when targeted at boys, says DaNel Hogan, director of the STEMAZing Project. That needs to change, she says: “Highlighting the way STEM careers can make people’s lives easier and make the world a better place are enticing reasons girls would pursue a STEM career.”

But it’s not just about making STEM a prosocial field; it’s about making the job itself, and the day to day activities associated with it, more social too. If you’re a female, chances are that becoming a computer programmer or mechanical engineer sounds about as appealing as isolated confinement, even if it’s for a good cause. Our perceived nature of the work itself, as well as the academic pursuit of it, needs to change.

Can We Make STEM More Prosocial?

One way to make STEM more appealing to girls is to make it applicable to real life scenarios, where it’s possible to imagine interacting with people to solve real-world problems.

This is confirmed by 2016 research showing that framing spatial tasks as social eliminates the gender gap in performance.

Margaret Tarampi and her colleagues at the University of California, Santa Barbara found that women performed worse than men on spatial tests involving only objects, but perform just as well as their male peers when the spatial tests included human-like figures.

In one experiment, the researchers gave 135 college students (65 men, 70 women) two timed perspective-taking tests.

In one test, the students saw a picture that included an array of objects, such as a house, a stop sign, a cat, a tree, and a car. They might be told, for example, to imagine standing at the cat, facing in the direction of the tree, while pointing to the car. Below that image, they saw a diagram of a circle with “cat” at the center and an arrow pointing to “tree”—they were asked to draw a second arrow to indicate the direction of the car.

Importantly, some of the students received a “social” version of the task, in which the starting point was a person instead of an object.

In the second test, the students were shown a map, with a path marked on it. They were told to imagine walking along that path and write an “R” or “L” at each turn, to indicate the direction they would be turning in. Again, some students received a social version of the test, in which a little person was shown at every corner.

In line with spatial stereotypes, women performed worse than men on the unmodified, “spatial” versions of the tests. But the gap in performance was completely eliminated when the tests were framed as social in nature.

Also in 2016, girls outperformed boys in a new national STEM assessment in the United States (The National Assessment of Educational Progress) which moved beyond multiple choice questions and focused instead on troubleshooting in real-world scenarios.

Although just 43 percent of U.S. eighth graders tested met or exceeded the benchmark for proficiency on the exam, girls averaged 151 points (out of a possible 300), three points higher than for boys. Measured another way, 45 percent of females met or exceeded the proficient level, compared with 42 percent of males.

It’s possible that the nature of the test itself—more closely tied to real world scenarios where other people might be involved, making the challenge appear more prosocial—may have helped girls perform better.

One way to start spinning STEM as prosocial is for educators to bust some of the anti-social myths surrounding it. The University of California at Berkeley’s Understanding Science blog is a great place to start. One of their posts on “The Social Side of Science” explains that science isn’t a solitary pursuit:

“In opposition to its stereotype, science much more typically works something like this: After reading up on the recent work of other scientists studying animal behavior, a scientist in Brazil gets an idea for a new bird song experiment while playing a word game with her kids. She calls a colleague in Canada to discuss the idea and to find out where she can get the recording software she will need for the experiment. She then recruits a few students and a visiting researcher from China to work on the project, and they apply for funding. After they complete the study, the team writes up the work and submits it to a journal for publication.The journal sends it out to three different scientists for review: one in Japan, one in the U.S., and one in the U.K. The reviewers like the study but suggest some changes to improve the statistical analysis. The team makes the changes and the paper is published several months later. A graduate student in France reads the paper with his lab group, emails the Brazilian researcher to learn more about her experimental procedures, and comes up with a follow-up experiment. He recruits another graduate student and a professor to work on the project with him… and so on. Compared to its stereotype, real science is more complex—but also more human.”

Cultural and societal gender norms may be holding women back in STEM, but biological differences can help push them forward. Historian Mary Beard writes: “If women aren’t perceived to be within the structure of power, isn’t it power itself we need to redefine?” Let’s draw on her example to redefine what STEM looks like too.

The post Can We Keep Girls Interested In STEM If We Make It Prosocial? appeared first on InformED.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


It’s been an exciting year in education with big advances in educational technology and instructional design, from new implementations of virtual reality to fascinating insights into everything from motivation and creativity to tried-and-true learning strategies.

Each year we like to keep track of all the latest research that confirms some of the things we already suspected about learning, uncovers new and exciting evidence about how we acquire and retain new information, and, on occasion, even throws into question some of our firmly held beliefs about the most effective ways to learn.

So as another busy year of learning comes to a close, we thought we’d line up some important research studies from 2017 that parents, teachers, and learners everywhere should know about.

1. Learning styles probably don’t exist

The idea that there are different “learning styles” and that each learner should receive instruction in his or her preferred style of learning—be it visual, auditory, or tactile—has gained popularity in recent years. Despite its increasing popularity, there’s no scientific evidence to support the theory that teaching students according to their individual learning styles achieves better results.

Earlier this year, 30 leading academics in neuroscience, education, and psychology signed a letter to the Guardian voicing concerns about the popularity of this approach in schools around the world. In the letter, “learning styles” are described as a common neuromyth that does nothing to enhance education. The scientists note that this approach to learning is not only ineffective but may even be damaging as it can discourage learners from applying or adapting to different ways of learning.

2. Robots increase engagement in online learning

In the first study of its kind, researchers from Michigan State University used innovative robots with mounted video screens to help online students connect with their instructors and fellow students.

They found that online students who used robot learning felt more engaged and connected to the instructor and other students in the classroom. Unlike with regular video conferencing, where multiple students are displayed on a single screen, robot learning enabled students to move virtually through the room and even maintain eye contact with their instructors and fellow students.

With the number of blended-learning classrooms expected to eventually make up 80 percent or more of all university classes, these findings are encouraging and demonstrate how technology can make it possible for online students to join discussions and participate fully in the classroom.

3. Listening is one of the best ways to learn a foreign language

Whether you’ve just moved to a new country and need to learn the language or just want to expand your knowledge or update your skills, learning a new language is no easy task. But research reported in Scientific American recently shed light on how we learn languages as well as a simple trick to make it easier on ourselves.

One study in particular found that listening to new sounds silently can help you learn a new language faster than if you listened to new sounds and practised saying them yourself at the same time.

In the study, Spanish speakers learning to distinguish between different sounds in the Basque language performed worse when they were asked to repeat the sounds during their training. So even if you’re not fully paying attention to it, turning on the radio or a podcast in the language you’re trying to learn may help you to learn it faster.

4. Sleep helps us use our memory in the most flexible way possible

We already know that sleep is good for us, and numerous studies have shown that getting enough sleep is vital to healthy brain function. Even so, a new study from the University of York highlighted yet another reason to make sure we’re getting plenty of sleep each night.

The researchers from York’s Sleep, Language and Memory (SLAM) laboratory found that sleep makes our memory more flexible and adaptable because it strengthens both new and old versions of the same memory. When we retrieve a memory, it’s updated with any new information that may be present at the time, but rather than rewriting that memory, the brain stores multiple versions of it.

So sleep helps us use our memories more adaptively and efficiently, because it enables us to update our understanding of the world and adapt for the future.

5. Brain training exercises aren’t worth your time

In recent years, so-called “brain training” programs and apps have increased in popularity, and you’ve probably seen claims about how they can boost everything from your memory and attention span to cognitive flexibility.

Unfortunately, if you’re using one of these apps or games in the hopes that it will improve your cognitive function, you might be sorely disappointed. A recent study from the University of Pennsylvania found that brain training programs have no effect on decision-making or cognitive function beyond the practice effects on the training tasks.

What does this mean? Basically, while brain training programs will probably help you get better at the specific tasks you’re practicing, your general cognitive abilities won’t improve and the benefits aren’t easily transferred to other areas of learning.

6. Practice really does make perfect

Ever heard the saying “practice makes perfect?” A study from Tel Aviv University recently proved that it’s more than just a nice thing parents and teachers say, but you might not even need as much practice as you think to reap the benefits.

The study suggests that instead of blasting our brain with repeated practice, it’s possible to use shorter but more efficient reactivations of a memory to learn.

In the study, participants were shown five computer-based visual recognition tasks that lasted just a few milliseconds each. These brief periods of performing a task helped to create and encode a memory of the tasks in subjects’ brains. They then participated in three additional practice sessions spread out over three days, during which the memory of the initial task was briefly reactivated.

The researchers point out that these brief reactivations of memory can yield a full typical learning curve and we may be able to leverage a new form of learning known as “reactivation-induced learning.”

7. Cognitive cross-training enhances learning

Although brain-training games on their own may not be the way to go, there are certainly things you can do to enhance your cognitive skills. A study from the University of Illinois at Urbana-Champaign found that in the same way athletes use cross-training to enhance their physical abilities, we can enhance our cognitive skills by exercising our brains in multiple ways.

The study, carried out over 18 weeks, looked at 318 healthy young adults and used a combination of physical exercise along with computer-based cognitive training and electric brain stimulation to promote skill learning. The researchers measured specific cognitive skills like memory, attention, and switching between tasks.

They found that the group which received cognitive training along with physical fitness training and brain stimulation performed significantly and consistently better than the group that received only cognitive training.

8. Students don’t use self-regulated learning strategies effectively

Most university students are familiar with self-regulated learning strategies, but despite knowing about them and their effectiveness, new research published in the journal Frontiers in Psychology found that many students still don’t take full advantage of these strategies.

Self-regulated learning (SRL) strategies include things like goal-setting and structuring learning content, self-evaluation, putting rewards in place, group reflection, and note taking.

The study found that although most students can correctly identify common SRL strategies, they don’t know how to put them into practice or when to use specific techniques. In fact, only a third of students who could correct identify a learning technique as beneficial admitted to actually using that technique in their own learning.

With this in mind, the researchers note that it’s important for learners to receive more hands-on training on how to use SRL strategies and understand that these strategies could enhance their learning outcomes and even save them time.

9. Learning with music changes our brain structure

Although listening to music while you study can sometimes be distracting, there are still reasons to consider it. A recent study from the University of Edinburgh found that using musical cues to learn a physical task may actually help to develop an important part of the brain.

To study this, the researchers had one group of volunteers learn a new task with musical cues, and another group without. By using MRI scans, they were able to demonstrate that the music group had increased structural connectivity between the parts of the brain that process sound and control movement, whereas the no-music group’s brain scans showed no changes.

It’s the first study of its kind to provide experimental evidence that using music for learning can actually lead to changes in the white matter structure in the brain.

10. Exercise can help you learn a foreign language

It’s been a good year for language learners. Another recent study found that the process of learning a new language as an adult can be facilitated by physical exercise.

The study, which was recently published in PLOS One, looked at college-aged Chinese men and women who were learning English. Some students were given rote memorisation tasks to complete while riding exercise bikes at a gentle pace, while others completed the same tasks without exercise.

After each lesson, both groups completed a vocabulary quiz, and it soon became clear that the students who had ridden bikes during their lessons were consistently outperforming those who simply sat still. So if you feel like you’ve recently hit a wall with your language learning, a leisurely stroll or cycle in the park could be just what you need.

Of course, this is just a small selection of all the important research carried out this year, so if there are any particular learning studies you’ve come across this year that fascinated you or challenged what you thought you knew about learning and memory, let us know about them in the comments.

The post 10 Important Learning Studies From 2017 appeared first on InformED.

Read Full Article
Visit website
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 


As we consider our resolutions and goals for 2018, many of us will vow to cut down on screen time. We will have different reasons for doing so, citing a waning inability to focus or a desire to free ourselves from information overload. But how necessary is this? In other words, does screen time have a measurable impact on the brain? And if it does, how much do we need to cut down in order to maintain a healthy relationship with our devices? Researchers have some important new findings to share with us on the matter.

But first—what do we mean by “screen time,” exactly?

“The cognitive resources required to watch television versus play a first-person shooter game on a PlayStation 3 are quite different,” writes Kayt Sukel for the Dana Institute. “Every day, virtual technologies are becoming more vivid, involving more senses and offering more interactivity for users. That makes the concept of ‘screen time’ difficult to properly define, and makes it difficult for researchers to design studies that keep up with the latest technology.”

Plus, studies suggest that many of us can benefit from screen activities like video games, which have been shown to increase brain volume.

The kind of screen time we’re addressing in this article is the kind most of us engage in during our day-to-day lives: namely, reading. We read the news, our social media feeds, our friends’ texts. It could even be argued that we “read” images. So how do these activities affect the brain, and how can we change our behaviour to achieve a healthy balance of on-and-off-screen engagement?

Let’s start by taking a look at some of the latest research.

How Does Screen Time Impact the Brain?

In a brand new study published in Acta Padiatrica, researchers found that brain connectivity in children is increased by the time they spend reading books and decreased by the length of exposure to screen-based media.

Researchers advertised the study to parents of private school children in Cincinnati, USA, having volunteers complete surveys on “how many hours their children spent on independent reading and screen-based media time, including smartphones, tablets, desktop or laptop computers, and television.”

Magnetic resonance imaging was then used to analyse the resting-state connectivity of nineteen children as they read books or used their devices. Researchers were looking to see how well-connected the left visual word form area was with other brain regions. What they found is that screen time negatively impacted connectivity.

“Time spent reading was positively correlated with higher functional connectivity between the seed area and left-sided language, visual and cognitive control regions,” the authors write. “In contrast, screen time was related to lower connectivity between the seed area and regions related to language and cognitive control.”

Not only do these results suggest we need to limit our screen time; they also “underscore the importance of children reading to support healthy brain development and literacy.” Books are here to stay.

Screen time also affects our emotions, making us unhappier the more we tap, scroll, and swipe.

Monitoring the Future, a survey funded by the National Institute on Drug Abuse and designed to be nationally representative, has been tracking high schoolers each year since 1975, recording happiness levels and the amount of leisure time spent on nonscreen activities like in-person social interaction and exercise versus, more recently, screen activities like texting, social media, browsing the web.

“The results could not be clearer,” writes Jean M. Twenge, professor of psychology at San Diego State University. “Teens who spend more time than average on screen activities are more likely to be unhappy, and those who spend more time than average on nonscreen activities are more likely to be happy. There’s not a single exception. All screen activities are linked to less happiness, and all nonscreen activities are linked to more happiness.”

Social media use, in particular, correlates with the most unhappiness. Twenge describes one study where college students with a Facebook page took short surveys on their phones over the course of two weeks. Five times a day they would receive a text with a link asking them to report on their mood and how much they’d used Facebook so far that day. “The more they’d used Facebook,” she reports, “the unhappier they felt.”

Heavy screen time also reduces our chances of getting enough sleep, which is essential for good brain function.

“I asked my undergraduate students at San Diego State University what they do with their phone while they sleep,” says Twenge. “Their answers were a profile in obsession. Nearly all slept with their phone, putting it under their pillow, on the mattress, or at the very least within arm’s reach of the bed. They checked social media right before they went to sleep, and reached for their phone as soon as they woke up in the morning (they had to—all of them used it as their alarm clock).”

It’s no coincidence, she says, that fifty-seven percent more teens were sleep deprived in 2015 than in 1991, and that twenty-two percent more teens failed to get seven hours of sleep in 2015 than in 2012. That’s around the time when most teens got a smartphone.

It remains to be seen how increased screen time affects social skills, but anyone can see it takes up time that might otherwise be spent engaging in real human contact.

“In the next decade,” Twinge warns, “we may see more adults who know just the right emoji for a situation, but not the right facial expression.”

Strategies to Limit Screen Time

The following are a few recommendations we’ve cooked up to help you (and your kids) cut down on screen time in the new year. Feel free to suggest your own in the comments section below.

1. Use a time management app.

This one might seem paradoxical since it gives you one more reason to look at your device, but if it reduces overall screen time, then it’s well worth it. Try Onward, an app that breaks down your daily device use by category, to get an idea of how you currently spend your time and set new goals.

2. Make access harder.

Turn your phone off or keep it in a part of the house you have to walk over to. Take social media apps off your phone so that you can only check your Facebook or Instagram when you use your laptop. Set your accounts to “Don’t save password” so that you have to enter your log-in details each time you want to check your feed.

3. Set parameters for daily use.

If you spend all day in front of a screen for work, make sure you balance that screen time with nonscreen time after you punch out. If you spend all day doing nonscreen activities, set rules for yourself so that you don’t binge once you get off work. Whatever parameters you set, keep them consistent; the brain likes predictability.

4. Replace screen activities with nonscreen activities.

Don’t just give up one activity without a plan for replacing it with something else. Have an idea of which nonscreen activities you enjoy and would like to incorporate into your schedule when you’ve suddenly got oh-so-much more time because you’re on Facebook for twenty minutes instead of an hour.

5. Involve your friends, family, and colleagues.

We’re products of our immediate environment. If the people around us are limiting their screen time as well, it will be easier for us all to achieve our goals. Remember: Over 2 billion people are facing the same challenge; let’s make these changes together.

The post Setting Limits On Screen Time: What Does the Research Say? appeared first on InformED.

Read Full Article
Visit website

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free year
Free Preview