Loading...

Follow The Atlantic | Education News on Feedspot

Continue with Google
Continue with Facebook
or

Valid

On Tuesday, President Donald Trump’s school-safety commission, which was established following the school shooting in Parkland, Florida, released its much-anticipated recommendations “to advance safety” in schools, including one that would scrap a federal policy urging schools not to punish minority students at a higher rate than white students.

The commission’s recommendation to roll back the Obama administration’s school-discipline guidance does not come as a surprise. Republicans have decried the policy as government overreach since it was released in 2014. The policy advocated “constructive approaches” to school discipline, such as victim-offender mediation, as opposed to harsher penalties such as suspensions or expulsions.

The Trump administration’s discipline recommendation comes alongside several bipartisan common-sense measures in the report, including encouraging teachers, administrators, and parents to be vigilant about reporting information to the FBI; improving access to school mental-health services and counseling; and implementing best practices to curb cyberbullying. The report also advocates that districts create a “media plan” to disseminate information in the event of a shooting, alongside a suggestion to follow “No Notoriety” guidelines to keep the focus in the aftermath of an incident on the victims rather than on the shooter.

The school-safety commission’s recommendations are just that: recommendations. As the report notes, “Implementation of the practices identified in this guide is purely voluntary, and no federal agency will take any action against schools that do not adopt them.” School districts have been slow to respond to such nonbinding recommendations in the past—including school-safety recommendations in the wake of school shootings during both the George W. Bush and Barack Obama administrations.

Given the broad mandate of developing recommendations to address school violence, the administration took to addressing the school-discipline guidelines remarkably quickly. The commission argues that the guidance left schools unable—or at least afraid to—take action against potentially dangerous students. “Policy guidance issued under the Obama Administration placed an emphasis on tracking school disciplinary actions by race,” the report says. “The Guidance sent the unfortunate message that the federal government, rather than teachers and local administrators, best handles school discipline.” The commission argued that the emphasis on avoiding a disparity in which students are disciplined may lead school leaders to let their school-discipline policies be driven by numbers, rather than by teacher input.

[Read: The Trump administration’s approach to school violence is more style than substance]

Education Secretary Betsy DeVos, who chairs the commission, argued that a one-size-fits-all approach to school safety would not work. “Through the Commission’s work, it has become even clearer there is no single policy that will make our schools safer,” she said in a statement. “What will work for schools in Montana will be different than what will work for schools in Manhattan.”

Still, the administration’s focus on school discipline has been highly contested—primarily because it seems disconnected from the broader issue of preventing the next school shooting. “It is unconscionable to use the very real horror of the shooting at Parkland to advance a preexisting agenda that encourages the criminalization of children and undermines their civil rights,” Vanita Gupta, the president of the Leadership Conference on Civil and Human Rights, said in a statement following the report’s release. Gupta’s statement tracks with months of criticism of the proposal to scrap the school-discipline guidance. Supporters of the Obama guidance argue that it is necessary to counteract the effects of the inequitable doling out of discipline.

Both Obama-era education secretaries, Arne Duncan and John B. King Jr., released a joint statement on the committee’s recommendation on Tuesday. “We put this guidance in place to start a conversation about these harmful practices and encourage advocates and policymakers to look more deeply into why these disparities exist and to intervene when necessary,” they said. In April, the Department of Education released its annual Civil Rights Data Collection report, which showed that black students made up 15 percent of K–12 enrollments nationwide, but 31 percent of expulsions.

Bobby Scott, the top Democrat on the U.S. House Committee on Education and the Workforce—who will become the committee chair next year—put the sentiment of those who prefer to keep the guidance plainly. “Rather than confronting the role of guns in gun violence, the Trump administration blames school shootings on civil rights enforcement,” he said in a statement. “This guidance has no connection to school shootings.”

While advocates and experts of all political stripes are likely to agree with several of the recommendations of the report, the recommendation on school discipline delves, perhaps unnecessarily, into one of the most politically contentious issues in education. As my colleague Alia Wong wrote in March, “For Washington policymakers to give outsized attention to student-discipline reform is to succumb to ideological precepts that lack empirical support. It is to waste the lessons gleaned from the growing tally of school shootings while reinforcing racial disparities.”

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

There was shock, then backlash. Last week, officials at the University of North Carolina at Chapel Hill announced their plans for the infamous Confederate statue known as Silent Sam, and it hardly satisfied anyone.

On Friday, protestors converged outside of the Center for Leadership Development at Chapel Hill as, inside, the UNC system’s Board of Governors met to deliberate on the university’s plan to erect a $5.3 million building on campus to house the monument. The building would also serve as a university history center. Ahead of the meeting, some board members expressed reservations about the proposal. Was there not a building already on campus that could house Silent Sam without having to spend $5.3 million on a new facility? Others argued that moving the statue to any building, but especially the proposed history center—which has all the hallmarks of a museum—might be illegal due to a state law that says prominently displayed monuments can’t be moved to museums.

[Read: Silent Sam survives]

In the end, the Board of Governors decided to punt. Harry Smith, the chair of the board, announced that they would be denying the plan presented by UNC Chapel Hill’s chancellor, Carol Folt, last week. For reasons of public safety, Smith said, alongside the sheer cost of the recommendation, the board could not support the plan. The board charged five of its members to meet with Folt to review other options for the statue. The deadline for the new recommendations is March 15, 2019.

The vote by the Board of Governors bookends a chaotic two weeks at the flagship institution. Late last week, more than 80 teaching assistants pledged to withhold final-exam grades for the fall semester unless the proposal was withdrawn, though they would make exceptions for students who needed final grades for graduation, job, or immigration purposes. Administrators forcefully responded. There would be “serious consequences” for such a strike, they said. One member of the Board of Governors—equating the action with “violence”—said that he would push for the expulsion of those who would participate in such a protest.

Then there was the striking incident during a faculty-council meeting where a black student, Angum Check, who had earned a Martin Luther King Jr. scholarship, confronted Folt. “I want to tell you, you are a disgrace,” Check said.

On Thursday, members of perhaps the most lauded organization on the University of North Carolina Chapel Hill’s campus, the men’s basketball team—including the alumni and NBA stars Vince Carter, Jerry Stackhouse, and Harrison Barnes, alongside several other former black athletes—sent a statement to university officials expressing “deep concern” about the proposal. “We love UNC but now also feel a disconnect from an institution that was unwilling to listen to students and faculty who asked for Silent Sam to be permanently removed from campus,” they wrote in a letter first reported by Spectacular Magazine, and confirmed by The Washington Post. “The recommendation is embarrassing to us who proudly promote UNC.”

[Read: The dramatic fall of Silent Sam, UNC’s Confederate monument]

Ultimately, the decision to punt is not surprising given the consternation the plan has stirred over the past several days. So, for now, officials will go back to the drawing board, and both Silent Sam’s supporters and those who would prefer him consigned to history will prepare for another battle in March.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In the late 1800s, one of the most enduring fictional characters of all time first appeared on the scene. No, I am not talking about Sherlock Holmes or Oliver Twist, but a less well-known though arguably more influential individual: Homo economicus.

Literally meaning “economic man,” the origins of the term Homo economicus are somewhat obscure—early references can be traced to the Oxford economist C. S. Devas in 1883—but his characteristics have become all too familiar. He is infinitely rational, possessing both unlimited cognitive capacity and access to information, but with the persona of the Marlboro Man: ruggedly self-centered, relentlessly materialistic, and a complete lone ranger. Homo economicus, created to personify the supposedly rational way humans behave in markets, quickly came to dominate economic theory.

But then in the 1970s, the psychologists Daniel Kahneman and Amos Tversky made a big discovery. The academics drew on psychological evidence to show that the actions of human beings deviate from the ironclad rationality of Homo economicus in all sorts of ways: People make systematic errors of judgment, such as being excessively attached to what they own, and yet are also more generous and cooperative than they’re given credit for. These insights led to the founding of a new field, behavioral economics, which became a household name 10 years ago, after Cass Sunstein and Richard Thaler published the best-selling book Nudge and showed how this new understanding of human behavior could have major policy consequences. Last year, Thaler won the Nobel Prize in Economics, and promised to spend the $1.1 million in prize money “as irrationally as possible.”

[Read: Richard Thaler wins the Nobel in Economics for killing Homo economicus.]

But despite the fanfare, Homo economicus remains a stubbornly persistent part of the economics curriculum. While it is fashionable for most economics departments to have courses on behavioral economics, the core requirements in economics at many colleges are usually limited to only two substantive courses—one in microeconomics, which looks at how individuals optimize economic decisions, and another in macroeconomics, which focuses on national or regional markets as a whole. Not only is the study of behavioral economics largely optional, but the standard textbooks used by many college students make limited references to behavioral breakthroughs. Hal Varian’s Intermediate Microeconomics devotes only 16 of its 758 pages to behavioral economics, dismissing it as a blip in the grand scheme of things, an “optical illusion” that would disappear “if people took the time to consider choices carefully—applying the measuring stick of dispassionate rationality.” The staple textbook on macroeconomics, written by Gregory Mankiw, gives behavioral approaches even shorter shrift by scarcely mentioning them at all.

Instead, the overwhelming majority of courses that students take in economics are heavily focused on statistics and econometrics. In 2010, the Institute for New Economic Thinking convened a task force to study the undergraduate economics curriculum, following up on a report from 1991. What changed in the intervening years, it found, was “an increase in mathematical and technical sophistication” that was “not sufficient to foster habits of intellectual inquiry.” In other words, Homo economicus is going strong in lecture halls and textbooks across the country.

Economists’ resistance to incorporate the wisdom of behavioral approaches may seem like a frivolous concern confined to the ivory tower, but it has serious consequences. What students are taught in their economics classes can perversely turn models and charts that are meant to approximate reality into aspirational ideals for it. Most economics majors are first introduced to Homo economicus as impressionable college freshmen and internalize its values: Studies show, for instance, that taking economics courses can make people actively more selfish. The consequences are only made more acute by the fact that business, a more preprofessional version of economics, is the single most popular major for college students in the United States—some 40 percent of undergraduates take at least one course in economics. That behavioral economics has been minimized and treated as an aberration by the mainstream has major bearings on how students make sense of markets and the world.

[Read: The curse of Econ 101]

What is so surprising about the hesitancy of economists today to absorb the learnings of behavioral economics is that until the appearance of Homo economicus, invoking psychology in the teaching of economics was standard. At the University of Cambridge, for instance, before a stand-alone department was established in 1903, economics was taught alongside psychology and philosophy. Only after World War II, when the center of gravity of the discipline shifted across the Atlantic, did the rupture became so stark. The dawn of the American era in economics marked a more intense commitment to mathematical analysis, at the exclusion of all else.

This profound change in the economics curriculum has resulted in a discipline that is sterile, tone-deaf, and lacking an emotional pulse—but also one that has proved ineffective in its explanatory and predictive capacities. Economists don’t exactly have a great track record at anticipating the pertinent developments of late: The discipline as a whole was caught off guard by the Great Recession in 2008 and has been late to recognize the skyrocketing rise of inequality. It is even more ill-equipped to deal with looming seismic shifts on the horizon, such as the accelerating effects of climate change or how advances in artificial intelligence will affect workers. Given the greatly amplified role of professional economists at every level of policy making, the extent to which economics is disconnected from reality is becoming more alarming.

Making behavioral economics compulsory isn’t a cure-all for the ills of the economics discipline, but doing so would go a long way in encouraging students to think about building economic models around actual human beings rather than around the caricature that is Homo economicus. If there’s a deeper lesson to come out of the behavioral revolution, it’s that the vagaries of human behavior make it very difficult to model as a pure science, and economists have a lot to learn from other disciplines, including other social sciences and the humanities. This may mean a dose of humility for economists, but it would enrich both the education that their students receive and their prospects of making positive change in the real world.

So because rumors of the demise of Homo economicus have been greatly exaggerated, economics professors today still have the chance to cast aside this antiquated character once and for all.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

For many years, Wisconsin had one of the finest public-university systems in the country. It was built on an idea: that the university’s influence should not end at the campus’s borders, that professors—and the students they taught—should “search for truth” to help state legislators write laws, aid the community with technical skills, and generally improve the quality of life across the state.

Many people attribute the Wisconsin Idea, as it is known, to Charles Van Hise, the president of the University of Wisconsin from 1903 to 1918. “I shall never be content until the beneficent influence of the University reaches every family of the state,” Hise said in an address in 1905. “If our beloved institution reaches this ideal it will be the first perfect state university.” His idea was written into the mission of the state’s university system, and over time that system became a model for what public higher education could be.

But the backbone of the idea almost went away in 2015, when Governor Scott Walker released his administration’s budget proposal, which included a change to the university’s mission. The Wisconsin Idea would be tweaked. The “search for truth” would be cut in favor of a charge to “meet the state’s workforce needs.”

To those outside Wisconsin, the proposed change might have seemed small. After all, what’s so bad about an educational system that propels people into a high-tech economy? But to many Wisconsinites, the change struck at the heart of the state’s identity. They argued that the idea—with its core tenets of truth, public service, and “improving the human condition”—is what makes Wisconsin, Wisconsin.

Walker ultimately scrapped his attempt to alter the Wisconsin Idea, claiming that his administration hadn’t meant to change it, that it was just a “drafting error.” And so the Wisconsin Idea was preserved—at least in an official sense. But though the words survived intact, many Wisconsinites believe that in the years since, the change Walker had proposed has taken place nevertheless. And one of the state’s institutions, the University of Wisconsin at Stevens Point, is the epicenter of that change.

In mid-November, the university announced its plans to stop offering six liberal-arts majors, including geography, geology, French, German, two- and three-dimensional art, and history. The plan stunned observers, many of whom argued that at a time when Nazism is resurgent, society needs for people to know history, even if the economy might not. But the university said it just was not possible: After decades of budget cuts, the most extreme of which came under Walker, Stevens Point no longer had the resources to sustain these six majors.

[Read: How education factored into the gubernatorial election in Wisconsin]

The state's recent gubernatorial election showed that millions of Wisconsinites want to preserve the values the university system has long embodied. The governor-elect, Tony Evers, won his race in part because of he campaigned on reclaiming Wisconsin’s status as a standard-bearer for higher education. But Evers has a fight ahead of him: The state's Republican-controlled legislature has made it abundantly clear that they are not about to cooperate with Evers' agenda.  Despite Evers' election, the Wisconsin Idea may not live for much longer.  How Wisconsin got here—and whether it will get out—is a story that reveals the tensions of a state that feels it must choose between the needs of a tech-hungry economy and those of a uniformed civil society, and is going to somehow try and satisfy both.

In November 2017, Lee Willis wasn’t terribly concerned. The chairs of each department at the University of Wisconsin at Stevens Point were assessing their programs ahead of a biweekly meeting with the dean; Willis, the chair of the history department, may not have been feeling great, but he was at least upbeat. “I felt like the department had really diversified its curriculum in a way to shore us up,” he told me. The department offered a history major, and much of the coursework for an interdisciplinary major in international relations and a social-science major for those pursuing a teaching certificate.

The college, alongside the rest of the UW system, was facing major budget cuts. The state was spending nearly 23 percent fewer inflation-adjusted dollars per student than it had a decade earlier, according to the Center on Budget and Policy Priorities (CBPP). On top of that, a tuition freeze was in place—which was good for students, but tough for an institution that somehow needed to fund its programs.

Colleges in this situation have little choice but to start cutting, Michael Mitchell, a policy analyst at CBPP, told me. Many institutions have to consolidate programs, restrict course offerings, stop hiring, furlough staff, transition some faculty from tenure track to adjunct positions, and reduce campus services that students rely on, such as mental-health services or library hours, Mitchell said. Flagship campuses, such as the University of Wisconsin at Madison, are usually safe, but not the Stevens Points of the world.

That’s the position the university found itself in during the 2017 chairs’ meeting. By that point, administrators had already broken down the number of students enrolled as majors in each department—at least those aside from the STEM fields, Willis said. Sure, there may have been signs of trouble for his program before, Willis said, that should have raised his concern, but that meeting was when he thought, for sure: “They’ve got us marked.”

In March 2018, the school’s administration offered a proposal to deal with the deficit. Cuts were necessary, the administration said. Liberal-arts staples such as English, philosophy, political science, and history would have to be eliminated. All told, the university planned to get rid of 13 majors. Not enough students were enrolled in them to make them worth the cost, the university argued. “We’re facing some changing enrollment behaviors,” Greg Summers, the provost and vice chancellor at Stevens Point, told me. “And students are far more cost-conscious than they used to be.”

Instead, administrators wanted to focus the school’s limited resources on the academic areas that students were flocking to and that the state’s economy could use straightaway—though they maintained that the liberal arts more generally would remain central to the curriculum, even if these specific majors were gone. “We remain committed to ensuring every student who graduates from UW-Stevens Point is thoroughly grounded in the liberal arts, as well as prepared for a successful career path,” Bernie Patterson, the institution’s chancellor, said in a message to the campus. The changes would reflect “a national move among students towards career pathways,” administrators argued. The proposal planned to add majors in chemical engineering, computer-information systems, conservation-law enforcement, finance, fire science, graphic design, management, and marketing. By focusing more on fields that led directly to careers, the school could better provide what businesses wanted—and students, in theory, would have an easier time finding jobs and career success.

[Read: The decline of the Midwest’s public universities threatens to wreck its most vibrant economies.]

Faculty members had been expecting cuts, but nothing nearly that severe. “I was personally surprised about the radicalness of the change,” Ed Miller, a political-science professor at the university, told Wisconsin Public Radio. “We do live in a democracy, and universities are supposed to be preparing people to participate in a democracy, besides participate in the workforce, although that’s certainly important.”

But beyond that, Thomas O’Guinn, a professor at the University of Wisconsin at Madison, told me, the changes flew smack in the face of the Wisconsin Idea. “The idea that we would just forsake everything for [career-focused schools] is not a Wisconsin Idea thing at all,” he said.

Fierce backlash to the proposal from students, faculty, and alumni pushed the administration to reconsider its original plan. By the time the final proposal was released in mid-November 2018, it was less expansive, though still forceful. Six programs would be cut, including the history major. The university seemed to be eyeing degree programs with low numbers of graduates, and nationally, the number of graduates from bachelor’s programs in history has had the steepest decline of any major in recent years, according to the National Center for Education Statistics.

If the proposal, which is now in the middle of a public-comment period, is finalized, history classes will still be offered, but Willis said that cutting the major may ultimately lead to a reduction of staff and upper-level courses, such as the spring seminar on the Holocaust and its major’s emphasis on race and ethnicity. To Willis, this isn’t just an educational loss, but a societal one as well. “You never know when a historical metaphor is going to arise,” he quipped, pointing to the recent incident in Baraboo, Wisconsin, where high-school students gestured the Nazi salute in a photo.

The words of the Wisconsin Idea haven’t changed, but the administration is gutting it in favor of career education, Willis said. “I feel like the liberal arts are sort of being asked to line up behind job preparation,” he told me, “rather than studying the liberal arts for the liberal arts’ sake as a public good.”

Greg Summers is an environmental historian by trade. He received his doctorate at Madison, and he has worked at Stevens Point his entire career, starting in 2001. But it wasn’t his own department—the history department—that led him to the university.

Instead, Stevens Point’s renowned College of Natural Resources drew him in. The college produces many of the state’s foresters, wildlife managers, and environmental engineers. “I came here wanting to make sure that every one of those folks that we trained in the College of Natural Resources were going to go out with an understanding that history mattered,” Summers said, “and that it was relevant to their professional work.” Many colleges, he argues, fail to give their STEM grads that broader understanding, due in large part to the general-education curriculum.

“We hear a lot from employers that they don’t want to choose between graduates who have some technical ability versus a graduate who has a liberal-arts major,” Summers said. “They really want both of those things.” He said he’s working to position Stevens Point to provide that balance. “We’re trying to search for ways to better integrate the liberal-arts education that we have always provided into all of our majors,” he said, so that students don’t have to choose between a major in the liberal arts and “a major that doesn’t really engage them meaningfully.” Essentially, he said, selecting a major in the hopes of getting a job shouldn’t prevent a student from receiving a basic liberal-arts education.

[Read: The unexpected value of the liberal arts.]

But he’s had a hard time getting faculty on board with the administration’s way of achieving that goal. Late last month, more than 100 current and former faculty and staff at the university called for the removal of Summers as well as Chancellor Bernie Patterson. “While Provost Summers was hiring more faculty than he now thinks we can afford, UWSP undertook a lengthy strategic planning process.” they wrote. “But excellent adjectives, no matter how elegantly arrayed, do not constitute a strategic plan. Instead of being guided by a consistent vision, UWSP’s leadership has instead been erratic, misguided, and in some cases even incompetent.”

Summers, however, argues that the strategy isn’t reactive—or inconsistent, for that matter. Instead, he said, the university is trying to think “10 years down the road,” to what the students and the state will need. The demand, Summers said, “is coming from working adults on college campuses who cannot come to campus on Tuesday morning at 10 to attend a three-credit class, and who may not be looking for a full-fledged baccalaureate degree.” Rather, “they might be in need of knowledge and learning and professional advancement,” he told me. Besides, he argued, “too often when people have these conversations, they tend to conflate the value of the major with the value of the discipline.” This isn’t an attack on the liberal arts, he said, but a push to open up liberal-arts courses to more students in significant ways.

Summers, too, lays claim to the Wisconsin Idea, and has proposed an “Institute for the Wisconsin Idea” that will serve as the hub that generates a revamped general-education curriculum. “We want to really give [the institute] first priority at defining the curriculum in the most meaningful way we can for our students and making sure it’s integrated with their chosen professional pathway,” he said.

“We need to make sure that knowledge is relevant, and it’s applied,” he said. The university’s competitiveness in the future higher-education market depends on it.

Both Willis and Summers agree on one critical point: What happens in Wisconsin could become a model for higher education across the country. What divides them is whether that would be a good thing.

One thing is sure, however: Financial realities such as those facing Stevens Point are not far off for many regional institutions. “The reality is that we just can’t be everything to everyone, regardless of the public-good value of some of the coursework,” Summers said. “Those constraints are very real.” There are few encouraging signs—if any—that states will once again pump dollars into state colleges to get them back to 2008 levels and, as Mitchell of the Center on Budget and Policy Priorities notes, those levels still were not enough to make college affordable for most students.

But as much as this is a tale of a resource-strapped institution, it’s also a tale of something else—something that has nothing to do with the school’s budget and everything to do with the state of higher education in the 21st century. And that’s because even if the state were to miraculously open the coffers for state institutions, Summers said he would likely still eliminate the history major and others in favor of more focus on STEM fields bolstered by a broader general-education curriculum.

[Read: The humanities are in crisis.]

“If we suddenly received more dollars,” he told me, “we’d have to ask some pretty hard questions about where we’d want to invest those dollars—and again, I’d point to enrollment figures.” If the demand is for the fields with clearly prescribed career pathways, that’s where the resources should flow, he said. “We are obligated to make sure that we’re serving those students in the best way possible, and for that money, we need to focus on the liberal-arts core curriculum of the university,” he told me.

To Willis, this sounds like the manifestation of Scott Walker’s 2015 plan, shirking the “search for truth” in favor of meeting economic needs. “I think that’s what our administration is thinking about,” he said, “that our role here in central Wisconsin is to anticipate what jobs are going to be needed and to develop programs accordingly." The problem, he fears, is that that alone will never be enough.

The national conversation around higher education is shifting, raising doubts about whether the liberal arts—as we have come to know them—are built to survive a tech-hungry economy.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In recent years, many of America’s urban schools have improved significantly. A 2016 report from the Urban Institute found that while all the country’s public-school students improved in the decade starting in 2005, the gain for those in large cities was double that of the U.S. average; the advances are especially pronounced in kids’ reading scores. With these strides, the achievement gap between city districts and their suburban and rural counterparts closed by roughly a third during that same period.

In some cases, the gap is all but nonexistent. Take, for example, Chicago, which in the late 1980s was notoriously deemed the country’s “worst school system” by then-Education Secretary William J. Bennett. A number of recent studies have shown that while standardized-test scores across Illinois have been flattening for the past decade or so, achievement in Chicago’s public-school district (CPS) has been steadily rising.

In fact, data from 11,000 school districts studied by Stanford researchers last year suggest that CPS ranks first in the nation for academic growth, and state statistics show that its students’ college-attendance rates are steadily improving, too: Sixty-five percent of the district’s 2018 graduates enrolled in college within a year after getting their diploma, compared with an average of 75 percent across the state. CPS students’ college-going prospects still fall toward the bottom when compared with those in most nearby districts, but they’re far from the worst—and the Stanford researchers’ findings around future growth in CPS indicate that its students’ postsecondary-achievement levels are poised to continue improving.

But middle-class, white parents tend to make assumptions otherwise—and research suggests that those assumptions are the result of racial biases. A recent study in the journal City & Community based on survey data out of eight metropolitan areas in the U.S. suggests that residents—including, presumably, parents—frequently harbor negative associations with the term urban and, by extension, “inner-city” communities and institutions, such as schools. To them, these words may connote scenes of educational dysfunction—rows of decrepit classrooms, for example, each stocked with an overworked teacher and a cluster of indignant teens, almost all of them poor students of color.

By contrast, the study pointed to evidence that the term suburban tends to elicit images of productivity and well-being among white parents. Of course, these stereotypes that white, middle-class parents harbor aren’t simply about population density, but about race, with urban standing in for predominantly black or Latino. A number of studies have shown that white parents tend to select schools with lower proportions of black students, regardless of school quality.

“We know that these terms, which might seem like they are neutral descriptions of physical spaces, are not neutral,” says Shelley Kimelberg, a sociologist at the University at Buffalo who co-authored the study with the Wichita State University sociology professor Chase Billingham. “They reflect people’s lived experiences and the social environment.” According to Kimelberg, the influence an individual’s personal experiences have in shaping how she defines the term urban contributes to a feedback loop, cementing “the idea that urban equals bad school and suburban equals good school.”

Read: The urban-school stigma

In their study, Kimelberg and Billingham analyzed data from a survey of residents in metropolitan areas across the U.S. When controlling for other factors, every one-point increase in whites’ perception of their neighborhood’s school quality was associated with a 15 percent decrease in the odds that they would describe their area as “urban.” The same effect was not evident among people of color.

Jack Schneider, a historian who studies education, has described this as “a gap in perceptions,” pointing as one example to public-opinion polls finding that while parents consistently give high marks to their own neighborhood schools, they also tend to report a lack of confidence in U.S. public education as a whole.

One of the most glaring manifestations of this gap, Schneider has argued, is the stigma against urban schools. Not only do stereotypes fail to acknowledge the variation within these districts, as Kimelberg’s study highlights, but they also place too much emphasis on test-score data, which, as Schneider has shown, provide a flawed illustration of school quality. For example, one 2006 study found that a majority (60 percent) of the variance in students’ test scores is attributable to kids’ lives outside of the classroom—where they live and with whom. The quality of instruction, including things like teacher characteristics, had little bearing on exam performance. Other research has found that the quality of one’s schooling plays a very limited role in determining whether she climbs up the economic ladder later in life.

Yet the stigma persists, and the tragedy of all this is that the stigma itself is a key reason educational inequality remains. Despite signs of a reversal in the white flight that crippled urban school districts following desegregation orders tracing back to the late 1960s and ’70s, research suggests that the country is seeing a new iteration of income-based housing segregation driven almost exclusively by affluent families with children. By moving to certain neighborhoods in pursuit of what they perceive to be good schools and to flee what they perceive to be bad ones, they contribute to school-funding inequalities by taking resources and social capital with them.

Read: Reviving a hollowed-out school

Chicago Public Schools, where close to nine in 10 students are black or Latino, offers a case study for these trends. The district has in recent years engaged in earnest efforts to attract middle-class families—launching International Baccalaureate programs at a slew of high schools, for example, and building new schools in white neighborhoods.

And, perhaps in part as a result, a body of scholarship corroborates the turnaround narrative that district officials have—sometimes suspiciously—long been touting. For example: CPS students, no matter the demographic subgroup, generally perform better than their peers in other Illinois school districts. These results are partly attributable to the district’s rising graduation rates and scores on the ACT college-entrance exam, but they also owe themselves to growing poverty and racial diversity in suburban school districts—a trend that Kimbelberg highlighted when reflecting on the outdated or otherwise flawed assumptions that seem to inform people’s mental associations with the words suburban and urban.

Despite these changes, CPS has struggled to generate a critical mass of middle-class parents interested in its public schools—at least beyond those schools where students need a certain test score to get in. While reporting by WBEZ shows that the rate of families in Chicago who choose to send their children to their neighborhood school has declined, the trend is particularly evident among white families. Just half of Chicago’s white, school-age children attend the city’s public schools, compared with about 80 percent of their black counterparts, according to a 2014 report by WBEZ; the remainder attend other types of schools, like charters, magnets, or private institutions.

This dynamic, which is seen in urban areas across the country that give parents significant choice over where to send their kids to school, has been found to exacerbate educational stratification and racial segregation. The result is that even when urban districts improve a little, they struggle to improve a lot. And yet another generation passes through an education system defined by its unevenness and its racial divides.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Highly selective colleges have long struggled with racial and economic diversity. At 38 such institutions in the United States, more students come from households in the top 1 percent than from those in the bottom 60 percent. That is in part due to who applies to the universities: Many high-achieving students from a low-income or minority background don’t think they can get in to a prestigious institution, let alone pay for it—despite the fact that many such colleges have generous financial-aid packages—so they end up not applying.

A new study, however, found that a few extra dollars on a university’s part might go a long way in terms of changing that calculus for low-income students. The working paper, published by the National Bureau of Economic Research, examined the effects of a targeted-outreach campaign for low-income students at the University of Michigan.

[Read: The missing black students at elite American universities]

The campaign, known as the High Achieving Involved Leader (HAIL) Scholarship, encourages highly qualified, low-income students to apply to the university, promising them four years of education free of tuition and fees. Students are sent a personalized mailing with all of the information, which costs the university less than $10 each to produce and send out; the students’ parents and school principals are also contacted separately. And the offer of free tuition isn’t contingent upon filling out financial-aid forms such as the Free Application for Federal Student Aid (FAFSA).

The researchers, led by the University of Michigan economist Susan Dynarski, found “very large effects of the HAIL scholarship offer on application and enrollment rates at the University of Michigan and more generally on college choice.” Students who received the mailing were more than twice as likely to apply to the University of Michigan compared with a control group. The percentage of low-income students enrolling at the university more than doubled as well—from 13 percent in the control group to 28 percent in the group of students who received the mailer.

The HAIL Scholarship is a new program, but even without it the students would likely have been able to attend the University of Michigan free of charge—90 percent of similarly situated high-achieving, low-income students receive full-tuition scholarships. But HAIL makes that fact explicit: It isn’t that students can apply and have the chance to afford the college—if they apply and are accepted, it is guaranteed.

The study shows one way to tackle the phenomenon known as “undermatching,” which is when high-achieving students don’t attend the most selective college they could get into. It’s something researchers have studied and worried about for several years now, since it tends to occur most frequently among low-income students. While it has been argued that there’s too much attention being focused on getting low-income students into a small number of elite colleges, as I’ve previously written, students who undermatch are less likely to graduate than their peers who don’t, and they forgo a range of social benefits accrued from attending an elite college.

[Read: When disadvantaged students overlook elite colleges]

In some cases, the students enrolling at Michigan wouldn’t have gone to college at all had they not had been contacted. “One-quarter of the enrollment effect (four percentage points) is driven by students who would not have attended any college in the absence of the treatment,” the authors of the report wrote. “The balance would have attended a community college or a less selective four-year college in the absence of the treatment.”

For the researchers, the next step in evaluating the program is to track its effects on students’ choice of major, graduation rates, and, in the long term, lifetime earnings. But for now, the results “show that a low-cost, low-touch intervention can strongly affect student application and enrollment at selective colleges.”

This is the second study in the past week showing the positive effects of a guarantee for low-income and minority students. A study published by the American Educational Research Association found that undermatching is reduced when low-income students know that their admission is ensured through state policy. The study examined the University of Texas system and its “top 10 percent plan,” which guarantees admission to students in the top 10 percent of their high-school class.

In both the Michigan and Texas studies, the students were given clear information that going to college—and to an elite college, at that—was a real possibility. As Kalena Cortes, an associate professor at Texas A&M and one of the Texas study’s authors, said, “Demystifying college-admissions policy is a pathway to greater inclusion.”

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Jasmine Lee had finally found something she was happy with and wanted to pursue. She had been working as a medical assistant at an orthopedic center, and she was enjoying it. But she wanted more. So she figured she should check out the certificate program at Virginia College in Birmingham, where she works.

“I'm just trying to better myself, and provide a better life for my kids,” Lee, the 23-year-old mother of two, told me. Last summer, she decided to enroll. She applied to the medical-assistant certificate program, got accepted, and found out she was eligible for the financial aid. Everything happened very fast, she said.

“They quickly had me signing up, they quickly had me trying to take a test,” she told me. She told her enrollment adviser that she wanted to start in August—it would be best for her job—but he advised her that the tuition would go up a significant amount soon, and if she started in July, she could “lock in” the current rate. According to the Virginia College website, tuition and fees for the nine-month certificate program were $16,750. A comparable program at the local community college in Birmingham costs roughly 10 percent of that.

[Read: The Coded Language of For-Profit Colleges.]

Once she started, Lee enjoyed the program. The people—teachers and staff included—seemed to actually care about the students. “I was getting good grades, I had a good GPA, I found out I was on the dean’s list,” she said. And then, on Wednesday, the school’s parent company, Education Corporation of America, announced that on Friday it would close all of its campuses nationwide. And Lee, five months into her nine-month program, will likely never receive the certificate she has sunk both time and money into.

Lee is one of the more than 19,000 students affected by the abrupt closure of Education Corporation of America, one of the country’s largest private for-profit college operators, which runs Virginia College, Brightwood College, Brightwood Career Institute, Ecotech Institute, and Golf Academy of America. The for-profit operator had been in a precarious position for some time, given a pending loss of accreditation and access to federal financial-aid funds. In a letter to students on Wednesday, Stu Reed, the CEO of Education Corporation of America, said, “It is with extreme regret that this series of recent circumstances has forced us to discontinue the operations of our schools.” The company had tried to “dramatically restructure” itself, including campus closures over time. But the outside pressures, it argued, were too powerful, and it had to accelerate the process of closing schools. But until then, students, by and large, were unaware that the problems were that severe. ECA did not immediately respond to a request for comment from The Atlantic.

There have been a handful of closures of major for-profit operators over the past few years, most notably the closure of Corinthian Colleges in 2015 and ITT Technical Institute in 2016. In each of these cases, there were thousands of real people, like Lee, who were left wondering how to pick up the pieces after they’d invested valuable resources into an education that suddenly disappeared.

[Read: The Lifelong Cost of Getting a For-Profit Education]

When a college closes abruptly, students can often have their federal student loans discharged, Toby Merrill, the director of the Project on Predatory Student Lending at Harvard, told me. But that doesn’t happen automatically, she says, and students have to apply to receive the funds. Typically, after they apply, the discharge takes roughly a month or two. Historically only a fraction of students who were eligible for such discharges have ever received them. However, the Department of Education recently changed its policies to provide automatic loan cancellation to all eligible students, as long as they do not enroll in another program that uses federal financial aid within three years.

But even though students may be eligible to get their loans discharged, Lee says, they are unlikely to get any credit for the work they’ve already done, and that doesn’t account for the money they spent out of their own pocket. “Signing up for school, they were telling us that a lot of schools do not take their credits,” Lee told me. But she figured it would be worth it because if she finished school there, the credential would follow her for life. Now that the school is closed, though, the consequences of this are more readily apparent: She can’t transfer into another program and would likely have to start from scratch. “I’ve missed out on five months of things with my kids to be in school,” she says. “I only had four more months to go.” And to have that rug pulled out from under her is “discouraging.”

Local community colleges and technical colleges will often try to step in to help students who have been affected by the closures, but their hands are tied in terms of what credits they can allow to be transferred over, Trace Urdan, a managing director at Tyton Partners, a higher-education consulting firm, told me. Since many of the credentials—including certificates—offered by the colleges are essentially one long course broken apart, most regionally accredited colleges can’t use the credits. It’s like taking half of a calculus course, he said.

Lee and other students in the medical-assistant program at Virginia College were still encouraged to take their final exam on Thursday evening. Lee went to class, hoping to learn more about what would happen to the students and their credits. On Friday, she told me that the school had suggested two institutions that may accept their credits. “But the classes are different,” she said, “so the ones who were at the end [of the program] may have to take another class or two.”

All of this is happening to a student body not well equipped to weather major setbacks. Statistically, students at for-profit colleges are more likely to be low-income than those at other institutions, and are less likely to have the resources to draw on to be able to come up with a good Plan B. And so they’ll end this year a little older, maybe a little wiser, and with even fewer options than they had when they started.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Editor’s Note: In the next five years, most of America’s most experienced teachers will retire. The Baby Boomers are leaving behind a nation of novice educators. In 1988, a teacher most commonly had 15 years of experience. Less than three decades later, that number had fallen to just five years leading a classroom. The Atlantic’s “On Teaching” project is crisscrossing the country to talk to veteran educators. This story is the fourth in our series. Read the first one here, the second here, and the third here.

Robert Gleed was only 17 when, a few years before the start of the Civil War, he escaped from a Virginia slave owner. He was caught soon after near Columbus, Mississippi, and sold at an auction, and he didn’t gain his freedom until Union troops arrived in 1865. In the 10 years that followed, Gleed opened a general store; acquired 295 acres of farmland, three city lots, a home; and became one of the first black state senators in Mississippi.

On May 8 of this year, more than 150 years after 437,000 black Mississippians—the majority of the state at the time—gained their freedom, Dairian Bowles, a junior at the Mississippi School for Mathematics and Science, told Gleed’s story. Dressed in a black waistcoat and a white shirt with a high collar, Bowles stood in front of Gleed’s marble tombstone in Sandfield Cemetery, Columbus’s historic burial ground for African Americans.

In front of about 200 visitors, Bowles told of how, a little more than a decade after emancipation, Gleed lost his political power, his store, and his home. In 1875, after his term as a state senator, Gleed ran for the position of sheriff. The day before the election, a mob of torch-carrying whites surged through downtown, killing four black men. Gleed survived only because a white friend helped him hide in his well. Soon after, white townspeople claimed that Gleed owed them money, auctioned off his store, and pocketed the profits.

Bowles’s performance was part of the African American history class taught by a 25-year veteran teacher at the school, Chuck Yarborough. Each year, Yarborough gives his students in his African American and U.S. history classes a list of people buried in Columbus’s two historic cemeteries—Sandfield and Friendship, the latter the resting place of many Confederate soldiers. Most of the people on the list have never been researched before, so students spend months poring over primary records in the town’s archives. Their final project is a performance written and directed by the students, and anywhere from 100 to 2,000 people from all over the state show up to see them.

Read: What kids are really learning about slavery

Among the moss-covered tombstones, students give voice to white, black, Jewish, and immigrant Mississippians, who more than a century ago—much as Americans do now—argued about who deserves the right to citizenship. But rather than prioritizing the debates of powerful leaders and the outcomes of bloody battles, which is common in history curricula across the United States, these students share stories that explore how the small, daily choices and actions of Columbus residents made Mississippi—and by extension, the country—what it is today.

The question of what students should learn about the Civil War, the role that slavery played in it, and the history of Reconstruction—the period from 1865 to 1876 when African Americans claimed their rights to freedom and voting, followed by a violent backlash by white Southerners—causes contentious disputes among educators, historians, and the American public. One outcome of these disputes is that ideologies often masquerade as historic facts. Texas’s 2010 standards, for instance, listed states’ rights and tariffs, alongside slavery, as the main causes of the Civil War—even though historians overwhelmingly agree that slavery was the central issue.

Another common problem is omissions: A 2017 survey of 10 commonly used textbooks and 15 sets of state standards found that textbooks treated slavery in superficial ways, and state standards focused more on the “feel-good” stories of abolitionists than on the brutal realities of slavery. When the same study surveyed 1,000 high-school seniors across the country, it found that among 12th graders, only 8 percent could identify slavery as the cause of the Civil War, and fewer than four in 10 students surveyed understood how slavery “shaped the fundamental beliefs of Americans about race and whiteness.”

Of course, students aren’t students forever, and the views of American adults are influenced by what they learn as children. When one 2015 poll asked American adults whether slavery was the main reason for the Civil War, 52 percent said that it was, while 41 percent said that it was not. In the same survey, 38 percent of adults insisted that slavery should not be taught as the main cause of the Civil War. That the country is divided on how to deal with Confederate statues and the Confederate flag follows in lockstep.

Read: What Trump’s generation learned about the Civil War

All this has motivated Yarborough to help his students explore the historical record; focus on primary sources, not textbooks; internalize, through performance, the stories of the people who lived through these times; and share their research with the community. He likes to paraphrase his favorite quote, by Elizabeth Cady Stanton, in his classes: “When women and men start to think, the first step in progress is taken.”

A sixth-generation Mississippian, Yarborough spent most of his childhood in Pass Christian, a small beach town on the Gulf Coast. Yarborough’s father drove to New Orleans every morning to work for Texaco as a geophysicist, while Yarborough’s mother raised five kids at home. In Yarborough’s telling, it was an idyllic childhood; he and his siblings went sailing, rode their bikes downtown to eat po’ boy sandwiches and play pinball, and spent hours reading books.

In elementary school, Yarborough became best friends with Otis Gates, who lived a few blocks away, and who was one of two African American students at the majority-white Catholic school both boys attended. In 1973, when Yarborough was a first grader, Gates invited him to his birthday party. A few moments after he arrived at Gates’s house, Yarborough realized that he was the only white kid in the large crowd of black children—even though Yarborough knew that Gates had invited all their white classmates.

Yarborough recalled this as one of the most formative moments of his youth, one that he often shares with his students. “I stayed the night at Otis’s all of the time, and he stayed the night at our house all of the time,” Yarborough told me. “But that day, it was highlighted to me that there was this big divide in our community. My entire career, I’ve been trying to step across that divide.”

While Yarborough and Gates were growing up, Mississippi became the center of one of the largest anti-integration movements in the country. In response to the Supreme Court’s Brown v. Board of Education decision ordering the integration of public schools, white segregationist organizations opened what became known as “segregation academies.” These were private schools—in part funded by the government through what became known as vouchers—that opened between 1964 and 1972 for the children of white parents opposed to desegregation. Hundreds of such academies were established, and at least 35 survive in Mississippi, including the Heritage Academy, on the Confederate Drive in Columbus. The schools recently became the brief focus of the nation’s attention when the Jackson Free Press reported that Mississippi Senator Cindy Hyde-Smith had attended one.

As Yarborough tried to understand the factors that fueled deep racial divisions in his state, he immersed himself in African American history. He majored in English at Vanderbilt University, and received his master’s degree from the University of Mississippi in southern studies, with a focus on black history and culture.

Since Yarborough first started working at the Mississippi School for Mathematics and Science, in 1994, he has taught students from all types of schools: former segregation academies, religious private schools, public schools, and charter schools. A public boarding school, the Mississippi School for Mathematics and Science teaches 248 students from all over the state who have come to spend their last two years of high school studying accelerated sciences, math, computer courses, the arts, and humanities. In a state with the second-highest rate for black poverty in the country, and a public-school system in which nearly a third of all districts have resegregated in recent decades, all the students I talked to at the school cited their highly integrated classrooms as one of the most valuable parts of their academic experience. (Eighteen percent of the students are black, 11 percent are Asian, 11 percent are mixed, and 66 percent are white.)

Each year, Yarborough surveys his students on what they know about the Civil War and Reconstruction. The feedback from the more than 1,400 students he has taught has been consistent: Out of a class of 18 to 20 students, about five come in with some basic knowledge of the Civil War, and a few have studied Reconstruction. “You can’t understand American history without understanding Reconstruction,” Yarborough argued. “Students have to understand the steps forward, racially and socioeconomically, that Reconstruction presents, and then steps backward that are taken with the violent reestablishment of white supremacy and the planter class being in control.”

Even though the U.S. history course for juniors in Mississippi is supposed to cover the period from 1877 on, Yarborough begins each year with the Civil War and Reconstruction. Students read Mississippi’s 1861 Ordinance of Secession, which, in Yarborough’s view, leaves little doubt that slavery played the key role in the Civil War. “Our position is thoroughly identified with the institution of slavery—the greatest material interest of the world,” the document states. “Its labor supplies the product which constitutes by far the largest and most important portions of commerce of the earth. These products are peculiar to the climate verging on the tropical regions, and by an imperious law of nature, none but the black race can bear exposure to the tropical sun. These products have become necessities of the world, and a blow at slavery is a blow at commerce and civilization.”

Read: How history classes helped create a “post-truth America”

Meanwhile, students start their research for what is known as the Tales From the Crypt project with primary sources—court and census records, diaries, family and business files, among others—using them to write a paper about an individual buried in Friendship or Sandfield Cemeteries.

This year, Erin Williams, a student from Hattiesburg, Mississippi, investigated and performed the life of Susan Casement Maer, a native of Australia who, in 1881, became one of Mississippi’s first women newspaper owners— the “editress” of the Columbus Dispatch, as records of that time identify her. Kaelon McNeece, a student from Brandon, a suburb of Jackson, researched J. G. Parsons, a Confederate soldier, which led to an exploration of undiagnosed PTSD following the war. Dairian Bowles—a student from Byhalia, a rural town in northern Mississippi, who performed the story of state Senator Gleed—also investigated the life of a progressive doctor, John H. Hand. He ended many cruel, ineffective medical practices in 19th-century Columbus—and then purchased nine enslaved women and men, as his practice earned him a sizable fortune.

The Tales From the Crypt project, which added research and community performances to the U.S. history curriculum, was started by Yarborough’s late colleague and mentor, Carl Butler, in 1990. In 2007, Yarborough founded the African American history course, for which students also research those buried in Sandfield and perform their stories as part of the Eighth of May Celebration of Emancipation—a yearly tribute that had fizzled out in Columbus in the 1970s and was later revived by Yarborough’s course.

As part of the African American history class this year, Edith Marie Green, a student from Oxford, a suburb in northern Mississippi, investigated the life of Allen L. Rabb, owner of Rabb’s Meat Market, started by Allen’s father in the post–Civil War years. Other students researched historic records for William Isaac Mitchell, the president of Penny Savings Bank, the first African American–owned bank in Columbus, and Richard Littlejohn, the publisher of a local black newspaper in the 1880s, among others.

Green told me that this class was the first time she’d learned anything about Reconstruction, and she certainly hadn’t learned about black life during that period. “In my history classes, we covered slavery and Martin Luther King. That’s it,” she said.

Green, who is white, said she especially appreciated hearing the perspectives of her black classmates during frequent discussions that made connections between Reconstruction, Jim Crow, and contemporary issues of racism. “Black students encouraged white students to go beyond outrage over great injustices to think about what we can do to change things now,” she said. Helping more Americans recognize black history as part of U.S. history is a priority for Green. She plans to become a high-school history teacher someday, she told me.

Bowles described his performances of state Senator Gleed—along with the process of researching Dr. Hand’s life—as the highlights of his high-school experience. Spending time in the archives, Bowles told me, made him feel like a private investigator, caught up in the excitement of one court record he’d uncover leading to another, as small puzzle pieces of his research subject’s life fell into place. The contradiction Bowles uncovered—an innovative, progressive doctor who made medicine more humane, and a person who failed to see the inhumanity and cruelty of slavery—became the central question Bowles explored during his performance to a crowd of about 1,000 mostly white Columbus residents at Friendship Cemetery. “It was important for me to understand how common his views were and where his mindset came from,” Bowles reflected. “Developing an understanding doesn’t mean justifying or excusing someone’s actions.”

Yarborough told me that Bowles did what many Americans struggle to do when they consider the past—recognize the contributions of individuals without whitewashing their flaws and inconsistencies. “Human beings like there to be no shades of gray,” he said. “We want simplicity in history. We want either good or bad, just or unjust, right or wrong. And while that’s very satisfactory to us individually, any project in history that is going to reflect our world, and teach kids how to operate in our world, has to explore that complexity.”

Yarborough argues that reliance on textbooks, which compress complex events or individuals into one paragraph or page, is not an effective way to teach key moments in American history. Instead, students should have opportunities to research primary sources in the context of other historic accounts about the events.

Prior to performing the stories of Hand and Gleed, Bowles told me, he used to think that he didn’t like being onstage and tried to find every excuse to get out of performing. He ended up enjoying it more than anything he’s done in high school, and is now considering majoring in screenwriting and film in college. “I didn’t go into this project thinking, I’ll be helping the community understand something, but seeing people engage and react to my characters was really satisfying. It was a really important moment in my life.”

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview