Loading...

Follow The Vocal CORD | Council of Residency Directors.. on Feedspot

Continue with Google
Continue with Facebook
or

Valid

We are excited to introduce an exciting new tool developed by the CORD Application Process Improvement Committee. EMATCH was designed to try and help applicants identify at risk characteristics that may need specific strategies to maximize their application in EM. 

This tool has two overarching goals: 1) to reassure applicants who are not at risk (the vast majority) and 2) to push best practice resources and strategies to all applicants. NRMP and AAMC data were used to derive the tool, and best practice advising and strategies include resources from the CORD Advising Students Committee in EM (ASCEM), AAMC Applying Smart and EMRA Match. The CORD BOD and EMRA BOD have both reviewed and approved the tool and IRB approval was obtained to perform a validation this summer with 4th year EM applicants. Please consider sharing this with any 4th year applicants in EM or their advisors!

If you are interested in using the tool please read the consent below and here: EMATCH.

Stay tuned next month for information on another amazing too, the Residency Navigator.

*    *    *

Dear Emergency Medicine Applicant:

You are being invited to take part in a research study seeking to understand perceptions of competitiveness in Emergency Medicine.  As part of this study, we invite you to complete the online Emergency Medicine Application Tool for Common Hang-ups (EMATCH) questionnaire. 

Your responses will help inform and improve our understanding of current applicant perceptions of competitiveness. Your responses are anonymous. The demographic data we are collecting includes your Medical School year, gender, ethnicity, and race. These will be used to ensure we have a representative sample. No identifying information is included in the survey. 

Taking part in this research study is totally your choice. You can decide not to participate or to stop taking part in this research study at any time for any reason by stopping the survey. Doing so will not affect how you are treated at Baystate Medical Center and will not affect your educational standing or applicant status.

We hope to recruit at least 250 subjects to complete the questionnaire.

Risks and Costs: There are no risks or costs associated with this study. Though unlikely, you may feel uncomfortable answering some questions on the survey, so you may choose to skip questions. However, your participation in this study benefits your education and the education of future applicants by contributing to a better understanding of perceptions of competitiveness and giving you direct access to evidence based advising. 

If you agree to take part in this research study, your personal information will not be linked back to you. Data will be kept in a password-protected database, accessible only by the PI, Dr. Lucienne Lutfy-Clayton.

Who to Contact:For questions about the study or if you believe you have experienced a complication or injury as a result of participating in this study, please contact the PI, Lucienne Lutfy-Clayton, by phone or email (see below). If you have questions about your rights as a research study subject, call the Baystate Medical Center Institutional Review Board (IRB) at (413) 794-4356.  

Next Steps:  If you choose to participate, please complete the EMATCH Questionnaire. Your completed survey will serve as your consent to participate in the study. 

If you choose not to participate in the study, you may keep this consent sheet for your future information. 

This research study has been reviewed and approved by the IRB of Baystate Medical Center. 

PRINCIPAL INVESTIGATOR (PI) CONTACT INFORMATION:

Lucienne Lutfy-Clayton MD, Assistant Professor UMMS Baystate

Email: lucienne.lutfy-clayton@baystatehealth.org; Phone number 413-794-5999

Department of Emergency Medicine, 759 Chestnut St, Springfield MA 01199

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Authors: Mark Olaf, DO and Liza Smith, MD, on behalf of the Advising Students Committee in Emergency Medicine (ASC-EM)

pexels.com

The Standard Video Interview (SVI) was developed by the Association of American Medical Colleges (AAMC) in response to the perceived need to provide information which would speak to the professionalism and communication skills of an applicant distinct from the academic metrics already available in the Electronic residency Application Service (ERAS) application.  The SVI was added as a required component of the ERAS application for students applying to Emergency Medicine (EM) in the 2019 Match and will continue to be required for the current 2020 Match cycle.

Over the year since our last SVI update, two papers have been published in Academic Medicine addressing the validity of the SVI as an independent evaluative tool as well as the reactions of residency program leaders to its implementation.  

The first paper, Innovation in Residency Selection by Steven Bird et al., methodically describes the rigorous development and creation process of the SVI for use as an evaluative tool and makes the case that it does appear to represent a unique facet of the ERAS application.  

The authors describe the implementation of the initial institutional review board (IRB)-approved research pilot study which took place from June to December 2016 as well as the operational launch of the SVI for the 2018 ERAS application cycle which collected data from 3,532 emergency medicine applicants.

Analysis of the data from both the pilot and operational launch showed distributions that were nearly normal.  Only small correlations were noted between the SVI score and USMLE Step 1 score, Step 2 CK score, Step 2 CS score, Alpha Omega Alpha Honor Medical Society membership, and Gold humanism Honor Society membership. The pilot study results showed small differences in score means when compared among races, suggesting racial or ethnic influence on SVI scores. Additional training was provided to the operational phase SVI raters, none of whom were the same as the raters for the pilot study. On review of the data from the operational launch, comparisons between the SVI and race/ethnicity in the operational phase yielded no differences, thus effectively eliminating any racial or ethnic influence.  In addition, in the pilot study, score mean differences were medium or large when comparing applicant type (United States MD, Osteopathic, Foreign Medical Graduate and International Medical Graduate) and these differences persisted into the operational phase.

The authors offer that although early measures of validity for the SVI demonstrate promise of its utility as a unique evaluative tool in the application process, their conclusions are not without reservation.

They also bring up the issue of cost and resource utilization required to expand or even maintain this large-scale operational assessment which might necessitate transitioning to computer-based scoring of video interviews.  Computer-scoring is widely adopted in high-stakes writing assessments, but in order to even begin to accept the premise of professionalism and communication competencies being scored by a computer, it will need to be demonstrated that this method can be reliable, fair, and valid.  They also comment on the potential pitfalls related to the variability with which the SVI scores might be incorporated into the residency selection process and caution that these scores should be used in the broad context of holistic application review and not as another filterable data point.  

Program response to the operational phase of the SVI is addressed in the second paper, The AAMC Standardized Video Interview: reactions and Use by Residency Programs During the 2018 Application Cycle by Fiona Gallahue, et al.  An initial study was performed in November 2017 and utilized a program director survey to evaluate reactions to the SVI during the operational phase followed by an additional study done in January 2018 which analyzed each program’s usage of SVI video responses.

Survey results were available from 125 programs while video usage analysis from all 175 program was available. Survey data indicated that program directors were cautious regarding their use of the SVI when evaluating applicants for interviews or ranking.  Most program directors did not use the SVI in the selection process, with the most common reason cited for watching a video was curiosity. Programs were more likely to view videos of applicants with higher USMLE scores and United States MD applicants. More than half of programs indicated that they would be at least somewhat likely to use the SVI scores or videos in the future, demonstrating a willingness to add the SVI into the already complicated process of evaluating residency candidates.  

Among those programs that utilized scores in evaluating candidates for interviews or ranking, mixed reactions and divided opinions were apparent. Slightly more than half (54%) of programs utilized the SVI as part of the applicant selection process, while at the same time 70% of  of programs reported that the SVI was not important in deciding whom to invite for interviews. The most common use of the SVI was as a “tie-breaker” between another applicant with a similar profile. The methods of discerning meaning from the SVI varied as well, with a third of programs watching videos in infer meaning, and fewer programs using the SVI score distribution and percentile rank tables, and performing a direct comparison to other objective measures on the ERAS applications.

On the other hand, 46% of programs didn’t consider the SVI at all, often citing uncertainty about the validity of the SVI as reasons for not using the score.

These mixed reactions prevent us from drawing clear conclusions about adoption of the SVI, however, it may simply be that the SVI will follow the usual pattern of adoption of novel ideas or technology, with relatively few early adopters, followed by a gradual increase.

Uncertainty regarding whether the AAMC will continue to conduct the SVI beyond the 2020 Match cycle remains, as do questions about potential costs to students and/or programs should it be incorporated in the long term.  The AAMC continues to partner closely with representatives from the Society for Academic Emergency Medicine (SAEM), Association of Academic Chairs of Emergency Medicine (AACEM), Clerkship Directors in Emergency Medicine (CDEM), Council of Emergency Medicine Residency Directors (CORD), Emergency Medicine Residency Association (EMRA), American Academy of Emergency Medicine Resident and Student Association (AAEM-RSA), and the AAMC Group on Student Affairs (GSA) to address this issue and others in addition to ongoing research regarding the SVI’s incorporation and utility.  Future research into the SVI is ongoing, including seventeen EM programs having partnered with the AAMC to investigate the relationship between SVI score and performance outcomes during residency. Undoubtedly, the AAMC hopes to demonstrate that the SVI is measuring attributes that translate into an evaluation of professional and interpersonal skills that are evident in residency training.

All that being said, students applying to emergency medicine for the ERAS 2020 Match are expected  to participate in the SVI between early June and July 16, 2019. They should be encouraged to access the free, online preparation materials provided by the AAMC and reassured that expensive and exhaustive commercial preparation has not been shown to offer any benefit over these free resources.  The CORD Advising Students Committee hopes to contribute to future progress on understanding the validity, impact and use of the SVI through ongoing research through the AAMC and anticipated surveys of applicants and advisors regarding preparation for the SVI by students. We’ll keep you updated on any new developments.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Authors: James Willis (SUNY Downstate), Jordan Specter (Boston Medical Center), Kerry McCabe (Boston Medical Center), Megan Fix (University of Utah), and Eric Shappell (Harvard University) on behalf of the CORD Education Committee
*NEW* CORD Education Consult Service

CORD’s Education Committee is excited to announce the Education Consult Service, aka the ECS!

The ECS is designed to help program leaders take advantage of the collective mind and experience of their peers within the CORD community. There is no reason to agonize over residency administration issues and reinvent the wheel when CORD includes hundreds of colleagues who have dealt with similar concerns and are able to provide guidance. The ECS is intended to connect residency program members with expert consultants in a variety of residency-related content areas in an effort to relay best practices and expert-recommended solutions to common residency program issues. This service is free to all CORD members.

The content areas for the ECS include:

  • Program Design and Management
  • Leadership Skills
  • Managing Accreditation
  • Faculty Skills Development
  • Curriculum
  • Teaching Methods
  • Resident Interface

We have a diverse, qualified group of consultants ready to help with your needs. Given the broad geographic diversity of ECS consultants, most consultations will occur remotely. Communication and recommendations will be provided digitally and, with the consent of the consulting program, archived (with or without de-identification) for future reference.

Click here to request a consult: Request A Consult

Don’t have any pressing issues now but want to offer your services as a consultant? Click here to become a CORD Education Consultant: Become a Consultant

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Authors: Liza Smith, MD; Emily Hillman, MD; Jamie Hess, MD; Seth Kelly, MD; Katelyn Harris, MD; Alexis Pelletier-Bui, MD; and Adam Kellogg, MD on behalf of the CORD Advising Students Committee in EM (ASC-EM)

This applying guide is intended for students interested in applying to emergency medicine (EM) but who have had academic struggles, professionalism concerns, or other potential red flags that may affect their ability to match. A printable version of this guide can be found here.

pexels.com

General Overview
Each year there are more applicants for emergency medicine (EM) residency training than available positions. Predicting which applicants are unlikely to find a match is an ongoing challenge in EM advising. The at-risk applicant is one who, for a variety of reasons, may fall into the less competitive end of the applicant pool and may not be able to have a successful match in EM. This concern is often due to a red flag in his/her application. When the term “red flag” is used in medicine, it indicates a warning sign suggesting more serious pathology, such as the red flags for spinal cord compression in back pain. This terminology has been adopted by application reviewers to refer to signs in an application that raise concerns about an applicant.

Knowing if you are an applicant who will raise red flags in the mind of a program director (PD) is really important for planning your application strategy. If you do have one or more of these warning signs, you are at risk of not matching in EM. You will need to do everything you can to minimize the impact on your application and be proactive about considering a non-EM backup application strategy. Finding an EM position after an unsuccessful match is very unlikely. The best alternative training opportunities, including those that allow for re-application to EM, are going to be those planned in advance of the match with a parallel application to another specialty.

“Red flags” tend to fall into one of three categories: academic struggles, professionalism concerns, and unexplained gaps in the CV. These are not all weighted equally, but any one of them can negatively impact your chance of matching.

Academic Struggles

1. Failure of the USMLE or COMLEX exam
Residency programs are evaluated on the rate at which their graduates pass the boards when they finish residency. It has been demonstrated for many specialties, including EM, that not passing the USMLE or COMLEX is a strong predictor of struggling to pass later exams.(1,2) This correlation leads program directors to worry about applicants who struggle on these types of knowledge assessments. In a survey of EM education faculty conducted by this committee, approximately half of programs will not consider an applicant who failed USMLE step 1, however almost all do consider applicants with below average scores.(3)

What to do? If you have failed a portion of the USMLE or COMLEX, it is critical to retake and pass as soon as possible. These marathon testing scenarios are challenging. In addition to continuing to bolster your knowledge base, taking a course in test-taking strategy can be extremely helpful for many students. In addition, students who perform poorly on or fail USMLE Step 1 should plan to take Step 2 CK early in order to have scores available when submitting the ERAS application in mid-September. An improved performance on Step 2, even just raising your score to average, will reassure programs and increase the likelihood of an interview.3 Failing USMLE Step 1 almost always warrants a nonEM backup plan, though below average scores do not. Because USMLE scores are often used as a filter for programs when reviewing applications, students with below average scores will need to be strategic in selecting programs that are less likely to screen out their applications based on this factor alone (see Figure 1).

2. Failure of a preclinical course or repeating a preclinical year
Failing a pre-clinical course or repeating a year of study typically indicates a struggle with accumulating a strong knowledge base and translating it into testing scenarios. Approximately 70 percent of programs will ‘rarely or never’ (<3 applicants/year) interview an applicant with a preclinical course failure on their transcript or MSPE.(3) However, the impact of a successfully remediated course that does not appear as a failure on the final transcript is less clear.

What to do? Successfully retaking a course is absolutely necessary to mitigate any concerns. If a failing grade will remain on the transcript, a non-EM backup plan must be considered.

3. Failure of a clerkship
Failing a clerkship or other clinical experience is even more worrisome than failing a preclinical course. These can be deal breakers to a program director due to concerns over potential professionalism issues. Nearly all programs reported ‘rarely or never’ interviewing applicants with a clinical course failure.(3,4) Again, the impact is less clear for a remediated course that no longer appears as a failure on a transcript or MSPE.
What to do? In addition to successfully repeating the clerkship, the circumstances around the failure need to be explained in the personal statement and/or MSPE and a non-EM backup plan should be pursued.

4. Negative feedback on the Medical Student Performance Evaluation (MSPE; Dean’s Letter)
The MSPE usually includes feedback given on your clerkship evaluations and occasionally can include constructive feedback that paints the applicant in a negative light, such as lack of interest, multiple absences or consistent tardiness, not paying attention, etc. When such constructive feedback is present in the MSPE, it is a source of concern for programs.
What to do? It is important to fully review your MSPE so you can address it in your personal statement and take ownership of any potentially negative feedback. The impact of the presence of negative feedback on your application varies by the situation and your ability to explain it. If negative comments are associated with a failed or repeated clerkship, a non-EM backup plan should be strongly considered.

Professionalism Concerns

1. Academic Misconduct

Academic dishonesty speaks to the character of the applicant and raises concerns about how the applicant will meet the legal, ethical, and professional obligations of a physician. All programs report ‘rarely or never’ interviewing candidates with a history of academic misconduct.3
What to do? If you have been involved in proceedings related to academic misconduct during your medical school tenure but are still on track to graduate, you must have convinced your school that there was a misunderstanding or that you have been rehabilitated. You can certainly try to restate your case for application reviewers in your personal statement, but in a specialty as competitive as EM, it is unlikely you will be offered enough interviews to match. If you move forward with applying to EM, a non-EM backup strategy must also be pursued.

2. Misdemeanor or felony history

There are two types of people in the world: those who learn from their mistakes and those who don’t. For instance, if your response is to blame others, make excuses, and continue to make the same mistakes, your past is likely to drag your application down. Approximately 70 percent of programs ‘rarely or never’ interview candidates with legal trouble on their record, such as DUI or drug possession.(3)
What to do? Take some time to truly reflect on your experience, identify how you could have handled the situation in a differently, and be able to articulate what you learned from it. ERAS has a text box where applicants provide narrative comments regarding a misdemeanor or felony. If you accept responsibility, take ownership of your mistakes, and can demonstrate making conscious changes for the better, some program directors may look past this blemish. A non-EM backup plan should be considered.

3. Unexplained gaps in your CV

If you have taken time off during medical school or if there are long periods of time unaccounted for on your CV, these gaps need to be addressed in your application. PDs may become concerned if an applicant demonstrates a history of not being able to complete a curriculum or course requirements in the usual time provided. Approximately 75 percent of programs ‘rarely or never’ interview candidates with unexplained gaps in their CV. (3)
What to do? There can be good reasons these gaps happen, and you are best off explaining up front in your personal statement or MSPE. Do not rely on the hope that they go unnoticed or that you can get away without explanation. If you leave these gaps to the imaginations of applicant reviewers, they will assume academic struggle or a professionalism issue.

The Best Defense is a Good Offense

In 2016, the AAMC recommended a new format of the MSPE with the goal of offering a more accurate and objective summary of student performance. The new format more directly compares your performance with your peers and highlights adverse parts of your application, such as professionalism deficiencies. For more information, visit this website.

Most advisors recommend addressing red flags in your personal statement. This is the first place that someone reviewing your application is going to look for an explanation. If they do not find one, there is little incentive for them to go any further in considering you for an interview.

You should explain mitigating circumstances that led to your failure of a USMLE or COMLEX exam, or failure of a clerkship, but be careful not to make excuses. In other words: Take responsibility for what happened. Describe the steps you have taken to remedy the issue and how you emerged from these challenges better prepared for a career in EM.

Have an advisor review your personal statement and give feedback. They should be a useful resource with insight on how your explanation will be interpreted. Things happen, life is complicated, and reviewers can understand this—if you give them the chance.

Applicants need to recognize the limitations of any of these strategies for managing red flags. Every effort should be made to explain the circumstances to better inform the application reviewer. However, many times the application will not be reviewed because of the use of ERAS filters by programs. The table below shows the results of a survey of EM residency program directors on the use of filters. (4)

Resources such as EMRA Match can be helpful in determining which programs are likely to use some of these filters. Looking for programs that report considering applicants with Step 1 failures or that acknowledge using certain Step 1 cutoffs can help an applicant target his/her applications to programs that are more likely to fully consider their application. For other red flags, it is hard to predict how programs will react. These applicants are best served with a broader application strategy and early, proactive discussions with their advisor about a non-EM backup plan.

Key Points

1. What does it mean to have red flags in your application?
“Red flags” refer to signs in an application that raise concerns about an applicant. They tend to fall into three categories:

  • Academic struggle (such as failing the USMLE or repeating a preclinical course or year)
  • Professionalism concerns (such as academic misconduct or having a misdemeanor/felony history)
  • Unexplained gaps on your CV

2. How should I address a red flag?
It may be tempting to hope it will go unnoticed by all of the experienced reviewers who will be looking at your application. In almost all cases, it is a good idea to use your personal statement as a vehicle to address any red flag by explaining what you have learned and how you have grown from the associated experience. Early, proactive discussions with an advisor familiar with EM residency applications and having a non-EM parallel plan or backup plan is invariably a good idea. The need for a backup/parallel planning depends on the red flag present and on how effectively it can be addressed and mitigated. Using resources such as EMRA Match can help an applicant be strategic about targeting programs that are more likely to be open to considering their application.

References

  1. Caffery T, Jones G, Musso M. Predicting Initial ABEM Board Passage Rates Using USMLE Scores. West J Emerg Med. 2016;17(4.1).
  2. Harmouche E, Goyal N, Pinawin A, Nagarwala J, Bhat R. USMLE Scores Predict Success in ABEM Initial Certification: A Multicenter Study. West J Emerg Med. 2017 Apr;18(3):544-549
  3. 3. Council of Emergency Medicine Residency Directors Advising Students Committee in Emergency Medicine. (2018). [CORD ASC-EM Advising Addenda Study]. Unpublished raw data.
  4. 4. Jarou Z, Kellogg A. Diagnosing the Match: Trends in the Applicant Selection Process. EM Resident. 2018. (Accessed May 25, 2018 at https://www.emra.org/emresident/article/diagnosing-the-match-trends-in-theapplicant-selection-process/)
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Author: Linda Katirji, MD, Assistant Clerkship Director at University Hospitals Cleveland Medical Center, Cleveland, Ohio, on behalf of the CORD Advising Students Committee in Emergency Medicine (ASC-EM)

wikipedia

The past week, the CORD Advising Students Committee in Emergency Medicine met during  CORD Academic Assembly 2019 in lovely Seattle, Washington. The committee has had a busy and exciting year, with efforts directed towards improving and developing advising resources for medical students applying to emergency medicine.

This post serves as a brief update from the committee after our meeting at CORD.

CORD/EMRA Student Applying Guide

This past year has been an extremely productive time for ASC-EM. Our committee continues to grow in size and with that comes fresh ideas. Our group was lucky enough to have the opportunity to collaborate with EMRA on a handbook for applying to EM. It is currently on the press and should be in the hands of every medical student EMRA member in the next few months!

New Special Population Applying Guides!

We also developed many new applying guides including the Underrepresented Applying Guide, Orphan Applicant Applying Guide, and Dual Accreditation (EM/IM, EM/FM) applying guide (which can all be found here). In the next year, our plan is to continue to update our previously published applying guides as well as add even more, including a “latecomers” applying guide for students who decide to apply to emergency medicine late in their medical school career, and for those who have to enter into the SOAP.

EM Stud Podcast Collaboration

ASC-EM was also lucky enough to collaborate with EM Stud Podcast for the first time to produce a podcast about our applying guides. Listen here!

ASCEM looks forward to collaborating with Drs. Lewis and Wieters in the future.

Distance Advising Service

Currently in the works is the development of a distance advising service. We plan to develop a service of faculty members and residents from ASC-EM to help facilitate advising specifically to students who do not have “home” EM programs at their medical schools (aka Orphan Applicants). Stay on the look out for this!

If you are interested in joining ASC-EM or have any other ideas on ways to improve medical student advising in emergency medicine, comment below or email myself on behalf of ASC-EM at lindakatirji@gmail.com.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Authors: Zhang XC, Abrams M, Papanagnou D, Thomas Jefferson University, on behalf of the Advising Students Committee in Emergency Medicine

Providing timely feedback is a critical component in medical education in promoting learning and learner self-reflection as they acquire new skills and knowledge in the ever-changing field of medicine. Feedback should be provided in a safe space, be based on direct observation, with specific suggestions for future improvement. While there are innumerable publications on how to provide appropriate, receptive feedback, many instructors and faculty members receive little training on writing a constructive and explorative student evaluation as part of a formative assessment process. Abbreviated commentaries such as ‘good job’ or ‘read more,’ provide little insight into the mindset and behavior of the student and can result in unexplained discrepancies between the students’ objective scores and their written narrative evaluations. 

As a food enthusiast and Yelp (www.yelp.com) reviewer, I am met with many similar challenges shared by my academic colleagues in writing a constructive summary (or evaluations) of a memorable experience. While the stakeholders of a restaurant review and student evaluation may differ, both pieces share common literary constructs and conceptual frameworks for a review that accurately and eloquently portrays every aspect of the encounter.

In order to guide evaluators in writing formative and constructive student evaluations, we have collected, reviewed and revised five lessons that transcends both culinary and pedagogical dimensions.  

Lessons 1.       Take a [mental] picture

Just as how food enthusiasts photograph their culinary adventures as a keepsake, clinician evaluators can also take mental pictures of the students’ performance, simply by empowering the learners to recall their own clinical experience for you. While many patients may invoke powerful memories recall with the near superhuman ability to remember specific nuances of their encounter, remembering the details of a student’s performance after a hectic day is trying and unreliable. As an evaluator, you can ask the student to send a formal evaluation request with the following information: 1) the student’s professional photo, 2) his/her patient encounters of the day, 3) learning points, 4) areas of strength, and 5) ways to improve for the next day. This simple act allows the learner to actively recap their clinical experience while invoking the evaluator’s memory of the day’s events.

2.       Document every detail

As each description from the impeccable selection of the avant-garde décor, to the artistically plated legumes, to the amicable staff transport the reader into the reviewer’s culinary experience, clear and detailed documentation is as crucial in recording medical events as student behavior and evaluation. Each specific recording of the students’ behavior provides a formative marker to track their specific growth as students and in some cases, mark concerning habits for the program leadership that should be addressed during their training.

3.       Use a rubric

Similar to a food critic’s reviews, medical students in the United States are also evaluated based on a set rubrics or criteria. The Association of American Medical Colleges (AAMC) lists 15 core competencies before medical school graduation; the Accreditation Council for Graduate Medical Education (ACGME) uses a similar set of six core competencies. Depending on the learner and institution, the evaluator should adhere to the strict phrasing for the level of competence within each gradable competencies to minimize subjective grading and ensure fair comparison between the student and his or her peers.

4.       Leave room for improvement

Even the best restaurant reviews have flaws, and it falls upon the reviewer to illustrate areas of improvement to make the dining experience even more memorable with each visit. In medicine, students are held to a high-stake and an even higher pressure where misinformed actions or unidentified knowledge gaps may cause inadvertent harm to patients. As such, clinical evaluators must strive to provide active, constructive feedback based on direct observation after every student encounter, as well as documenting these elements on formal evaluations to be later reviewed by the program leader stakeholders, such as clerkship directors or program directors. These documented areas of improvement can be instrumental in providing targeted remediation and mentorship.

5.       Check-in before you check out

While many commercialized review apps enable end-users and reviewers to ‘check-in’ to the restaurant and with periodic reminders to seek out personal opinions and evaluations of their experience, the formal medical evaluation process often requires learners to send out individualized evaluation requests on a password-protected hospital server, making it difficult for evaluators to input their reviews without student-prompts. While there is a limited market for a commercialized smartphone apps to check-in for an evaluation, evaluators can circumnavigate this by requesting the student to send an evaluation request at the beginning of the shift as a ‘check-in’ reminder. This check-in serves as a reminder for both the evaluator to complete the evaluation, and for the student to acknowledge that his/her performance will be critically assessed during the shift.

In conclusion, constructive, narrative, written feedback is an integral part of formative and summative evaluation in health professions education. We hope the insights from the food blogging world may prove helpful for these individuals. The five-step guide to develop a 5-star review will support evaluators as they aim to provide written, constructive feedback to their students.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Authors: James Ahn, MD, MHPE (University of Chicago), Christine Babcock, MD, MSc (University of Chicago), and Navneet Cheema, MD (University of Chicago) Introduction:

In 2016, three years after the Self-Study was introduced through the Next Accreditation System (NAS), the Accreditation Council for Graduate Medical Education (ACGME) offered select programs the opportunity to enroll in a pilot of the Self-Study program. There is limited literature outlining the optimal implementation of the Self-Study in EM, therefore forcing any program embarking on the process to create timely de novo hypothesized best practices.  The formation of this methodology informs this article which means to function as a template to the Self-Study process for EM programs.

Timeline:

As a program prepares for its Self-Study, it is important to have a projected timeline for organizational purposes.  Guralnick, et al., proposes a 20-week timeline for a typical large Internal Medicine program. Our process from formation of the Self-Study Committee (SSC) to submission of the summary spanned 12 months.  During this time, we had seven SSC meetings, three stakeholder meetings, conducted an electronic survey, and communicated by email regularly.  Our pilot Self-Study visit occurred three months after our document was submitted, for a total of a 15-month process.

Committee Formation:

The formation of a SSC is central to a successful Self-Study process. The ACGME suggests that the Program Evaluation Committee (PEC) be an ideal starting point given the substantial overlap in their data analysis and goals. Alternatively, the program could assemble a smaller, targeted committee for more efficient meetings and agile decision-making processes.  Broader representation to include additional stakeholders, such as recent graduates, nurses, and technicians can be beneficial as they contribute significantly to the environment and outcomes of the program.  We chose a smaller committee including the program leadership, chief residents, program coordinators, and selected core faculty.  This helped to facilitate productive discussions, meeting coordination, and task completion.  We engaged broader stakeholders through supplementary discussions and surveys.

Developing Aims:

An essential goal of the Self-Study is to develop program aims that are intended to describe the culture, product, and vision of the program.  They are in essence the core of what the program is and aspires to be when answering the question: “What type of physicians do we produce?”.   First, we had a facilitated idea-generating session at a residency-wide meeting.  We developed a word cloud from comments during that session that helped us focus our aims on recurring themes.  We next conducted a brief focus group during a faculty meeting.  Lastly, we performed a qualitative analysis on results from these focus groups and an alumni survey to triangulate our results.  This multi-pronged approach allowed for an iterative process of aim derivation that was reflective of our program.

Opportunities and Threats (SWOT analysis):

While the SWOT analysis has been a successful tool in the business world for decades, its use is newer in education.   It encompasses the process of identifying strengths, weaknesses, opportunities, and threats to the program. To gather data for our SWOT analysis, we had the chief residents hold a resident only meeting to facilitate candid feedback regarding the program.  Similarly, we had senior faculty members serve as objective facilitators and lead in-person and email discussions with the entire faculty to compile parallel data.  These two processes served as the basis for the SSC to finalize our SWOT analysis.  This process took place over months in an iterative manner.

 

Longitudinal Data Review:

One of the final critical components of the Self-Study is longitudinal data review.  This is an in-depth assessment of the yearly program evaluation data since the previous accreditation review.  This includes the ACGME resident and faculty surveys and a yearly program specific internal evaluation completed by both residents and faculty.  It explores which issues have been identified in the past and the action plans that have been implemented to address the shortcomings, as programs are expected to track their outcomes and improvements related to identified issues.

Self-Study Summary:

The SSC should aggregate their findings into a succinct Self-Study Summary (SSS).  This document is the only deliverable to the ACGME and has a 2500 word count limit.   As the ACGME requires transparency of the entire process, the program must be careful to document each step of the Self-Study contemporaneously. Notably missing from the SSS is any information about the program’s strengths and areas of improvement; the ACGME specifically excludes these items to allow programs to perform and report on these assessments during the 10-year visit.

Tips and Pitfalls:

After following the unique Self-Study process described above we successfully participated in the Self-Study Pilot Visit.  We have incorporated their feedback and the feedback of our key stakeholders in the process to identify the following tips to assist in the development and implementation of the Self-Study process.

  • Engage all stakeholders: The aims define what makes your program unique and will be the basic platform for the rest of the Self-Study.  As such, it is imperative to have robust involvement from all stakeholders including current residents, faculty, alumni, and administrative staff.
  • Develop measurable aims: It is important to have measurable aims with the ability to generate deliverables on an annual basis.  This will ensure that your program is still represented annually by the description generated by the Self-Study process.
  • Establish a timeline: We recommend a longitudinal ten to twelve month timeline to tackle each portion of the Self-Study process on a scheduled basis.
  • Set an agenda: Having a clearly defined agenda for each meeting and informing the committee in advance of the agenda and tasks to be completed at that meeting will allow for increased productivity. This will permit the actual meeting time for activities that mandate group participation such as consensus building, brainstorming, and discussion.
  • Assign asynchronous tasks: Drafting final text, editing, and detailed work regarding data analysis and reporting should be assigned at the end of each meeting as tasks to be completed asynchronously. This model will provide a roadmap that dedicates appropriate time and discussion to each portion of the process while respecting time demands of busy residents and faculty.
  • Approach the SWOT with transparency and embrace the opportunity to evolve: It is essential that there is an honest analysis into both internal and external stressors that affect the program. Throughout this process, we learned that every threat and opportunity offers the antithesis. Upon initial review, stressors may be identified that threaten the program in various capacities.  However, upon secondary review the committee may be able to find the silver lining that actually offers an opportunity in response to the threat.
  • Have a balanced approach to change: When identifying weaknesses, we discovered that it is important to report on “low hanging fruit” on an annual basis. We recommend addressing fixable weaknesses via the APE annually and slowly chipping away at larger institutional or programmatic issues over time. The key is to demonstrate a dedication to continually evolve the program both on small and large-scale levels while generating positive momentum.
Conclusion:

Since the implementation of the ACGME Self-Study in 2013, no EM programs have formally participated in the Self-Study Site Visit. Our program opted into the Self-Study Pilot Visit and learned some valuable lessons regarding optimizing the Self-Study process. This commentary provides a description of our processes as well as valuable tips based on feedback from stakeholders and the Pilot Visit ACGME representatives.

References:
  1. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system–rationale and benefits. The New England journal of medicine 2012;366:1051-6.
  2. Philibert I, Lieh-Lai M. A Practical Guide to the ACGME Self-Study. Journal of graduate medical education 2014;6:612-4.
  3. Philibert I, Nasca TJ. The Program Self-Study and the 10-Year Site Visit: Rationale for a New Approach. Journal of graduate medical education 2015;7:310-2.
  4. Coyle J, Martinez S, Robertson WW, Jr., Philibert I. Testing a site visit approach for the next accreditation system. Journal of graduate medical education 2013;5:349-51.
  5. Self-Study: Eight Steps for Conducting The ACGME Program Self-Study. (Accessed May 30, 2017, at http://www.acgme.org/What-We-Do/Accreditation/Self-Study.)
  6. Guralnick S, Hernandez T, Corapi M, et al. The ACGME Self-Study-An Opportunity, Not a Burden. Journal of graduate medical education 2015;7:502-5.
  7. Robbins JB, Sarkany D. Self-Study: Practical Tips for a Successful and Rewarding Experience. Academic radiology 2017;24:721-4.
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Author: Linda Katirji MD, Assistant Director of Medical Student Education, University Hospitals Cleveland Medical Center, Ohio on behalf of the Advising Students Committee in Emergency Medicine (ASC-EM)

As applications for Emergency Medicine away rotations begin to open, many students have been asking what programs are looking for when they request a letter of interest.

The letter of interest essentially is a way for clerkship directors to see who is genuinely interested in a the program. It is meant to ensure visiting student rotation slots go to EM bound applicants who are genuinely interested in a program/area.  As far as visiting student rotation applications, some programs view it as the single most important piece of data we use to select which students to invite for rotations.

These statements are brief, but have the potential to set students apart from the crowd. This post gives a few basic guidelines for students writing letters of interest to apply for EM away rotations.

1. Follow directions!

This is possibly the simplest YET most important piece of advice for letters of interest. If a program has asked a certain question to be answered –  make sure the student’s letter answers them. If there is a word limit or guideline for length, stick to that.

2. Show interest

Students will have the opportunity to write his or her personal statement on their ERAS application – this should be a true letter of interest in the program. The letter of interest is an opportunity for students to say why they want to rotate at a program, in the area, or a combination of both.

Some questions you students may consider are:

  • What are they hoping to gain from the rotation experience?
  • Are they looking for more exposure to urban EM vs. community EM?
  • Do they want to be near family/friends/ support system?
  • What is their potential connection to the rotation site?
  • Are they interested in sub-specialties that our program may offer (toxicology, US, EMS, etc)?

The letter of interest allows some excellent candidates to surpass the objective grading system based on their sheer interest in the program.  It is helpful for programs to see who genuinely had an idea of why they would want to spend time rotating – so make sure students use this section of their application to their full advantage.

3. Highlight yourself

Have the student briefly highlight anything about themselves of interest whether it be prior accomplishments or future goals.  Based on things they my identify about themselves, the letter of interest can also potentially provide a sense of “fit” for the program.

4. Show commitment

Students can describe “why EM” and include aspects of the variety of patient, procedures, leadership, and personal qualifications. It is important and okay to touch on this, however programs understand that many students applying in early MS3 year may not have even had an EM rotation.  Some have had only limited exposure with their EMIG or shadowing, or maybe prehospital exposure and are looking for the rotations to solidify their decision.  Have students be prepared to delve further into this in their personal statement

5. Keep it brief – less is more

Clerkship directors and program administrators will have many applications to read through, and ultimately things blend together. The student should emphasize quality over quantity!

If you have any more tips for students writing letters of interest for away rotations, leave them below!

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Author: Ash Rider, MD, Highland Emergency Medicine R4, on behalf of the Advising Students Committee in Emergency Medicine (ASC-EM)

Congratulations! You have chosen the best specialty, filled with incredible individuals and a rewarding career. No matter where you end up for training, you will graduate an ED doc prepared to handle anything, and that is SO special. The question in front of you—where do you go to meet that end?

You just interviewed at many different training programs, perhaps all over the country. You’ve chatted with hundreds of residents and given your most heartfelt responses in interview after interview. You’ve sat in airports for hours and spent more money than you’d like to admit on travel costs. Hopefully along the way you enjoyed visiting new cities or catching up with friends you haven’t seen in years. It was fast-paced and exhausting while you were at it, but now the more challenging task is at hand: the process of reflecting back on your experiences and crafting your rank list. I write this as a fellow trainee with the hope it helps you shift through your thoughts and impressions on which programs are your “best fits.” Part of that process is defining your own fit, as well as personal and professional priorities in the next 3 to 4 years.  If you are having any difficulty in creating the layers of your rank list, I hope that answering these simple 3 Reflection Questions (3RQ) will shed some light on where you will best flourish as an emergency medicine resident.

RQ #1: Where do you want to be?

Are you someone who prefers to be near your family and longstanding friends? What would it be like for months to years without having these people near you? If that is not a future you can envision, then look no further–you’re meant to stay put, so emphasize programs locally. For those who have a partner who is obligated to a particular city, this is also an easy question to answer. Perhaps you’ve built a strong professional community locally that provides many opportunities during residency and beyond. These are some of many reasons why a nearby program may be important to you.

If you are not committed to a particular place, is there somewhere NEW you’ve always wanted to live? I would argue that residency is one of the best opportunities to try living somewhere different. You will be instantly blessed with a group of friends who share a similar mindset, and will cherish becoming close with your class during intern year.

If you are committed to going somewhere new, think about the difference in personalities between the East and the West Coast, the Midwest and the South. With each, consider the varied experience in weather and access to things you enjoy. If you’ve never lived in the Midwest or East Coast, just know that might be new to you to scrape ice off a windshield in -10 degree weather while snowing. If you can’t live without mountains or skiing, going to Texas may not be your best fit. Love the ocean, an avid surfer? Maybe an inland state doesn’t match your profile. Do you adore the changes in seasons, fall, and later snowfall? California’s temperate climate may not keep you entertained.

Down to the nitty gritty within an area—do you need to be in a city? How big? Houston and NYC are both big cities, but the living and commuting experience is quite different. Or do you prefer a rural experience because that is the setting in which you anticipate working some day?

Most likely, many graduates from your program will stay in the area for jobs afterwards. Your alumni network connections will be important for your career after residency. If you left Maryland when you were 18 but plan on moving back to raise a family, residency might be a good springboard for building professional relationships. Asking yourself where is a good first question, but is often the easiest to answer. Ready for more?

RQ#2: What about your clinical experience matters to you?

There are many elements to consider. While it is true that all residency programs provide excellent emergency training, each program is distinct with unique strengths.

Take a look at the curriculum. How many off-service rotations would you do, and when in your training do they occur? If you are invested in Women’s Health, having more than 1 week of OBGYN will be important. If you are critical-care bound, is there adequate ICU exposure and will you be the senior resident in the ICU? Is there enough longitudinal pediatric exposure for you to see manifestations of illness in any season? If you are planning on working at a community site without orthopedic, ENT, or plastic surgery support, do you have ample opportunity to rotate on these services, spend elective time in specialty clinics, and practice in clinical settings without heavy support from these subspecialties? Likewise, if you foresee yourself doing a fellowship in administration, global health, EMS, wilderness, or sports medicine, is there flexibility to pursue these electives? Are there faculty mentors or recent grads to guide you in projects related to your interests? If a fellowship, a particular work setting, or a career trajectory are important to you, take a look at the recent alumni and ask yourself if they mirror the type of practicing physician you want to be.

Residency is in part about your training, but is more importantly about the patients. Your job is to learn, provide excellent patient care, and advocate when necessary. Ask yourself, what is the patient population like at the hospital? Is it diverse enough for you? Insured or uninsured? Do you see the entire spectrum of ages? Will you have the opportunity to take care of critically ill, stroke, STEMI, and trauma patients and be in a role where you’ll be taking a high level of responsibility? Moreover, is the residency program at just one site or do you have the opportunity to explore different practices patterns within emergency medicine? A breakdown that includes county, academic, and community settings is ideal. The VA, Kaiser, and military hospitals are other potential settings, while a children’s hospital is a must for every EM trainee. The thoughtful inclusion of multiple institutions is something that will pay dividends when you graduate, by not only priming you to adapt to multiple settings but also to inform your future career.

Finally, in a gestalt sense, where would you not need convincing of the excellent clinical experience? Ultimately residency is about becoming the best physician that you can be. Are grads prepared to work in any setting after 3-4 years of training?  You only get to be a trainee for a finite period of time, and while we are all life-long learners, you want to graduate feeling prepared to handle anything that comes your way.

RQ #3  – What residency personality did you jive with?

This is the hardest to put into words. Everyone in EM is awesome, right? It might be the most important of the 3RQ, as it is directly related to your happiness and feeling of connection with a group of people. Each program might attract a slightly different personality. Who stands out?

Think back to the socials and mixers throughout the year. Where did you laugh, have the most fun or feel the most “at home” with the residents? Where did you have the most stimulating conversations? How did residents interact with each other? Those people will be your seniors, teachers, and lifelong friends.  Culture is perpetuated.

In their free time, do residents seem to enjoy hanging out with each other and do they have the schedule flexibility to do so? Ask about the projects and research that residents and attendings are involved in. Are they excited about it or are they begrudgingly completing a scholarly requirement? Do they leave the type of legacy that you care about? From critical care to social emergency medicine, there is a breath of areas of interest and intellectual pursuits, but particular programs may attract individuals aligned with their areas of excellence, or have the resources to better support certain types of projects.

How about the interview day? What were your conversations about? Did the interviewers take interest in you? The questions you were asked might in part reflect values that the program holds dear, so think back to those conversations that struck a chord. Did the residency leadership seem present and fully committed to the program? Leadership is everything, so be sure the program directors prioritize the residents and are personally invested in each individual’s success.

After the interviews, did you stay for conference or spend time in the ED? Was conference engaging and relevant? Did residents and faculty seem like they value that protected time to learn, or did it strike you as a chore? If there were M&M presentations by residents, did the audience support the speaker or attack their actions? Did the learning objectives seem to challenge and push listeners in a new way? The point here is to identify the culture of education. An intellectually curious group invested in being excellent clinicians is going to propel you into being a great ED doc.

On shift in the ED, how much autonomy do the residents have over their patients? You want more independence than handholding, but the right amount of oversight.  Are senior residents pushed to see many patients per hour and are they thinking about flow, or is that left for the attending? In codes or airway management, how often does the attending take over? What about handoffs? Sign-out should not be a dumping ground for unfinished tasks punted to the next provider. Within reason, a good sign out reflects respect for your fellow ED colleague. Also think about how the residents interact with each other on shift, with the nursing staff, and with their attendings. Is there a general sense of camaraderie or is the structure hierarchal?

Finally, and most importantly, how do the residents treat their patients? On one side of the spectrum is shot gunned orders and 30 seconds of face time with no bedside manner. On the other is compassionate and through, yet efficient physician who treats every patient with the utmost respect, addresses all needs, and practices evidence-based medicine.  The culture needs to hold patient care and quality of training in the highest esteem.

Every applicant approaches his or her rank list differently. Some kept a running rank list while interviewing, others created a spreadsheet of objective data, and some are going simply off a gut feeling. No matter what your approach, if there is some doubt about the order of your residency programs, ask yourself what is important to you, and use this framework of the 3RQ. If you’ve gotten this far in this blog post, and still feel that this is abstract, consider this strategy: Rate each program out of 10. Use your totals to help you with the final rank list.  Rate each of these 3RQ questions out of 3, and then leave the final 1 point for overall “I would be thrilled to open up this envelope on match day.” It will come soon enough. Best of luck and welcome to EM!

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
Author: Jaron Soulek, MD (University of Oklahoma) “Music is the Shorthand of Emotion”

This phrase composed by Leo Tolstoy may not mean much for most people, but it surely does for the residents of University of Tennessee HSC in Nashville. With more emphasis being placed upon training residents in wellness and resiliency across the nation, the program leadership took advantage of their local “natural resource.” They hired a songwriter for a wellness activity.

This artist came to the program and spent several hours writing a song with the residents. A title was chosen and a chorus written, then residents broke into smaller groups to write verses. Note by note, they would arrange and rewrite. By the end of the day, a full song was prepared, which the artist then recorded and sent back to the program for their enjoyment.

Prior to the event, residents had mixed feelings about spending time writing a song. “There was admittedly a lot of apprehension when the email first came out… I can remember talking with the other residents and being worried about having to sing and perform in front of other people,” said Benjamin Rezny, DO. But those feelings dissipated when the artist played a few lines of a famous song he wrote, surprising the residents who quickly opened up and started having fun with it. Residents were soon relaxed, composing witty, fun, and humorous lyrics that eventually morphed into a more filtered and serious feeling song.

Resident wellness is a central part of the UT-Nashville EM program– it says so right on the front page of their website! While they do many of the mainstream wellness activities such as an intern welcome event, an annual resident retreat, and a resident wellness lecture series, they also have some more novel programs, including resident peer support group meetings, medical missions at home, and other frequent activities like the one detailed above. Zack Olson, MD, recalls a wellness-focused journal club held at an attending’s house. “There was very good discussion that occurred where attendings gave personal stories and anecdotes.”

Mary Jane Brown, MD, is an integral part of the wellness curriculum at this program. Olson states, “She is a true wellness champion. She is constantly trialing, fundraising, and asking for feedback. We are so lucky to have her.” This highlights the importance of a core faculty member becoming the wellness champion at their respective university. Should you be this champion?

“Wellness isn’t just going to the golf course” is a favorite line of the Chairman of my department, Bo Burns, DO. Being an avid golfer, at first I vehemently disagreed. But as I learned, I realize that resiliency training means providing resources that residents can employ when they inevitably have a hardship or face burnout. The songwriting activity does just that – it gives residents a creative outlet to decompress, whether from the daily grind of our profession or after a certain case did not go as planned.

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview