Exploring Student and Faculty Perceptions of Clinical Simulation: A Q-Sort Study
Abstract
Background:
Simulation learning has become a widely accepted and valuable methodology within nursing education. This study assessed whether student and faculty perceptions regarding simulation learning have changed since curricular integration of simulation activities within an undergraduate nursing program.
Method:
Q-methodology was used to identify unique and similar perspectives of 12 faculty and 21 students. Participants completed a brief demographic questionnaire and sorted statements related to beliefs about simulation-based learning.
Results:
Faculty perceptions were captured within one viewpoint—positive enthusiasts. Three student viewpoints were identified: challenge seekers, realistic embracers, and support seekers. Both students and faculty believed that simulation improved critical thinking.
Conclusion:
The findings suggest that faculty should be aware of the range of student perceptions and tailor their teaching approaches accordingly to maximize student learning. [J Nurs Educ. 2015;54(9):485–491.]
Introduction
In the past two decades, exponential growth has been seen in the integration of simulation-based learning to support elements of teaching and learning in many health science programs. In nursing, simulation is now recognized as a widely accepted teaching methodology for effective clinical learning, supported by many documented learning outcomes (Bambini, Washburn & Perkins, 2009; Decker, Sportsman, Puetz, & Billings, 2008; Lasater, 2007; O’Donnell, Decker, Howard, Levett-Jones, & Miller, 2014). To achieve such outcomes, nursing programs across North America and indeed around the world have invested substantial initial and ongoing resources to build and sustain simulation programs. This has included varying levels of simulation technology, including the use of high-fidelity manikins to support clinical learning. Several studies have explored the use of high-fidelity manikins as an educational approach to enhancing competency related to patient safety (Burns, O’Donnell, & Artman, 2010; Durham & Aiden, 2008; Ironside, Jeffries, & Martin, 2009; Sheaer, 2013). Additional studies have further recommended best practices in the use of this teaching methodology (Harder, 2010; Nehring & Lashley, 2009). Although the evidence is not conclusive, there is some indication that the use of simulation-based learning may, in fact, enhance knowledge acquisition, critical thinking (Lapkin, Levett-Jones, Bell-chambers, & Fernandez, 2010; Lasater, 2007), and student confidence (Bantz, Dancer, Hodson-Carlton, & Van, 2007; Reilly & Spratt, 2007).
Between 2003 and 2005, a substantial financial commitment was made to support the introduction of simulation learning in nursing education. Specifically, in Ontario, Canada, the Ministry of Health and Long-Term Care invested more than $2 million (Canadian) toward the use of this technology in programs of nursing. In late 2009, at the time when faculty and students were in the early stages of exploring this teaching–learning methodology, a province-wide evaluation of clinical simulation was conducted (Akhtar-Danesh, Baxter, Valaitis, Stanyon, & Sproul, 2009; Baxter, Akhtar-Danesh, Landeen, & Norman, 2012; Baxter, Akhtar-Danesh, Valaitis, Stanyon, & Sproul, 2009). In the ensuing years, nursing programs developed faculty expertise and pedagogical knowledge for the effective integration of high-fidelity simulation learning in nursing curricula. At the time of the current study, students were regularly immersed in high-fidelity simulation activities at all levels of the baccalaureate nursing program. As a research team, the authors thought it was important to understand what had changed since simulation was first implemented in the nursing program.
In a review of the literature, studies have suggested that perceptions and attitudes of faculty can make a difference in the implementation of simulation learning activities in nursing curricula (Lewis & Watson, 1997; Nehring & Lashley, 2004). Students’ perceptions of their engagement in the learning process have also been shown to influence student outcomes (Popkess & McDaniel, 2011). In addition, it has been identified that students view simulation as an opportunity to practice nursing interventions in a safe environment (Issenberg, McGaghie, Petrusa, Lee Gordon, & Scalese, 2005) and that it provides them with immediate feedback on their nursing practice (Dreifuerst, 2012). Simulation has also been reported to contribute to student anxiety during the learning process (Bantz et al., 2007; Cato, 2013; Ganley & Linnard-Palmer, 2012; McCaughey & Traynor, 2010; Pierazzo, 2014); thus, not all findings to date highlight positive perceptions of this technology. Given that perceptions of faculty and students are important in the learning process and that simulation technology has now become an essential component of the nursing curricula at the authors’ institution, it was important to explore the impact of fully integrating simulation in the nursing program and to determine whether faculty and student perceptions were different or remained the same. The purpose of the current study was to explore patterns of perceptions (commonly called viewpoints in Q-methodology research) about high-fidelity clinical simulation among students and faculty that contribute to, or inhibit, its applicability and implementation in nursing undergraduate education.
Background
At the time of the Ontario government’s financial investment in simulation for nursing education, initial province-wide studies were conducted of faculty and student perceptions of simulation using Q-methodology (Akhtar-Danesh et al., 2009; Baxter et al., 2009). In one of the studies, 28 faculty respondents from 17 institutions identified four major viewpoints (Akhtar-Danesh et al., 2009). Positive enthusiasts believed that simulation had great potential to support learning and, along with supporters, believed it makes learning in the clinical settings much more valuable. In addition, supporters believed that simulation provided opportunities for faculty to facilitate student critical thinking. They agreed that students viewed manikins as clients and disagreed that scheduling was a nightmare. Traditionalists believed that simulation enhanced, but did not replace, clinical experiences and that they would never support less clinical time. They did not think there were enough resources to develop realistic scenarios or that simulation helped students learn the nursing role. The final viewpoint was identified as help seekers, who expressed the need for more education for faculty on simulation and a provincial repository of creative simulation applications. They also shared the viewpoint with traditionalists that not enough human resources existed in their nursing program to fully integrate simulation within the curriculum and that simulation is time intensive for faculty because it is not built into their teaching time (Akhtar-Danesh et al., 2009).
In the second study, 24 students from 17 institutions also participated in a Q-methodology study exploring their perceptions of simulation (Baxter et al., 2009). Overall, students were positive about simulation, and four major viewpoints were identified. Reflectors believed that simulation increased their awareness of their actual ability before working with live patients and assisted in minimizing anxiety. Reality skeptics, on the other hand, were concerned with what they perceived as limited contact with live patients when they had increased access to simulation. Comfort seekers found simulation to be stressful, especially when used for assessment. Both the reality skeptics and comfort seekers did not perceive the manikins as realistic and did not feel that simulation increased their independence in the clinical setting. Finally, technology savvies were comfortable using the equipment and believed they could create their own scenarios and that all of the capabilities of the manikins were not being fully utilized.
Since the completion of the Akhtar-Danesh et al. (2009) and Baxter et al. (2009) studies, there has been a substantial investment in faculty development and trials of different approaches to integrating high-fidelity simulation in a baccalaureate nursing curricula (Blum & Parcells, 2012). At McMaster University, a curricular implementation grant was obtained to further assist with this development and to identify best practice approaches to using high-fidelity simulation. In essence, the goal was to enhance the student learning experience with targeted high-fidelity simulation activities that complemented learning in real practice settings. The grant was used to support one faculty member (J.P.) to guide the pedagogical process for using simulation technology in the program and to hire one project coordinator to assist in implementing and evaluating the learning activities. Several faculty development sessions were held throughout the course of the grant. In addition, the McMaster School of Nursing released three faculty members from some of their teaching responsibilities to further develop expertise in using technology in specialty areas such as maternal child and pediatrics. With educational research into the use of clinical simulation increasing exponentially, the team utilized this rich body of evidence as a platform to enhance learning in the undergraduate program. Thus, it was time to reexamine student and faculty perceptions of high-fidelity simulation learning.
Method
Q-methodology was used to identify the different, unique, and commonly shared viewpoints of students and faculty. In a Q-methodology study, the goal is to uncover different patterns of perception, rather than their numerical distribution among the larger population. Q-methodology is particularly useful in research that explores human perceptions and interpersonal relationships (Chinnis, Summers, Doerr, Paulson, & Davis, 2001). This approach also allows the researcher to ascertain similarities and differences between groups. Furthermore, it provides a way of systematically examining and allowing for greater understanding of the connections between subjective statements (Akhtar-Danesh, Baumann, & Cordingley, 2008). Thus, this is an appropriate research methodology to measure perceptions about clinical simulation.
In every Q-study, a representative list of statements is developed, called the Q-sample. The current study developed two Q-samples based on the perceptions described by faculty and students in the two previous studies described previously (Akhtar-Danesh et al., 2009; Baxter et al., 2009). The statements were reviewed for relevancy against the current literature (Adamson, 2010; Berkowitz, Peyre, & Johnson, 2011; Bray, Schwartz, Weeks, & Kardong-Edgren, 2009; Crouch, 2010; Ganley & Linnard-Palmer, 2012; Guhde, 2011; Healey et al., 2010; Howard, Englert, Kameg, & Perozzi, 2011; Howard, Ross, Mitchell, & Nelson, 2010; Leonard, Shuhaibar, & Chen, 2010; Nehring & Lashley, 2009; Nguyen, Nierler, & Nguyen, 2011; Partin, Payne, & Slemmons, 2011; Reese, Jeffries, & Engum, 2010; Wotton, Davis, Button, & Kelton, 2010). In addition, an analysis of graduating students’ feedback about their experiences with simulation was included in the review of currency of the statements. Then, the Q-samples were vetted for currency and validity by four students and four faculty members who previously experienced high-fidelity simulation. The final Q-samples included 42 and 40 statements for students and faculty, respectively (Tables A–B; available in the online version of this article).
The Q-Sort Table
In a Q-methodology study, when a Q-sample is finalized, a Q-sort table (grid) with quasinormal distributions is developed, having as many cells as the number of statements in the Q-sample. For the current study, the authors developed two Q-sort tables—one for the faculty participants with 40 cells, and one for the student participants with 42 cells (Figure). Each Q-sort table was developed with nine columns and differing lengths. Anchors of −4 (most disagree or least agree) and +4 (most agree) were assigned to the extreme ends of the tables. The columns were numbered sequentially from −4 to +4.

Q-sort tables for students and faculty.
After ethics approval was attained for the study from the Hamilton Integrated Research Ethics Board, student and faculty participants were recruited using similar approaches. A confidential e-mail address was used to correspond with potential student and faculty participants who had responded to mass e-mail requests and posted recruitment flyers targeted specifically to each group. This confidential e-mail address was monitored by undergraduate nursing students who were completing a research practicum that required them to act as research assistants on existing research projects. The use of student research assistants in the current study was purposeful to address any potential perceived power relationships among faculty members, as the authors were colleagues with faculty participants. Also, the investigators did not communicate with any student participants with whom they had a current or previous teaching relationship, nor were they present during data collection. The participants were asked to sort the statements from strongly agree to strongly disagree, using the Q-sort tables and written instructions. They also completed open-ended statements regarding their rationale for their strongly agree and strongly disagree ratings. Participants completed a brief demographic questionnaire to facilitate description of the participant group.
Data Analysis
The PQ Method version 2.35 (Schmolck, 2014) program was used for analysis. The completed Q-sorts were analyzed using a by-person factor analysis technique, where statistical analysis for extracting each factor is conducted by persons (represented by Q-sorts) rather than by variables. Using such factor analysis, the authors were able to statistically identify groups (factors) of participants with similar viewpoints. Therefore, each group represents participants who have similar perceptions or experiences related to simulation. Next, the cumulative perception, or viewpoint, of each identified group was interpreted based on the statements specific to that group. These statements are called distinguishing statements, which define the uniqueness of each factor, compared with others, and are used to qualitatively describe the factor (Akhtar-Danesh et al., 2008). In addition, statements with extreme scores (−4, −3, and +4, +3) were used for interpretations. The specific distinguishing statements, labeled by their numbers in Table A (Faculty Statements) or Table B (Student Statements) and their corresponding extreme scores, are reported in parentheses within the Results section.
No. | Statement | Factor 1 | Factor 2 | Factor 3 |
---|---|---|---|---|
3 | After learning on a simulator, instead of saying “I think,” I said, “I know.” I was confident and felt like a nurse. | −3 | −1 | −3 |
7 | It’s a safe environment to learn and make mistakes in a situation that is fairly realistic. | 1 | 2 | 0 |
8 | If you’re exactly on the right spot, you hear it loud and clear. But if you’re off [the exact spot], you don’t hear anything at all. | 0 | 0 | 1 |
9 | It helps you recognize clinical abnormalities. | −1 | 0 | −1 |
17 | On the simulator, you can’t tell if you’re actually hurting them. You don’t get that realistic point of view about how that patient really feels. | 1 | 0 | 1 |
18 | It doesn’t make me that independent to the point where I could do this skill for the first time in a hospital by myself. | 2 | 1 | 2 |
26 | When you’re in the simulation laboratory, you’re taking it seriously to an extent, but it’s a dummy. You’re more careful with a live person. | 2 | 1 | 1 |
28 | Repeating the simulation by seeing it again on the videotape was not helpful. | −2 | −3 | −2 |
35 | We need more academic preparation before using the manikins. | 0 | −1 | −1 |
36 | With the manikin, there’s no critical thinking whatsoever; we don’t actually use our brains, in a way. We’re just kind of listening. | −4 | −4 | −4 |
39 | You have to get into a certain mind set to get comfortable in the simulation environment. | 0 | −1 | 0 |
1 | The high-fidelity manikins are pretty close to reality. |
2 | Even though students say it was very useful to them, we may not necessarily have seen that translate in the clinical setting. |
3 | It’s a scheduling nightmare. |
4 | It is not hard to think of an appropriate scenario, but then it takes time to make it believable in the laboratory. |
5 | We need more education for faculty on simulation. |
6 | We need to learn more about resources we can tap into to help us attain scenarios. |
7 | The novelty of the simulation environment, especially high-fidelity (SimMan®), is captivating. |
8 | Some of the physical aspects of the manikin (skin tone, range of motion) are hard to believe. |
9 | After using simulations, students are able to adapt to the clinical setting much better than coming in cold. |
10 | Simulations help students learn psychomotor skills. |
11 | Simulation is an enhancer, not a replacement to clinical. I would never support less clinical over these laboratories or scenarios. |
12 | It makes learning in the clinical setting much more valuable. If the students get used to seeing the simulators and related equipment and touching things and working with things in the laboratory, it will decrease their anxiety when they’re in the clinical setting. |
13 | Simulations can help students learn how to collaborate with members of an interprofessional team. |
14 | Simulations can help students learn how to collaborate with other nurses. |
15 | The learning curve to use simulation technology is a huge stumbling block. You have to be motivated to use it. |
16 | Limitations on space and equipment in the laboratory make it very difficult to fully simulate the clinical experience. |
17 | We need to better understand how to use the space and equipment in the laboratory. |
18 | Using simulations does build students’ confidence; when they are in clinical they’re not afraid to speak out about their findings. |
19 | Because of simulation, we can make better use of the learning time students have in the clinical setting. |
20 | Simulation provides a safe environment for learning. |
21 | Simulations help students learn about communication. |
22 | Simulations help students in learning clinical decision making. |
23 | I think the purpose of the manikins is for complex skills like resuscitation, not just going in to practice. We have other resources to practice basic skills. |
24 | Simulations can help prepare students for community health placements. |
25 | We need enough clinical instructors who have received appropriate training for simulation teaching. |
26 | I think simulation can help with remediation. |
27 | Simulations help students get more comfortable with the role of the nurse. |
28 | Simulation fills the gap because you are not always going to have a chance to perform it in clinical. If we can simulate it, the students can at least get a feel for it. |
29 | Going out to clinical weeds out the people who shouldn’t be nurses. You can’t get that from simulation. |
30 | On the simulator, students can’t tell if they’re actually hurting them. Students don’t get that realistic point of view about how that patient really feels. |
31 | Overall, the more students are using simulation, the more they’re starting to accept manikins as real people. |
32 | There are not enough human resources in our program to do this in a fully integrated way within the entire curriculum. |
33 | Simulations have great potential to support learning, and we’re looking toward using them to their fullest capacity. |
34 | Knowledge learned during simulation can be transferred to the clinical setting. |
35 | Simulation can assist learners to better understand concepts. |
36 | I have a more positive attitude about the use of simulation, having received training or become comfortable with its use. |
37 | Integrating simulation into curriculum requires a greater workload for my teaching. |
38 | Once I experienced my initial exposure to simulation, I was able to see the benefits of this teaching approach. |
39 | Learning experiences using simulation should be standardized for all students. |
40 | We need appropriate technological support in order for simulation to be successful in our program. |
1 | It is incredibly helpful to have someone who works in the field debrief with us after completing a scenario. |
2 | I find that it’s nice to have simulation experiences, because the nurses are not too helpful, especially during the first few weeks of clinical. |
3 | After learning on a simulator, instead of saying “I think,” I said, “I know.” I was confident and felt like a nurse. |
4 | It’s nice to practice your first time using simulators without patients looking at you. |
5 | We didn’t have much time on the simulators; we had to wait for our group to participate, even though we were in the laboratory for hours. |
6 | It would be nice to feel more welcomed when coming to use the simulation laboratory. |
7 | It’s a safe environment to learn and make mistakes in a situation that is fairly realistic. |
8 | If you’re exactly on the right spot, you hear it loud and clear. But if you’re off [the exact spot], you don’t hear anything at all. |
9 | It helps you recognize clinical abnormalities. |
10 | If you had a bad day in the actual clinical setting, you could always go back to the simulation laboratory to regain your confidence. |
11 | It increases your awareness in terms of your actual ability and makes you realize your strengths and weaknesses and shows you where you need to improve before actually working on real live patients. |
12 | Simulations can help me prepare for community health placements |
13 | Simulations help me get more comfortable with the role of the nurse. |
14 | You don’t really learn many interpersonal skills from playing with a doll. |
15 | You don’t get a chance to come back and review what you’ve learned and try to integrate what you’ve learned because of a lack of time and accessibility. |
16 | The videotaping of the simulation was very stressful for me. |
17 | On the simulator, you can’t tell if you’re actually hurting them. You don’t get that realistic point of view about how that patient really feels. |
18 | It doesn’t make me that independent to the point where I could do this skill for the first time in a hospital by myself. |
19 | The simulation environment reinforces the importance of organization; although the manikins are not realistic, you can envision yourself being in a hospital setting. |
20 | I think it’s very stressful and overwhelming. |
21 | Having more variation and unpredictability would make simulations more realistic. |
22 | I am much more confident, because I practiced all the basic skills, so I am more prepared for the real world. |
23 | It gives you a chance to see things that you won’t see in clinical setting. |
24 | The novelty of the simulation environment especially high-fidelity (SimMan®) is captivating. |
25 | Complex scenarios provide us with amazing opportunities to be able to critically think and apply all that we know. |
26 | When you’re in the simulation laboratory, you’re taking it seriously to an extent, but it’s a dummy. You’re more careful with a live person. |
27 | It shouldn’t replace clinical; it should be in addition to it. |
28 | Repeating the simulation by seeing it again on the videotape was not helpful. |
29 | We need to manipulate the manikins and feel comfortable with them before we actually begin the scenario to maximize learning time. |
30 | Simulations can help students learn how to work in multidisciplinary teams. |
31 | You can’t replace the real world. Nursing students are not having sufficient access to real people; we need that real contact. |
32 | Faculty could relinquish some control over the equipment. We’re supposed to be independent learners, yet they don’t trust us to use the equipment in the laboratory. |
33 | I don’t see the manikins as mimicking a patient. That’s not what I’m using them for; I’m using them to practice my skills. |
34 | If you are failing, it gives you the opportunity to practice further and strengthen your skills. |
35 | We need more academic preparation before using the manikins. |
36 | With the manikin, there’s no critical thinking whatsoever; we don’t actually use our brains in a way. We’re just kind of listening. |
37 | The simulators continuously stop working, so that the experience is less significant than if they were working “correctly,” and the loud noise of the generators is irritating. |
38 | It’s a very good opportunity to learn in an environment where there are no risks to a living patient. |
39 | You have to get into a certain mind set to get comfortable in the simulation environment. |
40 | It helps to minimize the anxiety when you’re going to practicum, because you know what comes next. |
41 | I think it’s just making a habit of simulation; we realize that it is an option, but we just don’t make it a reality. |
42 | With the manikin I can take my time and really feel all the parts. |
Results
Participants (P-Set)
In total, 21 undergraduate nursing students (19 women and two men) from years two to four of one Ontario Bachelor of Science in Nursing program and 12 faculty members from the same program (11 women and one man) participated in this study. Student participants’ mean age was 22 years, with a standard deviation of 3.3 years (range = 19 to 31 years). First-year students were excluded from the study to ensure that participants had some experience with simulation learning. All but one student participant rated themselves as moderately or highly comfortable with high-fidelity clinical simulation. Eighteen students experienced simulation as an augmentation to their clinical courses, whereas high-fidelity simulation was also used to augment theory classes for five students.
The mean age of the faculty participants was 38.8 years, with a standard deviation of 10.9 years (range = 27 to 56 years). All faculty participants rated themselves as moderately or highly comfortable with using clinical simulation as a teaching modality. Their teaching experience ranged from 1 year to 20 years, with simulation teaching experience ranging from 6 hours to 5 years.
Faculty Perceptions
For faculty, only one major viewpoint emerged—positive enthusiasts. Faculty endorsed clinical simulation for providing a safe learning environment, which promotes deep learning and clinical decision making (20: +4). Although faculty acknowledged there were limits to the amount of realism possible with high-fidelity simulation (7: −3), they also saw its potential to enhance student learning (22: +4). In addition, they thought that knowledge learned during simulation is transferable to the clinical setting (34: +3) and promotes learning of complex and routine clinical skills (23: −4). Participants stated the following in their supplementary statements:
In simulation, teamwork and communication are essential to handling the situation presented. Simulation forces students to communicate and work together in order to provide quality care to the patient/mannequin.
I believe translation of knowledge from simulation to clinical practice is one of the main benefits.
Faculty were mindful of the need to support students during simulation-based learning, stating that “learners should be provided an environment where they feel comfortable to make mistakes.” One participant summarized an important faculty role:
A safe environment is created by the instructor. Students must treat sim[ulation] as a “real life” experience but they also need to be assured by the instructor that in sim[ulation], mistakes are expected and [are] okay. We can learn from our mistakes in sim[ulation]—at no cost or harm to real patients.
Student Perceptions
Based on a by-person factor analysis, three salient viewpoints emerged regarding the use of clinical simulation in learning during the Bachelor of Science in Nursing program. These three viewpoints included 20 students. One student did not load on any of these viewpoints and was excluded from any further comparative analysis. Viewpoints were named based on their distinguishing statements as challenge seekers, realistic embracers, and support seekers. No statistically significant association was found between these viewpoints and the demographic variables.
Viewpoint 1: Challenge Seekers. Ten students shared this viewpoint (two in year 2, one in year 3, and seven in year 4). They were distinguished by seeing the greatest potential for high-fidelity simulation. Not only did they value their simulation experiences, they also wanted more variability and unpredictability in scenarios (21: +4) and had the greatest comfort level with computer technology in general. They envisioned additional learning opportunities with expanded roles and with greater complexity in the scenarios (21: +4; 5: +3). One student expressed, “More variation and unpredictability would be more stressful but…it would be more realistic…. Sometimes it is just too obvious, we are waiting for something to happen.” For these students, providing additional, complex scenarios would be a welcome addition to their learning. The clinical simulation experiences were mostly acute care based, and scenarios involved other students taking on different interprofessional roles. However, students who were challenge seekers perceived that clinical simulation could not enhance their practice in community-based settings (12: −3) and their abilities in interprofessional practice (30: −4). They provided examples of how simulation could be modified or different scenarios could be developed:
The only other health care team member we contact via telephone is the “physician.” We don’t have the opportunity to work with physical therapy/occupational therapy, chaplain, or other members. If we could incorporate a sim[ulation] with various team members, it would be fantastic.
Viewpoint 2: Realistic Embracers. This group consisted of six students (three in year 2 and three in year 4). They perceived the value of simulation particularly as it related to hands-on learning and the ability to practice and make mistakes in a safe environment (38: +4). They did not have challenges with suspending disbelief, and they viewed simulation as enhancing their clinical capabilities (11: +4). They were not distracted by the technology (37: −3), and they disagreed with the statement that interpersonal skills could not be learned in simulation (14: −3). Of note, among the participants in this group were students who had experience in using clinical simulation as remediation for faculty-identified learning challenges. In addition, those students viewed simulation as an opportunity to rehearse the role of the nurse, as one student stated, “I think going through clinical scenarios allows students to really pull together all the information they’ve learned and be able to relate that to the clinical scenarios.” One student captured the main ideas of this viewpoint: “You don’t get the clinical skills by just reading the textbook. You need that hands-on experience. You can build confidence when working simulation lab[oratories] for benefit in your clinical setting.”
Viewpoint 3: Support Seekers. Four students who loaded on this group (three in year 2 and one in year 4) perceived that simulation was useful to their learning but they valued human connections in learning. They were most critical of the context of the learning, including the realism of simulation, the environment, and the staff. They experienced anxiety during simulation-based learning and wanted additional faculty and staff support. They also did not believe that experience with simulation decreased their anxiety in the clinical unit (40: −3) and that it did not help with learning organizational skills (19: −3). They were highly critical of simulation learning if they did not receive the level of support they desired (1: +4; 6: +4). As one participant wrote, “I was unable to get a pulse from the patient because the pulse mechanics were on the other side. This is embarrassing and made me feel incompetent.” Students sharing this viewpoint were the only group concerned about the realism portrayed in simulation. One student expressed, “the manikins will never embody a human with emotions. I don’t see them as a person. These manikins are for empirical knowledge and are most valued as such.” These perceptions were echoed by another student, “Manikins don’t feel or move in the same way as real people. If you can’t hear what the manikins is saying through the speaker, you have no other clues as to how they’re feeling because their face and body language don’t change.”
Consensus Statements. Finally, there was a list of statements that all students agreed or disagreed on equally (Table). For instance, a high degree of agreement existed among students that high-fidelity clinical simulation assisted in improving their critical thinking (statement 36) and that it was a safe learning environment in which they could make mistakes (statement 7). Further, participants concurred that videotaping their sessions was helpful to their learning (statement 28). Students commented that simulation “allows us to recognize [our] strengths and weaknesses” and “there is no direct risk to a living patient; you have an opportunity to revise your practice before doing skills on a REAL patient.” One student expanded, “Manikins play a huge role in critical thinking. When you are finding abnormalities, you have the chance to think about the situation and what you, as a nurse, are going to do about it.” Simulation was viewed as an opportunity to rehearse the role of the nurse, as this student commented, “Because it was very helpful. You saw yourself as a nurse, or what you perceive a nurse to look like and act. Your voice changes, you lean on bedrails. It’s really eye opening to see how you act. And what you want to change.”
Discussion
The faculty and student perceptions found in this study have shifted slightly from those found in similar studies conducted previously at the same institution, albeit with different participants (Akhtar-Danesh et al., 2009; Baxter et al., 2009). All faculty members in the current study were positive about the learning opportunities supported by clinical simulation, similar to the previously identified supporters. Although a larger sample size might have contributed to the emergence of additional faculty viewpoints, it is interesting that the viewpoints of the current participants did not include any traditionalists or help seekers, who were more critical of this teaching methodology, as was found in the previous study (Akhtar-Danesh et al., 2009). The faculty participant group in the current study has participated in numerous faculty development activities and have had considerable experience in teaching with simulation. They saw the educational possibilities and were far less concerned about the need to suspend disbelief, as they were more realistic about what could and could not be accomplished with the teaching methodology. For faculty, the passage of time and growing expertise may have led to a greater understanding of the strengths and limitations of using simulation as a teaching strategy, based on their understanding of the context and the amount of support that is actually required to address the technical issues. They had come to appreciate the differences in learning that occur using simulated scenarios, compared with live patients, and saw time spent in simulation contexts as complementary to learning in real contexts.
Student perceptions of clinical simulation in the current study were more similar to those found in the previous Ontario-based study (Baxter et al., 2009) and to a recently published Q-sort study of student perceptions in Korea (Yeun, Bang, Ryoo, & Ha, 2014). In all studies, students were positive about simulation-based learning. However, subtle shifts or minor differences in perceptions were found across the student groups. In the current study, all students believed that simulation promoted critical thinking, whereas only some of the students—the reflectors (Baxter et al, 2009) and the adventurous immersion (Yeun et al., 2014) groups—held that view in previously published research. In addition, students in the current study were not concerned about the use of videotaping, which was in contrast to the previous Ontario-based study (Baxter et al., 2009). These changes may have occurred as faculty developed more expertise in prompting critical thinking in their debriefing sessions and in enhancing student safety while viewing videotapes of the experiences.
Some students indicated that they wanted more from simulation. In the current study, challenge seekers had additional creative ideas on how to expand simulation, similar to the previously identified tech savvies, who wanted to create their own scenarios (Baxter et al., 2009). This viewpoint was not reported in the study by Yeun et al. (2014). The consistency of the finding in the previous study and the current study suggests that there is a group of students who could help expand the boundaries of simulation learning if they were given the opportunity.
Some students in all studies—comfort seekers (current study); support seekers (Baxter et al., 2009) and constructive criticism (Yeun et al., 2014)—found simulation-based learning to be highly stressful. Those students require additional support to decrease their anxiety in a hands-on learning experience. Faculty must be attuned to those students’ needs to help them move beyond the machinery to see the learning opportunities. In the current study, many all of the students who held this viewpoint (three of four) were in the second year of their program, having had only minimal exposure to high-fidelity simulation. Setting the tone of the environment, the experience, and the debriefing sessions is an important prerequisite so that these students can learn, regardless of their level in the program.
Limitations
Some limitations with this study are also noted. The numbers of participants were limited, especially among the faculty group. Recruitment was a challenge, despite the incentive of being entered into a draw on an Apple® iPad. As well, data collection sessions were scheduled immediately before or after faculty meetings or student classes, with disappointing response rates. With additional faculty participants, additional viewpoints could have been captured from faculty. Because all participation in this study was necessarily voluntary, only those individuals who held strong views about high-fidelity simulation may have shared their perspectives. Alternate methodologies, such as questionnaires, may have captured the general perceptions of a larger proportion of students and faculty. Also, repeating this study across different institutions, with a range of experiences in faculty training and support with simulation-based learning, may provide additional insights.
Conclusion
Since the introduction of additional resources and faculty development, faculty perceptions of simulation learning have changed. Faculty are more positive about using high-fidelity simulation to support clinical learning. In addition, awareness of the importance of training and support for faculty who are less experienced with simulation learning has increased. Student perceptions remained largely positive toward clinical simulation, including groups embracing the challenge, realism, and critical thinking that the technology allows. However, some students remain skeptical and nervous about simulation learning. With this in mind, it is important for faculty to be aware of student variations in perceptions and provide learning environments that are safe and supportive, yet challenging.
- Adamson K. (2010). Integrating human patient simulation into associate degree nursing curricula: Faculty experiences, barriers, and facilitators. Clinical Simulation in Nursing, 6(3), e75–e81.
10.1016/j.ecns.2009.06.002 > CrossrefGoogle Scholar - Akhtar-Danesh N., Baumann A., Cordingley L. (2008). Q-methodology in nursing research: A promising method for the study of subjectivity. Western Journal of Nursing Research, 30, 759–773.
10.1177/0193945907312979 > Crossref MedlineGoogle Scholar - Akhtar-Danesh N., Baxter P., Valaitis R.K., Stanyon W., Sproul S. (2009). Nurse faculty perceptions of simulation use in nursing education. Western Journal of Nursing Research, 31, 312–329.
10.1177/0193945908328264 > Crossref MedlineGoogle Scholar - Bambini D., Washburn J., Perkins R. (2009). Outcomes of clinical simulation for novice nursing students: Communication, confidence and clinical judgment. Nursing Education Research, 30, 79–83. > Google Scholar
- Bantz D., Dancer M.M., Hodson-Carlton K., Van H.S. (2007). A day-long clinical laboratory: From gaming to high-fidelity simulators. Nurse Educator, 32, 274–277.
10.1097/01.NNE.0000299476.57185.f3 > Crossref MedlineGoogle Scholar - Baxter P., Akhtar-Danesh N., Landeen J., Norman G. (2012). Teaching critical management skills to senior nursing students: Videotaped or interactive hands-on instruction. Nursing Education Perspectives, 33, 106–110.
10.5480/1536-5026-33.2.106 > Crossref MedlineGoogle Scholar - Baxter P., Akhtar-Danesh N., Valaitis R., Stanyon W., Sproul S. (2009). Simulated experiences: Nursing students share their perspectives. Nurse Education Today, 29, 859–866.
10.1016/j.nedt.2009.05.003 > Crossref MedlineGoogle Scholar - Berkowitz L.R., Peyre S.E., Johnson N.R. (2011). Mobilizing faculty for simulation. Obstetrics & Gynecology, 118, 161–163.
10.1097/AOG.0b013e31821fd34d > Crossref MedlineGoogle Scholar - Blum C.A., Parcells D.A. (2012). Relationship between high-fidelity simulation and patient safety in prelicensure nursing education: A comprehensive review. Journal of Nursing Education, 51, 429–435.
10.3928/01484834-20120523-01 > LinkGoogle Scholar - Bray B., Schwartz C.R., Weeks D.L., Kardong-Edgren S. (2009). Human patient simulation technology: Perceptions from a multidisciplinary sample of health care educators. Clinical Simulation in Nursing, 5, e145–e150.
10.1016/j.ecns.2009.02.002 > CrossrefGoogle Scholar - Burns H.K., O’Donnell J.M., Artman (2010). High-fidelity simulation in teaching problem-solving to 1st-year nursing students: A novel use of the nursing process. Clinical Simulation in Nursing, 6, e87–95.
10.1016/j.ecns.2009.07.005 > CrossrefGoogle Scholar - Cato M.L. (2013). Nursing student anxiety in simulation settings: A mixed methods study (Doctoral dissertation). Available from ProQuest Dissertations and Theses database. (UMI No. 3568883) > Google Scholar
- Chinnis A.S., Summers D.E., Doerr C., Paulson D.J., Davis S.M. (2001). Q methodology—A new way of assessing employee satisfaction. Journal of Nursing Administration, 31, 252–259.
10.1097/00005110-200105000-00005 > Crossref MedlineGoogle Scholar - Crouch L. (2010). High-fidelity human patient simulation experiences and baccalaureate nursing students’ perceptions. Clinical Simulation in Nursing, 6, e111.
10.1016/j.ecns.2010.03.020 > CrossrefGoogle Scholar - Decker S., Sportsman S., Puetz L., Billings (2008). The evolution of simulation and its contribution to competency. The Journal of Continuing Education in Nursing, 39, 74–80.
10.3928/00220124-20080201-06 > LinkGoogle Scholar - Dreifuerst K.T. (2012). Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. Journal of Nursing Education, 51, 326–333.
10.3928/01484834-20120409-02 > LinkGoogle Scholar - Durham C., Aiden K. (2008). Enhancing patient safety in nursing education through patient simulation. In (Ed.), Patient safety and quality: An evidence-based handbook for nurses (Vol. 3, pp. 221–260). Rockville, MD: Agency for Healthcare Research and Quality. > Google Scholar
- Ganley B.J., Linnard-Palmer L. (2012). Academic safety during nursing simulation: Perceptions of nursing students and faculty. Clinical Simulation in Nursing, 8, e49–e57.
10.1016/j.ecns.2010.06.004 > CrossrefGoogle Scholar - Guhde J. (2011). Nursing students’ perceptions of the effect on critical thinking, assessment, and learner satisfaction in simple versus complex high-fidelity simulation scenarios. Journal of Nursing Education, 50, 73–78.
10.3928/01484834-20101130-03 > LinkGoogle Scholar - Harder B.N. (2010). Use of simulation in teaching and learning in health sciences: A systematic review. Journal of Nursing Education, 49, 23–28;
10.3928/01484834-20090828-08 > LinkGoogle Scholar - Healey A., Sherbino J., Fan J., Mensour M., Upadhye S., Wasi P. (2010). A low-fidelity simulation curriculum addresses needs identified by faculty and improves the comfort level of senior internal medicine resident physicians with inhospital resuscitation. Critical Care Medicine, 38, 1899–1903.
10.1097/CCM.0b013e3181eb3ca9 > Crossref MedlineGoogle Scholar - Howard V.M., Englert N., Kameg K., Perozzi K. (2011). Integration of simulation across the undergraduate curriculum: Student and faculty perspectives. Clinical Simulation in Nursing, 7, e1–e10.
10.1016/j.ecns.2009.10.004 > CrossrefGoogle Scholar - Howard V.M., Ross C., Mitchell A.M., Nelson G.M. (2010). Human patient simulators and interactive case studies: A comparative analysis of learning outcomes and student perceptions. CIN: Computer, Informatics, Nursing, 28, 42–48. > Crossref MedlineGoogle Scholar
- Ironside P.M., Jeffries P.R., Martin A. (2009). Fostering patient safety competencies using multiple-patient simulation experiences. Nursing Outlook, 57, 332–337.
10.1016/j.outlook.2009.07.010 > Crossref MedlineGoogle Scholar - Issenberg S.B., McGaghie W.C., Petrusa E.R., Lee Gordon D., Scalese R.J. (2005). Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Medical Teacher, 27, 10–28.
10.1080/01421590500046924 > Crossref MedlineGoogle Scholar - Lapkin S., Levett-Jones T., Bellchambers H., Fernandez R. (2010). Effectiveness of patient simulation manikins in teaching clinical reasoning skills to undergraduate nursing students: A systematic review. Clinical Simulation in Nursing, 6, e207–e222.
10.1016/j.ecns.2010.05.005 > CrossrefGoogle Scholar - Lasater K. (2007). Clinical judgment development: Using simulation to create an assessment rubric. Journal of Nursing Education, 46, 496–503. > LinkGoogle Scholar
- Leonard B., Shuhaibar E.L., Chen R. (2010). Nursing student perceptions of intraprofessional team education using high-fidelity simulation. Journal of Nursing Education, 49, 628–631.
10.3928/01484834-20100730-06 > LinkGoogle Scholar - Lewis D., Watson J.E. (1997). Nursing faculty concerns regarding the adoption of computer technology. Computers in Nursing, 15, 71–76. > MedlineGoogle Scholar
- McCaughey C.S., Traynor M.K. (2010). The role of simulation in nurse education. Nurse Education Today, 30, 827–832.
10.1016/j.nedt.2010.03.005 > Crossref MedlineGoogle Scholar - Ministry of Health and Long-Term Care. (2005). Investing in clinical simulation equipment. Retrieved July 4, 2008, from http://www.health.gov.on.ca/english/media/news_releases/archives/nr_05/bg_111605.pdf > Google Scholar
- Nehring W.M., Lashley F.R. (2004). Current use and opinions regarding human patient simulators in nursing education: An international survey. Nursing Education Perspectives, 25, 244–248. > MedlineGoogle Scholar
- Nehring W.M., Lashley F.R. (2009). Nursing simulation: A review of the past 40 years. Simulation & Gaming, 40, 528–552.
10.1177/1046878109332282 > CrossrefGoogle Scholar - Nguyen D.N., Nierler B., Nguyen H.O. (2011). A survey of nursing faculty needs for training in use of new technologies for education and practice. Journal of Nursing Education, 50, 181–189.
10.3928/01484834-20101130-06 > LinkGoogle Scholar - O’Donnell J.M, Decker S., Howard V., Levett-Jones T., Miller C.W. (2014). NLN/Jeffries simulation framework state of the science project: Simulation learning outcomes. Clinical Simulation in Nursing, 10, 373–382.
10.1016/j.ecns.2014.06.004 > CrossrefGoogle Scholar - Partin J.L., Payne T.A., Slemmons M.F. (2011). Students’ perceptions of their learning experiences using high-fidelity simulation to teach concepts relative to obstetrics. Nursing Education Perspectives, 32, 186–188.
10.5480/1536-5026-32.3.186 > Crossref MedlineGoogle Scholar - Pierazzo J. (2014). Learner anxiety and professional practice self-efficacy in nursing education. Electronic Thesis and Dissertation Repository. Paper 2367. http://ir.lib.uwo.ca/etd/2367 > Google Scholar
- Popkess A.M., McDaniel A. (2011). Are nursing students engaged in learning? A secondary analysis of data from the National Survey of Student Engagement. Nursing Education Perspectives, 32, 89–94;
10.5480/1536-5026-32.2.89 > Crossref MedlineGoogle Scholar - Reese C.E., Jeffries P.R., Engum S.A. (2010). Learning together: Using simulations to develop nursing and medical student collaboration. Nursing Education Perspectives, 31, 33–37. > MedlineGoogle Scholar
- Reilly A., Spratt C. (2007). The perceptions of undergraduate student nurses of high-fidelity simulation-based learning: A case report from the University of Tasmania. Nurse Education Today, 27, 542–550.
10.1016/j.nedt.2006.08.015 > Crossref MedlineGoogle Scholar - Schmolck P. (2014). PQMethod (Version 2.35) [Computer software]. Neubiberg, Germany: University of the Bundeswehr Munich. Retrieved from http://schmolck.userweb.mwn.de/qmethod/#PQMethod > Google Scholar
- Shearer J. (2013). High-fidelity simulation and safety: An integrative review. Journal of Nursing Education, 52, 39–45.
10.3928/01484834-20121121-01 > LinkGoogle Scholar - Wotton K., Davis J., Button D., Kelton M. (2010). Third-year undergraduate nursing students’ perceptions of high-fidelity simulation. Journal of Nursing Education, 49, 632–639.
10.3928/01484834-20100831-01 > LinkGoogle Scholar - Yeun E.J., Bang H.Y., Ryoo E.N., Ha E. (2014). Attitudes toward simulation-based learning in nursing students: An application of Q methodology. Nurse Education Today, 34, 1062–1068.
10.1016/j.nedt.2014.02.008 > Crossref MedlineGoogle Scholar