1 Leeds Beckett University, School of Health and Community Studies, Leeds LS1 3HE, UK
2 SimSupport, York, UK
Simulation-based education (SBE) has become commonplace in healthcare education within hospitals, higher education institutions, the private healthcare sector, and private education providers. The standards and quality of delivery vary across the UK , leading to differing degrees of learning for healthcare professionals. This variance in standards makes research into the impact of SBE on the end user (the patient) difficult to measure.
The delivery of SBE needs to be of a high standard if learning via this pedagogy is to be maximised and benefits to patients are to be accurately assessed. This article aims to summarise the importance of quality within clinical SBE and how it can be achieved and maintained to produce a measurable impact on patient care. The current progress of the implementation of UK national standards for SBE is included to highlight the need for standardisation and guidance to support simulation centres and individuals to benchmark practice and work towards accreditation through quality measurement and monitoring processes. Suggestions are made on how such standards will affect the future of SBE and all those involved.
There is a clear need for the development of national standards for SBE delivery and for a stepped approach [i.e. minimum, intermediate, and advanced standards] depending on the size, capacity, and frequency of SBE education delivery. Considerable financial outlay will be required to monitor standards effectively. The enhanced use of current and future technologies should be considered with regards to monitoring standards as well as data collection for future research opportunities.
open-access license: This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is available at: https://creativecommons.org/licenses/by/4.0/legalcode. This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
* Address correspondence to this author at the Director of Clinical Skills and Simulation Leeds Beckett University, School of Health and Community Studies Leeds LS1 3HE, UK, Tel: (+) 44 113 812 4484; E-mail: firstname.lastname@example.org
Simulation can be defined simply as, “A tool, device and/or environment that mimics an aspect of clinical care” [2Cheng A, Kessler D, Mackinnon R, et al. Reporting guidelines for health care simulation research: Extensions to the CONSORT and STROBE statements. Simul Healthc 2016; 11(4): 238-48. [http://dx.doi.org/10.1097/SIH.0000000000000150] [PMID: 27465839] ]. Its concept is not new and while its roots are firmly planted in the aviation industry, it has become an embedded pedagogy within healthcare education over the last few decades. This is primarily due to the published evidence supporting its effectiveness in a learning environment [3Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach 2005; 27(1): 10-28. [http://dx.doi.org/10.1080/01421590500046924] [PMID: 16147767] , 4Warren JN, Luctkar-Flude M, Godfrey C, Lukewich J. A systematic review of the effectiveness of simulation-based education on satisfaction and learning outcomes in nurse practitioner programs. Nurse Educ Today 2016; 46: 99-108. [http://dx.doi.org/10.1016/j.nedt.2016.08.023] [PMID: 27621199] ]. There is no doubt that if delivered effectively, it is of clear benefit to clinicians as far as performance is concerned [5McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ 2010; 44(1): 50-63. [http://dx.doi.org/10.1111/j.1365-2923.2009.03547.x] [PMID: 20078756] ]. Appealing to a number of learning styles, simulation-based education (SBE) offers targeted learning experiences where knowledge, skills, and attitudes can be learned and refined within a safe and supportive environment [6Aggarwal R, Mytton OT, Derbrew M, et al. Training and simulation for patient safety. Qual Saf Health Care 2010; 19(Suppl 2): i34-43. [http://dx.doi.org/10.1136/qshc.2009.038562] [PMID: 20693215] ]. The ability to replicate specific clinical scenarios with immersive and interactive participation from learners (both individuals and teams) is a powerful tool with which technical and non-technical skills can be enhanced, and assessment of clinical performance, testing and refining care pathways and clinical processes can be performed [7Corvetto MA, Taekman JM. To die or not to die? A review of simulated death. Simul Healthc 2013; 8(1): 8-12. [http://dx.doi.org/10.1097/SIH.0b013e3182689aff] [PMID: 22960702] , 8Huddy JR, Weldon S-M, Ralhan S, et al. Sequential simulation (SqS) of clinical pathways: A tool for public and patient engagement in point-of-care diagnostics. BMJ Open 2016; 6(9): e011043. [http://dx.doi.org/10.1136/bmjopen-2016-011043] [PMID: 27625053] ]. SBE in the literature is viewed positively [9Alinier G, Hunt WB, Gordon R. Determining the value of simulation in nurse education: Study design and initial results. Nurse Educ Pract 2004; 4(3): 200-7. [http://dx.doi.org/10.1016/S1471-5953(03)00066-0] [PMID: 19038158] -18Wong TK, Chung JW. Diagnostic reasoning processes using patient simulation in different learning environments. J Clin Nurs 2002; 11(1): 65-72. [http://dx.doi.org/10.1046/j.1365-2702.2002.00580.x] [PMID: 11845757] ], claiming some of the benefits of this pedagogy are as follows:
Increasing patient safety
Developing critical thinking, diagnostic reasoning, and decision making
Enhancing teaching of non-technical skills
Increasing participants’ satisfaction of the learning experience
Potentially reducing demands on clinical placement providers for undergraduate students
While one would assume that the ripple of success of SBE in clinical education would continue downstream to benefit the quality of patient care, there is limited published evidence to support this. McGaghie, Issenberg [19McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ 2014; 48(4): 375-85. [http://dx.doi.org/10.1111/medu.12391] [PMID: 24606621] ] suggest that research to date has focused on measuring learner feedback on the SBE activity itself and measuring the impact of SBE on learner’s knowledge and skills. Research should now focus more on determining the impact on patient outcomes and the wider public health agenda as well as skill and knowledge retention over time. This pattern of research is likely due to the relatively new concept of SBE in relation to other pedagogies and follows translational learning and research models [20Rubio DM, Schoenbaum EE, Lee LS, Schteingart DE, Marantz PR, Anderson KE, et al. Defining translational research: Implications for training. academic medicine J Asso Amer Med Colleges 2010; 85(3): 470-5.]. A review of the literature supports the above claim. In contrast to the amount of evidence available supporting the impact of simulated practice on healthcare professionals’ education, there is relatively little research demonstrating that this learning translates into improvements in patient outcomes. The few studies that have been published focus on secondary care with an emphasis on the medical workforce [21Barelli A, Biasucci DG, Barelli R. Is simulation efficient to improve anesthetists’ performance and patient outcome? Minerva Anesthesiol 2012; 78(5): 628. [PMID: 22318403] -25Shear TD, Greenberg SB, Tokarczyk A. Does training with human patient simulation translate to improved patient safety and outcome? Curr Opin Anaesthesiol 2013; 26(2): 159-63. [http://dx.doi.org/10.1097/ACO.0b013e32835dc0af] [PMID: 23339975] ]. Findings range from the unequivocal to small, statistically insignificant positive changes to patient outcomes and focus on detecting latent error as well as driving forward quality improvement processes. This positive correlation to improved patient outcomes appears to increase when team training is utilised. Riley et al. [26Riley W, Davis S, Miller K, Hansen H, Sainfort F, Sweet R. Didactic and simulation nontechnical skills team training to improve perinatal patient outcomes in a community hospital. Jt Comm J Qual Patient Saf 2011; 37(8): 357-64. [http://dx.doi.org/10.1016/S1553-7250(11)37046-8] [PMID: 21874971] ] implemented team simulation training with the intention of reducing birth trauma within a community hospital. Their results showed a 37% drop in trauma following the training. Smith, Siassakos, Crofts and Draycott [27Smith A, Siassakos D, Crofts J, Draycott T. Simulation: Improving patient outcomes. Semin Perinatol 2013; 37(3): 151-6. [http://dx.doi.org/10.1053/j.semperi.2013.02.005] [PMID: 23721770] ] support the fact that team training using simulation has improved perinatal care and outcome, decreased litigation claims and reduced midwifery sick leave. Statistically significant changes were also demonstrated following advanced cardiac life support training for medical residents, where again, statistically significant improvements in the quality of clinical care delivered was shown [28Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: A case-control study. Chest 2008; 133(1): 56-61. [http://dx.doi.org/10.1378/chest.07-0131] [PMID: 17573509] ].
Published systematic reviews [29Merchant DC. Does high-fidelity simulation improve clinical outcomes? J Nurses Staff Dev 2012; 28(1): E1-8. [http://dx.doi.org/10.1097/NND.0b013e318240a728] [PMID: 22261910] -31Braga MS, Tyler MD, Rhoads JM, et al. Effect of just-in-time simulation training on provider performance and patient outcomes for clinical procedures: A systematic review. BMJ Simulation and Technology Enhanced Learning 2015.] support the above findings. Zendejas, Brydges, Wang and Cook [32Zendejas B, Brydges R, Wang AT, Cook DA. Patient outcomes in simulation-based medical education: A systematic review. J Gen Intern Med 2013; 28(8): 1078-89. [http://dx.doi.org/10.1007/s11606-012-2264-5] [PMID: 23595919] ] looked at 50 studies comparing the outcomes from simulated practice with no intervention or non-simulated instruction. Patient outcomes were enhanced but did not reach statistical significance. Surprisingly, studies demonstrated that using SBE for assessments related to patient outcomes works better in early career years or for experienced clinicians but does not appear to be as effective for those in mid-training [33Brydges R, Hatala R, Zendejas B, Erwin PJ, Cook DA. Linking simulation-based educational assessments and patient-related outcomes: A systematic review and meta-analysis. Acad Med 2015; 90(2): 246-56. [http://dx.doi.org/10.1097/ACM.0000000000000549] [PMID: 25374041] ]. One potential explanation for this is the pressure to perform well. Earlier career clinicians would not be expected to know and experienced clinicians will have gained the required skills and knowledge over time and feel more comfortable in their role. This would fit with Benner’s [34Benner P. From novice to expert: Excellence and power in clinical nursing practice. Addison-Wesley, Menlo Park, California 1984.] concept of moving from novice to expert, where the competent practitioner in mid-career becomes more aware of their long-term goals and gaps in knowledge, thereby intensifying the pressure to achieve. Burrell and Bienstock [35Burrell DA, Bienstock JL. Improving patient outcomes through supervision and simulation. Med Educ 2015; 49(7): 647-9. [http://dx.doi.org/10.1111/medu.12762] [PMID: 26077210] ] make a valid point that we should not forget, that competence is an individual characteristic. As such, learners should be treated as individuals and recognition given to the fact that acquisition of skills will take differing lengths of time.
Braga et al. [31Braga MS, Tyler MD, Rhoads JM, et al. Effect of just-in-time simulation training on provider performance and patient outcomes for clinical procedures: A systematic review. BMJ Simulation and Technology Enhanced Learning 2015.] focused on just-in-time simulation (i.e. simulated training took place shortly before the procedure was performed in the clinical setting). While learner performance was enhanced, there was no published evidence to show improved levels of patient complications. While anecdotal evidence suggests that the pedagogy of SBE in fact does have a wide-reaching impact on patient outcomes, to prove and measure this, quality needs to be achieved and maintained in two key areas; the SBE activity itself and the simulation-based research (SBR) processes utilised.
While the research mentioned above focuses on actual patient outcomes, these are often difficult to measure for healthcare educational establishments who are not associated with teaching hospitals. Interestingly, Brydges, Hatala, Zendejas, Erwin and Cook [33Brydges R, Hatala R, Zendejas B, Erwin PJ, Cook DA. Linking simulation-based educational assessments and patient-related outcomes: A systematic review and meta-analysis. Acad Med 2015; 90(2): 246-56. [http://dx.doi.org/10.1097/ACM.0000000000000549] [PMID: 25374041] ] undertook a systematic review and meta-analysis focusing on simulation-based assessments as surrogates for patient-related outcomes. If valid and reliable tools are used to measure these outcomes, they suggest that this format of measuring SBE impact may become common practice in the future. This approach would certainly remove some barriers within this realm of SBE research.
This article aims to summarise the importance of quality within clinical SBE and how it can be achieved and maintained to produce a measurable impact on patient care, but to achieve “quality,” it needs to be defined. There are definitions abound, but all affirm that to measure quality, a benchmark or standard must be set with which to measure your activity against. The Oxford English Dictionary [36Oxford English Dictionary. Quality 2016. Available from: ( https://en.oxforddictionaries.com/ definition/quality.) Accessed: 24 October 2016.] defines quality as, “the standard of something as measured against other things of a similar kind; the degree of excellence of something.” The GMC definition includes, “all the policies, standards, systems and processes that are in place to maintain and improve the quality of medical education” [37General Medical Council. Promoting excellence: standards for medical education and training, Available from: (http://www.gmc-uk.org/Promoting_excellence_standards_for_medical_education_and_training_0715.pdf_61939165.pdf) [Accessed: 25 October 2016]]. Quality frameworks for SBE developed by regional networks/groups refer to a narrative of what good quality looks like, recognition of best practice, a level of excellence to act as guidance for simulation and clinical skill providers, and drive quality improvement [38Health Education England North Region Quality Guidance Document 2016., 39North West Simulation Education network Quality Assurance Framework 2011.]. Health Education England, in their latest Quality Framework document refers to, “a national and local ambition for quality in education and training” [40Health Education England. HEE Quality Framework 2016/17 2016. Available from: (https://hee.nhs.uk/ sites/default/files/ documents/Quality Framework.pdf) Accessed: 25 October 2016.]. In the words of Lord Kelvin, “If you cannot measure it, you cannot improve it” [41Kelvin L. Lord Kelvin. Lord Kelvin Quotations n.d. Available from: (http://zapatopi.net/kelvin/quotes/) 2016. Accessed: October 24 2016].
2. QUALITY STANDARDS FOR SIMULATION-BASED EDUCATION
In 2012-13, the Association for Simulated Practice in Healthcare (ASPiH) [42Association for Simulated Practice in Healthcare. Association for Simulated Practice in Healthcare 2017. Available from: (http://www.aspih.org.uk/). Accessed: 1 May 2017.] conducted a National Simulation Development Project [1Anderson A, Baxendale B, Scott L, Mossley D, Glover I. The National simulation development project: Summary report 2014. Available from: (http://www.aspih.org.uk/static/aspihdjango/uploads/documents/general/national-scoping-project-summary-report.pdf) [Accessed: 12 October 2016]], supported by Health Education England [43Health Education England. Health Education England 2016. Available from: (https://hee.nhs.uk/) Accessed: 26 October 2016.] and the Higher Education Academy [44Higher Education Academy. Higher Education Academy 2016. Available from: (https://www.heacademy.ac.uk). Accessed 26 October 2016]. The aim was to map the resources available and the application of SBE and technology-enhanced learning (TEL) across the United Kingdom. A key concern identified in this report was the need for national guidance related to quality indicators and SBE standards of practice. This would need to be of relevance, value and easily accessible to an increasing number and breadth of organisations, departments and individuals designing and delivering SBE.
As a direct result of the National Project, ASPiH established a standards committee consulting with educationalists, professionals, and experts in the field and developed draft standards for SBE [45Simulation Based Education in Healthcare: Standards for Practitioners. Draft 2016. Available from: (https://worldspanmedia.s3.amazonaws.com/media/aspihdjango/uploads/documents/standards-consultation/draft-standards-for-sbe.pdf). [Accessed 30 September 2016].]. Both the first and second consultation, supported by Health Education England, confirmed that for SBE to achieve its full potential, an agreed quality standard framework is required. The majority of UK simulation centres, educational institutions, and practitioners support this requirement for national standards [46Association for Simulated Practice in Healthcare Outcomes of the Second Consultation: National Standards for Simulation Based Education 2016.]. In our opinion, there is no doubt that their adoption and application would support and enhance delivery of SBE, allowing for a more rigorous, consistent standard of practice and provide a benchmark to strive towards in order to achieve and maintain quality, parity, and inclusiveness. It is noteworthy that their development has triggered lengthy discussion around the use of the word standard and the mandatory consequence that may be perceived if compared to those of the professional bodies in defining their requirements for education, training, and patient safety such as, for example, the General Medical Council’s Promoting excellence: standards for medical education and training [37General Medical Council. Promoting excellence: standards for medical education and training, Available from: (http://www.gmc-uk.org/Promoting_excellence_standards_for_medical_education_and_training_0715.pdf_61939165.pdf) [Accessed: 25 October 2016]] and the Nursing and Midwifery Council’s Quality Assurance Framework Part Three: Assuring the Safety and Effectiveness of Practice Learning [47Nursing and Midwifery Council. Quality assurance framework for nursing and midwifery education and local supervising authorities 2015. Available from: (https://www.nmc.org.uk/globalassets/sitedocuments/edandqa/nmc-quality-assurance-framework.pdf). Accessed 6 November 2016.]. In comparison, the Resuscitation Council is very clear with regards to terminology within its standards and compliance, using the terms must, should, and recommends, making it clear which elements are mandatory [48Resuscitation Council (UK). Resuscitation Council (UK). Quality standards for cardiopulmonary resuscitation practice and training 2016. Available from: https://www.resus.org.uk/quality-standards/ community-hospitals-care-quality-standards-for-cpr/ 2016. Accessed: 6 November]. ASPiH needs to be mindful of this in the context of their framework. If the mandatory implications are removed, then the standards may take on a much greater aspirational and best practice significance. A number of organisations and individuals have already expressed concern around the levels of attainment and the challenges and impact that working towards certain elements of the standards may have on their staffing, resources, and finances [49Gopal A, Baxendale B. ASPiH Simulation Based Education in Healthcare Standards for Practitioners Consultation summary report of round table discussion. ASPiH Annual Conference: Brighton. 2015.]. Interestingly, others counteract this argument and feel that introduction of the standards may in fact provide leverage regarding funding and more adequate and appropriate resourcing. Hopefully, the latter will prevail.
The latest version of the ASPiH Standards Framework includes four themes: faculty, activity, resources, and technical personnel with an overall aim to provide the “opportunity to associate high quality SBE with improvement in care quality outcomes and system improvement” [50Association for Simulated Practice in Healthcare and Health Education England. Simulation Based Education in Healthcare: Standards Framework, Available from: (https://worldspanmedia.s3.amazonaws.com/ media/aspihdjango/ uploads/documents/ standards-consultation/ standards-framework.pdf) Accessed: 27 November 2016.]. ASPiH is very cognisant of the development and use of regional frameworks [51Health Education Yorkshire and the Humber Quality Management of Clinical Skills and Simulation Training 2013. Available from: (http://login.qaclinicalskills.co.uk/files/QACSS%20Final%20Version%20May%202013.pdf). Accessed 25 October 2016., 52North West Simulation Education Network. Accreditation of Education Using Simulation Based Learning 2016. Available from: (http://www.northwestsimulation.org.uk/ mod/page/ view.php?id=285) [Accessed: 26 October 2016].] and the availability of standards and processes for accreditation. In the United States there are currently two organisations who have developed standards, namely the International Nursing Association for Clinical Simulation and Learning (INACSL) Standards of Best Practice [53International Nursing Association for Clinical Simulation and Learning. Standards of Best Practice: Simulation 2015. Available from: (http://www.inacsl.org/i4a/ pages/index.cfm?pageid=3407.) [Accessed: 26 October 2015].] and the Society for Simulation in Healthcare (SSH) Accreditation Standards [54Society for Simulation in Healthcare. Full Accreditation 2016. Available from: (http://www.ssih.org/Accreditation/Full-Accreditation.) Accessed: 26 October 2016.]. Despite some UK organisations using these standards for reference, guidance, and partial adoption, the INACSL Standards of Best Practice do not include any of the environmental aspects of creating a simulated scenario or relevant quality assurance frameworks [55Parle J. West midlands central health innovation and education cluster: Simulation topic - final report. University of Birmingham 2013.]. There is currently no simulation accreditation process widely used in the UK. The SSH accreditation standards have substantial cost implications and no UK organisation has yet gone through the process [56SSH Accredited Programs. International 2016. Available from: (http://www.ssih.org/Accreditation/Programs-International). [Accessed 25 October 2016].].
There is no doubt that such ambition or aspiration for quality necessitates standards for SBE but evidencing achievement and progressing to recognition for that through accreditation requires additional commitment and is regarded as the final step in most quality assurance processes [1Anderson A, Baxendale B, Scott L, Mossley D, Glover I. The National simulation development project: Summary report 2014. Available from: (http://www.aspih.org.uk/static/aspihdjango/uploads/documents/general/national-scoping-project-summary-report.pdf) [Accessed: 12 October 2016]]. Preceding such accolade for most simulation centres and individuals, will be a period of working towards, of improvement, putting things in place, providing the evidence (i.e. measurement against the standards of SBE). In the long term, the measurement and monitoring activities will aim to drive quality improvement of SBE, however, compliance and delivering on such activities could be time consuming and arduous. The second consultation exercise has identified a variety of benchmarking practices, online reporting, self and peer review, periodic face-to-face audit – all voluntary accreditation processes for the UK national standards in SBE [46Association for Simulated Practice in Healthcare Outcomes of the Second Consultation: National Standards for Simulation Based Education 2016.]. Raising the standards of SBE delivery would, however, allow for a more robust research strategy to be implemented, enabling definitive outcome measures to be addressed.
3. QUALITY STANDARDS FOR SIMULATION-BASED RESEARCH
Despite the plethora of evidence supporting the use of SBE, Cheng et al [2Cheng A, Kessler D, Mackinnon R, et al. Reporting guidelines for health care simulation research: Extensions to the CONSORT and STROBE statements. Simul Healthc 2016; 11(4): 238-48. [http://dx.doi.org/10.1097/SIH.0000000000000150] [PMID: 27465839] ] claim that in the health professions, educational research is often poorly designed and the findings are inconsistently or poorly documented. They argue that many researchers utilise methodologies that reflect traditional educational research and argue that simulation-based research (SBR) has different unique features that are often not considered in the design or methodologies described. Research specific to SBE and healthcare has found that studies have not included aspects like instructional design, setting the context, and outcomes [57Cook DA, Levinson AJ, Garside S. Method and reporting quality in health professions education research: A systematic review. Med Educ 2011; 45(3): 227-38. [http://dx.doi.org/10.1111/j.1365-2923.2010.03890.x] [PMID: 21299598] ]. A further review identified that only 3% of studies utilising a debriefing following SBE (a key element) documented the essential elements required [58Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: A systematic review and meta-analysis. Med Educ 2014; 48(7): 657-66. [http://dx.doi.org/10.1111/medu.12432] [PMID: 24909527] ]. For the appraiser of the research, parts of the process are often missed out leading potentially to frustration. It could be argued that the inconsistent approach to the many elements of SBR reflects the inconsistent way that SBE is carried out in both NHS trusts and higher education institutions (HEIs). The introduction of the standards may encourage academics and researchers alike to consider the unique methodological challenges faced when carrying out SBR. Cheng et al [2Cheng A, Kessler D, Mackinnon R, et al. Reporting guidelines for health care simulation research: Extensions to the CONSORT and STROBE statements. Simul Healthc 2016; 11(4): 238-48. [http://dx.doi.org/10.1097/SIH.0000000000000150] [PMID: 27465839] ] have contributed to solving this issue by suggesting additions to existing reporting guidelines that reflect the unique qualities of SBR.
It has been commented that SBE in the health setting has sprung up out of a necessity rather than from a robust evidence base with ideas like the changing face of the NHS and increased demand for placements being cited as potential drivers [59McKenna L, French J, Cross W. Prepare Nurses for the Future: Identify Use of Simulation, and More Appropriate and Timely Clinical Placement to Increase Clinical Competence and Undergraduate Positions: Final Report of Key Activities for Department of Human Services Nurse Policy Branch. Melbourne, Australia: Monash University 2007.]. Others argue that the main driving factor for SBE is patient safety [12Guimond ME, Sole ML, Salas E. Getting ready for simulation-based training: A checklist for nurse educators. Nurs Educ Perspect 2011; 32(3): 179-85. [http://dx.doi.org/10.5480/1536-5026-32.3.179] [PMID: 21834380] , 60Bradley P. The history of simulation in medical education and possible future directions. Med Educ 2006; 40(3): 254-62. [http://dx.doi.org/10.1111/j.1365-2929.2006.02394.x] [PMID: 16483328] -67Agency for Healthcare Research and Quality. Surgical simulation training improves speed and confidence in residents learning endoscopic sinus surgery AHRQ Research Activities 2010; (360): 12.]. The drivers may be different depending on the clinical speciality. Whatever the drivers or motivations are, the general consensus within the field appears to be that a consistent approach to this pedagogy needs to be adopted to ultimately ensure its quality. The standards are an attempt to develop this pedagogy and provide a consistent approach (along with an evidence base) that is currently missing. Historically, SBE has been developed by pockets of simulation enthusiasts with sometimes very basic equipment and training. Despite the continued investment into SBE, equity of access to specialised centres is still recognised as a potential barrier to the development of this technique [68Al-Ghareeb AZ, Cooper SJ. Barriers and enablers to the use of high-fidelity patient simulation manikins in nurse education: An integrative review. Nurse Educ Today 2016; 36: 281-6. [http://dx.doi.org/10.1016/j.nedt.2015.08.005] [PMID: 26323885] ]. Perhaps there is a real risk that the standards could heighten this problem in the short term. They are a benchmark for what quality SBE should look like. To achieve some of these inevitably will require investment, not only in buildings and equipment, but in ensuring that facilitators (clinicians, educationalists, and learning technologists) delivering SBE are appropriately trained and supervised. Even those centres with the infrastructure to be able to cope with the new demands will find adopting the standards a challenge. Careful consideration needs to be shown to those centres that do not have the resilience to achieve the benchmark in the short term. The risk would be that they carry on in delivering SBE but do so (through no fault of their own) compromising some of the standards. A further risk could be that these centres would not engage in future developments potentially leading to independent SBE providers whose quality (in terms of the ASPiH standards) could not be assured.
In support of the patient safety agenda Deutsch et al. [69Deutsch ES, Dong Y, Halamek LP, Rosen MA, Taekman JM, Rice J. Leveraging health care simulation technology for human factors research: closing the gap between lab and bedside. Hum Factors 2016; 58(7): 1082-95. [http://dx.doi.org/10.1177/0018720816650781] [PMID: 27268996] ] argue that SBE allows a unique opportunity to carry out research into human factors (HF) within healthcare. Human factors have been defined by Catchpole [70National Quality Board. Department of Health Human Factors Reference Group Interim Report, 1 March 2012. Available from: (https://www.england.nhs.uk/wp-content/uploads/2013/11/DH-rep.pdf) 2016.] as “Enhancing clinical performance through an understanding of the effects of teamwork, tasks, equipment, workspace, culture, and organisation on human behaviour and abilities and application of that knowledge in clinical settings.”
One of the most complex hurdles academics and researchers must overcome when undertaking research is gaining ethical approval especially when dealing with patients. SBE provides the opportunity to undertake research in simulated clinical environments with members of the interdisciplinary team but without exposing real patients to any direct risks [71Deutsch ES. Simulation in otolaryngology: Smart dummies and more. Otolaryngol Head Neck Surg 2011; 145(6): 899-903. [http://dx.doi.org/10.1177/0194599811424862] [PMID: 21965444] , 72Kneebone R. Simulation in surgical training: Educational issues and practical implications. Med Educ 2003; 37(3): 267-77. [http://dx.doi.org/10.1046/j.1365-2923.2003.01440.x] [PMID: 12603766] ]. Deutsch et al. (2016) argue that SBE offers the HF researcher several unique opportunities. At an organisational level, simulations can be used to observe how leaders at different levels respond to patient safety issues, how they apply policy and procedure, and how they risk assess (Deutsch et al. 2016). It also allows potential risks to be identified and acted on before they cause harm, often referred to as latent risks [73Lok A, Peirce E, Shore H, Clark SJ. A proactive approach to harm prevention: Identifying latent risks through in situ simulation training. Infant 2015; 11(5): 160-3.]. SBE also provides the opportunity to develop innovative ways of working and problem solving especially within complex teams [69Deutsch ES, Dong Y, Halamek LP, Rosen MA, Taekman JM, Rice J. Leveraging health care simulation technology for human factors research: closing the gap between lab and bedside. Hum Factors 2016; 58(7): 1082-95. [http://dx.doi.org/10.1177/0018720816650781] [PMID: 27268996] ].
4. RESULTS AND DISCUSSION
In conclusion, SBE is set to stay, with professional organisations encouraging its use in their curricula, clinical and educational practice [74National Health Service. The UK Foundation Programme Curriculum 2010. Available from: (http://www.foundationprogramme.nhs.uk/ download.asp?file=Foundation_Curriculum_2010_WEB_Final.PDF.) Accessed: 11 November 2016.-77Nursing and Midwifery Council. Standards for pre-registration nursing education 2010. Available from: (https://www.nmc.org.uk/standards/additional-standards/standards-for-pre-registration-nursing-education/) [Accessed: 30 October 2016].]. In some areas, the investment into equipment, dedicated facilities, and personnel who support SBE has been significant. However, the drivers and standards guiding SBE have been focussed on those who have the infrastructure to support it rather than robust methodologies and evidence. For SBE to be delivered in a quality assured way (whatever the definition of quality) requires a benchmark standard for a baseline to be achieved. Without them a baseline will never be achieved and SBE will carry on being delivered by enthusiasts who despite their motivations or resources potentially could miss the bigger picture which is about providing high quality education in an effective manner to maximise the benefits of SBE for the learners, their current and future employers, and the simulation centre or programme. Maybe an approach that could be adopted in the short term is a stepped approach i.e. one whereby those delivering and centres providing SBE are to meet a minimum set of criteria documented by the standards and that progression to higher levels of approval are achievable as individuals and centres develop and investment increases. The difficulty lies in deciding what the minimum standards are; setting someone up to fail before they begin could become realistic.
Curran, cited in Riley [78Riley RH. Manual of Simulation in Healthcare. Oxford: Oxford University Press 2008.], writes in reference to SBE that “the capability of the trainer as an educator limits or expands the effectiveness of the teaching; the more versatile and competent the trainer, the more likely they are to be effective”. This statement supports the notion that beginning with developing the faculty may also be a sensible starting point. Ensuring that all faculty (from education, research, clinician, and learning technologist) are aware of the underpinning learning theories that support not just traditional education but also the elements that are unique to simulation which will help to provide a better learning experience. It may lead to new theories that have not been explored within education. Having a greater understanding of the pedagogy will provide the researcher the insight to develop new or adapted methodologies to capture the unique data that SBE may provide. With continued advancement in technology, system integration such as electronic medical records and programmes enabling the measurement of simulated patient and manikin parameters (proxy patient outcomes), education, training, and research within SBE is ideally situated to address areas of practice where clinical errors are most prevalent (e.g. prescribing, patient monitoring) [79Improvement NH. NHS Improvement. Organisation patient safety incident reports: 28 September 2016 2016. Available from: https://improvement.nhs.uk/ resources/ organisation-patient-safety-incident-reports-28-september-2016/ 2016. Accessed: 9 November].
As with the aviation industry, simulation is rapidly becoming the industry standard in relation to education and training. The key catalyst for its adoption in aviation is the clear link to enhanced pilot/passenger safety [80Aebersold M. The History of Simulation and Its Impact on the Future. AACN Adv Crit Care 2016; 27(1): 56-61. [http://dx.doi.org/10.4037/aacnacc2016436] [PMID: 26909454] ]. In healthcare, if such a link between SBE and improved patient outcomes can be established through robust research, incorporating both the standards for SBE [45Simulation Based Education in Healthcare: Standards for Practitioners. Draft 2016. Available from: (https://worldspanmedia.s3.amazonaws.com/media/aspihdjango/uploads/documents/standards-consultation/draft-standards-for-sbe.pdf). [Accessed 30 September 2016].] and enhanced research framework [2Cheng A, Kessler D, Mackinnon R, et al. Reporting guidelines for health care simulation research: Extensions to the CONSORT and STROBE statements. Simul Healthc 2016; 11(4): 238-48. [http://dx.doi.org/10.1097/SIH.0000000000000150] [PMID: 27465839] ], its development is likely to be continually supported in years to come.
ETHICS APPROVAL AND CONSENT TO PARTICIPATE
HUMAN AND ANIMAL RIGHTS
No Animals/Humans were used for studies that are base of this research.
CONSENT FOR PUBLICATION
CONFLICT OF INTEREST
The authors declare no conflict of interest, financial or otherwise.
Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach 2005; 27(1): 10-28. [http://dx.doi.org/10.1080/01421590500046924] [PMID: 16147767]
Warren JN, Luctkar-Flude M, Godfrey C, Lukewich J. A systematic review of the effectiveness of simulation-based education on satisfaction and learning outcomes in nurse practitioner programs. Nurse Educ Today 2016; 46: 99-108. [http://dx.doi.org/10.1016/j.nedt.2016.08.023] [PMID: 27621199]
Hope A, Garside J, Prescott S. Rethinking theory and practice: Pre-registration student nurses experiences of simulation teaching and learning in the acquisition of clinical skills in preparation for practice. Nurse Educ Today 2011; 31(7): 711-5. [http://dx.doi.org/10.1016/j.nedt.2010.12.011] [PMID: 21237536]
Issenberg SB, Pringle S, Harden RM, Khogali S, Gordon MS. Adoption and integration of simulation-based learning technologies into the curriculum of a UK undergraduate education programme. Med Educ 2003; 37(Suppl. 1): 42-9. [http://dx.doi.org/10.1046/j.1365-2923.37.s1.10.x] [PMID: 14641638]
Rubio DM, Schoenbaum EE, Lee LS, Schteingart DE, Marantz PR, Anderson KE, et al. Defining translational research: Implications for training. academic medicine J Asso Amer Med Colleges 2010; 85(3): 470-5.
Barelli A, Biasucci DG, Barelli R. Is simulation efficient to improve anesthetists’ performance and patient outcome? Minerva Anesthesiol 2012; 78(5): 628. [PMID: 22318403]
Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Simul Healthc 2010; 5(2): 98-102. [http://dx.doi.org/10.1097/SIH.0b013e3181bc8304] [PMID: 20389233]
Riley W, Davis S, Miller K, Hansen H, Sainfort F, Sweet R. Didactic and simulation nontechnical skills team training to improve perinatal patient outcomes in a community hospital. Jt Comm J Qual Patient Saf 2011; 37(8): 357-64. [http://dx.doi.org/10.1016/S1553-7250(11)37046-8] [PMID: 21874971]
Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: A case-control study. Chest 2008; 133(1): 56-61. [http://dx.doi.org/10.1378/chest.07-0131] [PMID: 17573509]
Braga MS, Tyler MD, Rhoads JM, et al. Effect of just-in-time simulation training on provider performance and patient outcomes for clinical procedures: A systematic review. BMJ Simulation and Technology Enhanced Learning 2015.
Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: A systematic review and meta-analysis. Med Educ 2014; 48(7): 657-66. [http://dx.doi.org/10.1111/medu.12432] [PMID: 24909527]
McKenna L, French J, Cross W. Prepare Nurses for the Future: Identify Use of Simulation, and More Appropriate and Timely Clinical Placement to Increase Clinical Competence and Undergraduate Positions: Final Report of Key Activities for Department of Human Services Nurse Policy Branch. Melbourne, Australia: Monash University 2007.
Gordon CJ, Buckley T. The effect of high-fidelity simulation training on medical-surgical graduate nurses’ perceived ability to respond to patient clinical emergencies. J Contin Educ Nurs 2009; 40(11): 491-8. [http://dx.doi.org/10.3928/00220124-20091023-06] [PMID: 19904861]
Deutsch ES, Dong Y, Halamek LP, Rosen MA, Taekman JM, Rice J. Leveraging health care simulation technology for human factors research: closing the gap between lab and bedside. Hum Factors 2016; 58(7): 1082-95. [http://dx.doi.org/10.1177/0018720816650781] [PMID: 27268996]