REVIEW ARTICLE


Simulation and Quality in Clinical Education



Ann Sunderland1, *, Jane Nicklin2, Andrew Martin1
1 Leeds Beckett University, School of Health and Community Studies, Leeds LS1 3HE, UK
2 SimSupport, York, UK


Article Metrics

CrossRef Citations:
2
Total Statistics:

Full-Text HTML Views: 5489
Abstract HTML Views: 1680
PDF Downloads: 1133
ePub Downloads: 756
Total Views/Downloads: 9058
Unique Statistics:

Full-Text HTML Views: 2466
Abstract HTML Views: 917
PDF Downloads: 694
ePub Downloads: 396
Total Views/Downloads: 4473



Creative Commons License
© 2017 Sunderland et al.

open-access license: This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is available at: https://creativecommons.org/licenses/by/4.0/legalcode. This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

* Address correspondence to this author at the Director of Clinical Skills and Simulation Leeds Beckett University, School of Health and Community Studies Leeds LS1 3HE, UK, Tel: (+) 44 113 812 4484; E-mail: a.sunderland@leedsbeckett.ac.uk


Abstract

Background:

Simulation-based education (SBE) has become commonplace in healthcare education within hospitals, higher education institutions, the private healthcare sector, and private education providers. The standards and quality of delivery vary across the UK [1], leading to differing degrees of learning for healthcare professionals. This variance in standards makes research into the impact of SBE on the end user (the patient) difficult to measure.

Review:

The delivery of SBE needs to be of a high standard if learning via this pedagogy is to be maximised and benefits to patients are to be accurately assessed. This article aims to summarise the importance of quality within clinical SBE and how it can be achieved and maintained to produce a measurable impact on patient care. The current progress of the implementation of UK national standards for SBE is included to highlight the need for standardisation and guidance to support simulation centres and individuals to benchmark practice and work towards accreditation through quality measurement and monitoring processes. Suggestions are made on how such standards will affect the future of SBE and all those involved.

Conclusion:

There is a clear need for the development of national standards for SBE delivery and for a stepped approach [i.e. minimum, intermediate, and advanced standards] depending on the size, capacity, and frequency of SBE education delivery. Considerable financial outlay will be required to monitor standards effectively. The enhanced use of current and future technologies should be considered with regards to monitoring standards as well as data collection for future research opportunities.

Keywords: Simulation, Quality improvement, Educational standards, Patient safety, Diagnostic reasoning, SBE activity.



1. BACKGROUND

Simulation can be defined simply as, “A tool, device and/or environment that mimics an aspect of clinical care” [2]. Its concept is not new and while its roots are firmly planted in the aviation industry, it has become an embedded pedagogy within healthcare education over the last few decades. This is primarily due to the published evidence supporting its effectiveness in a learning environment [3, 4]. There is no doubt that if delivered effectively, it is of clear benefit to clinicians as far as performance is concerned [5]. Appealing to a number of learning styles, simulation-based education (SBE) offers targeted learning experiences where knowledge, skills, and attitudes can be learned and refined within a safe and supportive environment [6]. The ability to replicate specific clinical scenarios with immersive and interactive participation from learners (both individuals and teams) is a powerful tool with which technical and non-technical skills can be enhanced, and assessment of clinical performance, testing and refining care pathways and clinical processes can be performed [7, 8]. SBE in the literature is viewed positively [9-18], claiming some of the benefits of this pedagogy are as follows:

  • Increasing patient safety
  • Developing critical thinking, diagnostic reasoning, and decision making
  • Enhancing teaching of non-technical skills
  • Increasing participants’ satisfaction of the learning experience
  • Potentially reducing demands on clinical placement providers for undergraduate students

While one would assume that the ripple of success of SBE in clinical education would continue downstream to benefit the quality of patient care, there is limited published evidence to support this. McGaghie, Issenberg [19] suggest that research to date has focused on measuring learner feedback on the SBE activity itself and measuring the impact of SBE on learner’s knowledge and skills. Research should now focus more on determining the impact on patient outcomes and the wider public health agenda as well as skill and knowledge retention over time. This pattern of research is likely due to the relatively new concept of SBE in relation to other pedagogies and follows translational learning and research models [20]. A review of the literature supports the above claim. In contrast to the amount of evidence available supporting the impact of simulated practice on healthcare professionals’ education, there is relatively little research demonstrating that this learning translates into improvements in patient outcomes. The few studies that have been published focus on secondary care with an emphasis on the medical workforce [21-25]. Findings range from the unequivocal to small, statistically insignificant positive changes to patient outcomes and focus on detecting latent error as well as driving forward quality improvement processes. This positive correlation to improved patient outcomes appears to increase when team training is utilised. Riley et al. [26] implemented team simulation training with the intention of reducing birth trauma within a community hospital. Their results showed a 37% drop in trauma following the training. Smith, Siassakos, Crofts and Draycott [27] support the fact that team training using simulation has improved perinatal care and outcome, decreased litigation claims and reduced midwifery sick leave. Statistically significant changes were also demonstrated following advanced cardiac life support training for medical residents, where again, statistically significant improvements in the quality of clinical care delivered was shown [28].

Published systematic reviews [29-31] support the above findings. Zendejas, Brydges, Wang and Cook [32] looked at 50 studies comparing the outcomes from simulated practice with no intervention or non-simulated instruction. Patient outcomes were enhanced but did not reach statistical significance. Surprisingly, studies demonstrated that using SBE for assessments related to patient outcomes works better in early career years or for experienced clinicians but does not appear to be as effective for those in mid-training [33]. One potential explanation for this is the pressure to perform well. Earlier career clinicians would not be expected to know and experienced clinicians will have gained the required skills and knowledge over time and feel more comfortable in their role. This would fit with Benner’s [34] concept of moving from novice to expert, where the competent practitioner in mid-career becomes more aware of their long-term goals and gaps in knowledge, thereby intensifying the pressure to achieve. Burrell and Bienstock [35] make a valid point that we should not forget, that competence is an individual characteristic. As such, learners should be treated as individuals and recognition given to the fact that acquisition of skills will take differing lengths of time.

Braga et al. [31] focused on just-in-time simulation (i.e. simulated training took place shortly before the procedure was performed in the clinical setting). While learner performance was enhanced, there was no published evidence to show improved levels of patient complications. While anecdotal evidence suggests that the pedagogy of SBE in fact does have a wide-reaching impact on patient outcomes, to prove and measure this, quality needs to be achieved and maintained in two key areas; the SBE activity itself and the simulation-based research (SBR) processes utilised.

While the research mentioned above focuses on actual patient outcomes, these are often difficult to measure for healthcare educational establishments who are not associated with teaching hospitals. Interestingly, Brydges, Hatala, Zendejas, Erwin and Cook [33] undertook a systematic review and meta-analysis focusing on simulation-based assessments as surrogates for patient-related outcomes. If valid and reliable tools are used to measure these outcomes, they suggest that this format of measuring SBE impact may become common practice in the future. This approach would certainly remove some barriers within this realm of SBE research.

This article aims to summarise the importance of quality within clinical SBE and how it can be achieved and maintained to produce a measurable impact on patient care, but to achieve “quality,” it needs to be defined. There are definitions abound, but all affirm that to measure quality, a benchmark or standard must be set with which to measure your activity against. The Oxford English Dictionary [36] defines quality as, “the standard of something as measured against other things of a similar kind; the degree of excellence of something.” The GMC definition includes, “all the policies, standards, systems and processes that are in place to maintain and improve the quality of medical education” [37]. Quality frameworks for SBE developed by regional networks/groups refer to a narrative of what good quality looks like, recognition of best practice, a level of excellence to act as guidance for simulation and clinical skill providers, and drive quality improvement [38, 39]. Health Education England, in their latest Quality Framework document refers to, “a national and local ambition for quality in education and training” [40]. In the words of Lord Kelvin, “If you cannot measure it, you cannot improve it” [41].

2. QUALITY STANDARDS FOR SIMULATION-BASED EDUCATION

In 2012-13, the Association for Simulated Practice in Healthcare (ASPiH) [42] conducted a National Simulation Development Project [1], supported by Health Education England [43] and the Higher Education Academy [44]. The aim was to map the resources available and the application of SBE and technology-enhanced learning (TEL) across the United Kingdom. A key concern identified in this report was the need for national guidance related to quality indicators and SBE standards of practice. This would need to be of relevance, value and easily accessible to an increasing number and breadth of organisations, departments and individuals designing and delivering SBE.

As a direct result of the National Project, ASPiH established a standards committee consulting with educationalists, professionals, and experts in the field and developed draft standards for SBE [45]. Both the first and second consultation, supported by Health Education England, confirmed that for SBE to achieve its full potential, an agreed quality standard framework is required. The majority of UK simulation centres, educational institutions, and practitioners support this requirement for national standards [46]. In our opinion, there is no doubt that their adoption and application would support and enhance delivery of SBE, allowing for a more rigorous, consistent standard of practice and provide a benchmark to strive towards in order to achieve and maintain quality, parity, and inclusiveness. It is noteworthy that their development has triggered lengthy discussion around the use of the word standard and the mandatory consequence that may be perceived if compared to those of the professional bodies in defining their requirements for education, training, and patient safety such as, for example, the General Medical Council’s Promoting excellence: standards for medical education and training [37] and the Nursing and Midwifery Council’s Quality Assurance Framework Part Three: Assuring the Safety and Effectiveness of Practice Learning [47]. In comparison, the Resuscitation Council is very clear with regards to terminology within its standards and compliance, using the terms must, should, and recommends, making it clear which elements are mandatory [48]. ASPiH needs to be mindful of this in the context of their framework. If the mandatory implications are removed, then the standards may take on a much greater aspirational and best practice significance. A number of organisations and individuals have already expressed concern around the levels of attainment and the challenges and impact that working towards certain elements of the standards may have on their staffing, resources, and finances [49]. Interestingly, others counteract this argument and feel that introduction of the standards may in fact provide leverage regarding funding and more adequate and appropriate resourcing. Hopefully, the latter will prevail.

The latest version of the ASPiH Standards Framework includes four themes: faculty, activity, resources, and technical personnel with an overall aim to provide the “opportunity to associate high quality SBE with improvement in care quality outcomes and system improvement” [50]. ASPiH is very cognisant of the development and use of regional frameworks [51, 52] and the availability of standards and processes for accreditation. In the United States there are currently two organisations who have developed standards, namely the International Nursing Association for Clinical Simulation and Learning (INACSL) Standards of Best Practice [53] and the Society for Simulation in Healthcare (SSH) Accreditation Standards [54]. Despite some UK organisations using these standards for reference, guidance, and partial adoption, the INACSL Standards of Best Practice do not include any of the environmental aspects of creating a simulated scenario or relevant quality assurance frameworks [55]. There is currently no simulation accreditation process widely used in the UK. The SSH accreditation standards have substantial cost implications and no UK organisation has yet gone through the process [56].

There is no doubt that such ambition or aspiration for quality necessitates standards for SBE but evidencing achievement and progressing to recognition for that through accreditation requires additional commitment and is regarded as the final step in most quality assurance processes [1]. Preceding such accolade for most simulation centres and individuals, will be a period of working towards, of improvement, putting things in place, providing the evidence (i.e. measurement against the standards of SBE). In the long term, the measurement and monitoring activities will aim to drive quality improvement of SBE, however, compliance and delivering on such activities could be time consuming and arduous. The second consultation exercise has identified a variety of benchmarking practices, online reporting, self and peer review, periodic face-to-face audit – all voluntary accreditation processes for the UK national standards in SBE [46]. Raising the standards of SBE delivery would, however, allow for a more robust research strategy to be implemented, enabling definitive outcome measures to be addressed.

3. QUALITY STANDARDS FOR SIMULATION-BASED RESEARCH

Despite the plethora of evidence supporting the use of SBE, Cheng et al [2] claim that in the health professions, educational research is often poorly designed and the findings are inconsistently or poorly documented. They argue that many researchers utilise methodologies that reflect traditional educational research and argue that simulation-based research (SBR) has different unique features that are often not considered in the design or methodologies described. Research specific to SBE and healthcare has found that studies have not included aspects like instructional design, setting the context, and outcomes [57]. A further review identified that only 3% of studies utilising a debriefing following SBE (a key element) documented the essential elements required [58]. For the appraiser of the research, parts of the process are often missed out leading potentially to frustration. It could be argued that the inconsistent approach to the many elements of SBR reflects the inconsistent way that SBE is carried out in both NHS trusts and higher education institutions (HEIs). The introduction of the standards may encourage academics and researchers alike to consider the unique methodological challenges faced when carrying out SBR. Cheng et al [2] have contributed to solving this issue by suggesting additions to existing reporting guidelines that reflect the unique qualities of SBR.

It has been commented that SBE in the health setting has sprung up out of a necessity rather than from a robust evidence base with ideas like the changing face of the NHS and increased demand for placements being cited as potential drivers [59]. Others argue that the main driving factor for SBE is patient safety [12, 60-67]. The drivers may be different depending on the clinical speciality. Whatever the drivers or motivations are, the general consensus within the field appears to be that a consistent approach to this pedagogy needs to be adopted to ultimately ensure its quality. The standards are an attempt to develop this pedagogy and provide a consistent approach (along with an evidence base) that is currently missing. Historically, SBE has been developed by pockets of simulation enthusiasts with sometimes very basic equipment and training. Despite the continued investment into SBE, equity of access to specialised centres is still recognised as a potential barrier to the development of this technique [68]. Perhaps there is a real risk that the standards could heighten this problem in the short term. They are a benchmark for what quality SBE should look like. To achieve some of these inevitably will require investment, not only in buildings and equipment, but in ensuring that facilitators (clinicians, educationalists, and learning technologists) delivering SBE are appropriately trained and supervised. Even those centres with the infrastructure to be able to cope with the new demands will find adopting the standards a challenge. Careful consideration needs to be shown to those centres that do not have the resilience to achieve the benchmark in the short term. The risk would be that they carry on in delivering SBE but do so (through no fault of their own) compromising some of the standards. A further risk could be that these centres would not engage in future developments potentially leading to independent SBE providers whose quality (in terms of the ASPiH standards) could not be assured.

In support of the patient safety agenda Deutsch et al. [69] argue that SBE allows a unique opportunity to carry out research into human factors (HF) within healthcare. Human factors have been defined by Catchpole [70] as “Enhancing clinical performance through an understanding of the effects of teamwork, tasks, equipment, workspace, culture, and organisation on human behaviour and abilities and application of that knowledge in clinical settings.”

One of the most complex hurdles academics and researchers must overcome when undertaking research is gaining ethical approval especially when dealing with patients. SBE provides the opportunity to undertake research in simulated clinical environments with members of the interdisciplinary team but without exposing real patients to any direct risks [71, 72]. Deutsch et al. (2016) argue that SBE offers the HF researcher several unique opportunities. At an organisational level, simulations can be used to observe how leaders at different levels respond to patient safety issues, how they apply policy and procedure, and how they risk assess (Deutsch et al. 2016). It also allows potential risks to be identified and acted on before they cause harm, often referred to as latent risks [73]. SBE also provides the opportunity to develop innovative ways of working and problem solving especially within complex teams [69].

4. RESULTS AND DISCUSSION

In conclusion, SBE is set to stay, with professional organisations encouraging its use in their curricula, clinical and educational practice [74-77]. In some areas, the investment into equipment, dedicated facilities, and personnel who support SBE has been significant. However, the drivers and standards guiding SBE have been focussed on those who have the infrastructure to support it rather than robust methodologies and evidence. For SBE to be delivered in a quality assured way (whatever the definition of quality) requires a benchmark standard for a baseline to be achieved. Without them a baseline will never be achieved and SBE will carry on being delivered by enthusiasts who despite their motivations or resources potentially could miss the bigger picture which is about providing high quality education in an effective manner to maximise the benefits of SBE for the learners, their current and future employers, and the simulation centre or programme. Maybe an approach that could be adopted in the short term is a stepped approach i.e. one whereby those delivering and centres providing SBE are to meet a minimum set of criteria documented by the standards and that progression to higher levels of approval are achievable as individuals and centres develop and investment increases. The difficulty lies in deciding what the minimum standards are; setting someone up to fail before they begin could become realistic.

Curran, cited in Riley [78], writes in reference to SBE that “the capability of the trainer as an educator limits or expands the effectiveness of the teaching; the more versatile and competent the trainer, the more likely they are to be effective”. This statement supports the notion that beginning with developing the faculty may also be a sensible starting point. Ensuring that all faculty (from education, research, clinician, and learning technologist) are aware of the underpinning learning theories that support not just traditional education but also the elements that are unique to simulation which will help to provide a better learning experience. It may lead to new theories that have not been explored within education. Having a greater understanding of the pedagogy will provide the researcher the insight to develop new or adapted methodologies to capture the unique data that SBE may provide. With continued advancement in technology, system integration such as electronic medical records and programmes enabling the measurement of simulated patient and manikin parameters (proxy patient outcomes), education, training, and research within SBE is ideally situated to address areas of practice where clinical errors are most prevalent (e.g. prescribing, patient monitoring) [79].

CONCLUSION

As with the aviation industry, simulation is rapidly becoming the industry standard in relation to education and training. The key catalyst for its adoption in aviation is the clear link to enhanced pilot/passenger safety [80]. In healthcare, if such a link between SBE and improved patient outcomes can be established through robust research, incorporating both the standards for SBE [45] and enhanced research framework [2], its development is likely to be continually supported in years to come.

ETHICS APPROVAL AND CONSENT TO PARTICIPATE

Not applicable.

HUMAN AND ANIMAL RIGHTS

No Animals/Humans were used for studies that are base of this research.

CONSENT FOR PUBLICATION

Not applicable.

CONFLICT OF INTEREST

The authors declare no conflict of interest, financial or otherwise.

ACKNOWLEDGEMENTS

Decleared none.

REFERENCES

[1] Anderson A, Baxendale B, Scott L, Mossley D, Glover I. The National simulation development project: Summary report 2014. Available from: (http://www.aspih.org.uk/static/aspihdjango/uploads/documents/general/national-scoping-project-summary-report.pdf) [Accessed: 12 October 2016]
[2] Cheng A, Kessler D, Mackinnon R, et al. Reporting guidelines for health care simulation research: Extensions to the CONSORT and STROBE statements. Simul Healthc 2016; 11(4): 238-48.
[3] Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach 2005; 27(1): 10-28.
[4] Warren JN, Luctkar-Flude M, Godfrey C, Lukewich J. A systematic review of the effectiveness of simulation-based education on satisfaction and learning outcomes in nurse practitioner programs. Nurse Educ Today 2016; 46: 99-108.
[5] McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ 2010; 44(1): 50-63.
[6] Aggarwal R, Mytton OT, Derbrew M, et al. Training and simulation for patient safety. Qual Saf Health Care 2010; 19(Suppl 2): i34-43.
[7] Corvetto MA, Taekman JM. To die or not to die? A review of simulated death. Simul Healthc 2013; 8(1): 8-12.
[8] Huddy JR, Weldon S-M, Ralhan S, et al. Sequential simulation (SqS) of clinical pathways: A tool for public and patient engagement in point-of-care diagnostics. BMJ Open 2016; 6(9): e011043.
[9] Alinier G, Hunt WB, Gordon R. Determining the value of simulation in nurse education: Study design and initial results. Nurse Educ Pract 2004; 4(3): 200-7.
[10] Berrigan L. Simulation: An effective pedagogical approach for nursing? 2011. Available from: (http://www.ncbi.nlm.nih.gov/ pubmed/21334797). Accessed: 11 November 2016.
[11] Cant RP, Cooper SJ. Simulation-based learning in nurse education: Systematic review. J Adv Nurs 2010; 66(1): 3-15.
[12] Guimond ME, Sole ML, Salas E. Getting ready for simulation-based training: A checklist for nurse educators. Nurs Educ Perspect 2011; 32(3): 179-85.
[13] Hope A, Garside J, Prescott S. Rethinking theory and practice: Pre-registration student nurses experiences of simulation teaching and learning in the acquisition of clinical skills in preparation for practice. Nurse Educ Today 2011; 31(7): 711-5.
[14] Issenberg SB, Pringle S, Harden RM, Khogali S, Gordon MS. Adoption and integration of simulation-based learning technologies into the curriculum of a UK undergraduate education programme. Med Educ 2003; 37(Suppl. 1): 42-9.
[15] McCallum J. The debate in favour of using simulation education in pre-registration adult nursing. Nurse Educ Today 2007; 27(8): 825-31.
[16] Murray C, Grant MJ, Howarth ML, Leigh J. The use of simulation as a teaching and learning approach to support practice learning. Nurse Educ Pract 2008; 8(1): 5-8.
[17] Shepherd CK, McCunnis M, Brown L, Hair M. Investigating the use of simulation as a teaching strategy. Nurs Stand 2010; 24(35): 42-8.
[18] Wong TK, Chung JW. Diagnostic reasoning processes using patient simulation in different learning environments. J Clin Nurs 2002; 11(1): 65-72.
[19] McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ 2014; 48(4): 375-85.
[20] Rubio DM, Schoenbaum EE, Lee LS, Schteingart DE, Marantz PR, Anderson KE, et al. Defining translational research: Implications for training. academic medicine J Asso Amer Med Colleges 2010; 85(3): 470-5.
[21] Barelli A, Biasucci DG, Barelli R. Is simulation efficient to improve anesthetists’ performance and patient outcome? Minerva Anesthesiol 2012; 78(5): 628.
[22] Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Simul Healthc 2010; 5(2): 98-102.
[23] Draycott T, Sibanda T, Owen L, et al. Does training in obstetric emergencies improve neonatal outcome? BJOG 2006; 113(2): 177-82.
[24] McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simul Healthc 2011; 6(7)(Suppl): S42-7.
[25] Shear TD, Greenberg SB, Tokarczyk A. Does training with human patient simulation translate to improved patient safety and outcome? Curr Opin Anaesthesiol 2013; 26(2): 159-63.
[26] Riley W, Davis S, Miller K, Hansen H, Sainfort F, Sweet R. Didactic and simulation nontechnical skills team training to improve perinatal patient outcomes in a community hospital. Jt Comm J Qual Patient Saf 2011; 37(8): 357-64.
[27] Smith A, Siassakos D, Crofts J, Draycott T. Simulation: Improving patient outcomes. Semin Perinatol 2013; 37(3): 151-6.
[28] Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: A case-control study. Chest 2008; 133(1): 56-61.
[29] Merchant DC. Does high-fidelity simulation improve clinical outcomes? J Nurses Staff Dev 2012; 28(1): E1-8.
[30] Zendejas B, Brydges R, Wang AT, Cook DA. Patient outcomes in simulation-based medical education: A systematic review. J Gen Intern Med 2013; 28(8): 1078-89.
[31] Braga MS, Tyler MD, Rhoads JM, et al. Effect of just-in-time simulation training on provider performance and patient outcomes for clinical procedures: A systematic review. BMJ Simulation and Technology Enhanced Learning 2015.
[32] Zendejas B, Brydges R, Wang AT, Cook DA. Patient outcomes in simulation-based medical education: A systematic review. J Gen Intern Med 2013; 28(8): 1078-89.
[33] Brydges R, Hatala R, Zendejas B, Erwin PJ, Cook DA. Linking simulation-based educational assessments and patient-related outcomes: A systematic review and meta-analysis. Acad Med 2015; 90(2): 246-56.
[34] Benner P. From novice to expert: Excellence and power in clinical nursing practice. Addison-Wesley, Menlo Park, California 1984.
[35] Burrell DA, Bienstock JL. Improving patient outcomes through supervision and simulation. Med Educ 2015; 49(7): 647-9.
[36] Oxford English Dictionary. Quality 2016. Available from: ( https://en.oxforddictionaries.com/ definition/quality.) Accessed: 24 October 2016.
[37] General Medical Council. Promoting excellence: standards for medical education and training, Available from: (http://www.gmc-uk.org/Promoting_excellence_standards_for_medical_education_and_training_0715.pdf_61939165.pdf) [Accessed: 25 October 2016]
[38] Health Education England North Region Quality Guidance Document 2016.
[39] North West Simulation Education network Quality Assurance Framework 2011.
[40] Health Education England. HEE Quality Framework 2016/17 2016. Available from: (https://hee.nhs.uk/ sites/default/files/ documents/Quality Framework.pdf) Accessed: 25 October 2016.
[41] Kelvin L. Lord Kelvin. Lord Kelvin Quotations n.d. Available from: (http://zapatopi.net/kelvin/quotes/) 2016. Accessed: October 24 2016
[42] Association for Simulated Practice in Healthcare. Association for Simulated Practice in Healthcare 2017. Available from: (http://www.aspih.org.uk/). Accessed: 1 May 2017.
[43] Health Education England. Health Education England 2016. Available from: (https://hee.nhs.uk/) Accessed: 26 October 2016.
[44] Higher Education Academy. Higher Education Academy 2016. Available from: (https://www.heacademy.ac.uk). Accessed 26 October 2016
[45] Simulation Based Education in Healthcare: Standards for Practitioners. Draft 2016. Available from: (https://worldspanmedia.s3.amazonaws.com/media/aspihdjango/uploads/documents/standards-consultation/draft-standards-for-sbe.pdf). [Accessed 30 September 2016].
[46] Association for Simulated Practice in Healthcare Outcomes of the Second Consultation: National Standards for Simulation Based Education 2016.
[47] Nursing and Midwifery Council. Quality assurance framework for nursing and midwifery education and local supervising authorities 2015. Available from: (https://www.nmc.org.uk/globalassets/sitedocuments/edandqa/nmc-quality-assurance-framework.pdf). Accessed 6 November 2016.
[48] Resuscitation Council (UK). Resuscitation Council (UK). Quality standards for cardiopulmonary resuscitation practice and training 2016. Available from: https://www.resus.org.uk/quality-standards/ community-hospitals-care-quality-standards-for-cpr/ 2016. Accessed: 6 November
[49] Gopal A, Baxendale B. ASPiH Simulation Based Education in Healthcare Standards for Practitioners Consultation summary report of round table discussion. ASPiH Annual Conference: Brighton. 2015.
[50] Association for Simulated Practice in Healthcare and Health Education England. Simulation Based Education in Healthcare: Standards Framework, Available from: (https://worldspanmedia.s3.amazonaws.com/ media/aspihdjango/ uploads/documents/ standards-consultation/ standards-framework.pdf) Accessed: 27 November 2016.
[51] Health Education Yorkshire and the Humber Quality Management of Clinical Skills and Simulation Training 2013. Available from: (http://login.qaclinicalskills.co.uk/files/QACSS%20Final%20Version%20May%202013.pdf). Accessed 25 October 2016.
[52] North West Simulation Education Network. Accreditation of Education Using Simulation Based Learning 2016. Available from: (http://www.northwestsimulation.org.uk/ mod/page/ view.php?id=285) [Accessed: 26 October 2016].
[53] International Nursing Association for Clinical Simulation and Learning. Standards of Best Practice: Simulation 2015. Available from: (http://www.inacsl.org/i4a/ pages/index.cfm?pageid=3407.) [Accessed: 26 October 2015].
[54] Society for Simulation in Healthcare. Full Accreditation 2016. Available from: (http://www.ssih.org/Accreditation/Full-Accreditation.) Accessed: 26 October 2016.
[55] Parle J. West midlands central health innovation and education cluster: Simulation topic - final report. University of Birmingham 2013.
[56] SSH Accredited Programs. International 2016. Available from: (http://www.ssih.org/Accreditation/Programs-International). [Accessed 25 October 2016].
[57] Cook DA, Levinson AJ, Garside S. Method and reporting quality in health professions education research: A systematic review. Med Educ 2011; 45(3): 227-38.
[58] Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: A systematic review and meta-analysis. Med Educ 2014; 48(7): 657-66.
[59] McKenna L, French J, Cross W. Prepare Nurses for the Future: Identify Use of Simulation, and More Appropriate and Timely Clinical Placement to Increase Clinical Competence and Undergraduate Positions: Final Report of Key Activities for Department of Human Services Nurse Policy Branch. Melbourne, Australia: Monash University 2007.
[60] Bradley P. The history of simulation in medical education and possible future directions. Med Educ 2006; 40(3): 254-62.
[61] Gordon CJ, Buckley T. The effect of high-fidelity simulation training on medical-surgical graduate nurses’ perceived ability to respond to patient clinical emergencies. J Contin Educ Nurs 2009; 40(11): 491-8.
[62] Killam LA, Montgomery P, Luhanga FL, Adamic P, Carter LM. Views on unsafe nursing students in clinical learning. Int J Nurs Educ Scholarsh 2010; 7(1): e36.
[63] Leonard M, Graham S, Bonacum D. The human factor: The critical importance of effective teamwork and communication in providing safe care. Qual Saf Health Care 2004; 13(Suppl 1): 85-90.
[64] Melnyk BM. Evidence to support the use of patient simulation to enhance clinical practice skills and competency in health care 2008.
[65] Valler-Jones T, Meechan R, Jones H. Simulated practice--a panacea for health education? Br J Nurs 2011; 20(10): 628-31.
[66] Agency for Healthcare Research and Quality. Improving Patient Safety in Hospitals: A resource list for the users of the AHRQ hospital survey on patient safety culture 2010. Available from: (http://www.abdn.ac.uk/~wmm069/uploads/ files/ Improving_ Patient_Safety_in_Hospitals_Resource_List_4-9-10.pdf) Accessed: 11 November 2016.
[67] Agency for Healthcare Research and Quality. Surgical simulation training improves speed and confidence in residents learning endoscopic sinus surgery AHRQ Research Activities 2010; (360): 12.
[68] Al-Ghareeb AZ, Cooper SJ. Barriers and enablers to the use of high-fidelity patient simulation manikins in nurse education: An integrative review. Nurse Educ Today 2016; 36: 281-6.
[69] Deutsch ES, Dong Y, Halamek LP, Rosen MA, Taekman JM, Rice J. Leveraging health care simulation technology for human factors research: closing the gap between lab and bedside. Hum Factors 2016; 58(7): 1082-95.
[70] National Quality Board. Department of Health Human Factors Reference Group Interim Report, 1 March 2012. Available from: (https://www.england.nhs.uk/wp-content/uploads/2013/11/DH-rep.pdf) 2016.
[71] Deutsch ES. Simulation in otolaryngology: Smart dummies and more. Otolaryngol Head Neck Surg 2011; 145(6): 899-903.
[72] Kneebone R. Simulation in surgical training: Educational issues and practical implications. Med Educ 2003; 37(3): 267-77.
[73] Lok A, Peirce E, Shore H, Clark SJ. A proactive approach to harm prevention: Identifying latent risks through in situ simulation training. Infant 2015; 11(5): 160-3.
[74] National Health Service. The UK Foundation Programme Curriculum 2010. Available from: (http://www.foundationprogramme.nhs.uk/ download.asp?file=Foundation_Curriculum_2010_WEB_Final.PDF.) Accessed: 11 November 2016.
[75] Nursing and Midwifery Council. Essential skills clusters for pre-registration nursing programmes. Nursing and Midwifery Council London 2007.
[76] Nursing and Midwifery Council. Nursing and Midwifery Council. Supporting direct care through simulated practice learning in the pre-registration nursing programme 2007. Available from: (http://www.nmc-uk.org/Documents/ Circulars/ 2007circulars/ NMCcircular36_ 2007.pdf.) Accessed: 11 November 2016.
[77] Nursing and Midwifery Council. Standards for pre-registration nursing education 2010. Available from: (https://www.nmc.org.uk/standards/additional-standards/standards-for-pre-registration-nursing-education/) [Accessed: 30 October 2016].
[78] Riley RH. Manual of Simulation in Healthcare. Oxford: Oxford University Press 2008.
[79] Improvement NH. NHS Improvement. Organisation patient safety incident reports: 28 September 2016 2016. Available from: https://improvement.nhs.uk/ resources/ organisation-patient-safety-incident-reports-28-september-2016/ 2016. Accessed: 9 November
[80] Aebersold M. The History of Simulation and Its Impact on the Future. AACN Adv Crit Care 2016; 27(1): 56-61.