Browsing by Author "Field, Megan"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item Open Access Do you trust me?(Cranfield University, 2017-11-15 11:52) Field, MeganPoster presented at the 2017 Defence and Security Doctoral Symposium.Automation of technology and systems across such domains as defence, nuclear, transportation and healthcare is forecast to increase dramatically in the coming decades, and with that, levels of automation (LOA) are set to change the role of operators.However, the shift from of working directly with and within a system to one characterised by supervision and (sometimes remote) surveillance, brings a range of human-centred issues and limitations. These issues are not solely focused on how the operator can cope with the huge amounts of real-time data and information; they also concern how individuals react and behave towards computerised teammates. This is especially critical in military environments, such as static and mobile Command and Control (C2) centres. These facilities must accurately and appropriately analyse, fuse and display considerable amounts of C3I (Communications, Command, Control and Intelligence) material. The ability to trust (or mistrust) a system is, therefore, vital for human safety and mission success.Nonetheless, human actions and behaviours are not formed in a ‘cognitive vacuum’ – they are influenced by the context of tasks, environments, prior experiences and memories. Trust formation with technology and automation is affected by many precedents, in a process similar to which humans endow others with levels of trust and confidence. These include prior knowledge, experiences with similar technology (or people) and how expectations, lack of transparency and failures can lead to mistrust.This research seeks to explore behaviours and attitudes of human operators, and how military culture shapes operator heuristics and naturalistic decision making. The qualitative inquiry will also probe whether these circumstances foster maladaptive behaviour which differs or deviate to those of civilian and defence personnel.Item Open Access Facilitation of Trust in Automation: A Qualitative Study of Behaviour and Attitudes Towards Emerging Technology in Military Culture(Cranfield University, 2018-11-15 12:21) Field, MeganPoster presented at the 2018 Defence and Security Doctoral Symposium.New technologies, increased levels of automation and artificial intelligence is emerging and integrating into our lives at an ever-quickening pace, however how we respond to these changes are not as immediate. Furthermore, in high-criticality domains where integration of new technologies is mission and life critical, finding the underlying obstacles for mistrust, under-reliance and apprehension in adapting to these are incredibly important.To aid in the facilitation of new technologies in the military domain, the research seeks to explore attitudes and behaviours through narrative analysis of underlying expressions of trust in personnel associated with different echelons of the Forces, alongside civilians. This is to inquire into differing attitudes and whether the unique culture and subcultures of the military colour narratives towards emerging technology.Item Open Access The Frankenstein Syndrome(Cranfield University, 2017-11-15 12:00) Field, MeganDigital image presented at the 2017 Defence and Security Doctoral Symposium.Automation of technology and systems across such domains as defence, nuclear, transportation and healthcare is forecast to increase dramatically in the coming decades, and with that, levels of automation (LOA) are set to change the role of operators. However, the shift from of working directly with and within a system to one characterised by supervision and (sometimes remote) surveillance, brings a range of human-centred issues and limitations. These issues are not solely focused on how the operator can cope with the huge amounts of real-time data and information; they also concern how individuals react and behave towards computerised teammates. This is especially critical in military environments, such as static and mobile Command and Control (C2) centres. These facilities must accurately and appropriately analyse, fuse and display considerable amounts of C3I (Communications, Command, Control and Intelligence) material. The ability to trust (or mistrust) a system is, therefore, vital for human safety and mission success.Nonetheless, human actions and behaviours are not formed in a ‘cognitive vacuum’ – they are influenced by the context of tasks, environments, prior experiences and memories. Trust formation with technology and automation is affected by many precedents, in a process similar to which humans endow others with levels of trust and confidence. These include prior knowledge, experiences with similar technology (or people) and how expectations, lack of transparency and failures can lead to mistrust.This research seeks to explore behaviours and attitudes of human operators, and how military culture shapes operator heuristics and naturalistic decision making. The qualitative inquiry will also probe whether these circumstances foster maladaptive behaviour which differs or deviate to those of civilian and defence personnel.Item Open Access Trust in Automation: A Qualitative Study of Behaviour and Attitudes Towards Emerging Technology in Military Culture(Cranfield University, 2018-11-27 14:51) Field, MeganTechnical paper presented at the 2018 Defence and Security Doctoral Symposium.Trust is often explored as a determinant of appropriate automation usage and reliance. Despite the wealth of research into the antecedents, decision making and cognitive factors to facilitate human-automation interaction, internal factors that influence dispositional trust, are often underrepresented. High speciality and criticality domains characterise the most researched areas in this field, however, there are minimal studies exploring organisational culture, such as within the military, and their effect on trust in automation. The research seeks to explore the dominant narratives of differing echelons of the military (ground, air, surface and subsurface) through responsive interviewing and examining the unique culture borne of strong hierarchical order, regulations and training in parallel to civilians. Furthermore, within the larger scope, submarine culture is psychosocially distinctive due to the environmental constraints of active duty, such as the isolation and restrictions incurred by lengthy operational deployment. Due to this seclusion, submarine life is often distinct from other strata owing to the weight of human-human trust and kinship placed on the personnel over automated teammates (e.g., decision making software). The research plans to delve into the experiences of this idiosyncratic workforce and others to explore how service potentially alters their views and experience of human-automation/system interactions and whether underlying skepticisms, expertise or training play a part in their worldview.