Prof Harold Thimbleby …
Talking & speaking
I am developing a new web site for 2024! This old site is just to say something to keep people happy while I work on the new one. Meanwhile, please have a look at our new booklet on patient safety and digital health.
Harold believes strongly that public understanding and awareness of technology and the science behind it is crucial for us to benefit from it to the full. He has been an ACM Distinguished Speaker.
He gives regular high-level talks on digital healthcare, for instance he gave one at the World Health Organization Global Ministerial Summit on Patient Safety Harold was the keynote speaker at the “Next Generation Tour,” a research workshop that toured New Zealand Universities, inspiring undergraduates to take up research. He regularly runs conferences and workshops. He has given over 80 conference keynotes and over 500 seminars and presentations (including at Cambridge, MIT, Oxford, Royal Institution, Stanford and the House of Lords) in 31 different countries. Harold’s spoken at ten Edinburgh International Science Festivals, the British Association Annual science festival, Spoleto Festival, TECHFEST, Mumbai, Welsh Eisteddfods, and numerous Science Cafés, etc. Harold and Will Thimbleby exhibited an amazing new calculator at the Royal Society Summer Science Exhibition and at many other exhibitions. Harold has written memorable advice for giving excellent presentations, which he calls Pirate Talks. A selection of recent talks: Computers are involved in all aspects of patient care, from booking appointments through to com- puters in systems that deliver care, such as ventilators, infusion pumps and pacemakers, as well as in computerised decision support systems supporting clinicians. Computers are used in diagnosis and assessment, in MRI scanners, and weighing machines. They control sterlisation, security, and ambulance dispatch. Everybody has mobile phones, email, calculators and medical apps.
It is likely that computer-related preventable error, including cybersecurity exploits, is significant, but more research is needed to quantify its impact. Our own very conservative estimate is that 1,000 deaths per year are caused in the English NHS by unnecessary bugs in computer systems. Regardless of an accurate assessment of numerical impact, though, we should be striving to minimise it, and enabling procurement to choose safer systems. We show that manufacturers appear to be unaware of bugs. If they — the most technical people involved — are unaware of bugs, then neither clinicians nor incident investigators will be properly aware of computer-related causes of patient harm. The aim of this paper is to show that computer-related error is frequently over-looked by programmers and manufacturers. In turn, it is over-looked by regulators, by procurement, and by clinicians. It is ubiquitous and remains highly problematic. We show that there are ways in which computer-related harm can be reduced. We provide 14 specific suggestions for improvement. Improvement will require tighter regulation as well as improved software engineering.
- Keeping patients safe — Trust me I’m a computer?
Computers, IT, digitisation, apps — whatever we call it — is everywhere in healthcare, and it is also racing ahead of healthcare and creating dreams and exciting opportunities for quality improvement and transformation. We want a paperless NHS. Yet we have to be careful what we wish for. We are most familiar with consumer IT, our own personal phones and tablets, but our enthusiasm for this must not be confused with what might be best for healthcare.
This talk has been written up for the Future Hospital Journal
- Turning into Effective HCI Researchers & Saving Lives Through Research in Healthcare, Computer Science and HCI
- Research is a rewarding lifetime career, and arguably the best way to make the world a better place while having fun at the same time. You meet people and make friends all over the world, and governments give you pots of money to do what you want to do. But it is a very competitive world, and to become successful — just even for the first steps of a PhD — means taking research strategy seriously. This talk will help everyone plan their own successful strategies as well as cope with the inevitable failures.
- Healthcare is a surprisingly dangerous place, and it is full of computers. The worldwide fiasco with the WannaCry malware merely made some of the problems very visible. Hospitals stopped working. What else is going wrong? What are the soluble research problems and what can we do about it? The talk is of interest to any computer scientists, HCI and human factors specialists, as well as to clinicians and especially to patients!
- A Crisis To Be Avoided
We want to provide the best patient care, and IT is a large part of that. But as much as IT is exciting, it can also cause serious problems. This interactive workshop concludes with a hands-on review of some of the latest thinking on patient safety.
- Human error is not the problem
Good HCI improves everything, but it has some unspoken assumptions: we need mature HCI and we need market forces to notice and demand good HCI. (That’s why iPhones are so successful.) In healthcare, we need more maturity and more awareness. This talk exposes the problems, and shows how to move healthcare to a more mature view of HCI — which eventually will save many lives.
- IT is the problem with healthcare
It seems obvious that modern healthcare needs more and more modern IT. For instance, it is obvious that most hospitals are years behind patients’ and doctors’ use of apps and social media, and the gap is getting worse. Yet this popular view is fundamentally muddled. In fact IT plays a critical part in hospital inefficiency and avoidable patient harm. Merely having more will be counter-productive. A recent UK court case over the corruption of patient data serves as a good example of our widespread inability to understand, provide, use or develop dependable IT — and we blame the wrong things. This talk will explain key problems with IT and go some way to explaining why hospital error has become Europe’s unacknowledged third biggest killer (close after cancer and heart disease) — and that thinking more clearly about IT is essential. The problems with IT also beset every other organisation, especially public services, including universities, although these organisations rarely kill their customers and students! This talk will be of interest to all thinkers about the future uses of computers, and especially patients, potential patients (that’s all of us), and particularly programmers who could deliver the improved IT that is needed. The talk itself aims to be interesting to general audiences, but experts in formal methods and human factors will recognise the underlying science.
- Human factors failings in the German Enigma design
The German World War II Enigma suffered from design and use weaknesses that facilitated its large-scale decryption first by the Polish and then continued in Britain throughout the war. The main technical weaknesses (self-coding and reciprocal coding) could have been avoided using simple contemporary technology, and therefore the true cause of the weaknesses is not technological but must be sought elsewhere. We will show that human factors issues resulted in the persistent failure to notice and seek out more effective designs. Similar limitations beset the historical literature, which misunderstands the Enigma weaknesses and therefore inhibits broader thinking about design and the critical role of human factors engineering in cryptography.
- Your invitation to fix healthcare IT
After heart disease and cancer, the third most likely cause of death is preventable medical error, and in almost every case IT is involved. Computer scientists should unite in fixing healthcare IT: it requires input from HCI, formal methods, and more, and there is everything to gain. This talk features a wide range of easily avoidable IT problems that have led to unnecessary harm and death.
- IT: help or hindrance?
Healthcare and IT do not fit comfortably together, and there has been a history of visions followed by failure. What is going on behind this, and what do we need to do?
- Creativity, innovation and risk
Encouraging international researchers to consciously think about their research strategy so they are effective and productive in a highly competitive world.
- How to put a winning proposal together
What really matters when you put a winning research proposal together? A stimulating review of the etiquette, assumptions and opportunities. Workshop includes hands-on development and formative review of proposals.
- Human factors and missed solutions to WWII Enigma design weaknesses
“The German World War II Enigma suffered from design weaknesses that facilitated its large-scale decryption by the British throughout the war. The main technical weaknesses (self-coding and reciprocal coding) could have been avoided using simple contemporary technology, and therefore the true cause of the weaknesses is not technological but must be sought elsewhere: we argue that human factors issues resulted in the persistent failure to seek out more effective designs. Similar limitations beset the historical literature, which misunderstands the Enigma weaknesses and therefore inhibits broader thinking about design and the critical role of human factors engineering in cryptography.
Bio: Harold Thimbleby is professor of computer science at Swansea University, Wales, and Emeritus Professor of Geometry, Gresham College, London. He built an electromechanical Enigma in 2002 to illustrate a Gresham College lecture on cryptography, and he has been fascinated by the topic ever since. Harold’s research interest is human error, particularly in complex healthcare systems, but he became interested in the Enigma because its design failures make a provocative analogue to healthcare IT design failures.
- Creativity, innovation and risk in your research
“Most people just ‘do’ research without thinking strategically about their work, their interests and how to do better. What is their plan when a paper or a funding application gets rejected? What is their plan for their plans? How can they be luckier next time? What really matters, and how can we prioritise this to best effect, when all around us are distractions from our priorities?
- Unsafe in any bed
“Unsafe At Any Speed” was the title of Ralph Nader’s damning critique of the 1960s car industry. We are at a similar position with today’s healthcare: Terrible, but quite able to improve. Ross Koppel (University of Pennsylvania, USA) will present examples of Healthcare IT and explore why so much of it fails to respond to the needs of clinicians and patients. Harold Thimbleby will show how many of these problems arise from design failings that remain invisible until it is too late. How can these problems be avoided, so patients are safer? Together Ross and Harold will debate with the audience to respond to ideas for improved healthcare in our increasingly computer-dominated hospitals.
- Unsafe healthcare devices, and how to improve them
“Unsafe At Any Speed was the title of Ralph Nader’s damning critique of the 1960s car industry. We in a very similar position with today’s healthcare: unseen problems with devices and health IT cause and contribute to error. Too often investigations fail to explore the impact of system design on error. To improve, we suggest two obvious ideas: black boxes and a public safety scoring system. Design-induced error needs to be visible to enable learning and finding better solutions to the problem; secondly, safety scores will enable all stakeholders (regulators, procurers, clinicians, incident investigators, journalists, and of course patients) to compare solutions and hence chose better systems.
- Improving safety in medical devices and systems
We need to improve healthcare technologies — electronic patient records, medical devices — by reducing use error and, in particular, unnoticed errors, since unnoticed errors cannot be managed by clinicians to reduce patient harm. Every system we have examined has multiple opportunities for safer design, suggesting a safety scoring system. Making safety scores visible will enable all stakeholders (regulators, procurers, clinicians, incident investigators, journalists, and of course patients) to be better informed, and hence put pressure on manufacturers to improve design safety. In the longer run, safety scores will need to evolve, both to accommodate manufacturers improving device safety and to accommodate insights from further research in design-induced error.
- Human error is not the problem
When something bad happens to a patient, then surely somebody must have done something bad? Although it’s a simple story, it’s usually quite wrong. This talk argues, with lots of surprising examples, that the correct view is you do not want to avoid error — you want to avoid patient harm. Drawing on human factors and computer science, this talk shows the astonishing ways that systems conspire to cause and hide the causes of error. We will then show that better design can reduce harm significantly. We explain why industry is reluctant to improve, and how new policies could help improve technology.
- Social network analysis and interactive device design analysis
All interactive systems respond to what users do by changing what they are doing — though sometimes maybe not in the way you intended! What they were doing before and after any particular user action defines a network. These networks have many interesting properties, which can be readily related to social and other sorts of more familiar networks. For example, hubs are well-connected, and off is usually easy to get to from anywhere, so off is usually — but not always — a hub. It turns out that many interactive systems have quite quirky designs, and network analysis can pinpoint design problems. For example, anaesthetists often switch devices (ventilators etc) off-and-on-again to adjust patient parameters; it turns out that off is often the most between state and therefore usually the best route to get a device to do what you want. It would have been preferable, instead, to design systems so standby was the most between state, as that would allow anaesthetists to adjust some patient values without having to reset all of them, which is what off does. Such design problems may be critical in an operation. In short, this talk will discuss usability problems (particularly medical devices, where design really matters) from the perspective of network analysis.
Mae pob system ryngweithiol yn ymateb i weithredoedd defnyddwyr pan fyddant yn newid yr hyn y maent yn eu wneud — er na fydd hynny bob amser o reidrwydd yn y modd a fwriadwyd gennych! Yr hyn yr oeddent yn ei wneud cyn ac ar ôl unrhyw weithred benodol gan ddefnyddiwr sy’n diffinio rhwydwaith. Mae llawer o nodweddion diddorol i’r rhwydweithiau hyn, a gellir creu cysylltiad hwylus rhyngddynt a rhwydweithiau cymdeithasol a phob math o rwydweithiau eraill mwy cyfarwydd. Er enghraifft, mae digonedd o gysylltiadau i ganolbwyntiau, ac mae fel arfer yn hawdd cyrraedd y cyflwr o fod i ffwrdd (off) o unrhyw le, felly fel arfer — er nad bob amser — canolbwynt yw i ffwrdd. Fel mae’n digwydd, mae dyluniad digon anghyffredin i lawer o systemau rhyngweithiol, ac o ddadansoddi’r rhwydweithiau gellir canfod problemau dylunio. Er enghraifft, bydd anaesthetegwyr yn aml yn troi dyfeisiau (peiriannau anadlu ac ati) i ffwrdd ac ymlaen eto i addasu paramedrau cleifion; fel mae’n digwydd, y cyflwr i ffwrdd yn aml yw’r cyflwr mwyaf rhyngol (between), ac felly fel arfer, dyna’r llwybr gorau i sicrhau bod dyfais yn gwneud yr hyn rydych chi’n ei ddymuno. Buasai’n well dylunio systemau fel mai segur (standby) oedd y cyflwr mwyaf rhyngol, gan y byddai hynny’n golygu bod modd i anaesthetegwyr addasu rhai o werthoedd y claf heb orfod ailosod y cyfan, fel sydd yn digwydd yn achos i ffwrdd. Gall y cyfryw broblemau dylunio fod yn hollbwysig mewn llawdriniaeth. Yn gryno, bydd y sgwrs hon yn trafod problemau hwylustod defnydd (yn arbennig dyfeisiau meddygol, y mae eu dyluniad yn wirioneddol bwysig) o bersbectif dadansoddi rhwydweithiau.
Here’s a selection of other recent talks and lectures:
- Welsh Computer science. Science Advisory Council for Wales
- Improving medicines safety with new technology. Royal Pharmaceutical Society Medicines Safety Symposium
- Widespread errors are fixable by better design. National Biomedical and Clinical Engineering Conference
- Dependable user interfaces — avoiding computer-provoked human error in medical systems. IEEE International Symposium on Computer-Based Medical Systems
- Design out harm — don’t design out error. Stanford Research Institute
- Safer Health IT. Aspiring to clinical excellence conference, Guild of Healthcare Pharmacists/UK Clinical Pharmacy Association
- Blindspots and safer medical systems. DesignMed, Stuttgart
- Numbers, numbers everywhere, and none you can trust — not yet! Cardiff Scientific Society
- Secrets of research success. Scottish Informatics and Computer Science Alliance
- Saving lives with science. Royal Society
- Looking beyond human error to its causes. Patient Safety and Clinical Decision Making, Keynote, Royal College of Physicians, Edinburgh
- Looking beyond human error to its causes and prevention. Scottish Intensive Care Society Annual Scientific Meeting
- Moving from user interface design to interaction programming. MIT
- Avoiding designed-in errors in interactive medical devices. University of Cambridge
- Thinking out of the Computer Science cargo cult box. Distinguished Lecture Series, St Andrews University
- A new sort of calculator. University of California at Berkeley
- Mud and maths. Royal Institution
- Interaction technology and its impact on science. Department for Environment, Food and Rural Affairs (DEFRA)
- Avoiding death by computer. Swansea Science Café
- School talks …
See also Harold’s views on undergraduate teaching.
He has been widely interviewed and reported in the media. Harold gives numerous public lectures and school talks. He had the largest post-bag ever for a New Scientist feature article he wrote about video recorders.
Lecture at the Royal Institution
Getting to grips with maths using a Land Rover differential
— getting the next generation excited