I was on a work trip to the UK recently, which involved a train journey from the airport. The flight landed a little later than scheduled, so I missed the train I had pre-booked tickets for. I was pretty confident I had booked fully changeable tickets, but I checked the app while we were waiting to be let off the plane. Helpfully, the app in question had an ‘assistant’ (aka AI chatbot) that I could link to my tickets and ask whether I could get the next train. It confirmed that I could.
So far so good.
My journey required two stops, so once on the train, I wanted to double-check whether they were the same on this later journey as they had been on the one I originally booked. I asked the ‘assistant’ and was told I had a two-hour wait in Reading for the first change. It may help to explain my horror at reading this to know that it was nearly 9pm by then.
I asked the ‘assistant’ to check the timings as they didn’t look right. The reply came back, ‘You are right. I got that wrong. Sorry about that. You need to change at [another station] instead, and there is a 14-minute wait for the next train.’ The other station wasn’t really an obvious place for me to change, so again, I asked it to check the change details.
Again, it came back, ‘You are right. I got that wrong. Sorry about that. You need to get off at the next station and go back to where you started, as you are on the wrong train to get to your location.’
By this time, I was starting to feel a rising sense of panic. Was I on the wrong train?
Just then, my mobile signal dropped off, and the train wi-fi was woeful. All the time I was thinking, Am I heading somewhere miles from where I need to be?
Once my signal was back, I tried once more on the ‘assistant’, which had forgotten my previous chat and given me new times for changes that made no sense, since they were in the past! My panic was now mixed with increasing annoyance and frustration. I decided to go on the booking website and simply try to book a ticket for the train I was on and see what changes there were. Thankfully, I was on the right one, the changes were the same as the original journey, and the wait times were minimal. Phew!
First takeaway from that story? Please do not roll out AI across any part of your business unless and until you have stress-tested it… properly. It has real-world consequences. I could’ve been sent as a lone traveller, at night, to a place I didn’t want to be and didn’t know.
The funny thing is, at the point where I was desperately trying to get some sense from the AI chatbot, the one thing I was most hoping to see in the train carriage was a human being in the form of train staff
Second takeaway? Sometimes, only humans will do – especially at times of stress or difficulty.
On my way home a couple of days later, I was checking into a hotel at the airport for the night. The check-in was on a computer screen, and I noticed an elderly gentleman hovering by the unmanned desk as I checked in.
When a staff member saw him, they checked him in, and he asked if he could have a wake-up call at 5:30am as he had an early flight to catch. The abrupt answer was, ‘We don’t do that. Use your phone.’ He replied, saying that he didn’t have a phone. He was then told to use his watch, to which he replied that it didn’t have an alarm because it was a hand-wound watch.
As I was in the lift to my room, I was left pondering the situation this man found himself in, and it made me feel more than a little sad. Firstly, because I simply hoped that he did manage to wake in time. Perhaps I should’ve offered to help? Secondly, it brought home to me how much the world has changed in recent years, and how reliant we have become on all things digital.
Even as an older Gen X, I’m pretty comfortable around most elements of the digital world and rarely give it a second thought: I booked and downloaded my train tickets on my phone; I booked the hotel room online; I checked in online; I didn’t need the hotel to give me an alarm call as I can set that on my phone – a very small snapshot of life in 2026.
But what about the elderly gentleman I saw? How is he to navigate this brave new world? The answer is, with great difficulty. And I was struck by how unfair it seemed that those people most in need of extra help and support are at risk of being left behind.
My takeaway from that story? In data protection legislation, there is a legal obligation on organisations to consider the risk to individuals where new processing of data is planned. Whilst it may not be legally explicit, it’s surely morally incumbent upon us to also consider those who may be excluded when we roll out new technologies. These are people who rarely have a voice or a platform, but their lives are affected just as much (if not more), so we must do all we can to ensure they’re considered.
Of course, that will sometimes require additional time and money, but in the case of the elderly man I saw, would it really have been a difficult endeavour to ask the person on shift that next morning to knock on the door to ensure he was awake in time to catch his flight? I do hope we don’t find ourselves sacrificing simple acts of human kindness at the altar of technological progress.
Andrew grill, an author and AI commentator, recently wrote:
“The real risk is not that AI takes your job in 18 months, it’s that your organisation bolts AI onto broken workflows designed for a different era and calls it transformation. Nearly 90% of companies have invested in AI. Fewer than 40% are seeing measurable gains. That’s not a technology problem. That’s a leadership problem.”
Everything seems to be happening at breakneck speed these days, and it’s easy to get swept up in it, which is why it’s more important than ever to take the time to:
- Identify the problem you want AI to solve
- Avoid the hype
- Look critically at the quality and source of the training data (garbage in, garbage out; bias in, bias out; etc.)
- Build in accountability, governance, oversight, and improvement from the beginning, and keep them there
- ‘Human in the loop’ is critical, but it has to be the right human with the skills and experience
- Consider everyone who may be impacted, particularly those who may be vulnerable or excluded
- Never forget what Andrew Grill said – this isn’t about technology, it’s about leadership


