He said it seemed likely the paintings were inspired by
Before the events of 3 December 2024, Lee Jae–myung's path to South Korea's presidency was littered with obstacles.Ongoing legal cases, investigations for corruption and allegations of abusing power all looked set to derail the former opposition leader's second presidential bid.
Then a constitutional crisis changed everything.On that night, former president Yoon Suk Yeol's abortive attempt to invoke martial law set in motion a series of events that appears to have cleared the path for Lee.Now, as the Democratic Party candidate, he is the frontrunner to win South Korea's election on 3 June.
It's a dramatic reversal of fortunes for the 61-year-old, who at the time of Yoon's martial law declaration stood convicted of making false statements during his last presidential campaign in 2022.Those charges still cast a long shadow over Lee, and could yet threaten his years-long pursuit of the top job. But they are also just the latest in a string of controversies that have dogged him throughout his political career.
A rags-to-riches origin story combined with a bullish political style has made Lee into a divisive figure in South Korea.
"Lee Jae-myung's life has been full of ups and downs, and he often takes actions that stir controversy," Dr Lee Jun-han, professor of political science and international studies at Incheon National University, tells the BBC.Experts express concerns about chatbots around potential biases and limitations, lack of safeguarding and the security of users' information. But some believe that if specialist human help is not easily available, chatbots can be a help. So with NHS mental health waitlists at record highs, are chatbots a possible solution?
Character.ai and other bots such as Chat GPT are based on "large language models" of artificial intelligence. These are trained on vast amounts of data – whether that's websites, articles, books or blog posts - to predict the next word in a sequence. From here, they predict and generate human-like text and interactions.The way mental health chatbots are created varies, but they can be trained in practices such as cognitive behavioural therapy, which helps users to explore how to reframe their thoughts and actions. They can also adapt to the end user's preferences and feedback.
Hamed Haddadi, professor of human-centred systems at Imperial College London, likens these chatbots to an "inexperienced therapist", and points out that humans with decades of experience will be able to engage and "read" their patient based on many things, while bots are forced to go on text alone."They [therapists] look at various other clues from your clothes and your behaviour and your actions and the way you look and your body language and all of that. And it's very difficult to embed these things in chatbots."