Select Page

Chatbots and body image: How AI is changing the treatment of eating disorders

27 minutes

Contributors

Gemma Sharp

Associate Professor
Monash University

Sam Ikin

Journalist

Episode Notes

If you were feeling low and needed to talk, would you turn to a robot? With the rise of AI Chatbot technology, many Australians are doing just that – and they’re seeing the benefits. In this episode, Associate Professor Gemma Sharp, head of the Body Image and Eating Disorders Research Group at Monash University, explains why chatbots can successfully bridge the gap between people living with an eating disorder and access to in-person treatments, while host of the Butterfly: Let’s Talk podcast, Sam Ikin shares how having an eating disorder that doesn’t fit the stereotype can be a barrier to seeking help.

Transcript

Sam Ikin: Given this is an audio podcast I should probably describe myself to start with because it is related. I am a 45-year-old Caucasian man and I’m overweight. I have an eating disorder and I think a lot of people can’t get their head around the fact that that’s the case.

Ginger Gorman: Sam Ikin hosts the podcast Butterfly: Let’s Talk, which is all about body image and eating disorders. It’s a condition often associated with teenage girls, which only adds to the stigma for the many other types of people who suffer from disordered eating.

Sam Ikin: For me, it started with negative body image as a child because I was a bigger kid.

Ginger Gorman: Before we go on, I want to let you know that this episode might be a tough one if you’re triggered by discussions around body shape and eating disorders. It’s all about new ways to support you, but I really do get it if you just don’t want to listen. If you need support, head straight to our show notes for some links to organisations like Lifeline and the Butterfly Foundation. Alternatively, you can call the Butterfly Foundation’s National Helpline on 1800 334 673 or Lifeline on 13 11 14. Okay, back to Sam.

Sam Ikin: I remember quite clearly being in grade four or five, sitting on the ground on the concrete listening to the PE teacher who everyone thought was cool. His name was Mr. E and he played baseball and he did Frisbee golf on the weekends. We all thought he was great. We all wanted to be his friend. At one point he said, “If you can grab more than a handful of fat from your belly, then you’re overweight. You need to do something about it.” Of course, I sat there just trying to not be noticed because I knew that that was me. Then he said, “And I’m talking to you and you and you.” He pointed to three kids and I was one of them and I just wanted to dig a hole in the concrete and disappear.

Ginger Gorman: That sucks. I’d like to think we’ve come a long way that we don’t body-shame people, let alone children. Technology has given a much-needed megaphone to positive body messages. But in all honesty, they’ll probably never be as loud or as popular as all the beauty and fitness influencers that still tell us good looks only come in small packages. Psychologist Gemma Sharp talks to her patients a lot about how social media triggers their struggles with body dysmorphia. In those sessions she does her best to guide them.

A/ Professor Gemma Sharp: I thought we were having these really great discussions in session. I thought we’d come up with some good plans for how to navigate social media. Then they would come back for the next appointment and say, “I was on social media and forgot everything we discussed in session.” I was like, “Ah, that’s a shame.”

Ginger Gorman: That’s where Gemma realised she could fight the technology with more technology. Stay with me here because I am about to use a word that’s been burdened with the touch of moral panic: Chatbots.

A/ Professor Gemma Sharp: They needed that in-the-moment support while they were online. It wasn’t enough to have these discussions in sessions when they weren’t actually on social media. So I wanted to give them that 24/7 voice of evidence-based support exactly when they needed it and a chatbot seemed like a good solution for that.

Ginger Gorman: This is Seriously Social. I’m Ginger Gorman. I want you to forget all of your preconceived notions about AI. It’s already helping people with depression and anxiety. Today on the podcast, how is artificial intelligence reaching people on the front line of eating disorders?

A/ Professor Gemma Sharp: I’ll be honest. Back in 2018, I actually didn’t even know what a chatbot was. I’d been using them and not realising the technology behind them.

Ginger Gorman: Associate Professor Gemma Sharp leads the Body Image and Eating Disorders Research Group at Monash University as well as being a Senior Clinical Psychologist at the new Statewide Women’s Mental Health Service at Alfred Health.

A/ Professor Gemma Sharp: But it was actually conversations with my own patients that prompted me to think of the idea of a chatbot for eating disorders.

Ginger Gorman: What’s the connection there with being on social media and feeling bad about your body or feeling perhaps a lack of control and then triggering your eating disorder?

A/ Professor Gemma Sharp: I think there’s many ways that that can occur. It might be someone comparing their body to an influencer or comparing their life to the shiny glittery people online. There’s also really unfortunate dietary advice. There’s a lot of, quote, unquote, “wellness content” on social media that can be very unhelpful and make people very unwell sadly. I think it can be very hard to not encounter this kind of content, particularly younger people who get fed a lot of pro-ana, pro-mia, pro-ED content out there. As much as social media platforms will try and eliminate this content or make it very hard to access, there is always more being generated that can get around the safeguards sadly. And you’re right. It’s basically a recipe for an eating disorder on social media and encouraging each other to undertake really unhelpful behaviours.

Ginger Gorman: Traditional support systems like doctors and counsellors can’t always be there when a person is falling down the social media rabbit hole. But there are other reasons people won’t or even can’t get the help they need from a living, breathing human expert. Access is a big one. Services can be expensive, overbooked and maybe not even available if you live outside of a capital city. Then there’s the stigma.

A/ Professor Gemma Sharp:  Particularly getting help with eating disorders can be a very difficult step. There’s a lot of shame and stigma associated with eating disorders. Why can’t I just get over it myself? It’s just eating. But of course it’s far more complicated than that and so a chatbot can provide a bit of a stepping stone for seeking in-person support if that is available. Certainly in our research we found that people do find it easier to speak with a non-human first up, even though it sounds like a human style interaction. The chatbot is just that nice first step for people seeking more support or particularly for parents, carers and loved ones. It can be, “I’m worried about my young person, should I be getting them more help? What signs they should look out for?”

Ginger Gorman: Who are those people that never seek help?

A/ Professor Gemma Sharp:  We certainly know that the people who do seek help tend to fit a particular demographic of being younger girls and women usually of European descent and from higher socioeconomic status. It’s generally only about one in four people who experience an eating disorder will ever get that professional support in their lifetimes, meaning that 75% of people are not. These might be people in larger bodies, people who identify as another gender, boys, men, gender diverse, people from culturally and linguistically diverse backgrounds, people from lower socioeconomic status. We think from the data we have that they’re the people who are not getting the service they deserve.

Ginger Gorman: Sam Ikin agrees that associating eating disorders with white teenage girls is particularly damaging for people like him who have a binge-eating disorder.

Sam Ikin: I think that stereotype is pretty harmful. Definitely there are lots of young wealthy white women with anorexia nervosa who need help. I mean anorexia nervosa is the mental disorder with the highest mortality rates. I’m certainly not saying that we shouldn’t be giving that particular diagnosis a huge amount of attention and resources to help people who have it, but at the same time, that is three to four percent of the people who have eating disorders. Binge-eating disorder is the one that is easily the most prolific and we don’t even know really how bad it is because it’s so difficult to, firstly, to get people to look for help to say that I have a problem. I think the stigma that this person could have binge-eating disorder, but they’ve been told that they need to do all the things that we’re told you need to do to lose weight rather than look for psychological help, look for the mental health aspect to try and get better. I think it’s hugely damaging.

There’s a lot of shame involved, like there’s so much shame involved. When you binge, there is nothing enjoyable about it. It’s not a fun kind of decadent little moment where you just let yourself go and enjoy whatever you’re doing. It’s really an uncomfortable period where you have to try and make yourself feel a certain way or try and push out some particular feeling and you’re using food to try and do that. It’s the sense of shame that prevents us from looking for help. It’s the sense of shame that keeps these cycles continuing. People don’t know that they could have a problem and therefore they don’t go and look for help.

Ginger Gorman: Sam says around 90% of people with binge-eating disorders present as overweight or obese. When they do go to their GP, they’re often given treatments that exacerbate the problem.

Sam Ikin: I’m not having a go at GPs, but usually doctors won’t suggest to somebody that perhaps you have an eating disorder if you present and you’re overweight. You’re like, “I need to do something.” They’ll tell you, “Well, why don’t you try all of these other weight loss things and you can lose weight.” I mean this is the thing. A lot of people who have these conditions aren’t lazy, but they think they are. They’ve gone to massive lengths to try and lose weight, but it comes back. Unless you treat the eating disorder, you’re not going to get anywhere.

Ginger Gorman: So what exactly is the bot and how does it work?

A/ Professor Gemma Sharp: It’s the same design for all of our tech tools. It’s called Double Diamond, that’s the technical name. But basically it just means four phases where you start from the very beginning where you’re just going, “Hey, we think there’s a bit of an issue here. What do you think we could do to potentially solve it?” You explore the issue in-depth with people who have that lived experience, whether they have it themselves directly or their loved one. Then you go through a series of four stages basically narrowing down on the solution. All of the dialogue in the chatbot was co-designed with people with a lived experience. There’s not a word of it in there that didn’t go past our wonderful consortium. Then of course there’s the actual tech part of it too, our latest iteration Jem that’s spilled in a platform called Google Dialogflow.

Basically, just a text messaging platform just like you would use on your phone but we’ve taught it how to respond to more common queries. Of course if someone’s really experiencing distress and putting that into the chatbot, it goes straight to the crisis line. Again, we’ve taught the chatbot to do that. I suppose it’s multifaceted. It’s definitely multidisciplinary. It has mental health professionals. It has tech experts. It has marketing, business. I don’t think there’s an area of Monash University that hasn’t been involved in this project, quite frankly. It’s a really worthwhile experience in the end, seeing it all come together and help the people who were right at the start who were helping us just come up with the ideas and the first lines of dialogue.

Ginger Gorman: I’ve had a look at your videos of how the chatbot works. I want to say she’s very friendly and I don’t know if that’s my own bias. I saw some of the comments by your young people saying it seemed non-binary, your chatbot, so maybe you’ve done a really good job and it’s accessible to anyone. But it was very accessible. The chatbot seemed to be friendly. It started off asking the person’s intention. What was the person concerned about? Then went into symptoms collection and research and even showing little videos. So how do people respond to that? What are they saying about its usability?

A/ Professor Gemma Sharp: You’re right that people will sometimes give a gendered pronoun for it, which is absolutely fine. It’s meant to be non-binary and non-human, in fact. We went to great pains to make sure to design a character that didn’t have a body because people experiencing eating disorders will often compare their own body to any other body available, and we didn’t want our chatbot to be that point of comparison. That’s why we did that. In terms of the dialogue and the little videos, people do find, at least from our testing so far in the latest chatbot, have found it very engaging. We’ve made the videos really short. The feedback was make it TikTok style. They’re one to three minutes in length. They have infographics rather than just crusty old academics talking about eating disorders. We really want it to be an engaging and immersive experience, particularly because in-person you have that wonderful dynamic between the therapist and the person seeking support and it’s very hard to recapitulate that. But if you have engaging dialogue with emojis, with multimedia experience like videos, it can do a reasonable job of replicating it.

Ginger Gorman: What do you say to those people who might argue that a health professional is always better and this is a risk essentially?

A/ Professor Gemma Sharp: I think we’re not trying to replace it. That’s just it. That came through really early in all of our design phases that you’ll never be able to reproduce that empathy. You’ll never be able to give those very natural responses. That is correct as the technology stands right now, as of 2023, maybe in the future, who knows? But that’s not the goal. I think it is to be a tool to support in-person help, be that first point of contact, to direct people to further services. It’s not meant to replace anyone. I would love to retire and have a chatbot take my place, by all means. There are other ways I can spend my time. That’s not going to happen right now. But the chatbot is only as good as the data it’s trained on, which comes from humans and our conversations. It’s always a step behind. However, when there is no human to help, no human to answer the helpline phone, I would rather something be there than not. There’s so much safeguarding built into all of our tech tools that it picks up on any word of distress and gives those helpline contacts.

Ginger Gorman: In some instances, there might be an advantage that it’s not a person. I know I asked one of my children at one point, did they want to call Kids Helpline? But they opted to talk to a bot instead because they didn’t want the stigma of talking to a person, which I found really interesting.

A/ Professor Gemma Sharp: I think that’s great that you had that conversation and you had those options available for your little one because in the past that wasn’t the case. It had to be a person or not and wouldn’t you rather they speak to something rather than not.

Ginger Gorman: The older you are, the harder it might be to grasp the idea that texting AI is preferable to talking to a counsellor. Then again, if you’ve spent any time with someone under 25, you might get why this suits them.

A/ Professor Gemma Sharp:  We hear this quite a bit. Young people just really detest talking on the phone. There seems to be this kind of generational divide now of people who will not speak on the phone and people who quite prefer it.

Ginger Gorman: What do the young people using it say about the actual experience of it to you?

A/ Professor Gemma Sharp: They find it easy to use. They quite like how in all of our chatbots we have a lot of quick replies or buttons. Sometimes they don’t know what to ask, so thank you for giving us those kinds of prompts, but we can also take free text questions as well. They quite like how there’s a dynamic approach there. They like the character popping up to sort of go, “Hey, I’m still here. I’m still with you.” They like the emojis for sure. That’s something that we’ve thought a lot about, including emojis in our text messages because that’s how people correspond anyway.

Ginger Gorman: The big question is does it work? Does it actually lead young people to get more support?

A/ Professor Gemma Sharp:  We haven’t interviewed every single person who’s used it. I would love to do that though. But certainly, from the people we have done the more in-depth interviews with it’s been a mixture. Some have said, “Actually, I got the support I needed from the bot. I didn’t need further.” But then others are like, “No, no. I rang the helpline service and then went on to in-person support.” I think it speaks to the intention the person came with to use the bot in the first place because the quick distress tolerance or quick coping skills are often all people want at that time.

Ginger Gorman: Now they’re working on adding mobile sensing technology that could look at a user’s social media activity and monitor for triggers or signs of distress. For a person at risk, particularly a young person, that would probably be better than having a parent watch over them online. The bot can also take some of the burden off an overwhelmed parent.

A/ Professor Gemma Sharp: The bot does not get exhausted. The bot always has more resources for the young person seeking support. In that way it doesn’t have that human frustration to it. We’ve been definitely hearing that from parents and carers that the more tech support, the better because it can be really tiring to support a young person experiencing these issues. A chatbot has infinite patience.

Ginger Gorman: What risks are there? Because I’ve got personal experience of someone close to me with a drastic eating disorder and sometimes the person can be at risk of death. It’s not necessarily that the person’s sitting there chatting. They might be in a very, very dire situation.

A/ Professor Gemma Sharp: Yes. They might not be speaking with their loved ones at all, quite frankly. The eating disorder is so all-consuming. Sometimes it’s hard to judge where their eating disorder starts and their personality finishes. It’s such a horrible existence. I think a chatbot cannot manage that kind of situation. Of course, we have trained our chatbot to monitor or I suppose give that advice of if there’s any medical symptoms that are occurring to go to ER straight away. But whether the person actually does that or not, we cannot be certain. But I suppose at least if the person is completely withdrawn from their family and the chatbot has said, “You should really go to hospital with these symptoms,” at least they have received that advice. I’m pretty sure their loved ones would’ve said that to them as well.

Ginger Gorman: But also sometimes it’s better not coming from a loved one because of the family dynamics. It may be better coming from an anonymous party almost.

A/ Professor Gemma Sharp: Exactly.

Ginger Gorman: While Jem, the chatbot can’t replace in-person medical care, it can give a person information about the risks and symptoms of eating disorders. Information they just might not believe if it’s coming from a family member.

A/ Professor Gemma Sharp: We have this picture of a body and it has where lots of things can go wrong. Irregular heartbeat, stopping periods, gastrointestinal problems. You can click all around the body and see, and people often go, “Oh, I’m experiencing all of these. This is serious.” It’s one of the few moments, particularly in therapy where people go, “This is a really big problem because I’m having a lot of these symptoms.”

Ginger Gorman: The science is helping them understand it as opposed to a very emotional parent perhaps.

A/ Professor Gemma Sharp: Exactly. Just in my experience, potentially there is a bias in the types of patients I see. But they actually love the research and they go, “Can I please read more?” I send them more papers on these types of things. I think it can be a really strong strategy in people experiencing eating disorders.

Ginger Gorman: Well, sometimes understanding exactly what is happening to you is really useful in terms of where you go next.

A/ Professor Gemma Sharp: Indeed. They know, I hope, and this has been the case sometimes, that it takes some of the blame away. That it’s not some kind of personality defect. That this actually would happen to anyone’s body who’s experiencing usually starvation symptoms in an eating disorder context.

Ginger Gorman: It’s interesting though, isn’t it? Because we think of eating disorders mainly as happening to people with small bodies and people having anorexia. But there are lots of different kinds of people that have eating disorders and they present very differently.

A/ Professor Gemma Sharp: Indeed. I’m just thinking of a diagnosis that is unfortunately going through the roof at the moment, atypical anorexia. The reason that it’s atypical is because the person is not in a small body. But because they have so heavily restricted their eating and usually the people around them have been praising them for their drastic weight loss that they are actually having all of the same physical issues that someone underweight is having. They are just at risk of unfortunately dying. But everyone around them is going, “Well done, you.” Those of us in eating disorders are going, “This person is in a lot of trouble.”

Ginger Gorman: Is there any research that maps body image issues or perhaps even use of the chatbot for Aboriginal and Torres Strait Islander people?

A/ Professor Gemma Sharp: We did collect that data for our pilot chatbot, and it was less than 2% of users identified as Aboriginal or Torres Strait Islander. But we know from the limited research that has been done, there’s so much more that needs to be done that Aboriginal and Torres Strait Islander peoples are actually more at risk of eating disorders. They need even more help. But I think, and again, we need more research. Potentially it’s not on their radar. Potentially there are other issues going on that might be clouding the eating disorder symptoms. But that doesn’t mean it’s not there.

Ginger Gorman: And also I would be concerned about whether those resources were culturally sensitive or culturally appropriate.

A/ Professor Gemma Sharp: Absolutely. I mean, all of our eating disorder treatments seemingly have been designed with that younger white girl in mind who is low weight. But I think we are always encouraging more people of Aboriginal and Torres Strait Islander backgrounds to join mental health. There are some fantastic clinicians out there, but we definitely need more and a pathway for them to be able to flourish as mental health professionals.

Ginger Gorman: If we look at the history though of eating disorders, where do you think your innovation comes in? What place does it have?

A/ Professor Gemma Sharp: In my unbiased opinion, hopefully a really good place. I think it’s an interesting field. I think digital health in other areas of mental health has progressed further. If we think of anxiety and depression, mobile apps, they’ve been around for, gosh, probably a decade or more. But when it comes to eating disorder focused online tools, it’s far more recently. I think that’s a great shame that we didn’t get into this earlier. Because clearly people with eating disorders are using their phones and devices just as much as everyone else, maybe even more so. There was that opportunity for intervention earlier.

However, I think now is a great time of acceleration in this area. The fact that the government, particularly the National Health and Medical Research Council, has invested in me with two fellowships. Thank you so much, NHMRC. That it shows an investment in this space in particular that they’re going, “Please run with this. Please help people sooner by using these online platforms.” I do think it has been a slow start, absolutely. Where we were just like, “Nope, has to be in-person treatment. Has to be with a multidisciplinary team, with a psychologist delivering cognitive behavioral therapy, that talk therapy.” Now we’re like, “Actually, we can really do a lot more for people outside of traditional treatment settings.”

Ginger Gorman: Sam Ikin particularly likes the idea that AI takes the very human emotion of judgment out of the equation.

Sam Ikin: Absolutely. Because you’re used to bearing the judgment of other people. This is a thing that you can remove from the whole situation, so that might help a few people to look at it. Of course, then there is the stigma already there that, “Well, what if I do have it? Then I have to tell people.” That, “Oh, then I have to go and actually look for it.” That is another barrier. But I think this definitely makes it a little more accessible. I hope that is something that a lot of people start using as these things become more and more available.

Ginger Gorman: Thanks for listening to Seriously Social. This podcast is produced on Ngunnawal, Ngambri, Yuggerah, and Turrbal Land, and we pay our respects to Elders past and present. Seriously Social is produced by Kim Lester, engineered by Mark Gageldonk aka Baldy, and our executive producers are Bonnie Johnson and Clare McHugh. It’s an initiative of the Academy of the Social Sciences in Australia. Next time, trust. What happens when organisations lose our trust and how do they regain it? See you then.

Useful Links

Aboriginal and Torres Strait Islander people should be aware that this website may contain images, voices and names of deceased persons.