- According to a 2017 study, 71% of people reported feeling better (rediscovery of self and positive emotions) about 11 weeks after a breakup. But social media complicates this healing process.
- Even if you "unfriend", block, or unfollow, social media algorithms can create upsetting encounters with your ex-partner or reminders of the relationship that once was.
- Researchers at University of Colorado Boulder suggest that a "human-centered approach" to creating algorithms can help the system better understand the complex social interactions we have with people online and prevent potentially upsetting encounters.
Photo by Antonio Guillem on Shutterstock
Social media complicates the natural healing process of breakups
According to a 2017 study (which you can find in the Journal of Positive Psychology), most people are able to heal from a breakup within a span of three months after the relationship has ended.
This study examined 155 participants who had gone through breakups in the past six months – these were people who had been in relationships of various durations and consisted of people who had been broken up with as well as people who had been the one to end the relationship.
71% of people in this study described feeling better (reporting rediscovery of self and more positive emotions) around 11 weeks after the relationship had ended.
“Offline, breakups can range from awkward to awful, inspiring a gamut of emotions for former partners and people in their networks. Typically these feelings fade with time and distance as ex-partners grow apart emotionally and physically…”
Social media complicates this process, according to a 2019 study conducted by a team in the Department of Information Science division at the University of Colorado Boulder.
While it’s obvious that social media can make grieving the end of a relationship even more difficult, many people unfriend, unfollow and even block their ex-partners to gain some sense of control and erase any reminder of their lost love.
However, according the study mentioned above, even if you unfollow, unfriend and block your ex-partner, social media platforms are very likely to serve you reminders of your relationship due to their algorithms.
Figure 1 from 2019 study on Facebook algorithms
Even if you “unfollow” and block, social media algorithms can make breaking up even more painful
This study investigated the unexpected encounters people face with social media content (relating to an ex romantic partner or relationship that has ended) as a direct result of that platform’s curation algorithm.
Through 3 sets of interviews conducted with 19 adult Facebook account holders (within the United States), the team characterized the kinds of social media encounters participants in the study had experienced and how that experience affected their ability to heal from the breakup.
The participants of this study varied in age and sexual orientation, and the length of their romantic relationships also varied (this data can be found in Table 1 of this document):
- Participants ranged in age from 18-46 (with a median age of 30.56)
- Participants included 12 females and 7 males
- Relationship duration varied from 2 months to 15 years
- Relationship statuses (while together) varied from dating to cohabitating to married
- Sexual orientations of the participants varied from straight to bisexual to lesbian
The “time since encounter” (of the unexpected social media encounters) ranged from ongoing to over 2 years ago. Each participant of this study self-identified as having experienced an unexpected and upsetting experience with content about an ex-partner on Facebook.
According to this study, there are three places on Facebook where “upsetting algorithmic encounters” frequently happen:
- News Feed – which, according to Facebook, shows you “stories that matter most to you” through metrics based on the type of content you post and interactions you have with posts you come into contact with.
- “On this Day” or “Memories” – a place where pictures or interactions with posts are shown to you as happening “a year ago today” or “five years ago today.”
- Shared Spaces and Friend Suggestions – where upsetting encounters can happen by seeing mutual friend posts where you can see a blocked person’s response to a post by a friend of yours.
Who is at fault for these upsetting encounters?
In one instance, person 15 (as they are labeled in the study) indicated she had blocked her ex-husband and mutual friends they shared, as well as his family. Even so, she still encountered an upsetting “friend suggestion” on the sidebar of her Facebook screen.
“Around the time of the divorce, I was getting ‘people you may know’ suggestions of his [new] girlfriend’s relatives, which was bizarre…”
Not only was person 15 upset with these friend recommendations, but she was also very confused: she assumed unfriending her ex-partner, as well as any mutual friends they had, would create enough “virtual distance” between her and her ex-partner that the system would no longer recommend overlapping connections between the two of them.
Across the range of these interviews, some of the participants did blame themselves for not changing their privacy settings or maintaining their social media to help avoid these encounters.
A minority of people in the study held others accountable: giving examples of “not deleting photos with the two of us in it” as blame being on their ex-partner.
However, most of the participants held the social media platform accountable.
“I clicked the Facebook app and at the top, the very top item of my News Feed is “so and so is in a relationship with someone else” and I’m like, “why are you putting that at the top of my feed?” – a quote from person 9 in the study.
Image by Sergey Nivens on Shutterstock
The problem is clear…is the solution also clear?
The real problem with the algorithms on social media platforms, according to the study, is that these systems do not understand the (at times, quite complex) social context of the data they are processing.
The unpredictable outcomes of these algorithms can cause extremely upsetting experiences for social media users.
Going beyond the scope of breakups for a moment, we can imagine how traumatic the experience of seeing your deceased daughter in Facebook’s “Year in Review” video was for Eric Meyer, who explains his experience in this article about inadvertent algorithmic cruelty:“I didn’t go looking for grief this afternoon, but it found me anyway, and I have designers and programmers to thank for that.”
“Yes, my year looked like that” explained Meyer in his emotional article, “true enough. My year looked like the now-absent face of my little girl. It was still unkind to remind me so forcefully.”
This is just one instance of potentially devastating effects of social media algorithms that don’t take more into account than how many “likes” a photo received or how you are connected to this person through a friend of a friend.
The solution: human-centered algorithms
The algorithm is made to simply show you “a friend of a friend” in the “mutual friends” section – not knowing that this “friend of a friend” just happens to be your ex-boyfriend or girlfriend’s new partner. Or in the case of Eric Meyer, the algorithm showed his most “liked” photo, which happened to be of his daughter before her passing earlier that year.
This can create a very triggering response, as you can imagine. But is there a solution to this? The research team suggests that “human-centered approaches” to algorithms could help.
While approaching this problem in a simplistic way might prevent people from having online interactions they do value, the study suggests there are things social media algorithms can take into account that could potentially detect upsetting triggers and redesign how these encounters occur.
An example given in the study is a Facebook event where both you and your partner are attending, the algorithm could choose how (and when) to make your ex-partner’s interactions with that event visible to you.
“As the work of content curation on social media continues to shift from people to algorithms, understanding how people experience what those algorithms make visible is critical to the design of human-centered systems, especially when the results are upsetting or harmful.”