Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

レポート 2012

関連インシデント

インシデント 3263 Report
Facebook Automated Year-in-Review Highlights Showed Users Painful Memories

Loading...
Facebook apologises over 'cruel' Year in Review clips
theguardian.com · 2014

Facebook has apologised after learning, yet again, that not everything can be done algorithmically. Some things, it seems, need the human touch.

The company’s latest blunder stems from a seemingly innocuous feature it rolls out to its users shortly before Christmas every year, called the Year in Review.

It lets users automatically select photos, wall posts and other content from a user’s past year, offering those which gained the most responses in likes or comments as “highlights”. Facebook users can then piece together a scrapbook of the past year, and experience instant nostalgia.

This year, to go one step further, Facebook automatically picked one particularly well-engaged photo to present to users, under the banner: “Here’s what your year looked like!” For many users, that will have been a happy memory, such as a graduation, wedding, or the birth of a child.

But for some users, the algorithm forced painful memories back to the surface.

Web designer Eric Meyer wrote on his blog that Facebook had shown him a picture of his daughter, Rebecca, who died in 2014.

“Yes, my year looked like that. True enough. My year looked like the now-absent face of my little girl. It was still unkind to remind me so forcefully,” he wrote in a post titled “Inadvertent Algorithmic Cruelty”.

“And I know, of course, that this is not a deliberate assault. This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases, reminding people of the awesomeness of their years, showing them selfies at a party or whale spouts from sailing boats or the marina outside their vacation house.”

In a follow-up post, Meyer said the product manager for Year in Review, Jonathan Gheller, had personally apologised to him for the blunder. The Washington Post said Gheller described the app as “awesome for a lot of people, but clearly in this case we brought him [Meyer] grief rather than joy … We can do better — I’m very grateful he took the time in his grief to write the blog post.”

The feature has already been tweaked following feedback: it initially ended the slideshow with the words “It’s been a great year! Thanks for being a part of it.” It now uses the more neutral language “See you next year!”

Writer Julieanne Smolinski shared another story: her ex-boyfriend’s year in review, which framed a picture of his house on fire.

And many other users had similar experiences:

I'm so glad that Facebook made my 'Year in Review' image a picture of my now dead dog. I totally wanted to sob uncontrollably this Xmas Eve.

— Sarah-Jane (@isloveSJ) December 24, 2014

Facebook "year in review" thing is kind of awful as it chose 2 pictures of my dogs that died this year & uses poor graphic design elements.

— Travis Louie (@travislouie) December 27, 2014

Won't be sharing my Facebook Year in Review, which "highlights" a post on a friend's death in May despite words like "killed" and "sad day"

— Andrew Katz (@katz) December 29, 2014

As for Meyer, he suggests two things Facebook, and firms like it, can do to avoid this sort of inadvertent cruelty. “First, don’t pre-fill a picture until you’re sure the user actually wants to see pictures from their year. And second, instead of pushing the app at people, maybe ask them if they’d like to try a preview—just a simple yes or no. If they say no, ask if they want to be asked again later, or never again. And then, of course, honour their choices.

“It may not be possible to reliably pre-detect whether a person wants to see their year in review, but it’s not at all hard to ask politely—empathetically—if it’s something they want. That’s an easily-solvable problem. Had the app been designed with worst-case scenarios in mind, it probably would have been.”

情報源を読む

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2024 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • e1b50cd