2021年10月25日 星期一

洩密者文件公佈後 Facebook 股價承壓

 https://www.cnbc.com/2021/10/25/facebook-whistleblower-documents-released-shares-under-pressure.html

關鍵點
  • Facebook 論文包括來自 17 家美國新聞媒體的報導,可以訪問前僱員 Frances Haugen 提供的內部文件。
  • 這些文件揭示了 Facebook 對 1 月 6 日的處理以及英語以外語言的仇恨言論。
  • Facebook 發言人表示,該公司不會將利潤置於人們的福祉之上 

  • KEY POINTS
    • The Facebook Papers include stories from 17 U.S. news outlets with access to internal documents provided by former employee Frances Haugen.
    • The documents shed light on Facebook’s handling of Jan. 6 and hate speech in languages outside of English.
    • A Facebook spokesperson said the company does not put profits over people’s well-being.
 

The Facebook Papers, a series of articles published by a consortium of 17 U.S. news outlets beginning Friday, shed new light on the company’s thinking behind its actions leading up to the Capitol insurrection on Jan. 6 and its ability to fend off hate speech in languages outside of English.

Facebook shares were slightly negative in early trading Monday after the news outlets published their stories based on the leaked documents. The company is also scheduled to report quarterly earnings after markets close Monday.

The documents were provided to the news outlets by Frances Haugen, a former Facebook employee who took tens of thousands of pages of internal research with her before she left. She’s since provided those documents to Congress and the Securities and Exchange Commission, seeking whistleblower status.

廣告

“At the heart of these stories is a premise which is false,” a Facebook spokesperson said in a statement in response to the flood of reporting. “Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie. The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook.”

Here are some of the major themes the Facebook Papers have explored so far:

Jan. 6

The documents revealed frustration among Facebook’s ranks about the company’s ability to get the spread of content that potentially incites violence under control.

“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” an employee wrote on an internal message board during the riot outside the U.S. Capitol on Jan. 6, according to The Associated Press. “We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”

Facebook had put additional emergency measures in place ahead of the 2020 election to stem the spread of violent or dangerous content if needed. But as many as 22 of those measures were set aside after the election and before Jan. 6, internal documents reviewed by AP showed.

A Facebook spokesperson told the outlet its use of those measures followed signals from its own platform and law enforcement.

Language barriers

Some of the reports showed how Facebook’s content moderation systems can fall flat when faced with languages besides English.

AP reported that Arabic poses a particularly difficult challenge for content moderators. Arabic-speaking users have learned to use symbols or extra spaces in words thought to set off flags in Facebook’s systems, like the names of militant groups.

While the methods are meant by some to avoid an overzealous content moderation system, AP reported that certain measures have managed to avoid Facebook’s hate speech censors.

“We were incorrectly enforcing counterterrorism content in Arabic,” an internal Facebook document said, according to AP. Meanwhile, it said, the system “limits users from participating in political speech, impeding their right to freedom of expression.”

Facebook told AP it’s put more resources into recruiting local dialect and topic experts, and has researched ways to improve its systems.

India

Other reports show that some Facebook employees were dismayed by the company’s handling of misinformation in India, believing leadership made decisions to avoid angering the Indian government.

Hate speech concerns in the region were amplified by similar language barrier issues as in the Middle East. According to AP, Facebook added hate speech classifiers in Hindi and Bengali in 2018 and 2020, respectively.

One researcher who set up an account as a user in India in 2019 found that by following Facebook’s algorithm recommendations, they saw “more images of dead people in the past three weeks than I’ve seen in my entire life total,” in the News Feed, according to The New York Times.

A Facebook spokesperson told the Times that hate speech against marginalized groups in India and elsewhere has been growing, and it’s “committed to updating our policies as hate speech evolves online.”

Retaining users

Other reports showed the existential issues facing the company if it failed to hold onto enough young users.

The platform is already experiencing a dip in engagement among teens, The Verge reported based on the internal documents.

“Most young adults perceive Facebook as a place for people in their 40s and 50s,” a March presentation from a team of data scientists said, according to The Verge. “Young adults perceive content as boring, misleading, and negative. They often have to get past irrelevant content to get to what matters.”

The documents showed that Facebook plans to test several ideas to increase teen engagement, like asking young users to update their connections and tweaking the News Feed algorithm to show users posts from outside their own network.

A Facebook spokesperson told The Verge that the platform is “no different” from any social media site that wants teens to use its services.

Competition

Facebook has spent the past few years fighting the label of a monopoly, which many lawmakers and academics say is appropriate for a platform of its scale.

But among its ranks, Facebook employees acknowledge the vast power of the platform with details that could fuel ongoing and future antitrust lawsuits. The FTC recently filed an amended complaint alleging Facebook illegally maintained monopoly power in personal social networking services after a judge threw out its initial claims.

According to a report from Politico, 78% of American adults and nearly all teens in the U.S. use Facebook’s services. Even though competitors like TikTok and Snap have made progress with teen users, Facebook and Instagram continue to maintain a stronghold on activities like connecting with others on common interests and sharing photos and videos, according to a survey of users last year.

And once they sign up, few actually leave the platforms, Facebook’s own research reportedly shows.

In a 2018 presentation reviewed by Politico, employees wrote that despite “Facebook-the-company” doing only “okay” with teens around the world, “we do have one of the top social products — with growing market share — almost everywhere.”

Facebook spokesperson Christopher Sgro told Politico that, “Far from supporting the government’s case, the documents presented to Facebook firmly reinforce what Facebook has always said: We compete with a broad range of services for people’s time and attention, including apps that offer social, community, video, news and messaging features.”

Facebook的論文,一系列由17從週五美國新聞媒體的財團發表的文章,新的視角對公司的背後其行為導致對國會起義1月6日的思想和其抵禦在語言仇恨言論的能力英語之外。

在新聞媒體根據洩露的文件發布了他們的故事後,Facebook 股價在周一早盤略微下跌。該公司還計劃在周一市場收盤後報告季度收益。

這些文件由前 Facebook 員工 Frances Haugen 提供給新聞媒體,她在離開前帶走了數万頁的內部研究。此後,她向國會和證券交易委員會提供了這些文件,以尋求舉報人身份。

“這些故事的核心是一個錯誤的前提,”Facebook 發言人在回應大量報導的聲明中說。“是的,我們是一家企業,我們盈利,但我們以犧牲人們的安全或福祉為代價這樣做的想法誤解了我們自己的商業利益所在。事實上,我們已經投資了 130 億美元,有超過 40,000 人從事一項工作:確保人們在 Facebook 上的安全。”

以下是 Facebook 論文迄今為止探討的一些主要主題:

1月6日

這些文件顯示,Facebook 的內部人員對該公司控制可能煽動暴力的內容傳播的能力感到沮喪。

“我們是不是沒有足夠的時間來弄清楚如何在不啟用暴力的情況下管理話語?” 美聯社報導,在 1 月 6 日美國國會大廈外發生騷亂期間,一名員工在內部留言板上寫道“我們長期以來一直在助長這場大火,我們不應該對它現在失控感到驚訝。”

Facebook 已在 2020 年大選之前採取了額外的緊急措施,以在需要時阻止暴力或危險內容的傳播。但美聯社審查的內部文件顯示,其中多達 22 項措施在選舉後和 1 月 6 日之前被擱置。

Facebook 的一位發言人告訴媒體,它使用這些措施是根據其自己的平台和執法部門發出的信號。

語言障礙

一些報告顯示了 Facebook 的內容審核系統在面對英語以外的語言時會如何失敗。

美聯社報導說,阿拉伯語對內容主持人提出了特別困難的挑戰。說阿拉伯語的用戶已經學會了在被認為會在 Facebook 系統中引發標誌的單詞中使用符號或額外空格,例如激進組織的名稱。

雖然這些方法是為了避免過度熱心的內容審核系統,但美聯社報導稱,某些措施已經成功避免了 Facebook 的仇恨言論審查。

據美聯社報導,Facebook 的一份內部文件稱:“我們錯誤地執行了阿拉伯語的反恐內容。” 同時,它表示,該系統“限制用戶參與政治言論,阻礙他們的言論自由權”。

Facebook 告訴美聯社,它投入了更多資源來招募當地方言和主題專家,並研究了改進其係統的方法。

印度

其他報導顯示,一些 Facebook 員工對該公司在印度處理錯誤信息的方式感到沮喪,認為領導層做出的決定是為了避免激怒印度政府。

與中東地區類似的語言障礙問題加劇了該地區的仇恨言論問題。據美聯社報導,Facebook 分別於 2018 年和 2020 年在印地語和孟加拉語中添加了仇恨言論分類器。

一位研究人員於 2019 年在印度開設了一個帳戶,發現通過遵循 Facebook 的算法推薦,他們“在過去三週內看到的死人圖片比我一生中看到的總數還要多”,在新聞中據《紐約時報》報導,飼料

Facebook 的一位發言人告訴《泰晤士報》,針對印度和其他地方邊緣群體的仇恨言論一直在增加,並且“隨著仇恨言論在線的演變,它致力於更新我們的政策”。

留住用戶

其他報告顯示,如果公司未能留住足夠多的年輕用戶,該公司將面臨生存問題。

The Verge根據內部文件報導稱,該平台的青少年參與度已經有所下降

據 The Verge 報導,一個數據科學家團隊在 3 月份的一份報告中說:“大多數年輕人認為 Facebook 是適合 40 多歲和 50 多歲的人的地方。” “年輕人認為內容無聊、誤導和消極。他們通常必須跳過不相關的內容才能找到重要的內容。”

文件顯示,Facebook 計劃測試幾種提高青少年參與度的想法,例如要求年輕用戶更新他們的連接並調整新聞提要算法以向用戶展示來自他們自己網絡之外的帖子。

Facebook 發言人告訴 The Verge,該平台與任何希望青少年使用其服務的社交媒體網站“沒有什麼不同”。

競賽

過去幾年,Facebook 一直在與壟斷的標籤作鬥爭,許多立法者和學者認為這種標籤適合其規模的平台。

但在其隊伍中,Facebook 員工承認該平台的巨大力量,其細節可能會助長正在進行和未來的反壟斷訴訟。美國聯邦貿易委員會最近提交了一份修正後的投訴,指控 Facebook 在法官駁回其最初的主張後非法維持個人社交網絡服務的壟斷權。

根據Politico的一份報告,78% 的美國成年人和美國幾乎所有的青少年都使用 Facebook 的服務。根據去年對用戶的一項調查,儘管 TikTok 和Snap等競爭對手在青少年用戶方面取得了進展,但 Facebook 和 Instagram 在與其他人建立共同興趣以及分享照片和視頻等活動方面繼續保持優勢。

據報導,Facebook 自己的研究表明,一旦他們註冊,很少有人真正離開平台。

在 Politico 審查的 2018 年演示文稿中,員工寫道,儘管“Facebook 公司”對世界各地的青少年只做“還好”,但“我們確實擁有頂級社交產品之一——市場份額不斷增長——幾乎無處不在。”

Facebook 發言人克里斯托弗·斯格羅 (Christopher Sgro) 告訴 Politico,“提交給 Facebook 的文件遠非支持政府的立場,而是堅定地強化了 Facebook 一貫所說的話:我們與廣泛的服務競爭,以爭取人們的時間和注意力,包括提供社交、社區服務的應用程序。 、視頻、新聞和消息功能。”

這個故事正在發展。回來查看更新。

 

沒有留言: