From Wikipedia, the free encyclopedia
News and notes

Wikimedia enters US Supreme Court hearings as "the dolphin inadvertently caught in the net"

US Supreme Court seal
The United States Supreme Court is hearing two cases that will be crucial for Wikimedia

Cautious optimism from "the dolphin inadvertently caught in the net" at US Supreme Court hearings "crucial for Wikimedia"

For prior Signpost coverage, see see Section 230 report (February 2023)

On the Wikimedia-l mailing list, two members of the Wikimedia Foundation's "Global Advocacy" team drew attention to

important hearings happening this week at the United States Supreme Court.

The hearings on two cases that will be crucial for Wikimedia have just started: NetChoice, LLC v. Paxton and Moody v. NetChoice, LLC. Both cases are challenges to state laws in Texas and Florida, which impact content moderation on social media websites. [...] As they are written, these laws prohibit website operators from banning users or removing speech and would generally risk Wikipedia’s volunteer-led systems of content moderation. That’s because these laws were designed to prevent social media platforms from engaging in politically motivated content moderation, but were drafted so broadly that they would also impact Wikipedia. The case is also important beyond the impact it might have on our projects. It represents a scenario that is part of a trend globally, where governments introduce legislation to address harms from big tech actors, yet Wikimedia ends up as the dolphin inadvertently caught in the net."

The Foundation has previously weighed in on these cases with an amicus brief and several blog posts, and is present at the current hearings "in person talking to stakeholders and observing the proceedings. We expect the Court to rule this year and will be providing updates as we know more."

Asked about the worst-case scenario (from a Wikimedia perspective), Stan Adams of the Global Advocacy team elaborated:

"Perhaps the worst long-term outcome would be if several other states or even the US Congress replicated the Texas or Florida laws. If those laws were enforced against Wikipedia editors or the Foundation – say, for editors' regular work of removing content that is inaccurate, unsourced, or that violates NPOV policies – it could become increasingly difficult to operate and maintain Wikipedia."

However,

"based on what I observed at the Court yesterday [February 26, mentioning comments by justice Brett Kavanaugh in particular], I think most of the Justices would be reluctant to uphold the Texas and Florida laws. That said, these cases won't be the end of legislative attempts to regulate social media and other venues for expression online – I expect to see the Court considering more cases like these as states continue to enact laws that raise First Amendment questions in the online context."

H

U4C Charter vote

The Wikimedia Foundation advised on Meta-Wiki that –

A vote to ratify the charter for the Universal Code of Conduct Coordinating Committee (U4C) was held from 19 January until 2 February 2024 via SecurePoll. Voting is now closed. Thank you to all who voted. The result was 1249 voters in support and 420 voters opposed. 69 voters did not choose an option. Voter statistics and a summary of voter comments will be published soon.

You can find more information on the U4C's purpose and scope here. – AK

2024 Requests for adminship review

Placeholder alt text
Cleanup on aisle 24...

Are more changes afoot for the Requests for adminship process? Open proposals from Phase I include the following (some others were already rendered unsuccessful).

  • Proposal 2: Add a reminder of civility norms at RfA
  • Proposal 3: Add three days of discussion before voting (trial)
  • Proposal 3b: Make the first two days discussion-only (trial)
  • Proposal 4: Prohibit threaded discussion (trial)
  • Proposal 5: Add option for header to support limited-time adminship (trial)
  • Proposal 6: Provisional adminship via sortition
  • Proposal 6b: Trial adminship
  • Proposal 7: Threaded General Comments
  • Proposal 8: Straight vote (trial)
  • Proposal 10: Unbundling 90% of blocks
  • Proposal 12: Abolish the discretionary zone and crat chats
  • Proposal 12b: Abolish crat chats and allow discretionary relisting
  • Proposal 13: Admin elections
  • Proposal 14: Suffrage requirements
  • Proposal 16: Allow the community to initiate recall RfAs
  • Proposal 16b: Require a reconfirmation RfA after X years
  • Proposal 16c: Community recall process based on dewiki
  • Proposal 16d: Community recall process initiated by consensus
  • Proposal 17: Have named Admins/crats to monitor infractions
  • Proposal 18: Normalize the RfB consensus requirements
  • Proposal 20: Make RFA an internal non public process
  • Proposal 21: Reduce threshold of consensus at RfA
  • Proposal 22: Change the name from RFA to "Nominations For Adminship"

Phase I is still open, and you may weigh in with your thoughts here: Wikipedia:Requests for adminship/2024 review. – B

Stylized, black white and blue picture of a human brain made up of PCB-type connections
AI is changing the way people use the Internet ...

WMF publishes draft "research agenda on the implications of artificial intelligence (AI) for the knowledge commons"

From February 19 to February 23, 2024, "a group of 21 Wikimedians, academics, and practitioners" met at the Rockefeller Foundation's Bellagio Center in Northern Italy "to draft an initial research agenda on the implications of artificial intelligence (AI) for the knowledge commons." The aim is "to focus attention (and therefore resources) on the vital questions volunteer contributors have raised, including the promise, as well as risks and negative impacts, of AI systems on the open Internet." The agenda is available on Meta-Wiki, together with a brief report on the meeting.

Members of the "Wikimedia AI" Telegram group expressed their surprise about hearing about this effort first from organizations outside the Wikimedia movement, and about the fact that the term "open source" isn't mentioned in the document (despite open-source AI being an important topic of debate in AI currently, and WMF's general commitments to the use of open source software). While the announcement appears to be speaking on behalf of "volunteer contributors", the "Wikimedians" involved in drafting the document appears to have consisted exclusively of Wikimedia Foundation staff (largely from its Research department), according to the attendee list. Wikimedia Foundation CEO Maryana Iskander subsequently clarified that this "effort to contribute to a shared research agenda on AI [...] was created by a small group working in the open who rushed to publish a ‘bad first draft’ that will benefit from more input."

In other AI-related news, the Wikimedia Foundation recently received a $2.2 million grant from the Sloan Foundation (a longtime supporter) for the purpose of "leverag[ing] AI for the benefit of Wikipedia's readers and contributors, including tools to address vandalism" over the next three years. (These funds come on top of a $950,000 grant announced in April 2023 by WMF's own Wikimedia Endowment for "building and strengthening AI and machine learning infrastructure on Wikipedia and Wikimedia projects", similarly highlighting "the development of algorithms to measure the quality of Wikipedia articles and machine learning models that help catch incidents of vandalism on Wikimedia projects.")

AK H


Brief notes

Group photo from the EduWiki Conference 2023 in Belgrade, Serbia