Wikipedia Is Biased in Favor of Liberals, Study Finds

Wikipedia entries are more likely to paint public figures on the right in a negative light than the left, a Manhattan Institute study released Thursday found.

The study analyzed the sentiments of 1,628 words that were used in reference to political topics and found that Wikipedia generally uses more negative terms in reference to right-leaning public figures, and less when referencing left-leaning figures. The results would suggest that Wikipedia is contradicting its “neutral-point-of-view” policy, according to the study.

Read More

Commentary: OpenAI and Political Bias in Silicon Valley

ChatGPT app on a smartphone

AI-powered image generators were back in the news earlier this year, this time for their propensity to create historically inaccurate and ethically questionable imagery. These recent missteps reinforced that, far from being the independent thinking machines of science fiction, AI models merely mimic what they’ve seen on the web, and the heavy hand of their creators artificially steers them toward certain kinds of representations. What can we learn from how OpenAI’s image generator created a series of images about Democratic and Republican causes and voters last December?

OpenAI’s ChatGPT 4 service, with its built-in image generator DALL-E, was asked to create an image representative of the Democratic Party (shown below). Asked to explain the image and its underlying details, ChatGPT explained that the scene is set in a “bustling urban environment [that] symbolizes progress and innovation . . . cities are often seen as hubs of cultural diversity and technological advancement, aligning with the Democratic Party’s focus on forward-thinking policies and modernization.” The image, ChatGPT continued, “features a diverse group of individuals of various ages, ethnicities, and genders. This diversity represents inclusivity and unity, key values of the Democratic Party,” along with the themes of “social justice, civil rights, and addressing climate change.”

Read More

Eight Newspapers Sue Open AI and Microsoft for Copyright Infringement

The lawsuit comes after the New York Times filed their own suit against both companies in December. Authors such as Games of Thrones creator George R. R. Martin, John Grisham, and Jodi Picoult have also sued the companies for copyright infringement.

Eight American newspapers sued OpenAI and Microsoft on Tuesday, for alleged copyright infringement related to their chatbots, which they claim have been stealing millions of copyrighted articles without permission.

Read More

Tech Companies Plan to Combat Use of Fake AI in Elections

Facebook User

As the threat of fake images and videos generated by artificial intelligence (AI) could potentially play a role in the coming 2024 elections and beyond, several tech companies have pledged to use their resources to combat misinformation as a result of such technology.

According to Politico, multiple companies are planning to cooperate through a so-called “Tech Accord” dictating several key goals and methods that will be used in the fight against false AI. The companies intend to expose and debunk any “deepfake” images or videos produced by AI, through various tactics such as watermarks and automatic detection technology.

Read More

U.S. Firms Worked Covertly with Chinese Experts to Brainstorm AI Policy: Report

Leading American artificial intelligence (AI) companies have been secretly discussing how to regulate the advanced technology with Chinese experts, The Financial Times reported on Thursday.

U.S. companies OpenAI and Anthropic have partaken in these covert diplomatic discussions centering around addressing concerns regarding the risks of the technology, including so-called misinformation and social cohesion threats, the FT reported. Two meetings transpired in Geneva during July and October of 2023, bringing together scientists and policy experts from U.S. and Canadian AI organizations with counterparts from CCP-backed Tsinghua University as well as other state-supported establishments.

Read More