Tuesday , June 15 2021

For fear of bias, Google blocks genre-based pronouns from new AI tools



SAN FRANCISCO (Reuters) – Alphabet Inc (en)GOOGL.O) Google introduced Gmail's functionality for the users that have automatically typed phrases in May. Touch "I love it" and Gmail can propose "you" or "it".

FILE: Google name will be displayed outside the business office in London, Britain, on November 1, 2018. REUTERS / Toby Melville / File Photo

But users are lucky if the object of their love is "that" or "same".

Google's technology does not suggest a gender-pronoun, because there is a risk that "Smart Compose" technology may not be able to harm sex or gender identity and harm users, leading product leaders to reveal in conversations.

Gmail's product manager, Paul Lambert, said one of the company's research scientists discovered that he was in January when he wrote "Writing next week's investment," and Smart Compose has made it possible to keep track of the question: "Do you want to know him? want to "? "her".

Consumers have become accustomed to mixing self-directed illnesses on mobile phones. But Google did not at that time recover politics and social recovery in gender issues, and the critics never analyzed the potential bias in artificial intelligence.

"Not all" converters "are the same," said Lambert. The genre "big, big thing" is getting worse.

Getting Smart Compose right could be good for the business. Google demonstrates that understanding of AI enhancements rather than competitors demonstrates the company's strategy to build its brand affinity and attract customers to attract AI-powered cloud computing tools, advertising services and hardware.

Gmail has 1.5 million users, and Lambert helps Smart Compose support 11% of Gmail.com messages.

Smart Compose AI developers are an example of language-based natural creation (NLG), which are used by computers to write phrases in the analysis of relationships between literature, e-mail and web pages.

The system shows millions of human phrases, completing common phrases, but limiting generosity. As men have prevailed in the fields of economics and science, for example, technology would conclude from the data that "investors" or "engineers" or "themed" have. This is happening at almost every major technology company.

Lambert's team of 15 engineers and designers, Smart Compose, tried a number of solutions, but there was no shortage or despair. The best resolution was that they were the sharpest: Limit coverage. The prohibition against gender name is less than 1% in the cases where Smart Compose would suggest something, Lambert said.

"The only trusted technique we have had is to be conservative," said Prabhakar Raghavan, Gmail's engineering and other services to the final promotion.

NEW POLICY

Google makes sure to play safe and secure with the genre, they are shameless with the company's predictable technology.

The company pardoned in 2015 the recognition of the image of its photo services labeled as black gorillas like gorilla. In 2016, Google changed its search engine's autocomposition function, when it tried to find information on Jews, anti-Semitic queries were "bad".

Google prohibits explicit and racial cunning from technology predictions, as well as referrals to businessmen or tragic events.

The new policy for generating pseudo bans also influenced Google's Smart Reply. This service allows users to instantly respond to text messages and short emails, such as "good sounds".

Google uses tests developed by its AI ethics team to review their seasons. Spam and abuse groups put systems into "trying to find" tasty "gaffes hackers or journalists might think, Lambert said.

Non-US workers seek local cultural issues. Smart Compose soon will work in four languages: Spanish, Portuguese, Italian and French.

"You need a great human oversight," said Raghavan's chief engineer, "in each language, the development of inadequate is something different."

WIDESPREAD CHALLENGE

Google is not a technology company that links the genre-based pronouns issue.

Agolo, the New York launcher investing in Thomson Reuters, uses AI to make a summary of business documents.

Its technology can not reliably identify what name it pronounces in some pronouns. Thus, the summary extracts several sentences to give users more context, said Mohamed AlTantawy, chief of technology at Agolo Technology.

You need to copy more parts. "The smallest mistake people will lose confidence," said AlTantawy. "People want 100 percent correct."

However, errors remain. Keyboard ad tools developed by Google and Apple Inc (AAPL.O) to produce "generic" "police" "police" and "vendor" sales.

Write a Turkish neutral text saying "one soldier" into Google Translator and "says a soldier" in English. Alibaba's translation toolsBABA.N) and Microsoft Corp (MSFT.O). Amazon.com Inc (AMZN.O) Selects "itself" for the cloud computing service for clients.

AI experts demanded the company to show a statement and several possible translations.

Microsoft's LinkedIn said it prevents the gender pronoun in the old predictive messaging tool, Smart Responses, to avoid potentially bad.

Alibaba and Amazon did not respond to requests.

Warnings and limitations such as intelligent commands continue to be counterattacks used in complex systems, said John Hegele, a senior technical engineer in Durham, based in North Carolina, with Automated Insights Inc, which generates statistical news.

"The end goal is a machine-generated system, which magically writes," said Hegel. "It's not a ton of progress, but we're not there yet."

Paresh Dave reports; Edit Greg Mitchell and Marla Dickerson

Our rules:Thomson Reuters Trust Principles.

Source link