Quick Telecast
Expect News First

Google plans use of AI called BERT to cut racy search results

0 24


OAKLAND, CALIF. —
When U.S. actress Natalie Morales carried out a Google search for “Latina teen” in 2019, she described in a tweet that all she encountered was pornography.

Her experience may be different now.

The Alphabet Inc unit has cut explicit results by 30 per cent over the past year in searches for “latina teenager” and others related to ethnicity, sexual preference and gender, Tulsee Doshi, head of product for Google’s responsible AI team, told Reuters on Wednesday.

Doshi said Google had rolled out new artificial intelligence software, known as BERT, to better interpret when someone was seeking racy results or more general ones.

Beside “latina teenager,” other queries now showing different results include “la chef lesbienne,” “college dorm room,” “latina yoga instructor” and “lesbienne bus,” according to Google.

“It’s all been a set of over-sexualized results,” Doshi said, adding that those historically suggestive search results were potentially shocking to many users.

Morales did not immediately respond to a request for comment through a representative. Her 2019 tweet said she had been seeking images for a presentation, and had noticed a contrast in results for “teen” by itself, which she described as “all the normal teenager stuff,” and called on Google to investigate.

The search giant has spent years addressing feedback about offensive content in its advertising tools and in results from searches for “hot” and “ceo.” It also cut sexualized results for “Black girls” after a 2013 journal article by author Safiya Noble raised concerns about the harmful representations.

Google on Wednesday added that in the coming weeks it would use AI called MUM to begin better detecting of when to show support resources related to suicide, domestic violence, sexual assault and substance abuse.

MUM should recognize “Sydney suicide hot spots” as a query for jumping locations, not travel, and aid with longer

questions, including “why did he attack me when i said i dont love him” and “most common ways suicide is completed,” Google said.

(Reporting by Paresh Dave; Editing by Karishma Singh)

// BEGIN: Facebook clicks on unlike button FB.Event.subscribe("edge.remove", function (response) { Tracking.trackSocial('facebook_unlike_btn_click'); }); };

var plusoneOmnitureTrack = function () { $(function () { Tracking.trackSocial('google_plus_one_btn'); }) } var facebookCallback = null; requiresDependency('https://connect.facebook.net/en_US/all.js#xfbml=1&appId=404047912964744', facebookCallback, 'facebook-jssdk'); });

// BEGIN: Facebook clicks on unlike button FB.Event.subscribe("edge.remove", function (response) { Tracking.trackSocial('facebook_unlike_btn_click'); }); };

var plusoneOmnitureTrack = function () { $(function () { Tracking.trackSocial('google_plus_one_btn'); }) } var facebookCallback = null; requiresDependency('https://connect.facebook.net/en_US/all.js#xfbml=1&appId=404047912964744', facebookCallback, 'facebook-jssdk'); });


OAKLAND, CALIF. —
When U.S. actress Natalie Morales carried out a Google search for “Latina teen” in 2019, she described in a tweet that all she encountered was pornography.

Her experience may be different now.

The Alphabet Inc unit has cut explicit results by 30 per cent over the past year in searches for “latina teenager” and others related to ethnicity, sexual preference and gender, Tulsee Doshi, head of product for Google’s responsible AI team, told Reuters on Wednesday.

Doshi said Google had rolled out new artificial intelligence software, known as BERT, to better interpret when someone was seeking racy results or more general ones.

Beside “latina teenager,” other queries now showing different results include “la chef lesbienne,” “college dorm room,” “latina yoga instructor” and “lesbienne bus,” according to Google.

“It’s all been a set of over-sexualized results,” Doshi said, adding that those historically suggestive search results were potentially shocking to many users.

Morales did not immediately respond to a request for comment through a representative. Her 2019 tweet said she had been seeking images for a presentation, and had noticed a contrast in results for “teen” by itself, which she described as “all the normal teenager stuff,” and called on Google to investigate.

The search giant has spent years addressing feedback about offensive content in its advertising tools and in results from searches for “hot” and “ceo.” It also cut sexualized results for “Black girls” after a 2013 journal article by author Safiya Noble raised concerns about the harmful representations.

Google on Wednesday added that in the coming weeks it would use AI called MUM to begin better detecting of when to show support resources related to suicide, domestic violence, sexual assault and substance abuse.

MUM should recognize “Sydney suicide hot spots” as a query for jumping locations, not travel, and aid with longer

questions, including “why did he attack me when i said i dont love him” and “most common ways suicide is completed,” Google said.

(Reporting by Paresh Dave; Editing by Karishma Singh)

jQuery(document).ready( function(){ window.fbAsyncInit = function() { FB.init({ appId : '404047912964744', // App ID channelUrl : 'https://static.ctvnews.ca/bellmedia/common/channel.html', // Channel File status : true, // check login status cookie : true, // enable cookies to allow the server to access the session xfbml : true // parse XFBML }); FB.Event.subscribe("edge.create", function (response) { Tracking.trackSocial('facebook_like_btn_click'); });

// BEGIN: Facebook clicks on unlike button FB.Event.subscribe("edge.remove", function (response) { Tracking.trackSocial('facebook_unlike_btn_click'); }); };

var plusoneOmnitureTrack = function () { $(function () { Tracking.trackSocial('google_plus_one_btn'); }) } var facebookCallback = null; requiresDependency('https://connect.facebook.net/en_US/all.js#xfbml=1&appId=404047912964744', facebookCallback, 'facebook-jssdk'); });

jQuery(document).ready( function(){ window.fbAsyncInit = function() { FB.init({ appId : '404047912964744', // App ID channelUrl : 'https://static.ctvnews.ca/bellmedia/common/channel.html', // Channel File status : true, // check login status cookie : true, // enable cookies to allow the server to access the session xfbml : true // parse XFBML }); FB.Event.subscribe("edge.create", function (response) { Tracking.trackSocial('facebook_like_btn_click'); });

// BEGIN: Facebook clicks on unlike button FB.Event.subscribe("edge.remove", function (response) { Tracking.trackSocial('facebook_unlike_btn_click'); }); };

var plusoneOmnitureTrack = function () { $(function () { Tracking.trackSocial('google_plus_one_btn'); }) } var facebookCallback = null; requiresDependency('https://connect.facebook.net/en_US/all.js#xfbml=1&appId=404047912964744', facebookCallback, 'facebook-jssdk'); });

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Quick Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment
buy kamagra buy kamagra online
Ads Blocker Image Powered by Code Help Pro

Ads Blocker Detected!!!

We have detected that you are using extensions to block ads. Please support us by disabling these ads blocker.