Quick Telecast
Expect News First

Meredith Whittaker, president of Signal, named Future Perfect 50 finalist

0 218


New technologies can help connect us, but too often they come at a steep price: our privacy. The cost of constant interconnection is the threat of unwanted surveillance.

Meredith Whittaker knows that all too well. A little over a year ago, she became the president of Signal, which runs a messaging app with end-to-end encryption.

To appreciate how important that is, consider what can happen when someone uses another platform, like WhatsApp or Facebook Messenger, both of which are owned by Meta. This summer, a Nebraska woman pleaded guilty to helping her teenage daughter acquire pills for an abortion that is illegal in post-Roe America. How did police gather evidence against the woman? Meta turned over the family’s Facebook Messenger chat history to the cops.

Whittaker is staunchly opposed to this kind of digital surveillance, and she’s been fighting battles on many fronts to keep Signal truly private. As the global tech publication Rest of World recently explained:

Governments in China, Egypt, Cuba, Uzbekistan and, most recently, Iran have banned Signal outright. In the U.K., recently passed legislation could target messenger services and require an app like Signal to moderate harmful content such as terrorist content or child abuse imagery. To find that content, Signal would need access to user conversations, which would mean breaking the service’s end-to-end encryption. Similar bills have already been passed in India and proposed in Brazil.

No matter what any government says, Whittaker has stated that Signal will never weaken its encryption on her watch.

Whittaker’s contributions to the tech industry don’t stop with Signal, however. She has a long history with AI and a strong track record of dissenting. A particular specialty of hers is calling out fake fixes she sees in the AI world, sacred cows be damned. In a recent paper, she and her co-authors explained why “open source” AI isn’t all it’s cracked up to be. And when, in March, AI leaders (and Elon Musk) signed an open letter urging all labs to press pause on powerful AI, she withheld her signature.

Explaining why, she told the Guardian, “These are the people who could actually pause it if they wanted to. They could unplug the data centers. They could whistleblow. These are some of the most powerful people when it comes to having the levers to actually change this, so it’s a bit like the president issuing a statement saying somebody needs to issue an executive order. It’s disingenuous.”

Whittaker’s experience as an internal dissenter stretches back to her time working at Google. In 2018, she wrote an open letter urging the company not to build warfare technology for the Department of Defense. Some 3,000 employees signed and Google pulled out of the project.

Later, as a co-founder of the nonprofit AI Now Institute and senior adviser at the Federal Trade Commission, she repeatedly emphasized the social implications of AI and dared to mount what was then practically a taboo argument: Some AI just shouldn’t exist. No matter how much technologists try to make them unbiased, certain systems will just do too much harm.

“We need to look beyond technical fixes for social problems,” Whittaker said in 2019. “We need to ask: Who has power? Who is harmed? Who benefits? And ultimately, who gets to decide how these tools are built and which purposes they serve?”

Whittaker is asking the right moral questions and leading with integrity in a field that often seems bereft of it. We’ll all be a lot better off if more technologists and policymakers can learn to keep these questions in the forefront of their minds.

Correction, December 4, 10 am ET: Due to an editing error, a previous version of this story, originally published November 29, misstated Meredith Whittaker’s title. She is president of Signal, not CEO.


New technologies can help connect us, but too often they come at a steep price: our privacy. The cost of constant interconnection is the threat of unwanted surveillance.

Meredith Whittaker knows that all too well. A little over a year ago, she became the president of Signal, which runs a messaging app with end-to-end encryption.

To appreciate how important that is, consider what can happen when someone uses another platform, like WhatsApp or Facebook Messenger, both of which are owned by Meta. This summer, a Nebraska woman pleaded guilty to helping her teenage daughter acquire pills for an abortion that is illegal in post-Roe America. How did police gather evidence against the woman? Meta turned over the family’s Facebook Messenger chat history to the cops.

Whittaker is staunchly opposed to this kind of digital surveillance, and she’s been fighting battles on many fronts to keep Signal truly private. As the global tech publication Rest of World recently explained:

Governments in China, Egypt, Cuba, Uzbekistan and, most recently, Iran have banned Signal outright. In the U.K., recently passed legislation could target messenger services and require an app like Signal to moderate harmful content such as terrorist content or child abuse imagery. To find that content, Signal would need access to user conversations, which would mean breaking the service’s end-to-end encryption. Similar bills have already been passed in India and proposed in Brazil.

No matter what any government says, Whittaker has stated that Signal will never weaken its encryption on her watch.

Whittaker’s contributions to the tech industry don’t stop with Signal, however. She has a long history with AI and a strong track record of dissenting. A particular specialty of hers is calling out fake fixes she sees in the AI world, sacred cows be damned. In a recent paper, she and her co-authors explained why “open source” AI isn’t all it’s cracked up to be. And when, in March, AI leaders (and Elon Musk) signed an open letter urging all labs to press pause on powerful AI, she withheld her signature.

Explaining why, she told the Guardian, “These are the people who could actually pause it if they wanted to. They could unplug the data centers. They could whistleblow. These are some of the most powerful people when it comes to having the levers to actually change this, so it’s a bit like the president issuing a statement saying somebody needs to issue an executive order. It’s disingenuous.”

Whittaker’s experience as an internal dissenter stretches back to her time working at Google. In 2018, she wrote an open letter urging the company not to build warfare technology for the Department of Defense. Some 3,000 employees signed and Google pulled out of the project.

Later, as a co-founder of the nonprofit AI Now Institute and senior adviser at the Federal Trade Commission, she repeatedly emphasized the social implications of AI and dared to mount what was then practically a taboo argument: Some AI just shouldn’t exist. No matter how much technologists try to make them unbiased, certain systems will just do too much harm.

“We need to look beyond technical fixes for social problems,” Whittaker said in 2019. “We need to ask: Who has power? Who is harmed? Who benefits? And ultimately, who gets to decide how these tools are built and which purposes they serve?”

Whittaker is asking the right moral questions and leading with integrity in a field that often seems bereft of it. We’ll all be a lot better off if more technologists and policymakers can learn to keep these questions in the forefront of their minds.

Correction, December 4, 10 am ET: Due to an editing error, a previous version of this story, originally published November 29, misstated Meredith Whittaker’s title. She is president of Signal, not CEO.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Quick Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment
Ads Blocker Image Powered by Code Help Pro

Ads Blocker Detected!!!

We have detected that you are using extensions to block ads. Please support us by disabling these ads blocker.

buy kamagra buy kamagra online