In the Era of AI and Web3, We Need Deep Authenticity

IntermediateJun 28, 2024
This article explores the impact of current AI developments on the digital world and introduces the concept of deep authenticity. It suggests leveraging the characteristics of blockchain technology to address issues of falsification, thereby meeting humanity's need for authentic information.
In the Era of AI and Web3, We Need Deep Authenticity

Forward the Original Title‘Deep reals’

TL;DR

It’s never been more difficult to tell whether what we see online is real or fake. AI bots are being paid to leave fake reddit comments, Drake resurrected Tupac in a rap verse, and Morgan Freeman is still not Morgan Freeman. Tony Blinken playing guitar in Kyiv turned out to be real, though.

Digital forgeries are not new and have been around as long as the internet has. Plain old forgeries have been around even longer. Petroglyphs, ancient pottery, and stained glass windows have all been examples of medias used to misrepresent the truth or cover it up entirely. And for as long as forgeries have existed, it’s been practically impossible to tell whether something is real or fake with absolute certainty. The same will probably go for generative AI.

That’s why we’ve been thinking about something called ‘deep reals.’ A deep real is an inversion of a deep fake. It’s not assuming everything is real and that we just need to demonstrate something’s fakeness. Rather it assumes everything is fake and that we need to sufficiently demonstrate something’s realness.

Historically, our solution for determining realness was by turning to and trusting in institutions. But this model broke down in an age of pervasive media and information overload. The flaws and biases within these institutions were exposed, leading to a profound crisis of trust. Without a centralized authority to rely on, we now need to demonstrate the realness (or lack thereof) of an object in a manner that embraces a decentralized architecture.

Another way to say this is that we believe deep reals should not so much be “trustworthy” as they should be “trustless.” Instead of placing trust in a single authority, trust should be decentralized across a network of digital signatures, cryptographic algorithms, community notes, and immutable blockchain technology. This has the benefits of avoiding a single point of failure, enhancing transparency, and giving individuals more control over the rules by which they’re judged. AI and web3 are two sides of the same coin, and it’s web3 that has the answer to the AI attribution problem.

How will this work? Andy said it best describing our investment in the Mediachain protocol back in 2015:

“The protocol allows anyone to attach information to creative works, make it persistent and discoverable in a blockchain-based database. … The data is maintained by participants of the network and no permission is required to contribute or access it, making it an ideal place for collaboration between creators, developers, platforms, and media organizations. It is applicable to any form of media – images, gifs, videos, written works, and also music.”

For a long time USV has believed that the contextual information of the media we consume should be made more open. A creator should be given the choice to attach her identity to a photo she took, where she took it, and when. And her audience should be able to send a micropayment back to her to thank her for her work. But protocol-based media has continued to feel more like a vitamin than a pain killer.

Not anymore. 2024 will be the biggest election year in history. More than half the world’s population – that’s 4 billion people – will send their citizens to the polls this year. A system that gives users more information about the media they’re consuming is badly needed. And not just in politics, but in spaces like dating apps, second-hand fashion marketplaces, or even vacation rentals.

But it would be a mistake to only see deep reals as tools for fighting disinformation. We think they could become a new media primitive in their own right. One that is just as fun and addictive as the reels you see on IG and TikTok. Don’t get me wrong, AI-generated media is magic, but we believe there will always be a craving for content that is authentic, human-generated, and IRL. And we’re excited about new platforms being built to capture and express this.

What will it take to get distribution for this kind of technology? There’s interesting efforts underway like the Content Authenticity Initiative and C2PA, which enable existing media platforms like TikTok and the New York Times to retrofit cryptographic “credentials” into their content. However, we wonder whether the breakout solution will end up looking more web3-native and full-stack. Which is why we’re curious to learn about approaches that integrate the creation, signing, and sharing of content into one single platform, reducing the risk of contamination between each step. Paragraph’s blogging platform is an excellent example of this.

Deep fakes aren’t new, they’re just another mirage. And as the technology underpinning how we consume, share, and believe information changes, ‘deep reals’ will emerge to reinforce our collective sense of connection and trust in one another.

Disclaimer:

  1. This article is reprinted from [Union Square Ventures]. Forward the Original Title‘Deep reals’. All copyrights belong to the original author [Grace Carney]. If there are objections to this reprint, please contact the Gate Learn team, and they will handle it promptly.
  2. Liability Disclaimer: The views and opinions expressed in this article are solely those of the author and do not constitute any investment advice.
  3. Translations of the article into other languages are done by the Gate Learn team. Unless mentioned, copying, distributing, or plagiarizing the translated articles is prohibited.

In the Era of AI and Web3, We Need Deep Authenticity

IntermediateJun 28, 2024
This article explores the impact of current AI developments on the digital world and introduces the concept of deep authenticity. It suggests leveraging the characteristics of blockchain technology to address issues of falsification, thereby meeting humanity's need for authentic information.
In the Era of AI and Web3, We Need Deep Authenticity

Forward the Original Title‘Deep reals’

TL;DR

It’s never been more difficult to tell whether what we see online is real or fake. AI bots are being paid to leave fake reddit comments, Drake resurrected Tupac in a rap verse, and Morgan Freeman is still not Morgan Freeman. Tony Blinken playing guitar in Kyiv turned out to be real, though.

Digital forgeries are not new and have been around as long as the internet has. Plain old forgeries have been around even longer. Petroglyphs, ancient pottery, and stained glass windows have all been examples of medias used to misrepresent the truth or cover it up entirely. And for as long as forgeries have existed, it’s been practically impossible to tell whether something is real or fake with absolute certainty. The same will probably go for generative AI.

That’s why we’ve been thinking about something called ‘deep reals.’ A deep real is an inversion of a deep fake. It’s not assuming everything is real and that we just need to demonstrate something’s fakeness. Rather it assumes everything is fake and that we need to sufficiently demonstrate something’s realness.

Historically, our solution for determining realness was by turning to and trusting in institutions. But this model broke down in an age of pervasive media and information overload. The flaws and biases within these institutions were exposed, leading to a profound crisis of trust. Without a centralized authority to rely on, we now need to demonstrate the realness (or lack thereof) of an object in a manner that embraces a decentralized architecture.

Another way to say this is that we believe deep reals should not so much be “trustworthy” as they should be “trustless.” Instead of placing trust in a single authority, trust should be decentralized across a network of digital signatures, cryptographic algorithms, community notes, and immutable blockchain technology. This has the benefits of avoiding a single point of failure, enhancing transparency, and giving individuals more control over the rules by which they’re judged. AI and web3 are two sides of the same coin, and it’s web3 that has the answer to the AI attribution problem.

How will this work? Andy said it best describing our investment in the Mediachain protocol back in 2015:

“The protocol allows anyone to attach information to creative works, make it persistent and discoverable in a blockchain-based database. … The data is maintained by participants of the network and no permission is required to contribute or access it, making it an ideal place for collaboration between creators, developers, platforms, and media organizations. It is applicable to any form of media – images, gifs, videos, written works, and also music.”

For a long time USV has believed that the contextual information of the media we consume should be made more open. A creator should be given the choice to attach her identity to a photo she took, where she took it, and when. And her audience should be able to send a micropayment back to her to thank her for her work. But protocol-based media has continued to feel more like a vitamin than a pain killer.

Not anymore. 2024 will be the biggest election year in history. More than half the world’s population – that’s 4 billion people – will send their citizens to the polls this year. A system that gives users more information about the media they’re consuming is badly needed. And not just in politics, but in spaces like dating apps, second-hand fashion marketplaces, or even vacation rentals.

But it would be a mistake to only see deep reals as tools for fighting disinformation. We think they could become a new media primitive in their own right. One that is just as fun and addictive as the reels you see on IG and TikTok. Don’t get me wrong, AI-generated media is magic, but we believe there will always be a craving for content that is authentic, human-generated, and IRL. And we’re excited about new platforms being built to capture and express this.

What will it take to get distribution for this kind of technology? There’s interesting efforts underway like the Content Authenticity Initiative and C2PA, which enable existing media platforms like TikTok and the New York Times to retrofit cryptographic “credentials” into their content. However, we wonder whether the breakout solution will end up looking more web3-native and full-stack. Which is why we’re curious to learn about approaches that integrate the creation, signing, and sharing of content into one single platform, reducing the risk of contamination between each step. Paragraph’s blogging platform is an excellent example of this.

Deep fakes aren’t new, they’re just another mirage. And as the technology underpinning how we consume, share, and believe information changes, ‘deep reals’ will emerge to reinforce our collective sense of connection and trust in one another.

Disclaimer:

  1. This article is reprinted from [Union Square Ventures]. Forward the Original Title‘Deep reals’. All copyrights belong to the original author [Grace Carney]. If there are objections to this reprint, please contact the Gate Learn team, and they will handle it promptly.
  2. Liability Disclaimer: The views and opinions expressed in this article are solely those of the author and do not constitute any investment advice.
  3. Translations of the article into other languages are done by the Gate Learn team. Unless mentioned, copying, distributing, or plagiarizing the translated articles is prohibited.
Начните торговать сейчас
Зарегистрируйтесь сейчас и получите ваучер на
$100
!
Создайте аккаунт