An exploration journey of the convergence and innovation of artificial intelligence and Web3

Beginner6/4/2024, 10:33:59 AM
This paper explores the potential and practices of the convergence of AI and Web3 technologies, showing how the decentralized internet paradigm can provide new impetus for the development of AI and in turn empower the Web3 ecosystem. The article covers data-driven foundations, privacy protection, the computing power revolution, DePIN, IMO, and AI Agent, and discusses their role and development prospects in Web3.

As a new paradigm of the decentralized, open, and transparent Internet, Web3 has a natural synergy with artificial intelligence. Under the traditional centralized architecture, AI computing and data resources are tightly controlled, facing many challenges such as computing bottlenecks, privacy leaks, and algorithm black boxes. Web3, on the other hand, is based on distributed technology and injects new vitality into AI development through shared computing networks, open data markets, and privacy-preserving computing. At the same time, AI can empower the construction of the Web3 ecosystem by optimizing capabilities such as smart contracts and anti-cheat algorithms. Therefore, exploring the convergence of Web3 and AI is critical to building the next generation of internet infrastructure and unlocking the value of data and computing power.

Data-driven: A solid foundation for AI and Web3

Data is the core driving force for the development of AI, just like the fuel of the engine. AI models need to ingest large amounts of high-quality data in order to gain deep understanding and strong reasoning capabilities. Data not only provides the basis for training machine learning models, but also determines their accuracy and reliability.

In the traditional centralized AI data acquisition and utilization model, several key problems have emerged:

  1. Data acquisition is costly, making it difficult for SMEs to participate.
  2. Data resources are monopolized by technology giants, forming data silos.
  3. Personal data privacy is at risk of leakage and misuse.

Web3 provides a new decentralized data paradigm to address the pain points of traditional models:

  1. Through projects like Grass, users can sell idle network capacity to AI companies, enabling decentralized web data scraping, cleaning, and transformation, providing real, high-quality data for AI model training.
  2. Public AI uses a “mark-to-earn” model to motivate global workers to participate in data annotation, aggregate global wisdom, and enhance data analysis capabilities.
  3. Blockchain data trading platforms such as Ocean Protocol and Streamr provide an open and transparent trading environment for both data supply and demand, promoting data innovation and sharing.

In these ways, Web3 not only reduces the cost of data acquisition, but also enhances the openness and transparency of data, providing more diverse and high-quality data sources for AI model training. At the same time, through decentralized privacy-preserving computing, Web3 can also better protect the privacy of personal data and improve the security and reliability of data use.

Continuing to explore and practice the integration of AI and Web3 will provide a solid foundation for building a new generation of Internet infrastructure and unlock new value in data and computing power.

Nonetheless, real-world data collection also faces challenges such as uneven data quality, high processing complexity, and insufficient data diversity and representation. In the Web3 data space, synthetic data could be a rising star. Based on generative AI technology and simulation, synthetic data can simulate the attributes of real data, effectively supplementing and improving the efficiency of data use. In areas such as autonomous driving, financial market trading, and game development, synthetic data has demonstrated its potential for mature applications.

Privacy Protection: The Role of FHE in Web3

In a data-driven era, privacy protection has become a global focus, and the enactment of the European Union’s General Data Protection Regulation (GDPR) reflects the strict protection of individual privacy. However, this also poses challenges: some sensitive data cannot be fully utilized due to privacy risks, which undoubtedly limits the potential and inference capabilities of AI models.

Fully Homomorphic Encryption (FHE) allows the encrypted data to be computed directly without decrypting the data, and the result of the computation is consistent with the result of the same operation on the plaintext data. FHE provides strong protection for AI privacy-preserving computing, enabling GPU computing power to perform model training and inference tasks without accessing raw data. This presents a significant advantage for AI companies, as they can securely open up API services while protecting trade secrets.

Fully Homomorphic Encryption Machine Learning (FHEML) supports the encryption of data and models throughout the machine learning lifecycle, ensuring the security of sensitive information and preventing data leakage. In this way, FHEML strengthens data privacy and provides a secure computing framework for AI applications.

FHEML complements ZKML (Zero-Knowledge Machine Learning), where ZKML proves the correct execution of machine learning, while FHEML emphasizes computation on encrypted data to maintain data privacy.

The Computing Revolution: AI Computing in the Decentralized Web

The computational complexity of current AI systems is doubling every three months, leading to a surge in demand for computing power that far exceeds the supply of existing computing resources. For example, OpenAI’s GPT-3 model training requires enormous computing power, equivalent to 355 years of training on a single device. This shortage of computing power not only limits the advancement of AI technology, but also makes advanced AI models inaccessible to most researchers and developers.

In addition, global GPU utilization of less than 40%, coupled with a slowdown in microprocessor performance, supply chain issues, and chip shortages due to geopolitical factors, further exacerbated the problem of computing power supply. AI practitioners are faced with a dilemma: either buy hardware or rent cloud resources, and urgently need an on-demand and cost-effective computing service model.

IO.net is a decentralized network of Solana-based AI computing power that aggregates idle GPU resources around the world and provides an affordable computing power market for AI companies. Entities that require computing power can publish computing tasks on the network, and smart contracts assign tasks to contributing miner nodes. Miners perform tasks, submit results, and receive rewards upon successful verification. IO.net’s approach improves resource efficiency and helps alleviate bottlenecks in computing power in the AI space.

In addition to general-purpose decentralized computing power networks, there are platforms focused on AI training, such as Gensyn and Flock.io, as well as specialized computing power networks focused on AI inference, such as Ritual and Fetch.ai.

The decentralized computing power network provides a fair and transparent computing power market, breaks the monopoly, lowers the application threshold, and improves utilization efficiency. In the Web3 ecosystem, the network of decentralized computing power will play a key role in attracting more innovative dApps to jointly promote the development and application of AI technology.

DePIN: Web3 empowers AI at the edge

Imagine your smartphone, smartwatch, or even smart home device having the ability to run AI – that’s the beauty of AI at the edge. Edge AI enables computing to happen at the source of the data, enabling low-latency and real-time processing while protecting user privacy. Edge AI technology is already being applied in key areas such as autonomous driving.

In the Web3 space, we have a more familiar name – DePIN. Web3 emphasizes decentralization and user data sovereignty, while DePIN enhances user privacy protection by processing data locally, reducing the risk of data breaches. Web3’s native token economy can incentivize DePIN nodes to provide computing resources and build a sustainable ecosystem.

At present, DePIN is developing rapidly in the Solana ecosystem and has become one of the preferred public chain platforms for project deployment. Solana’s high throughput, low transaction fees, and technological innovation provide strong support for the DePIN project. Currently, the market capitalization of DePIN projects on Solana has exceeded $10 billion, and notable projects such as Render Network and Helium Network have made significant progress.

IMO: A New Paradigm for AI Model Publishing

The concept of IMO (Initial Model Offering) was first proposed by the Ora protocol to tokenize AI models.

In the traditional model, due to the lack of a revenue sharing mechanism, it is often difficult for developers to obtain continuous benefits from the subsequent use of the AI model once it is developed and put on the market. Especially when the model is integrated into other products and services, it is difficult for the original developer to track its usage and generate revenue. In addition, there is often a lack of transparency about the performance and effectiveness of AI models, making it difficult for potential investors and users to assess their true value, limiting market acceptance and business potential.

IMO provides a new approach to financing and value sharing for open-source AI models. Investors can buy IMO tokens and get a share of the revenue generated by the model. The Oracle protocol leverages ERC-7641 and ERC-7007 standards, combined with Onchain AI Oracle and OPML technologies, to ensure the authenticity of AI models and enable token holders to share in the yield.

The IMO model enhances transparency and trust, encourages open-source collaboration, is in line with crypto market trends, and injects momentum into the sustainable development of AI technology. Although IMO is still in the early experimental stage, its innovation and potential value are worth looking forward to as market acceptance and participation expand.

AI Agent: A New Era of Interactive Experiences

AI agents can perceive the environment, think independently, and take appropriate actions to achieve predetermined goals. Powered by large language models, AI agents can not only understand natural language, but also plan, decide, and execute complex tasks. They can act as virtual assistants that learn user preferences and provide personalized solutions through interaction. Even without explicit instructions, AI agents can autonomously solve problems, improve efficiency, and create new value.

Myshell is an open AI-native application platform that provides a comprehensive and user-friendly toolset for configuring bot functionality, appearance, sound, and connecting to external knowledge bases. It is committed to creating a fair and open AI content ecosystem, using generative AI technology to empower individuals to become super creators. Myshell has trained specialized large language models to make role-playing more humane. Its voice cloning technology can accelerate the interaction of personalized AI products, reduce the cost of speech synthesis by 99%, and voice cloning only takes 1 minute. Custom AI agents created with Myshell can currently be applied in a variety of areas, including video chat, language learning, and image generation.

In the convergence of Web3 and AI, the current focus is mainly on exploring the infrastructure layer to solve key problems such as obtaining high-quality data, protecting data privacy, putting models on-chain, improving the effective use of decentralized computing power, and validating large language models. As these infrastructure components mature, there is reason to believe that the convergence of Web3 and AI will give rise to a range of innovative business models and services.

Statement:

  1. This article is reprinted from [mirror]. All copyrights belong to the original author [BadBot]. If you have any objections to the reprint, please contact the Gate Learn team and they will deal with it promptly.
  2. Disclaimer: The views and opinions expressed in this article are solely those of the author and do not constitute any investment advice.
  3. The translation of this article was done by the Gate Learn team. Unless otherwise specified, reproduction, distribution, or plagiarism of translated articles is prohibited.

An exploration journey of the convergence and innovation of artificial intelligence and Web3

Beginner6/4/2024, 10:33:59 AM
This paper explores the potential and practices of the convergence of AI and Web3 technologies, showing how the decentralized internet paradigm can provide new impetus for the development of AI and in turn empower the Web3 ecosystem. The article covers data-driven foundations, privacy protection, the computing power revolution, DePIN, IMO, and AI Agent, and discusses their role and development prospects in Web3.

As a new paradigm of the decentralized, open, and transparent Internet, Web3 has a natural synergy with artificial intelligence. Under the traditional centralized architecture, AI computing and data resources are tightly controlled, facing many challenges such as computing bottlenecks, privacy leaks, and algorithm black boxes. Web3, on the other hand, is based on distributed technology and injects new vitality into AI development through shared computing networks, open data markets, and privacy-preserving computing. At the same time, AI can empower the construction of the Web3 ecosystem by optimizing capabilities such as smart contracts and anti-cheat algorithms. Therefore, exploring the convergence of Web3 and AI is critical to building the next generation of internet infrastructure and unlocking the value of data and computing power.

Data-driven: A solid foundation for AI and Web3

Data is the core driving force for the development of AI, just like the fuel of the engine. AI models need to ingest large amounts of high-quality data in order to gain deep understanding and strong reasoning capabilities. Data not only provides the basis for training machine learning models, but also determines their accuracy and reliability.

In the traditional centralized AI data acquisition and utilization model, several key problems have emerged:

  1. Data acquisition is costly, making it difficult for SMEs to participate.
  2. Data resources are monopolized by technology giants, forming data silos.
  3. Personal data privacy is at risk of leakage and misuse.

Web3 provides a new decentralized data paradigm to address the pain points of traditional models:

  1. Through projects like Grass, users can sell idle network capacity to AI companies, enabling decentralized web data scraping, cleaning, and transformation, providing real, high-quality data for AI model training.
  2. Public AI uses a “mark-to-earn” model to motivate global workers to participate in data annotation, aggregate global wisdom, and enhance data analysis capabilities.
  3. Blockchain data trading platforms such as Ocean Protocol and Streamr provide an open and transparent trading environment for both data supply and demand, promoting data innovation and sharing.

In these ways, Web3 not only reduces the cost of data acquisition, but also enhances the openness and transparency of data, providing more diverse and high-quality data sources for AI model training. At the same time, through decentralized privacy-preserving computing, Web3 can also better protect the privacy of personal data and improve the security and reliability of data use.

Continuing to explore and practice the integration of AI and Web3 will provide a solid foundation for building a new generation of Internet infrastructure and unlock new value in data and computing power.

Nonetheless, real-world data collection also faces challenges such as uneven data quality, high processing complexity, and insufficient data diversity and representation. In the Web3 data space, synthetic data could be a rising star. Based on generative AI technology and simulation, synthetic data can simulate the attributes of real data, effectively supplementing and improving the efficiency of data use. In areas such as autonomous driving, financial market trading, and game development, synthetic data has demonstrated its potential for mature applications.

Privacy Protection: The Role of FHE in Web3

In a data-driven era, privacy protection has become a global focus, and the enactment of the European Union’s General Data Protection Regulation (GDPR) reflects the strict protection of individual privacy. However, this also poses challenges: some sensitive data cannot be fully utilized due to privacy risks, which undoubtedly limits the potential and inference capabilities of AI models.

Fully Homomorphic Encryption (FHE) allows the encrypted data to be computed directly without decrypting the data, and the result of the computation is consistent with the result of the same operation on the plaintext data. FHE provides strong protection for AI privacy-preserving computing, enabling GPU computing power to perform model training and inference tasks without accessing raw data. This presents a significant advantage for AI companies, as they can securely open up API services while protecting trade secrets.

Fully Homomorphic Encryption Machine Learning (FHEML) supports the encryption of data and models throughout the machine learning lifecycle, ensuring the security of sensitive information and preventing data leakage. In this way, FHEML strengthens data privacy and provides a secure computing framework for AI applications.

FHEML complements ZKML (Zero-Knowledge Machine Learning), where ZKML proves the correct execution of machine learning, while FHEML emphasizes computation on encrypted data to maintain data privacy.

The Computing Revolution: AI Computing in the Decentralized Web

The computational complexity of current AI systems is doubling every three months, leading to a surge in demand for computing power that far exceeds the supply of existing computing resources. For example, OpenAI’s GPT-3 model training requires enormous computing power, equivalent to 355 years of training on a single device. This shortage of computing power not only limits the advancement of AI technology, but also makes advanced AI models inaccessible to most researchers and developers.

In addition, global GPU utilization of less than 40%, coupled with a slowdown in microprocessor performance, supply chain issues, and chip shortages due to geopolitical factors, further exacerbated the problem of computing power supply. AI practitioners are faced with a dilemma: either buy hardware or rent cloud resources, and urgently need an on-demand and cost-effective computing service model.

IO.net is a decentralized network of Solana-based AI computing power that aggregates idle GPU resources around the world and provides an affordable computing power market for AI companies. Entities that require computing power can publish computing tasks on the network, and smart contracts assign tasks to contributing miner nodes. Miners perform tasks, submit results, and receive rewards upon successful verification. IO.net’s approach improves resource efficiency and helps alleviate bottlenecks in computing power in the AI space.

In addition to general-purpose decentralized computing power networks, there are platforms focused on AI training, such as Gensyn and Flock.io, as well as specialized computing power networks focused on AI inference, such as Ritual and Fetch.ai.

The decentralized computing power network provides a fair and transparent computing power market, breaks the monopoly, lowers the application threshold, and improves utilization efficiency. In the Web3 ecosystem, the network of decentralized computing power will play a key role in attracting more innovative dApps to jointly promote the development and application of AI technology.

DePIN: Web3 empowers AI at the edge

Imagine your smartphone, smartwatch, or even smart home device having the ability to run AI – that’s the beauty of AI at the edge. Edge AI enables computing to happen at the source of the data, enabling low-latency and real-time processing while protecting user privacy. Edge AI technology is already being applied in key areas such as autonomous driving.

In the Web3 space, we have a more familiar name – DePIN. Web3 emphasizes decentralization and user data sovereignty, while DePIN enhances user privacy protection by processing data locally, reducing the risk of data breaches. Web3’s native token economy can incentivize DePIN nodes to provide computing resources and build a sustainable ecosystem.

At present, DePIN is developing rapidly in the Solana ecosystem and has become one of the preferred public chain platforms for project deployment. Solana’s high throughput, low transaction fees, and technological innovation provide strong support for the DePIN project. Currently, the market capitalization of DePIN projects on Solana has exceeded $10 billion, and notable projects such as Render Network and Helium Network have made significant progress.

IMO: A New Paradigm for AI Model Publishing

The concept of IMO (Initial Model Offering) was first proposed by the Ora protocol to tokenize AI models.

In the traditional model, due to the lack of a revenue sharing mechanism, it is often difficult for developers to obtain continuous benefits from the subsequent use of the AI model once it is developed and put on the market. Especially when the model is integrated into other products and services, it is difficult for the original developer to track its usage and generate revenue. In addition, there is often a lack of transparency about the performance and effectiveness of AI models, making it difficult for potential investors and users to assess their true value, limiting market acceptance and business potential.

IMO provides a new approach to financing and value sharing for open-source AI models. Investors can buy IMO tokens and get a share of the revenue generated by the model. The Oracle protocol leverages ERC-7641 and ERC-7007 standards, combined with Onchain AI Oracle and OPML technologies, to ensure the authenticity of AI models and enable token holders to share in the yield.

The IMO model enhances transparency and trust, encourages open-source collaboration, is in line with crypto market trends, and injects momentum into the sustainable development of AI technology. Although IMO is still in the early experimental stage, its innovation and potential value are worth looking forward to as market acceptance and participation expand.

AI Agent: A New Era of Interactive Experiences

AI agents can perceive the environment, think independently, and take appropriate actions to achieve predetermined goals. Powered by large language models, AI agents can not only understand natural language, but also plan, decide, and execute complex tasks. They can act as virtual assistants that learn user preferences and provide personalized solutions through interaction. Even without explicit instructions, AI agents can autonomously solve problems, improve efficiency, and create new value.

Myshell is an open AI-native application platform that provides a comprehensive and user-friendly toolset for configuring bot functionality, appearance, sound, and connecting to external knowledge bases. It is committed to creating a fair and open AI content ecosystem, using generative AI technology to empower individuals to become super creators. Myshell has trained specialized large language models to make role-playing more humane. Its voice cloning technology can accelerate the interaction of personalized AI products, reduce the cost of speech synthesis by 99%, and voice cloning only takes 1 minute. Custom AI agents created with Myshell can currently be applied in a variety of areas, including video chat, language learning, and image generation.

In the convergence of Web3 and AI, the current focus is mainly on exploring the infrastructure layer to solve key problems such as obtaining high-quality data, protecting data privacy, putting models on-chain, improving the effective use of decentralized computing power, and validating large language models. As these infrastructure components mature, there is reason to believe that the convergence of Web3 and AI will give rise to a range of innovative business models and services.

Statement:

  1. This article is reprinted from [mirror]. All copyrights belong to the original author [BadBot]. If you have any objections to the reprint, please contact the Gate Learn team and they will deal with it promptly.
  2. Disclaimer: The views and opinions expressed in this article are solely those of the author and do not constitute any investment advice.
  3. The translation of this article was done by the Gate Learn team. Unless otherwise specified, reproduction, distribution, or plagiarism of translated articles is prohibited.
Nu Starten
Meld Je Aan En Ontvang
$100
Voucher!