เข้าสู่ระบบ
ลงทะเบียน
สแกนคิวอาร์โค้ดเพื่อดาวน์โหลดแอปมือถือ
ตัวเลือกการดาวน์โหลดเพิ่มเติม
การแจ้งเตือน
ตลาดและราคา
ไม่มีการแจ้งเตือนใหม่
เพิ่มเติม
เลือกภาษาและภูมิภาค
简体中文
English
Tiếng Việt
繁體中文
Español
Русский
Français (Afrique)
Português (Portugal)
ไทย
Indonesia
日本語
بالعربية
Українська
Português (Brasil)
ปรับเปลี่ยนสีชาร์ตแท่งเทียน
สีเขียวคือราคาเพิ่มขึ้น สีแดงคือลดลง
สีเขียวคือราคาเพิ่มขึ้น สีแดงคือลดลง
ปรับเปลี่ยนเวลา%
24 ชม.
UTC 00:00
UTC+8 00:00
Gate.io
BLOG
Meta’s New Megabyte System: A Breakthrou...
Meta’s New Megabyte System: A Breakthrough in Overcoming Roadblocks for GPTs
2023-06-07, 00:51
[//]:content-type-MARKDOWN-DONOT-DELETE ![](https://gimg2.gateimg.com/image/article/1686098682RDZZ.jpeg) GPTs can translate texts, summarize data and create content suitable for various purposes like marketing. Meta’s Megabyte aims to overcome the roadblocks that other GPT systems like OpenAi’s GPT-4 and ChatGPT face. Megabyte is different from other GPT models because it does not use tokenization. The Megabyte model comprises a local transformer, a patch embedder and a global transformer. ## Introduction Technological innovation has revolutionized the way human beings interact and do various tasks including personal or business ones. Artificial intelligence, also called machine learning, is able to carry out different activities such as writing essays or making financial plans. In this article we discuss the importance of Generative Pre-trained Transformer (GPT) in natural language processing and its applications. We will also focus on Meta’s megabyte System which overcomes several roadblocks for GPTs. ## Significance of GPTs in natural language processing Generative Pre-trained transformers (GPTs) have many benefits in various sectors of the economy as they enhance productivity and increase social awareness. First, it is important to know that GPTs create human-like texts on various subject matter. The GPTs use various parameters to process data and present it in ways that are easy to understand.. There are different applications that use GPTs to create value for human beings and the society in general. Basically, GPTs are important components of artificial intelligence driven applications that translate information from one language to another. They also generate and summarize large volumes of data into easy-to-understand information. In some cases, GPTs enable generation of content suitable for different purposes such as poems, blog posts, academic essays, marketing material and memes, among others. Businesses can also use GPTs to power chatbots and virtual assistants that can interact with real people in a conversational manner, helping them to understand different business or social aspects. For business purposes, they can generate sentiment analysis on any topic or field of interests. As an example, there are Ai driven protocols that generate crypto market sentiments which enable traders and other investors to make informed investment decisions. Other use cases of GPTs in natural language process and Ai applications include content creation for marketing products, customer service, analyzation of financial information as well as data extraction and reporting, among others. ## Limitations of traditional GPT models Although there are various [types of GPTs](https://www.gate.io/live/video/90b1e91dd5e7fb207b1509a809e5b444 "types of GPTs") created by different platforms like ChatGPT and Openai, most of them have serious limitations. The current best Generative AI models [including OpenAI's GPT-4 and ChatGPT](https://www.gate.io/blog_detail/2064/chatgpt-ai-impacts-healthcare-rising-prices-hit-americans-chinas-redistributive-policies-affect-property-developers "including OpenAI's GPT-4 and ChatGPT") use the Transformer architecture which was introduced by Google researchers. The increase in self-attention scales and the length of inputs and outputs creates a challenge as each word needs attention. Basically, this system works well when few words are used as input. However, the Megabyte method uses a different architecture which divides sequences of inputs and outputs into patches rather than tokens. As such, it can handle many more words than the current models. Also, Meta’s approach solves the scalability problem which is common among most models currently on the market. Basically, the Megabyte model enables a single feedforward network to act on a patch consisting of multiple tokens. Therefore, Meta’s Megabyte system performs in parallel rather than serially. This increases its efficiency even if the base model has many parameters. Read also: [The Meta Metaverse: What is the company working on?](https://www.gate.io/blog_detail/729/the-meta-metaverse-what-is-the-company-working-on "The Meta Metaverse: What is the company working on?") Some of the models such as deep neural networks are complex to understand and explain which may reduce trust, accountability and raise ethical concerns. Therefore, there is a need for simpler models like Meta Ai which are easy to explain. This is because most users would like to know how a system works to put their trust in it. Another issue is that some of these models require much data to be validated and trained. Nonetheless, such data may not be available which reduces their efficiency. In addition, issues related to privacy, bias, noise, security as well as data incompleteness negatively affects the robustness and performance of most GPT models. Most traditional AI models are expensive and consume much energy when making the calculations. This is because most of the systems are computational intensive. As such, they consume many resources and increase environmental costs. Additionally, most of these models have low interoperability as a result of differences in their standardization. Thus, it is very difficult for them to integrate since they use different languages, frameworks and formats. However, open formats like ONNX or universal compilers can enhance their communication. It is important to realize that Meta AI’s architecture is created in a manner that overcomes most of these problems. ## Meta’s Megabyte System Meta AI has developed a new [GPT system called the Megabyte](https://encord.com/blog/meta-ai-megabyte-model-architecture-explained/ "GPT system called the Megabyte") with the aim of getting around tokenization which most GPT models use. Its generative pre-trained transformer (GPT) system processes large volumes of data like videos and texts such as novels without using tokenization. As a point, tokenization functions in a manner similar to file compression through converting large amounts of data into tokens. The transformer processes the tokens to create output tokens which the system decodes. Normally, tokenization enables AI models to convert large strings of data into numbers. For instance, a system can convert a phrase like, “My favourite color is red,” to a token string such as 3666, 4004, 3124, 318, 2266, 13” which is then processed. However, with this method there is a limit on the amount of data it processes. For example, the limit of GPT-3.5 is between 3,000 and 4,000 words while that of GPT-4 is between 24,000 and 32,000. In contrast, [Meta](https://www.gate.io/ja/blog_detail/308/why-is-meta-previously-facebook-betting-big-on-metaverse "Meta") has discarded tokenization in favour of the new multi-layer prediction architecture that depends on end-to-end modeling of more than one million bytes of data. This is a great achievement considering that it can process a document composed of up to 750,000 words. This means that the Megabyte system can process data contained in three average size novels. As noted, Megabyte overcomes the roadblocks of tokenization arising from its hard data limits, much time required to train systems and high energy consumption. Also, without tokenization, it is possible to train AI models to support non-English languages which can be encoded in the standard 8-bit characters, for example. Meta’s artificial intelligence crypto AI will expand existing opportunities as it further democratizes various blockchain technologies. As an instance, developers can introduce cryptocurrency trading bots in their native languages such as Russian or French. More importantly, Decentralized autonomous organizations (DAOs) can code their protocols in local languages as well. ## How Meta Megabyte system works Megabyte, the multiscale decoder architecture, models sequences of more than 1 million bytes while maintaining end-to-end differentiability. It uses multiscale transformers which incorporate different levels within their architecture thereby modeling both global and local patterns in data. Basically, the Megabyte model comprises three components namely a local module, a patch embedder and a global module (global transformer).The local module, also called the local transformer, predicts the bytes within every patch while the embedder is responsible for encoding patches through combining byte embeddings. Finally, the global module, also known as global transformer, inputs and outputs the various patch representations. The following diagram shows the Megabyte overview. ![](https://gimg2.gateimg.com/image/article/1686098917Meta 1.png) The diagram above shows some of Megabyte’s key components. A recent experiment showed that the Megabyte can be 40% faster than the Transformer model. It is essential, though, to note that the Megabyte used during the experiment had 1.5 billion parameters while the transformer had 350 million. Overall, the Megabyte has several advantages over the traditional transformers. For example, it reduces the computational costs of self-attestation which makes it possible to handle long sequences. Secondly, it uses feedforward layers per path rather than per position resulting in the efficient utilization of computational resources. Also, it enhances greater parallelism during processing which leads to faster sequence generation while maintaining high performance. The Megabyte architecture improves scalability, reduces resource consumption and enables smooth communication with various GPT based applications. It achieves some of these benefits through dividing long sequences into two shorter sequences which minimizes self-attention costs. In addition, parameter sharing and compression algorithms minimize resource requirements of GPTs. ## Conclusion Meta’s Megabyte uses the generative pre-trained transformer system to process large volumes of data without using tokenization. Instead, it utilizes a multi-layer prediction architecture which minimizes costs, enhances speed, improves efficiency as well as increasing scalability and interoperability. <div class="blog-details-info"> <div>Author:**Mashell C.**, Gate.io Researcher <div class="info-tips">\*This article represents only the views of the researcher and does not constitute any investment suggestions. <div>\*Gate.io reserves all rights to this article. Reposting of the article will be permitted provided Gate.io is referenced. In all cases, legal action will be taken due to copyright infringement. </div>
แชร์
Inhalt
Introduction
Significance of GPTs in natural language processing
Limitations of traditional GPT models
Meta’s Megabyte _
How Meta Megabyte _ works
Conclusion
Credit Ranking
Complete Gate Post tasks to upgrade your rank
Join Now
BTC/USDT
-1.85%
ETH/USDT
-1.86%
GT/USDT
-1.3%
บทความที่เกี่ยวข้อง
Market News
From Bitcoin to Ethereum: Why Ethereum is Blockchain 2.0
2021-06-20, 09:30
Market News
Airdrop ครั้งใหญ่ที่สุดในประวัติศาสตร์กำลังจะเกิด : Matamask เปิดตัวโทเค็นเร็
2022-03-18, 04:53
Market News
Science: From Market Maker to Liquidity Mining, How Important is Liquidity?
2021-07-19, 07:36