Human beings are remarkable creatures. While the pace of biological evolution is incredibly slow, the rate at which humans transform the world through science and technology is astonishingly rapid in comparison. Consider the contrast between our lives today and those of people a thousand years ago. Despite having similar appearances and not vastly different cognitive frameworks, the disparity in living standards is immense.
However, no matter how swiftly the world changes, humans are ultimately bound by their physical and genetic makeup, composed of organic and inorganic materials. The instinct-driven struggles for wealth and power, class conflicts, wars to re-establish international order, and cycles of wealth and debt have been persistent throughout history and will likely continue. The ways humans react and behave in response to these issues are unlikely to change significantly over time.
This perspective suggests that by examining historical human actions and responses to major events, we can anticipate future patterns. While we cannot predict the future with absolute certainty, unless there are dramatic changes in human biology or a radical shift in our collective mindset, such as a universal conversion to Buddhism achieving enlightenment, we can use the past to make educated guesses about future trends.
Numerous books have been published analyzing the unchanging aspects of human society and our consistent reactions to historical events. For instance, Morgan Housel’s “Same as Ever” offers insightful explanations about the persistent nature of human thought processes from a micro perspective. On the other hand, Ray Dalio’s “Principles for Dealing with the Changing World Order” provides a macro perspective, analyzing the repetitive history of empires. Both books are highly recommended for readers interested in understanding these enduring patterns.
In this context, this essay aims to explore the significant, unavoidable trends humanity currently faces and their potential impacts on society, drawing parallels with historical precedents. Among these trends, I focused on the wavering status of the US dollar and the rise of Artificial General Intelligence (AGI), noting their commonality in presenting significant risks due to centralization. Consequently, I believe that blockchain technology, which inherently promotes decentralization, will play a crucial role in the future of human society. Each section of this essay will delve into how the blockchain industry, led by Bitcoin, might ultimately shape our world.
Currency is a social contract established for the purpose of facilitating barter. The legitimacy of this contract relies on social factors such as the balance of power within the international order and the trust of its participants. Considering that there have been no significant changes in human thought and emotional systems over a long historical period, it is highly likely that future currency systems will follow historical precedents.
Most people living in the present day are already very familiar with the US dollar as the global reserve currency, using it in daily life without much question. The United States’ dominance in military, financial, scientific, and various other fields has solidified the dollar’s seemingly eternal status. However, humans have a tendency to be complacent about things they have not personally experienced. A brief exploration into the essence and history of money reveals that the tenure of a global reserve currency is often shorter than one might expect.
The US dollar has held its position as the sole global reserve currency only since the establishment of the Bretton Woods system in 1944, a span of merely about 80 years. Before assessing the current status of the dollar, it is instructive to briefly review the global reserve currencies that preceded it. Prior to the dollar, the British pound sterling served as the world’s reserve currency, and before that, the Dutch guilder held this role.
(The history of reserve currencies repeats itself)
The rise and fall of the Netherlands and Britain as great powers, and their tenure as holders of the global reserve currency, followed remarkably similar patterns. Both nations began their ascent by triumphing in wars against declining powers. This victory acted as a catalyst for their increasing national competitiveness, spurred by developments such as the growth of capitalism and the Industrial Revolution. These advancements laid the foundation for their status as reserve currency nations.
However, as history has repeatedly shown, the wealth and prosperity derived from holding the status of a global reserve currency often sow the seeds of decline. Increasing current account deficits and widening income inequality weaken national competitiveness and accelerate the accumulation of debt. Eventually, massive debts incurred through wars, along with the devaluation of their currencies, force these once-dominant nations to relinquish their reserve currency status to emerging powers.
(Mount Washington Hotel in Bretton Woods | Source: Wikipedia)
The United States, currently the world’s leading superpower, has followed a similar trajectory. After the Civil War, the nation enhanced its competitiveness through the Second Industrial Revolution, the development of capitalism, and its geopolitical advantages. Surpassing a declining Europe in wealth and prosperity during and after the First and Second World Wars, the U.S. reached new heights. As victory in World War II became certain, the United States convened a conference to restructure the post-war financial order, adopting the Bretton Woods system, which established the dollar as the reserve currency under the gold standard.
However, a reserve currency economy based on hard currency, like the gold standard, presents a dilemma. To use the dollar as the primary currency for international trade, there must be a sufficient supply of dollars, requiring the reserve currency nation to maintain a deficit. While gold reserves remained constant, the increasing issuance of dollars inevitably led to the currency’s devaluation and eroded international trust in the reserve currency. This issue is known as the Triffin Dilemma.
The Cold War with the Soviet Union, the Vietnam War, and the Oil Shock exacerbated trade deficits and inflation. When the U.S. could no longer meet the demand for gold redemption, President Richard Nixon ended the gold convertibility of the dollar in 1971. This led to a dramatic rise in the price of gold from the fixed $35 per ounce to $850 per ounce by 1980, marking the beginning of the fiat currency era and an age of high inflation.
Fortunately, due to the unprecedented high-interest-rate policies implemented by Paul Volcker, which reached annual rates of 20%, and the successful establishment of the petrodollar system, the dollar regained its value. This recovery ushered in a period of economic prosperity for the United States during the 1990s.
(Source: FRED)
However, the dynamics of dollar issuance underwent a complete transformation after the end of the Bretton Woods system. Whenever funds were needed, the government began issuing Treasury bonds, and the Federal Reserve printed money to purchase these bonds, leading to a rapid increase in the money supply. Government debt soared from $391 billion (34% of GDP) in 1971 to $34 trillion (120% of GDP) by the end of 2023. During the financial crises of 2008 and 2020, the government amassed significant debt through this mechanism, resulting in the continuous depreciation of the dollar’s value.
How long can such massive government debt be sustained? This question opens the door to various scenarios. One possibility is the emergence of another inflation fighter like Paul Volcker, who might take drastic measures to reduce debt, even at the cost of severe economic recession. Alternatively, disruptive innovations like artificial intelligence could boost supply and production, exerting continuous deflationary pressure on the economy and thereby extending the lifespan of the dollar.
(Political polarization | Source: Pew Research)
However, as previously mentioned, currency is a social contract. Thus, the decline of the dollar will commence when the international community begins to lose faith in the United States and its currency. The inevitable inflation associated with being a reserve currency can exacerbate social issues such as income inequality and political polarization, both domestically and internationally, further eroding trust in the dollar. Although there are no definitive signs of the dollar’s demise yet, accumulating issues suggest that such a scenario is increasingly plausible.
(China loves gold | Source: Investing.com)
Geopolitical issues, not just inflation, can also undermine the dollar’s status. In response to Russia’s invasion of Ukraine, Western nations excluded Russia from the SWIFT banking system, preventing it from settling trade in euros or dollars. They also froze half of Russia’s foreign exchange reserves held in dollars. Such actions can diminish other nations’ trust in the dollar. For instance, China has been steadily selling off U.S. Treasury bonds and accumulating gold since the onset of the Russia-Ukraine conflict, thereby reducing its dependence on the U.S.
History proves that the dynamics of power surrounding currency remain constant. Unless an unprecedented perfect monetary policy emerges, any reserve currency will eventually lose its status. Although no one can predict the exact timing, the dollar will someday face its end. I can only hope that this moment comes as late and as smoothly as possible.
As the dollar gradually loses its credibility, naturally, assets like gold, will garner attention. Gold has been valued from ancient times to the modern era due to its scarcity and immutable physical properties. During major conflicts, gold has been the ultimate asset recognized for its value internationally. Consequently, central banks around the world always maintain a certain reserve of gold.
(Russians line up at the bank during the war | Source: AP)
Today, individuals can invest in gold through various means such as mining company stocks, gold futures, and gold ETFs. These investment methods are generally effective in developed countries with accessible financial markets. However, if you reside in a nation with less developed financial markets or one directly involved in war or revolution, investing in gold can be highly restrictive. These investment avenues do not involve direct ownership of gold, introducing counterparty risk during international turmoil. Additionally, purchasing and storing physical gold is not an easy task.
(Source: Kaiko)
In such scenarios, Bitcoin can serve as an excellent hard asset similar to gold. Its supply is limited, it is not controlled by any single entity, and it is exceptionally easy to store and transfer, even in dire situations like wartime. For example, during Russia’s invasion of Ukraine on February 24, 2022, the trading volume and price of BTC/UAH surged, trading at a 6% premium over the international rate. Even in less extreme cases, demand for Bitcoin is high in countries with unstable national currencies. In Turkey, where the annual inflation rate is around 70%, Bitcoin trades at a premium similar to gold. These examples demonstrate that Bitcoin can indeed fulfill the role of a hard asset.
(Source: BlockScholes, Yahoo)
Given the above examples, it is evident that Bitcoin holds significant potential to serve as a hard currency in the future. But does this mean that citizens of developed countries, currently protected by stable monetary systems, have no need to include Bitcoin in their portfolios? Even outside of crisis situations, allocating a portion of one’s portfolio to Bitcoin can offer substantial benefits in terms of diversification. As illustrated in the graph, although Bitcoin’s correlation with other assets like gold, stocks, and the dollar can be volatile over time, it generally exhibits distinct price movements. This unique characteristic alone makes it advantageous to hold a portion of assets in cryptocurrencies like Bitcoin.
(Source: K33 Research)
Indeed, many financial institutions in the United States have recently added BTC ETFs to their portfolios. According to K33 Research, in the first quarter of 2024, 937 institutions reported holding Bitcoin ETFs in their 13F filings. Among them were notable names like JP Morgan, UBS, and Wells Fargo, as well as the Wisconsin Investment Board, which acquired BTC ETFs worth approximately $160 million. This trend indicates that Bitcoin is increasingly being recognized as a store of value.
(Fast food to the moon)
Even before the inflationary effects of the COVID-19 era’s quantitative easing have fully dissipated, the United States is increasing liquidity again in anticipation of the upcoming presidential election. The Treasury Department is expanding fiscal spending, and starting May 29, plans to conduct bond buybacks for the first time in over twenty years. Simultaneously, the Federal Reserve is slowing the pace of quantitative tightening.
Consequently, the dollar will continue to face inflationary pressures and will be issued in large quantities during major economic downturns. Unless the United States maintains its leadership through continual innovation in military, scientific, and industrial fields, the value of the dollar is bound to decline over time. Conversely, this will naturally increase the attention and value of Bitcoin.
However, to achieve the same status as gold as a hard asset, Bitcoin faces a critical challenge: the security scale and profitability of its network. The essential element for maintaining Bitcoin’s value is the security level of its network. The more miners there are to mine Bitcoin, the more secure the network becomes, thereby solidifying Bitcoin’s value.
Bitcoin miners earn revenue in two primary ways: block rewards and transaction fees. Block rewards are the Bitcoins awarded for successfully mining a block, with the amount being fixed and halved every four years. Transaction fees, on the other hand, are the fees paid by users for conducting transactions on the Bitcoin network, separate from block rewards.
(Fees should be higher to achieve sustainability | Source: dune, @21co)
For miners to continue participating in the Bitcoin network, their mining revenue must exceed their costs. Due to the halving that occurs every four years, block rewards diminish over time, necessitating an increase in transaction fee revenue to make up the difference. However, unlike networks such as Ethereum and Solana, the Bitcoin network has limited applications and low scalability, leading to fewer transactions and consequently lower transaction fee revenue. Recently, new token standards like Ordinals and Runes have momentarily increased activity on the Bitcoin network, but there is no long-term guarantee that these will significantly contribute to transaction fee revenue.
(Source: MacroMicro)
Up to now, mining revenue has generally surpassed mining costs. However, as block rewards continue to decrease due to future halvings, unless 1) Bitcoin’s price rises substantially or 2) network activity increases to boost transaction fee revenue, there is a risk that miners will exit the network. This would lower the security level of the Bitcoin network, diminishing its intrinsic value and potentially leading to a vicious cycle of further miner departures and decreased security.
This highlights the primary difference between gold and Bitcoin. Gold’s intrinsic value is not tied to profitability, whereas Bitcoin’s intrinsic value is directly linked to it. Therefore, ensuring profitability is a long-term challenge that the Bitcoin network must address. While there is currently no definitive solution within the Bitcoin community, the emergence of applications such as Ordinals, Runes, and innovations like OP_CAT suggest a potential increase in transaction fee revenue in the long term.
(Is this truly the future of humanity? | Source: The Matrix)
Historically, unlike currency, innovative technologies such as AI have always brought significant changes to society. The steam engine, electricity, and the internet revolution transformed the global industrial landscape, profoundly impacting human jobs and lifestyles. While these technological revolutions brought about various social issues during their transitional periods, they ultimately provided humans with much more prosperous lives. Steam engines and electricity liberated humans from most physical labor, while digital and internet technologies freed them from simple forms of mental labor.
(Fun fact: Illia is the person you know, iykyk)
AI technology has been studied since the 1900s, but meaningful results were slow to emerge. However, the pace of AI development accelerated dramatically after the publication of the Attention Is All You Need paper in 2017, which introduced the transformer theory. This breakthrough made it easier to develop large language models (LLMs), bringing humanity a step closer to artificial general intelligence (AGI). Like previous industrial revolutions, the development of AGI is expected to lead to a significant increase in productivity and have a substantial societal impact. However, I believe the implications will differ significantly for several reasons.
First, AGI will free humans from almost all forms of labor. Previous industrial revolutions liberated humans from physical and simple mental labor, leading to a higher proportion of the population engaged in more sophisticated tasks. However, AGI can handle advanced mental labor, including artistic endeavors such as art and music. Coupled with advanced robotics, this means the areas in which humans can contribute to productivity will diminish significantly.
(The modern day Luddite movement?)
Of course, this does not mean all jobs will disappear. Even in the 21st century, a portion of the population is engaged in agriculture and fisheries, although the proportion is much lower than in the past. While most job types will remain with the advent of AGI, the number of people needed to perform them will drastically decrease. For instance, tasks that ten people currently handle could be managed by one person in the future, leading to a significant increase in the population unable to find employment. Notably, leading figures in AI, such as Elon Musk and Sam Altman, have argued that AI and robots will handle global productivity, resulting in widespread job loss for humans.
Some argue that efficiency could be maximized while maintaining the current employment levels, but this is a misconception. For this to happen, demand would need to increase proportionally with the significant boost in supply (productivity) provided by AGI. However, in most fields, this is not feasible. Job creation would have to occur in new areas beyond AGI’s reach, but as mentioned earlier, AGI’s capabilities extend beyond physical to mental tasks, making this unlikely.
Secondly, AI is inherently a highly centralizable technology. Even before achieving AGI, the AI industry has already become heavily centralized around big tech companies. This is due to the rapid advancement of AI technology. Since the introduction of the transformer theory, the size of language models has increased by a factor of 10^4 between 2018 and 2022. Consequently, there are significant technological disparities in the essential industries that constitute AI technology.
(Source: @EricFlaningam)
(Source: Counterpoint)
In summary, centralization is inevitable in the AI industry, where achieving economies of scale is essential. As the AI industry becomes more centralized, several micro-level issues can arise, such as excessive corporate profit-seeking, unethical data use, single points of failure like server downtimes, and the opacity of AI models. On a macro level, we may face societal chaos as the line between humans and AI blurs, and many people lose their jobs. I believe that blockchain technology, which inherently pursues decentralization, can serve as an antithesis to AI, addressing the challenges associated with AI centralization. Let’s explore how blockchain can be applied to the AI industry.
Just as Satoshi Nakamoto introduced Bitcoin in 2008, advocating decentralization in response to the unchecked issuance of currency by central banks, blockchain technology can be utilized in various ways in the AI industry, where centralization trends are driven by economies of scale.
Among the five highly centralized elements mentioned earlier, semiconductor design and production require concentrated expertise and substantial manufacturing facilities, leaving little room for blockchain solutions. However, blockchain can be effectively applied in the fields of ‘computing power,’ ‘AI models,’ and ‘data.’ Additionally, it can address issues such as the proliferation of fake information, including deepfakes, and support basic income policies for a populace facing mass unemployment. Let’s explore the potential applications of blockchain technology within the AI pipeline.
Decentralized Computing
Training and inferring AI models require immense computing power and hardware. Big tech companies continuously purchase GPUs like NVIDIA’s H100 for their model training, exacerbating the global hardware supply shortage. While services like AWS and Azure provide data centers for cloud-based AI model training and inference, they operate as oligopolies, imposing high margins on users. In response to these challenges, new services leveraging blockchain technology to offer decentralized computing power have emerged.
Examples include Akash and io.net, where users can contribute their hardware’s computing power to the platform in exchange for incentives. There are also protocols specialized in niche services. For instance, Gensyn is optimized for training AI models. General decentralized computing services can reduce costs by utilizing idle hardware, but it is challenging to perform state-dependent computations, such as AI model training, in a decentralized manner. Gensyn addresses this with concepts like probabilistic proof-of-learning and graph-based pinpoint protocol. While Gensyn is specialized in training AI models, Bittensor focuses on AI model inference. Users can submit tasks, and Bittensor’s decentralized nodes compete to provide the optimal results.
zkML
zkML, a fusion of zero-knowledge (zk) cryptography and machine learning (ML), promises to enhance the privacy and transparency of AI models. Many AI models currently operate as closed-source, leaving users uncertain whether these models are using the correct weights and performing inference honestly. By applying cryptographic techniques like ZK-SNARKs (Zero-Knowledge Succinct Non-Interactive Argument of Knowledge) to ML models, it becomes possible to prove that an AI model has executed its inference process correctly without revealing its weights, thus achieving both privacy and computational integrity.
(Source: Polygon ID)
ZK-SNARKs are a powerful cryptographic technology that allows the validity of arbitrary computations to be proven without revealing the input data. To illustrate this, consider a real-world example: proving one’s age online. Typically, this requires complex KYC verification, involving the disclosure of personal information such as name and ID. With ZK technology, this process can be simplified and made more private. Once a user has verified their age with an official entity, they can generate and submit a ZK proof whenever they need to prove they are over 18. This proof contains no personal information but still assures the verifier of the user’s age, making the identity verification process safer and simpler.
(Top: Standard ML, Bottom: zkML | Source: @danieldkang Medium)
Applying the same concept to ML models, a consumer using a closed-source ML model cannot be sure whether the model performed the computation honestly on the given input. By incorporating ZK-SNARKs, an ML provider can assure the consumer that the computation was carried out correctly without revealing the input or weights. A ZKP (Zero-Knowledge Proof) of the ML inference process can be generated and verified by a smart contract on a neutral blockchain protocol, ensuring that anyone can trust the results.
(Source: Modulus Labs)
While the concept of zkML is highly attractive, significant challenges remain. Verifying ZKPs for specific computations is straightforward, but generating these proofs requires more computational power than performing the actual computation. According to @ModulusLabs/chapter-5-the-cost-of-intelligence-da26dbf93307">Modulus Labs, generating a Plonky2-based ZKP for an ML model with 18 million parameters takes about one minute. Given that GPT-3 has 175 billion parameters and GPT-4 has 1.76 trillion parameters, substantial advancements are needed before zkML can be adopted meaningfully.
Data Sovereignty
As the AI industry continues to evolve, the significance of data grows exponentially. However, this surge has led to increasing instances of data sovereignty infringements. By leveraging blockchain technology, individuals can manage their identity-related information through self-custody, providing data only when necessary via digital signatures. Moreover, blockchain enables transparent data provision or sale through incentive systems or marketplaces accessible to all. Perhaps the most blockchain-like approach to data sovereignty has been exhibited by Reddit, which offered long-time users the chance to participate in its IPO, while contracting to provide data to Google. This move exemplifies a novel path in data sovereignty.
While slightly tangential to data sovereignty, blockchain also holds the potential to address issues in the data labeling industry. Data labeling is essential for enhancing the accuracy and ethics of AI models. Currently, this task often falls to low-wage workers, emerging as a new social issue. For instance, China’s AI industry exploits vocational school students, and OpenAI has outsourced this work to low-wage workers in Kenya. Integrating blockchain into data labeling could democratize participation and ensure fair compensation.
Proof of Personhood
Decentralized computing, zkML, and data sovereignty may solve some AI industry challenges. Yet, proof of personhood and universal basic income could safeguard human sovereignty in a society drastically altered by AGI. Let us explore how blockchain might support human sovereignty amidst such profound social transformation.
As AI models advance, the production of various content forms—text, images, videos—by AI becomes increasingly prevalent. Distinguishing whether these outputs are human-made is becoming more challenging. The acceleration of digitalization is inevitable, and as AI-generated content proliferates, the associated social problems will undoubtedly surge.
(Did Caitlyn Jenner really launch memecoin?)
These issues are not merely speculative; they are already occurring. Fraud through deepfakes, which mimic the faces and voices of individuals, has become alarmingly frequent, resulting in substantial financial losses. The authenticity of videos is now often hotly debated online due to the existence of deepfakes.
A recent incident involving Caitlyn Jenner illustrates this point vividly. She announced the launch of a meme coin on the Solana network via the platform X. Given the unusual nature of the announcement, many suspected her account had been hacked. Despite Caitlyn posting a video herself, there was significant controversy over whether it was a deepfake. This debate persisted until Caitlyn’s manager also released a video, helping to somewhat settle the matter.
(proof of personhood | Source: Worldcoin)
As we advance into the AI age, one of the most critical challenges will be proving one’s humanity in the digital realm. This concept, known as “proof of personhood,” aims to prevent sybil attacks and disinformation in the digital world. Currently, most applications rely on government-issued identity systems like passports or credit cards to verify personhood. However, these methods pose privacy risks and the potential for single points of failure. Thus, a truly digital identity system is essential. Blockchain technology offers a solution, allowing individuals to prove their humanity and the authenticity of their created content, potentially mitigating issues like deepfakes.
(Scanning iris through Orb | Source: Sam Altman)
The most commonly used method for digital identity verification is biometric systems, which authenticate specific body parts. OpenAI’s CEO Sam Altman is pioneering a project called Worldcoin, combining blockchain technology with iris scanning. Users install an application on their mobile devices, receiving a private key (account) on the blockchain. By using an iris scanning device called the Orb, users can authenticate their humanity in the digital world. The Orb ensures that the user is indeed a person and that the iris has not been previously registered, securely granting digital identity.
The Orb transmits only the hash value of the iris data to the server, destroying the actual iris data afterward. Users can later prove their personhood without revealing their account address, thanks to ZK-SNARKs, addressing privacy concerns. However, potential issues like hardware backdoors still need to be resolved. The importance of proof of personhood extends beyond content authenticity. It plays a crucial role in the concept of universal basic income, which we will explore next.
Universal Basic Income
(Source: Scott Santens)
As previously mentioned, the advent of AGI is poised to bring about an unprecedented leap in productivity in human history. However, this revolutionary progress will inevitably result in significant job displacement. To sustain societal stability, the concept and necessity of Universal Basic Income (UBI) are gaining increasing attention. The idea of UBI predates AGI, tracing its origins back to Thomas More’s “Utopia” in the 16th century. UBI entails providing regular, unconditional financial support to all members of society. An existing example of UBI can be found in Alaska, where the Alaska Permanent Fund Dividend offers a form of UBI, demonstrating positive outcomes across various dimensions such as poverty, employment, and health.
The focus here, however, is not on a UBI that merely enhances quality of life, but on a UBI substantial enough to support individuals who lose their jobs due to AGI, ensuring they can live adequately without employment. Elon Musk refers to this as “universal high income.“ Similarly, Sam Altman has shown considerable interest in UBI, conducting research through OpenResearch. He has proposed innovative ideas such as providing UBI in the form of assets and means of production like equity or computing power, rather than just cash.
Sam Altman’s Worldcoin, discussed in the “Proof of Personhood” section, is also closely linked to UBI. A critical aspect of UBI distribution is ensuring that only genuine individuals receive it and preventing multiple claims by the same person. Thus, preventing Sybil attacks is crucial for implementing UBI. Worldcoin aims to achieve this through iris recognition for proof of personhood. Currently, users verified via iris recognition on the Worldcoin app receive WLD tokens periodically, a form of UBI. Although I resonate with Worldcoin’s vision, I harbor some reservations about the distribution of WLD tokens.
Even beyond Sam Altman’s Worldcoin, blockchain technology will be indispensable for establishing a complete UBI system. Blockchain can enhance transparency and efficiency not only in selecting recipients through proof of personhood but also in the distribution process, ensuring a more effective and transparent UBI delivery.
Despite the unprecedented crises marked by the collapses of Terra and FTX, the blockchain market has swiftly regained its scale. However, reflecting on both the previous and current market booms, a distinct shift in the industry’s vision is evident. In 2021, numerous protocols were driven by the grand vision of decentralization, capturing the imagination and excitement of many. Now, despite the market’s similar scale, there seems to be a pervasive uncertainty within the industry and community about the direction blockchain should take. This is not due to any failure on our part or a deficiency in blockchain technology itself; rather, it is simply that the current era has not yet created a pressing need for blockchain technology.
While it is intriguing to observe blockchain’s application in niche markets, the industry must set its sights higher. As the long history of humanity has shown, we will continue to experience cyclical monetary systems and revolutionary technological innovations. Within these vast movements, blockchain will stand as a crucial technology that will safeguard human sovereignty.
Human beings are remarkable creatures. While the pace of biological evolution is incredibly slow, the rate at which humans transform the world through science and technology is astonishingly rapid in comparison. Consider the contrast between our lives today and those of people a thousand years ago. Despite having similar appearances and not vastly different cognitive frameworks, the disparity in living standards is immense.
However, no matter how swiftly the world changes, humans are ultimately bound by their physical and genetic makeup, composed of organic and inorganic materials. The instinct-driven struggles for wealth and power, class conflicts, wars to re-establish international order, and cycles of wealth and debt have been persistent throughout history and will likely continue. The ways humans react and behave in response to these issues are unlikely to change significantly over time.
This perspective suggests that by examining historical human actions and responses to major events, we can anticipate future patterns. While we cannot predict the future with absolute certainty, unless there are dramatic changes in human biology or a radical shift in our collective mindset, such as a universal conversion to Buddhism achieving enlightenment, we can use the past to make educated guesses about future trends.
Numerous books have been published analyzing the unchanging aspects of human society and our consistent reactions to historical events. For instance, Morgan Housel’s “Same as Ever” offers insightful explanations about the persistent nature of human thought processes from a micro perspective. On the other hand, Ray Dalio’s “Principles for Dealing with the Changing World Order” provides a macro perspective, analyzing the repetitive history of empires. Both books are highly recommended for readers interested in understanding these enduring patterns.
In this context, this essay aims to explore the significant, unavoidable trends humanity currently faces and their potential impacts on society, drawing parallels with historical precedents. Among these trends, I focused on the wavering status of the US dollar and the rise of Artificial General Intelligence (AGI), noting their commonality in presenting significant risks due to centralization. Consequently, I believe that blockchain technology, which inherently promotes decentralization, will play a crucial role in the future of human society. Each section of this essay will delve into how the blockchain industry, led by Bitcoin, might ultimately shape our world.
Currency is a social contract established for the purpose of facilitating barter. The legitimacy of this contract relies on social factors such as the balance of power within the international order and the trust of its participants. Considering that there have been no significant changes in human thought and emotional systems over a long historical period, it is highly likely that future currency systems will follow historical precedents.
Most people living in the present day are already very familiar with the US dollar as the global reserve currency, using it in daily life without much question. The United States’ dominance in military, financial, scientific, and various other fields has solidified the dollar’s seemingly eternal status. However, humans have a tendency to be complacent about things they have not personally experienced. A brief exploration into the essence and history of money reveals that the tenure of a global reserve currency is often shorter than one might expect.
The US dollar has held its position as the sole global reserve currency only since the establishment of the Bretton Woods system in 1944, a span of merely about 80 years. Before assessing the current status of the dollar, it is instructive to briefly review the global reserve currencies that preceded it. Prior to the dollar, the British pound sterling served as the world’s reserve currency, and before that, the Dutch guilder held this role.
(The history of reserve currencies repeats itself)
The rise and fall of the Netherlands and Britain as great powers, and their tenure as holders of the global reserve currency, followed remarkably similar patterns. Both nations began their ascent by triumphing in wars against declining powers. This victory acted as a catalyst for their increasing national competitiveness, spurred by developments such as the growth of capitalism and the Industrial Revolution. These advancements laid the foundation for their status as reserve currency nations.
However, as history has repeatedly shown, the wealth and prosperity derived from holding the status of a global reserve currency often sow the seeds of decline. Increasing current account deficits and widening income inequality weaken national competitiveness and accelerate the accumulation of debt. Eventually, massive debts incurred through wars, along with the devaluation of their currencies, force these once-dominant nations to relinquish their reserve currency status to emerging powers.
(Mount Washington Hotel in Bretton Woods | Source: Wikipedia)
The United States, currently the world’s leading superpower, has followed a similar trajectory. After the Civil War, the nation enhanced its competitiveness through the Second Industrial Revolution, the development of capitalism, and its geopolitical advantages. Surpassing a declining Europe in wealth and prosperity during and after the First and Second World Wars, the U.S. reached new heights. As victory in World War II became certain, the United States convened a conference to restructure the post-war financial order, adopting the Bretton Woods system, which established the dollar as the reserve currency under the gold standard.
However, a reserve currency economy based on hard currency, like the gold standard, presents a dilemma. To use the dollar as the primary currency for international trade, there must be a sufficient supply of dollars, requiring the reserve currency nation to maintain a deficit. While gold reserves remained constant, the increasing issuance of dollars inevitably led to the currency’s devaluation and eroded international trust in the reserve currency. This issue is known as the Triffin Dilemma.
The Cold War with the Soviet Union, the Vietnam War, and the Oil Shock exacerbated trade deficits and inflation. When the U.S. could no longer meet the demand for gold redemption, President Richard Nixon ended the gold convertibility of the dollar in 1971. This led to a dramatic rise in the price of gold from the fixed $35 per ounce to $850 per ounce by 1980, marking the beginning of the fiat currency era and an age of high inflation.
Fortunately, due to the unprecedented high-interest-rate policies implemented by Paul Volcker, which reached annual rates of 20%, and the successful establishment of the petrodollar system, the dollar regained its value. This recovery ushered in a period of economic prosperity for the United States during the 1990s.
(Source: FRED)
However, the dynamics of dollar issuance underwent a complete transformation after the end of the Bretton Woods system. Whenever funds were needed, the government began issuing Treasury bonds, and the Federal Reserve printed money to purchase these bonds, leading to a rapid increase in the money supply. Government debt soared from $391 billion (34% of GDP) in 1971 to $34 trillion (120% of GDP) by the end of 2023. During the financial crises of 2008 and 2020, the government amassed significant debt through this mechanism, resulting in the continuous depreciation of the dollar’s value.
How long can such massive government debt be sustained? This question opens the door to various scenarios. One possibility is the emergence of another inflation fighter like Paul Volcker, who might take drastic measures to reduce debt, even at the cost of severe economic recession. Alternatively, disruptive innovations like artificial intelligence could boost supply and production, exerting continuous deflationary pressure on the economy and thereby extending the lifespan of the dollar.
(Political polarization | Source: Pew Research)
However, as previously mentioned, currency is a social contract. Thus, the decline of the dollar will commence when the international community begins to lose faith in the United States and its currency. The inevitable inflation associated with being a reserve currency can exacerbate social issues such as income inequality and political polarization, both domestically and internationally, further eroding trust in the dollar. Although there are no definitive signs of the dollar’s demise yet, accumulating issues suggest that such a scenario is increasingly plausible.
(China loves gold | Source: Investing.com)
Geopolitical issues, not just inflation, can also undermine the dollar’s status. In response to Russia’s invasion of Ukraine, Western nations excluded Russia from the SWIFT banking system, preventing it from settling trade in euros or dollars. They also froze half of Russia’s foreign exchange reserves held in dollars. Such actions can diminish other nations’ trust in the dollar. For instance, China has been steadily selling off U.S. Treasury bonds and accumulating gold since the onset of the Russia-Ukraine conflict, thereby reducing its dependence on the U.S.
History proves that the dynamics of power surrounding currency remain constant. Unless an unprecedented perfect monetary policy emerges, any reserve currency will eventually lose its status. Although no one can predict the exact timing, the dollar will someday face its end. I can only hope that this moment comes as late and as smoothly as possible.
As the dollar gradually loses its credibility, naturally, assets like gold, will garner attention. Gold has been valued from ancient times to the modern era due to its scarcity and immutable physical properties. During major conflicts, gold has been the ultimate asset recognized for its value internationally. Consequently, central banks around the world always maintain a certain reserve of gold.
(Russians line up at the bank during the war | Source: AP)
Today, individuals can invest in gold through various means such as mining company stocks, gold futures, and gold ETFs. These investment methods are generally effective in developed countries with accessible financial markets. However, if you reside in a nation with less developed financial markets or one directly involved in war or revolution, investing in gold can be highly restrictive. These investment avenues do not involve direct ownership of gold, introducing counterparty risk during international turmoil. Additionally, purchasing and storing physical gold is not an easy task.
(Source: Kaiko)
In such scenarios, Bitcoin can serve as an excellent hard asset similar to gold. Its supply is limited, it is not controlled by any single entity, and it is exceptionally easy to store and transfer, even in dire situations like wartime. For example, during Russia’s invasion of Ukraine on February 24, 2022, the trading volume and price of BTC/UAH surged, trading at a 6% premium over the international rate. Even in less extreme cases, demand for Bitcoin is high in countries with unstable national currencies. In Turkey, where the annual inflation rate is around 70%, Bitcoin trades at a premium similar to gold. These examples demonstrate that Bitcoin can indeed fulfill the role of a hard asset.
(Source: BlockScholes, Yahoo)
Given the above examples, it is evident that Bitcoin holds significant potential to serve as a hard currency in the future. But does this mean that citizens of developed countries, currently protected by stable monetary systems, have no need to include Bitcoin in their portfolios? Even outside of crisis situations, allocating a portion of one’s portfolio to Bitcoin can offer substantial benefits in terms of diversification. As illustrated in the graph, although Bitcoin’s correlation with other assets like gold, stocks, and the dollar can be volatile over time, it generally exhibits distinct price movements. This unique characteristic alone makes it advantageous to hold a portion of assets in cryptocurrencies like Bitcoin.
(Source: K33 Research)
Indeed, many financial institutions in the United States have recently added BTC ETFs to their portfolios. According to K33 Research, in the first quarter of 2024, 937 institutions reported holding Bitcoin ETFs in their 13F filings. Among them were notable names like JP Morgan, UBS, and Wells Fargo, as well as the Wisconsin Investment Board, which acquired BTC ETFs worth approximately $160 million. This trend indicates that Bitcoin is increasingly being recognized as a store of value.
(Fast food to the moon)
Even before the inflationary effects of the COVID-19 era’s quantitative easing have fully dissipated, the United States is increasing liquidity again in anticipation of the upcoming presidential election. The Treasury Department is expanding fiscal spending, and starting May 29, plans to conduct bond buybacks for the first time in over twenty years. Simultaneously, the Federal Reserve is slowing the pace of quantitative tightening.
Consequently, the dollar will continue to face inflationary pressures and will be issued in large quantities during major economic downturns. Unless the United States maintains its leadership through continual innovation in military, scientific, and industrial fields, the value of the dollar is bound to decline over time. Conversely, this will naturally increase the attention and value of Bitcoin.
However, to achieve the same status as gold as a hard asset, Bitcoin faces a critical challenge: the security scale and profitability of its network. The essential element for maintaining Bitcoin’s value is the security level of its network. The more miners there are to mine Bitcoin, the more secure the network becomes, thereby solidifying Bitcoin’s value.
Bitcoin miners earn revenue in two primary ways: block rewards and transaction fees. Block rewards are the Bitcoins awarded for successfully mining a block, with the amount being fixed and halved every four years. Transaction fees, on the other hand, are the fees paid by users for conducting transactions on the Bitcoin network, separate from block rewards.
(Fees should be higher to achieve sustainability | Source: dune, @21co)
For miners to continue participating in the Bitcoin network, their mining revenue must exceed their costs. Due to the halving that occurs every four years, block rewards diminish over time, necessitating an increase in transaction fee revenue to make up the difference. However, unlike networks such as Ethereum and Solana, the Bitcoin network has limited applications and low scalability, leading to fewer transactions and consequently lower transaction fee revenue. Recently, new token standards like Ordinals and Runes have momentarily increased activity on the Bitcoin network, but there is no long-term guarantee that these will significantly contribute to transaction fee revenue.
(Source: MacroMicro)
Up to now, mining revenue has generally surpassed mining costs. However, as block rewards continue to decrease due to future halvings, unless 1) Bitcoin’s price rises substantially or 2) network activity increases to boost transaction fee revenue, there is a risk that miners will exit the network. This would lower the security level of the Bitcoin network, diminishing its intrinsic value and potentially leading to a vicious cycle of further miner departures and decreased security.
This highlights the primary difference between gold and Bitcoin. Gold’s intrinsic value is not tied to profitability, whereas Bitcoin’s intrinsic value is directly linked to it. Therefore, ensuring profitability is a long-term challenge that the Bitcoin network must address. While there is currently no definitive solution within the Bitcoin community, the emergence of applications such as Ordinals, Runes, and innovations like OP_CAT suggest a potential increase in transaction fee revenue in the long term.
(Is this truly the future of humanity? | Source: The Matrix)
Historically, unlike currency, innovative technologies such as AI have always brought significant changes to society. The steam engine, electricity, and the internet revolution transformed the global industrial landscape, profoundly impacting human jobs and lifestyles. While these technological revolutions brought about various social issues during their transitional periods, they ultimately provided humans with much more prosperous lives. Steam engines and electricity liberated humans from most physical labor, while digital and internet technologies freed them from simple forms of mental labor.
(Fun fact: Illia is the person you know, iykyk)
AI technology has been studied since the 1900s, but meaningful results were slow to emerge. However, the pace of AI development accelerated dramatically after the publication of the Attention Is All You Need paper in 2017, which introduced the transformer theory. This breakthrough made it easier to develop large language models (LLMs), bringing humanity a step closer to artificial general intelligence (AGI). Like previous industrial revolutions, the development of AGI is expected to lead to a significant increase in productivity and have a substantial societal impact. However, I believe the implications will differ significantly for several reasons.
First, AGI will free humans from almost all forms of labor. Previous industrial revolutions liberated humans from physical and simple mental labor, leading to a higher proportion of the population engaged in more sophisticated tasks. However, AGI can handle advanced mental labor, including artistic endeavors such as art and music. Coupled with advanced robotics, this means the areas in which humans can contribute to productivity will diminish significantly.
(The modern day Luddite movement?)
Of course, this does not mean all jobs will disappear. Even in the 21st century, a portion of the population is engaged in agriculture and fisheries, although the proportion is much lower than in the past. While most job types will remain with the advent of AGI, the number of people needed to perform them will drastically decrease. For instance, tasks that ten people currently handle could be managed by one person in the future, leading to a significant increase in the population unable to find employment. Notably, leading figures in AI, such as Elon Musk and Sam Altman, have argued that AI and robots will handle global productivity, resulting in widespread job loss for humans.
Some argue that efficiency could be maximized while maintaining the current employment levels, but this is a misconception. For this to happen, demand would need to increase proportionally with the significant boost in supply (productivity) provided by AGI. However, in most fields, this is not feasible. Job creation would have to occur in new areas beyond AGI’s reach, but as mentioned earlier, AGI’s capabilities extend beyond physical to mental tasks, making this unlikely.
Secondly, AI is inherently a highly centralizable technology. Even before achieving AGI, the AI industry has already become heavily centralized around big tech companies. This is due to the rapid advancement of AI technology. Since the introduction of the transformer theory, the size of language models has increased by a factor of 10^4 between 2018 and 2022. Consequently, there are significant technological disparities in the essential industries that constitute AI technology.
(Source: @EricFlaningam)
(Source: Counterpoint)
In summary, centralization is inevitable in the AI industry, where achieving economies of scale is essential. As the AI industry becomes more centralized, several micro-level issues can arise, such as excessive corporate profit-seeking, unethical data use, single points of failure like server downtimes, and the opacity of AI models. On a macro level, we may face societal chaos as the line between humans and AI blurs, and many people lose their jobs. I believe that blockchain technology, which inherently pursues decentralization, can serve as an antithesis to AI, addressing the challenges associated with AI centralization. Let’s explore how blockchain can be applied to the AI industry.
Just as Satoshi Nakamoto introduced Bitcoin in 2008, advocating decentralization in response to the unchecked issuance of currency by central banks, blockchain technology can be utilized in various ways in the AI industry, where centralization trends are driven by economies of scale.
Among the five highly centralized elements mentioned earlier, semiconductor design and production require concentrated expertise and substantial manufacturing facilities, leaving little room for blockchain solutions. However, blockchain can be effectively applied in the fields of ‘computing power,’ ‘AI models,’ and ‘data.’ Additionally, it can address issues such as the proliferation of fake information, including deepfakes, and support basic income policies for a populace facing mass unemployment. Let’s explore the potential applications of blockchain technology within the AI pipeline.
Decentralized Computing
Training and inferring AI models require immense computing power and hardware. Big tech companies continuously purchase GPUs like NVIDIA’s H100 for their model training, exacerbating the global hardware supply shortage. While services like AWS and Azure provide data centers for cloud-based AI model training and inference, they operate as oligopolies, imposing high margins on users. In response to these challenges, new services leveraging blockchain technology to offer decentralized computing power have emerged.
Examples include Akash and io.net, where users can contribute their hardware’s computing power to the platform in exchange for incentives. There are also protocols specialized in niche services. For instance, Gensyn is optimized for training AI models. General decentralized computing services can reduce costs by utilizing idle hardware, but it is challenging to perform state-dependent computations, such as AI model training, in a decentralized manner. Gensyn addresses this with concepts like probabilistic proof-of-learning and graph-based pinpoint protocol. While Gensyn is specialized in training AI models, Bittensor focuses on AI model inference. Users can submit tasks, and Bittensor’s decentralized nodes compete to provide the optimal results.
zkML
zkML, a fusion of zero-knowledge (zk) cryptography and machine learning (ML), promises to enhance the privacy and transparency of AI models. Many AI models currently operate as closed-source, leaving users uncertain whether these models are using the correct weights and performing inference honestly. By applying cryptographic techniques like ZK-SNARKs (Zero-Knowledge Succinct Non-Interactive Argument of Knowledge) to ML models, it becomes possible to prove that an AI model has executed its inference process correctly without revealing its weights, thus achieving both privacy and computational integrity.
(Source: Polygon ID)
ZK-SNARKs are a powerful cryptographic technology that allows the validity of arbitrary computations to be proven without revealing the input data. To illustrate this, consider a real-world example: proving one’s age online. Typically, this requires complex KYC verification, involving the disclosure of personal information such as name and ID. With ZK technology, this process can be simplified and made more private. Once a user has verified their age with an official entity, they can generate and submit a ZK proof whenever they need to prove they are over 18. This proof contains no personal information but still assures the verifier of the user’s age, making the identity verification process safer and simpler.
(Top: Standard ML, Bottom: zkML | Source: @danieldkang Medium)
Applying the same concept to ML models, a consumer using a closed-source ML model cannot be sure whether the model performed the computation honestly on the given input. By incorporating ZK-SNARKs, an ML provider can assure the consumer that the computation was carried out correctly without revealing the input or weights. A ZKP (Zero-Knowledge Proof) of the ML inference process can be generated and verified by a smart contract on a neutral blockchain protocol, ensuring that anyone can trust the results.
(Source: Modulus Labs)
While the concept of zkML is highly attractive, significant challenges remain. Verifying ZKPs for specific computations is straightforward, but generating these proofs requires more computational power than performing the actual computation. According to @ModulusLabs/chapter-5-the-cost-of-intelligence-da26dbf93307">Modulus Labs, generating a Plonky2-based ZKP for an ML model with 18 million parameters takes about one minute. Given that GPT-3 has 175 billion parameters and GPT-4 has 1.76 trillion parameters, substantial advancements are needed before zkML can be adopted meaningfully.
Data Sovereignty
As the AI industry continues to evolve, the significance of data grows exponentially. However, this surge has led to increasing instances of data sovereignty infringements. By leveraging blockchain technology, individuals can manage their identity-related information through self-custody, providing data only when necessary via digital signatures. Moreover, blockchain enables transparent data provision or sale through incentive systems or marketplaces accessible to all. Perhaps the most blockchain-like approach to data sovereignty has been exhibited by Reddit, which offered long-time users the chance to participate in its IPO, while contracting to provide data to Google. This move exemplifies a novel path in data sovereignty.
While slightly tangential to data sovereignty, blockchain also holds the potential to address issues in the data labeling industry. Data labeling is essential for enhancing the accuracy and ethics of AI models. Currently, this task often falls to low-wage workers, emerging as a new social issue. For instance, China’s AI industry exploits vocational school students, and OpenAI has outsourced this work to low-wage workers in Kenya. Integrating blockchain into data labeling could democratize participation and ensure fair compensation.
Proof of Personhood
Decentralized computing, zkML, and data sovereignty may solve some AI industry challenges. Yet, proof of personhood and universal basic income could safeguard human sovereignty in a society drastically altered by AGI. Let us explore how blockchain might support human sovereignty amidst such profound social transformation.
As AI models advance, the production of various content forms—text, images, videos—by AI becomes increasingly prevalent. Distinguishing whether these outputs are human-made is becoming more challenging. The acceleration of digitalization is inevitable, and as AI-generated content proliferates, the associated social problems will undoubtedly surge.
(Did Caitlyn Jenner really launch memecoin?)
These issues are not merely speculative; they are already occurring. Fraud through deepfakes, which mimic the faces and voices of individuals, has become alarmingly frequent, resulting in substantial financial losses. The authenticity of videos is now often hotly debated online due to the existence of deepfakes.
A recent incident involving Caitlyn Jenner illustrates this point vividly. She announced the launch of a meme coin on the Solana network via the platform X. Given the unusual nature of the announcement, many suspected her account had been hacked. Despite Caitlyn posting a video herself, there was significant controversy over whether it was a deepfake. This debate persisted until Caitlyn’s manager also released a video, helping to somewhat settle the matter.
(proof of personhood | Source: Worldcoin)
As we advance into the AI age, one of the most critical challenges will be proving one’s humanity in the digital realm. This concept, known as “proof of personhood,” aims to prevent sybil attacks and disinformation in the digital world. Currently, most applications rely on government-issued identity systems like passports or credit cards to verify personhood. However, these methods pose privacy risks and the potential for single points of failure. Thus, a truly digital identity system is essential. Blockchain technology offers a solution, allowing individuals to prove their humanity and the authenticity of their created content, potentially mitigating issues like deepfakes.
(Scanning iris through Orb | Source: Sam Altman)
The most commonly used method for digital identity verification is biometric systems, which authenticate specific body parts. OpenAI’s CEO Sam Altman is pioneering a project called Worldcoin, combining blockchain technology with iris scanning. Users install an application on their mobile devices, receiving a private key (account) on the blockchain. By using an iris scanning device called the Orb, users can authenticate their humanity in the digital world. The Orb ensures that the user is indeed a person and that the iris has not been previously registered, securely granting digital identity.
The Orb transmits only the hash value of the iris data to the server, destroying the actual iris data afterward. Users can later prove their personhood without revealing their account address, thanks to ZK-SNARKs, addressing privacy concerns. However, potential issues like hardware backdoors still need to be resolved. The importance of proof of personhood extends beyond content authenticity. It plays a crucial role in the concept of universal basic income, which we will explore next.
Universal Basic Income
(Source: Scott Santens)
As previously mentioned, the advent of AGI is poised to bring about an unprecedented leap in productivity in human history. However, this revolutionary progress will inevitably result in significant job displacement. To sustain societal stability, the concept and necessity of Universal Basic Income (UBI) are gaining increasing attention. The idea of UBI predates AGI, tracing its origins back to Thomas More’s “Utopia” in the 16th century. UBI entails providing regular, unconditional financial support to all members of society. An existing example of UBI can be found in Alaska, where the Alaska Permanent Fund Dividend offers a form of UBI, demonstrating positive outcomes across various dimensions such as poverty, employment, and health.
The focus here, however, is not on a UBI that merely enhances quality of life, but on a UBI substantial enough to support individuals who lose their jobs due to AGI, ensuring they can live adequately without employment. Elon Musk refers to this as “universal high income.“ Similarly, Sam Altman has shown considerable interest in UBI, conducting research through OpenResearch. He has proposed innovative ideas such as providing UBI in the form of assets and means of production like equity or computing power, rather than just cash.
Sam Altman’s Worldcoin, discussed in the “Proof of Personhood” section, is also closely linked to UBI. A critical aspect of UBI distribution is ensuring that only genuine individuals receive it and preventing multiple claims by the same person. Thus, preventing Sybil attacks is crucial for implementing UBI. Worldcoin aims to achieve this through iris recognition for proof of personhood. Currently, users verified via iris recognition on the Worldcoin app receive WLD tokens periodically, a form of UBI. Although I resonate with Worldcoin’s vision, I harbor some reservations about the distribution of WLD tokens.
Even beyond Sam Altman’s Worldcoin, blockchain technology will be indispensable for establishing a complete UBI system. Blockchain can enhance transparency and efficiency not only in selecting recipients through proof of personhood but also in the distribution process, ensuring a more effective and transparent UBI delivery.
Despite the unprecedented crises marked by the collapses of Terra and FTX, the blockchain market has swiftly regained its scale. However, reflecting on both the previous and current market booms, a distinct shift in the industry’s vision is evident. In 2021, numerous protocols were driven by the grand vision of decentralization, capturing the imagination and excitement of many. Now, despite the market’s similar scale, there seems to be a pervasive uncertainty within the industry and community about the direction blockchain should take. This is not due to any failure on our part or a deficiency in blockchain technology itself; rather, it is simply that the current era has not yet created a pressing need for blockchain technology.
While it is intriguing to observe blockchain’s application in niche markets, the industry must set its sights higher. As the long history of humanity has shown, we will continue to experience cyclical monetary systems and revolutionary technological innovations. Within these vast movements, blockchain will stand as a crucial technology that will safeguard human sovereignty.