中国央行副行长陆磊罕见点赞中本聪,比特币逐渐脱离法币属性
中国央行副行长陆磊罕见提到比特币创始人中本聪,称其值得“高度尊敬”。他指出,比特币已演变为昂贵的数字资产,逐渐脱离传统货币的流通属性。
MiyukiAuthor: Token Engineering Commons; Translator: Sissi@TEDAO
Token engineering is an emerging and rapidly developing field responsible for building and connecting invisible pathways used by humans, machines, and artificial intelligence in transactions, commerce, communication, and coordination.
Since the advent of Bitcoin in 2009, we have witnessed significant progress in the field of cryptocurrency. The launch of Ethereum in 2015 sparked a wave of cryptographic experiments. Before the launch of Ethereum smart contracts, the design space for token-based economies on production-level blockchains was very limited. It was not until 2018, as these cryptographic experiments matured and patterns within the ecosystem took shape, that the term "token engineering" was formally proposed.
The initial landscape of this emerging field is shaped by elements of blockchain infrastructure, governance mechanisms, privacy solutions, decentralized finance (DeFi), and its related subtopics. The permissionless nature of Web3, rooted in open source principles, has grown in complexity to the point where it has prompted the development of social graphs to address challenges such as coordination. These social infrastructures integrate reputation and reward systems, metrics to assess stakeholder contributions, and identity solutions that maintain a degree of privacy while preventing attacks. Token engineering thus acts as a mediator of multidisciplinary fields in Web3, whose common denominator is the creative use of tokens. Many token-based systems have been built in recent years with the advancement of blockchain technology. The idea of decentralization has penetrated many organizations, promoting collaboration across backgrounds. Efforts centered on the regenerative economy have matured and promoted a focus on public goods funding. Decentralized Science (DeSci) is transforming academic infrastructure, co-op-based models are introduced to Web3, and holistic network algorithms improve operational efficiency, deepen transparency, and also increase the complexity of interactions in many successful projects. These are just some of the benefits that attract more people to build Web3 solutions. Yet, despite many positive examples, the broader Web3 community has also experienced unprecedented disruption and reputational damage stemming from the failure of centralized exchanges like FTX, the collapse of token experiments like LUNA, failures due to smart contract vulnerabilities like The DAO, the Bridge hack, phishing scams, and a proliferation of pump-and-dump and Ponzi schemes that can be attributed to lack of maturity, insufficient due diligence, or pump-and-dump and Ponzi schemes.
Despite these high-profile setbacks and regulatory uncertainty in crypto markets, innovation continues. Individuals and institutions in the blockchain space are increasingly demanding a common language and framework, seeking trusted ways to advance their work.
Tokens, as tools for decentralizing ownership, asserting rights, and enhancing coordination strategies, represent the ambitions that token systems aspire to achieve, as well as consumers’ growing demand for greater privacy and autonomy. While the overall concept of token engineering seems clear at a high level, the details of its practice and precise definitions of terminology are abstract, motivating this research. Interactions with token engineering practitioners seek to answer foundational questions to enhance rigor and legitimacy in the way token systems are conceived, researched, designed, implemented, validated, and maintained.
Thus, the main question of this study is: What is token engineering?
In this report, we explore this question through the lens of people working in the field—their practices, challenges, and needs. We define the boundaries of this emerging field, share participants’ perspectives on the future, and provide a discussion guide to further explore these topics.
We hope that these findings will highlight the edges of the field and inspire continued experimentation and imagination about how it can shape our future world. This not only provides deep insights into the theory and practice of token engineering, but also provides a valuable reference for practitioners and researchers to better understand and advance this field.
To answer the key question of this study—“What is token engineering?”—it is first necessary to clarify the meaning of each keyword:
Participants generally view tokens as stores of value or assets on a blockchain. One respondent noted, “A token is a cryptographic representation of information and/or value that can flow freely between systems like atoms.” In chemistry and physics, atoms are the fundamental units of matter and the building blocks of all chemical elements. Similarly, tokens are the building blocks of blockchain-based value exchange systems. Our participants expressed a variety of perspectives. Some saw tokens as symbols of information and tools for building social systems; others saw tokens as tools for experimenting with financial innovation. Tokens can carry one or more value functions at the same time and can be programmed to be credit, reputation, vouchers, shares, and rights. They were described as a medium that optimizes a goal, opening up new forms of value exchange, ownership, and authentication representation. The programmability and traceability of tokens are at the core of token engineering.
Why emphasize “engineering” instead of terms such as token economics, tokenomics, token design, or mechanism design?
Two experienced participants cited the definition of engineering on Wikipedia as an appropriate starting point. At the time of this research, the definition cited was as follows:
“The National Council on the Development of the Professional Development of Engineers (2022-2023) defines ‘engineering’ as:
the creative application of scientific principles to the design or development of structures, machines, devices, or manufacturing processes, or the use of these works, either singly or in combination; or the construction or operation of the same, with the full knowledge of their design; or the prediction of their behavior under specified operating conditions; all with regard to intended function, economy of operation, and safety to life and property.”
Of the two participants, one emphasized that design is part of engineering, while the other reiterated that it is a process of designing, validating, deploying, and maintaining artifacts. Using the word “engineering” is an acknowledgement of the entire process. Discussing only token design may underestimate the importance of these other practices, which are directly related to the ethics and safety of what is created. Safety is a word often used to describe the importance of engineering. One participant shared, “You design with the reasonable expectation that people can use these [artifacts/public infrastructure] without having to individually and independently verify their security or integrity.” When people cross a bridge, they don’t need to first test the bridge’s safety, but assume that this has been taken care of in the engineering process, otherwise the bridge wouldn’t exist. The term “engineering” seems to extend the same rigor to Web3 systems. Of course, the word “engineering” alone is not enough to guarantee safe and ethical implementations. But it does suggest a Hippocratic Oath-like principle of “do no harm” that many participants believe is necessary for successful progress in the field. As with the term “design,” many participants felt that terms like “economics” were too restrictive for the many uses and potentials of tokens that extend far beyond the traditional decentralized finance or fiscal allocation scenarios that are often associated with using tokens in Web3.
Summary:Tokens in this report are defined as blockchain assets, and engineering is viewed as a scientific approach to building these token-based systems, an approach that emphasizes rigor, security, and ethics.
While some participants expressed discomfort with the use of the term “engineering,” arguing that despite having knowledge and expertise, the lack of a formal engineering education background does not preclude them from becoming token engineers. Token engineers encompass the ability to design, implement, and validate Web3 artifacts. Some argue that because each stage involves different specialized complexities, it is better handled by a team rather than an individual.
Many believe that as the field moves toward specialization, companies should hire teams rather than individuals to complete these tasks, especially when faced with complex or high-impact products. One participant likened token engineers to product managers, driving projects through the stages to ensure high-quality results; another argued that they are more like system auditors, focusing on assessing the security and usability of complex systems.
While the exact responsibilities of token engineers have not been uniformly defined, there is broad agreement on clarifying the responsibilities of clients and practitioners. This role includes not only engineering skills, but also analysis, meeting organization, compliance, and entrepreneurship, and is part of the entire ecosystem.
Summary: The role of a token engineer is not just a single position, but a complex process that requires collaboration among multiple roles. As the field develops, it would be very beneficial to borrow standards and practices from traditional engineering.
“Breathing life into a digital soul.” - Ataberk Casur
Are there clear steps in token engineering? When engineers design bridges or roads, they do not build them themselves, but are responsible for developing blueprints - these carefully crafted plans that others use to actually build and serve the public. Similarly, token engineers create a set of blueprints that may then be handed off to software developers or other personnel for implementation. Creating precise blueprints and guiding projects through various stages constitutes the core process of token engineering.
From the feedback of participants, most practitioners are not clear about the complete end-to-end process, although it is considered beneficial to be clear about this process. Although not all participants agreed that there is a fixed process, their responses revealed some key distinctions and stages. The five basic phases of token engineering are outlined here: Discovery, Design, Implementation, Verification and Validation, and Maintenance.
These phases are iterative and do not necessarily occur in sequence. It is common that the end of a phase may return to the previous phase rather than proceed directly to the next phase. This may be due to changes in requirements, the revelation of new information, or revisions to technical and regulatory options. As these systems contain many elements, iteration is part of the process.
Participants described in detail their work and how they ensured the quality of their solutions. The following sections further detail the phases of the token engineering process.
Description: The discovery phase is typically the first contact with the client and the analysis of the project requirements. During this phase, the conversation evolves to transform the problem into a clearer project scope, which can be in the form of a written document, diagram, or roadmap. During this phase, the client and token engineer will determine the appropriate engineering approach for the project and expectations for the future life cycle of the project.
Challenges: Challenges in this phase include unclear requirements from stakeholders and difficulty communicating complex concepts and processes.
Highlights: The highlight of the discovery phase is to identify assumptions from the early stages and to identify and map as many unknown variables as possible.
Description: The design phase is the stage where the requirements and expectations obtained in the discovery phase are further refined. Here, the goals of the system are established, and indicators for evaluating these goals are proposed. The subject behavior is analyzed more deeply, and incentive mechanisms are designed to guide the behavior of stakeholders within the system. This stage also involves technical decisions about the selection of underlying technologies, primitives, platforms, and tools. The design phase will also identify and prepare the parameters and conditions required for simulation in the implementation phase, including subject-based modeling and algorithm design.
Challenges: This stage often requires resolving conflicting goals. The simulation needs to determine the ultimate goals and their importance. Sometimes, although people agree on the long-term goals of the economy, they disagree on how to achieve these goals. The challenge of this stage is also to develop incentive mechanisms that can strengthen the value and goals of the system without introducing undesirable consequences.
Highlights: Participants clearly distinguished between creating new tokens and handling third-party token flows as possible behavioral options in the system. David Sisson emphasized that the main focus of token engineering is the interweaving of mechanisms and stakeholders, not just the creation of tokens. He added that making tokens the main focus may become a distracting element depending on the specific situation, but this does not mean that tokens do nothing in the whole process.
The immutability of blockchain is an important feature, but some parts of the system are easier to change than others. When setting initial conditions, it is crucial to think about all the steps and foresee how specific components may be parameterized in the future without redesigning the entire system. Especially in terms of governance, the rapid evolution of governance mechanisms and the choice of some projects to gradually decentralize their decision-making power may bring many changes. Considering these potential changes before setting initial conditions can help create a more resilient system.
Description: Computational implementation is a term in computer science that refers to the faithful preservation of the basic properties and relationships of a system when it is converted from a theoretical model (such as a conceptual idea) to a computational representation (such as software). At this stage, the initial conditions and parameters determined in the design phase are transformed into formal specifications, which may be mathematical equations, algorithm descriptions, or pseudocodes to generate simulations. The role of simulations and models is to facilitate understanding of the designed system and potentially verify or disprove certain assumptions so that adjustments and iterations can be made as needed.
Challenges: The implementation phase may require fine-tuning of parameters and specifications to ensure that computational visualizations accurately reflect the model and stakeholder objectives. When simulation results are ready to be shared with stakeholders, ensuring that they are clearly understood is also a challenge.
Highlights: Implementation methods are not isolated and sequential. They can occur during the design, verification, and maintenance phases because implementation centers on transformations between model representations and optimization through observation and iteration of system behavior. Implementation approaches described by our participants included:
Algorithm design: involves defining inputs, specifying desired outputs, and developing a clear set of steps to transform inputs into desired outputs
Agent-based modeling and simulation: simulating complex systems by representing individual agents and their interactions according to predefined rules for agent behavior and environmental conditions
Description: Verification and validation are the cornerstones of token engineering, distinguishing between simple design processes and engineering disciplines with the rigor of building relevant systems, protecting stakeholders from potential risks and weaknesses.
There was considerable disagreement among participants regarding the use of the terms “verification” and “validation”. Upon further review, we found that this is a common debate in the engineering community, so we selected the most consistent instances in the dataset and explained our use of these words.
The “validation” process ensures that design choices are technically feasible and effectively meet stakeholder needs and expectations, including considerations such as product-market fit. Confirmation focuses on consistency with user needs and external adaptability, ensuring that the final result meets the original intention and goals.
The "verification" process verifies the security, integrity and reliability of the system, including rigorous audits of the code, incentive structure and the entire system. In contrast to confirmation, verification ensures that the final result meets the description and specification.
The forms of verification and confirmation mentioned in this study include digital verification, analog verification and token engineering audit:
Digital: Smart contract audit
Analog: Incentive structure analysis
System audit or token engineering audit: Overall risk assessment, verifying the accuracy of the system, including procedures and rules for how tokens should and should not be used in the protocol. Consider interoperability, token economics review, system performance evaluation, product market adaptation and regulatory compliance.
Highlights: Trent McConaghy emphasized that we can draw theory and practice from analog, digital and mixed signal verification in electrical engineering and apply it to token engineering. Digital verification is relatively mature in token engineering, but existing tools often ignore analog elements such as incentive design and extreme cases. Analog verification is still in its early stages, using tools such as cadCAD and TokenSPICE. There is a huge opportunity to develop more analog and mixed-signal verification tools. Griff Green proposed the concept of "token engineering audit", advocating the establishment of targeted security standards for Web3, covering liquidity strategies and regulatory compliance.
Challenges: External pressure to "release yesterday" often leads to accelerated due diligence processes, which has led to multiple adverse results in the market and brought reputational damage to the entire field. It is important to make slow progress to accelerate progress and ensure that these stages create necessary pauses for the project, because once it is on the chain and used by multiple stakeholders, it is difficult to change.
Not only can code go wrong, but human behavior guided by design incentives can also go wrong. This is a moment when the complexity of the system needs to be carefully considered. Therefore, professionals with a deep understanding of the entire process are best suited to perform token engineering audits, although research respondents pointed out that such talents are still scarce in this field.
Description: The maintenance phase focuses on the long-term operation of the system. Whether it is one month, three months, a year or longer after the system goes online, it is necessary to ensure that everything is running as planned. The main tasks of the maintenance phase include collecting, analyzing and monitoring data and feedback, verifying the user's acceptance of the design assumptions over time, adjusting the evaluation indicators, and comprehensively optimizing the system.
Challenges: It is a challenge to keep paying attention to system maintenance and ensure its importance. Participants rarely share methods on how to effectively maintain the system in a sustainable way. Who is responsible for continuously updating and maintaining the simulation system? What analysis and monitoring methods can be used? How to improve existing indicators? How long should token engineers participate in project maintenance? How to establish an effective feedback mechanism with system users and answer these questions are challenges that token engineers and projects need to face.
Highlights: Economic systems are constantly changing and extremely complex. There are always some aspects that need to be adjusted and improved. Token engineers should also focus on how to guide stakeholders to participate in the subsequent maintenance of the system.
“As an engineer, you always want to design the most robust system that works in the best interests of all participants, but this is often not easy to achieve.” - Danilo Lessa Bernardineli
Participants faced many challenges, with regulatory uncertainty being a particular concern, which has significantly slowed down the industry’s growth and innovation. Individuals also face huge challenges, such as the need to continuously improve their skills, deal with diverse expertise needs, balance work cycles, and avoid burnout in a high-pressure environment. Communication challenges in conveying its importance within the industry and to a wider audience also add additional difficulties. Finding the right balance between simplicity and complexity in token systems is also one of the key challenges explored in this section. While there is no direct path to solving these challenges, addressing them is essential for the field to move forward.
Participants clearly pointed out the lack of standardization as a major problem. Specifically, there is a lack of standards in the following areas:
Standardization of terminology
Standards for assessing the quality of work
Educational certification
Documentation of learning and case studies
An up-to-date, easily searchable and accessible knowledge base
Twenty-five of the 41 participants stressed the importance of standardization. Both junior and senior technical engineers face challenges in obtaining high-quality insights such as benchmarks, standards, guidelines and frameworks from the successes and failures of various projects. The lack of standards is seen as a major barrier to learning efficiency. In the process of finding the information they need, people may have to conduct endless searches, constantly sifting through large amounts of data that include irrelevant and outdated information. In addition, the lack of standards is seen as a major bottleneck in token systems in gaining recognition from regulators and formal organizations.
While there are some existing resources and activities around token engineering, it is clear that they fall short of meeting the current needs of practitioners, highlighting the need for more robust and accessible standards to support and expand the field.
Participants elaborated on key challenges encountered in communicating this multifaceted concept. Specific challenges include:
Converting qualitative narratives into quantitative mathematical functions
Narrowing broad goals into specific metrics
Lack of a shared understanding of what token engineering is
Clear understanding of what clients need
Problems with peer communication throughout the process
Accurately communicating the nuances of design choices to clients
Evangelizing and raising awareness of what is being built to the broader Web3 community
In the field of token engineering, language considerations go beyond the distinction between human-readable and code-based languages. Token engineers need to deal with various forms of information exchange and design effective communication systems. Participants are challenged to build effective interaction paths using a variety of “languages” including mathematics, data analysis, and visual representations, each of which is a unique way of communicating and expressing themselves. Building consensus and communication was further challenged by a lack of understanding of the field of token engineering, the role of token engineers, and how to translate broad goals into measurable metrics. Participants also reported challenges in communicating details of their work and the capabilities of their models and simulations. Effective communication was seen as essential to producing high-quality results.
Accessibility and rapidly spreading knowledge are significant challenges in advancing the field of token engineering. Specifically, participants highlighted the following issues:
Overwhelmed by the multidisciplinary learning experience
Inadequate educational resources for token engineering
Lack of education or career paths for token engineers
Frequent switching of tools and tracking of new versions
Limited career opportunities and lack of mentors for junior token engineers
Challenges in staying engaged and patient during the learning process
Resources for token engineering have evolved significantly since the concept was first proposed in 2018. For example, the use of tools such as cadCAD has made the practice more accessible. The Token Engineering Academy’s Token Engineering Foundations course has trained hundreds of students to enter the industry, and the Token Engineering Public Organization is democratizing the industry by providing funding opportunities for projects at all stages. These are just some of the many initiatives that are advancing the field. However, education and accessibility remain major challenges. Tools, concepts, and use cases are constantly changing in this rapidly evolving, interdisciplinary field. As the diversity of the ecosystem increases, the knowledge base required also expands. The mechanisms that provide junior practitioners with a path to specialization have not yet been able to meet the huge demand for professionals in this field.
While there is a lot of excellent educational work done by TE Academy, TEC and other institutions, there are still many limitations, namely the lack of a path for people with no background to truly develop the knowledge and experience to master this field.
In summary, while some progress has been made in the field of token engineering, the issue of access to education and resources remains a major challenge to the continued development of this field. Here are some key conclusions and recommendations:
Strengthen the construction of educational resources: In order to address the problem of insufficient educational resources, the industry should develop and share educational materials more widely, especially training on professional knowledge and tools for token engineering.
Innovate educational models: In view of the limitations of traditional educational methods, more flexible and fast learning methods can be explored, such as online courses, short-term workshops, and direct interaction with industry experts to better adapt to the fast pace of industry development.
Establish a career development path: Clarify and optimize the career path of token engineers, provide a multi-stage career development plan from entry to advanced, and help junior engineers gradually grow into Senior experts who are needed by the industry
Promote industry cooperation: Encourage cooperation within the industry, including cross-company and cross-disciplinary cooperation projects, which can not only improve the efficiency of resource utilization, but also help the exchange and inheritance of knowledge and technology
Expand the mentor network: Increase investment in mentor resources and establish a more complete mentor system, especially to provide junior practitioners with more opportunities to contact experienced professionals
Through these measures, the token engineering field can not only solve the existing education and resource problems, but also provide practitioners with a wider range of career development opportunities and learning resources, and ultimately promote the healthy and sustainable development of the entire industry.
The lack of a funding model for open source development has prompted token engineers to turn to selling consulting services and developing closed tools. Due to the competition and scarcity of resources, a new culture of confidentiality has formed around knowledge sources. Specific issues include:
Limited resources to develop complex projects
Stakeholders are often focused on short-term returns
Often only projects that are seen as Ponzi schemes can get funding to hire token engineers
Economic vulnerability due to high reliance on venture capital
The capabilities of a single token engineer are limited, and hiring a full team to handle all aspects of token engineering is often beyond budget
Lack of funding for infrastructure and tooling
Ensuring that the value created by a project or community is accurately mapped to the token, thereby opening up more innovative funding mechanisms, is a major challenge facing the field. The feasibility of the design brings additional complexity due to the various workflows in different stages of development. Some features and tools may be mature, while aspects such as user experience and accessibility are still evolving, depending on the specific use case. The overall state of Web3 directly affects the work environment of token engineering. The development of many tools is like building a bridge and the mechanical equipment required at the same time. In addition to securing funding for the token engineering process in projects with customers, participants also face the resource challenge of developing infrastructure during this complex phase.
The token engineering field faces the following challenges due to its complexity and nascent nature:
Multiple levels of innovation are happening simultaneously, making it difficult to track inputs and outputs
The existence of many variables makes it difficult to determine the specific reasons why things succeed or fail
Poor user experience and inherent risks indicate that these systems are not suitable for all users
Distractions are easy and there is a lack of continuous project follow-up
Even with models and simulations, there is still uncertainty about what people actually do after the system is activated
Token engineering is still in its infancy, especially in the field of incentive design, and has huge development potential. This field is gradually moving towards academicization, and the capabilities of Web3 are also expanding rapidly. Facing the challenges of an emerging field can be daunting. Token engineering is multidisciplinary and focuses on socio-technical complex systems, which increases the challenges and complexity of its early stages.
The task of token engineering involves dealing with complex systems, which often consist of interconnected and dependent components, whose overall behavior and characteristics cannot be directly inferred from the behavior of individual parts. The interaction of components within the system leads to nonlinearities and emergent phenomena that are not explicitly presupposed when the system is designed. This emphasizes the complexity and unpredictability of the task.
In the face of these challenges, we need to acknowledge and accept the difficulties and complexities of the emerging field, while taking proactive measures to promote the maturity and development of the field. Through continued education and cross-disciplinary collaboration, as well as strengthening the combination of practice and theory, the field of token engineering can better cope with its inherent complexity and changing technical requirements. Over time, the accumulation of practical experience and academic research will support the maturity of the field, ultimately achieving more efficient and fair token systems, and bringing deep understanding and broad opportunities to practitioners and stakeholders.
When discussing the issues that keep them up at night, the regulatory environment is the most pressing concern expressed by participants. The main reasons include:
Unclear risks and consequences of token experimentation
Liability issues, particularly regarding who may be accountable for multi-stakeholder systems
Regulatory constraints slowing down industry development
Risks are too high, as not all projects can afford to innovate with tokens while ensuring regulatory compliance
Insecurities during the design phase leading to unexpected or delayed costs
Limited access to legal domain expertise based on blockchain experiments
Navigating the complex regulatory compliance landscape presents a major challenge for token engineers, which impacts critical decisions at the intersection of innovation and legal frameworks. This delicate balance often translates into practical challenges, where well-conceived token engineering projects can encounter delays and increased risks due to regulatory uncertainty. There are many examples of rigorously developed models encountering roadblocks and stalling development due to legal uncertainty.
Regulatory uncertainty and communication barriers have hampered the development of the field. Lack of standardized practices and limited educational resources have exacerbated overhead, while funding models have posed additional challenges to innovation and industry growth. Despite its potential, token engineering is still in its early stages and faces numerous challenges.
中国央行副行长陆磊罕见提到比特币创始人中本聪,称其值得“高度尊敬”。他指出,比特币已演变为昂贵的数字资产,逐渐脱离传统货币的流通属性。
MiyukiThe lawyer claims winners were chosen for their stories, not randomly, making this an unregistered lottery. It's unclear if Musk or America Pac members might face jail if charges proceed.
KikyoOpenSea is planning a major relaunch in December 2024 to reclaim its position in the NFT market after facing stiff competition from rivals like Blur.
AnaisGemini has launched an ad campaign emphasizing crypto's significance in the US election, with both Donald Trump and Kamala Harris discussing digital asset policies. This initiative reflects the increasing importance of crypto in political discourse and its potential electoral impact.
CatherineOpenAI is exploring a shift from a non-profit to a for-profit model, seeking guidance from California's attorney general on restructuring while maintaining its ethical commitments.
WeatherlyMeta will share its open-source AI with the US and allies to maintain a “technological edge” over China. In partnership with tech giants like Microsoft and Amazon, Meta aims to strengthen national security and global tech resilience.
Kikyo在台湾金融科技协会(TFTA)主办的FinTechOn 2024论坛上,立委葛如钧与多位区块链行业领袖共同探讨台湾虚拟资产服务提供商(VASP)在自律方面的可能路径。自律的优势与风险成了业界和政府的关注焦点。
AlexThe Global Dollar Network, backed by major fintech and crypto firms, has introduced USDG, a new stablecoin tied to the US dollar to promote stablecoin adoption globally. Issued by Paxos from Singapore, USDG aims to meet regulatory standards and encourage use in real-world transactions.
WeatherlyAn attendee was robbed at knifepoint by two assailants in Bangkok just days before the Devcon 7 conference. Although he escaped without physical harm, the incident has raised safety concerns within the crypto community.
JoyThe 2024 US elections represent a pivotal moment as crypto donations exceed $238 million, surpassing traditional sectors like oil and pharmaceuticals. With this growing influence, will the election shift policy in favour of digital assets, or will the momentum wane afterward?
Catherine