Site iconSite icon ForkLog

Web3 will look different

Web3 will look different

For the past several years the crypto community has awaited the advent of Web3, where data belong to users, everything runs on the blockchain, and a crypto wallet is needed to access information.

The standard vision of the next-generation internet rests on decentralisation. Data are not stored on centralised servers and are not exploited for personalised advertising.

But what if Web3 turns out somewhat different? Consider a future in which decentralisation is achieved through locally running, user-built applications created with AI agents.

What many expect of Web3

With the emergence of cryptocurrencies and the development of blockchain technology, many began to see them as the foundation for the internet’s next iteration. For example, the author of the Web 2.0 concept, Tim O’Reilly, argues that Web3 will be a meaningful stage of progress if it can connect the crypto-economy to the real world—including legal systems, property, payments, identity, applied services and production.

The crypto community sees the chief distinction from Web 2.0 in deeper decentralisation at every layer, including data storage and application execution. Ideally, product development is handled not by an owner but by a distributed community governing the project via DAO.

Decentralisation is viewed as a foundational principle that allowed cryptocurrencies and smart contracts to find their place in the economy: they reduce reliance on intermediaries and centralised structures.

Be your own programmer

The rise of large language models (LLMs), AI agents and vibe-coding invites a different take on Web3. ForkLog is not calling for ditching decentralisation and blockchains—only for widening the vision of the internet to come.

What if every user were their own programmer? Such a user can write applications not for general consumption but for personal tasks, run them locally on their own computer or a remote server, and depend on no central provider.

Take decentralised exchanges such as PancakeSwap or Uniswap. They are collections of smart contracts operating on Ethereum, BNB Chain and several other blockchains. The networks are decentralised, and access to services is via non-custodial wallets.

That may look like Web3 already. Yet a single point of failure remains: the frontend. The official websites through which users access these exchanges are centralised; they can restrict certain tokens or block users by IP and address.

Direct access to smart contracts without the official site is possible, but hard. One can use third-party frontends—again introducing centralised choke points—or open a contract in a blockchain explorer like Etherscan and call functions via Write Contract. This is awkward, complex and requires technical skills. Not everyone will manage it.

AI offers a third route: write an application via vibe-coding and run it locally on your PC. We tried building such a product using Zed, OmniRoute and LLMs from Anthropic and OpenAI.

Screenshot: ForkLog.

The frontend was created through Lovable. When run locally the app still looks rough and needs interface polish, but it performs all functions.

Screenshot: ForkLog.

The application was built in a few hours of vibe-coding without any programming knowledge. In future AI will be smarter and able to generate complete tools without dozens of prompts and constant corrections. It may be enough to request: “Create and deploy an application for providing liquidity on Uniswap.”

The idea of running local applications can be extended as far as imagination allows:

Mobile apps are fair game too. AI can produce not only a website but also an Android APK for the same purposes, connecting directly to smart contracts on a blockchain.

Imagine you learn that the Spark service offers 12% APY on DAI stablecoins. You visit the site, but your IP is blocked. A VPN does not help. In the Web3 future described here, that is no problem. You open Claude Code and write a prompt:

“Create an application to earn using the Spark protocol on Ethereum. It should support adding DAI and withdrawing them, as well as a dashboard to track investment performance.”

The AI builds a service that connects directly to smart contracts, bypassing frontend blocks. It runs locally on a PC—no centralised components.

Local AI

In such a Web 3.0 the single point of failure could be AI itself—specifically centralised language models. ChatGPT, Gemini and similar systems run on the servers of OpenAI, Google and other labs. They can filter traffic and impose censorship and restrictions. 

There is, however, a remedy: open-source LLMs that you can run on your own machine or a remote server.

For example, you can assemble a configuration like this:

You then chat in Zed as with a regular bot; it writes code and launches applications, while the LLMs run locally.

An example of working with AI in Zed to create your own application. Screenshot: ForkLog.

The model you choose depends on your machine. For example, on a MacBook Air with 16GB of RAM, qwen2.5-coder:7b, qwen3:8b, llama3.2:3b and deepseek-r1:8b are suitable. On a local server you can run something more powerful, though that will no longer be free.

There are many powerful open-source models, but most are Chinese—DeepSeek, Alibaba’s Qwen3.5, Kimi K2 / K2.5 / K2.6. Among American firms only Meta had tried to move in this direction, but its latest LLM was released closed-source. Google has the Gemma line, which is not flagship. Even so, it is well suited to local deployment.

In May 2025 Tether announced a new platform for developing “infinite and ubiquitous intelligence,” envisaging the “launch and evolution” of AI agents on users’ devices instead of the data centres of big firms.

QuantumVerse Automatic Computer (QVAC) removes the need for cloud connectivity and offers greater privacy, autonomy and resilience. Its modular architecture lets developers build and extend applications using small composable elements.

A peer-to-peer network ensures direct device-to-device communication and collaboration without dependence on centralised servers.

Apple is building AI with an emphasis on on-device operation—Apple Intelligence. Some tasks are performed directly on iPhone, iPad or Mac to account for a user’s personal context without collecting personal data. More complex tasks still use the cloud, but Apple’s own—Private Cloud Compute. Apple says only relevant data are sent, deleted after processing, with the system designed around privacy.

Open projects

Beyond writing your own code from scratch, you can turn to ready-made open-source projects. GitHub abounds with implemented ideas.

Here are a few projects for liquidity management:

When scouting interesting repositories, analyse the code, verify contract addresses and test functionality in a sandbox or with small sums. No one guarantees quality.

You can use ready-made solutions as-is or adapt them to your needs. Often you need not write code manually—ask an AI agent to make the necessary changes to an existing project.

Screenshot: ForkLog.

Drawbacks

The chief obstacle to the Web3 future sketched here is the shortage of convenient, ready-made tools and the technical complexity of implementation. You can already build a frontend for decentralised Web3 projects with AI, but it remains challenging for the average user. Without guidance or hours of research, vibe-coding, installing tools like Zed or Antigravity, launching local LLMs and wiring them through OmniRoute are hard to master.

One option is to use ready-made applications from OpenAI (Codex) or Anthropic (Claude Code). But that undermines decentralisation—and token usage could become costly. Theoretically, in the first path you can code entirely for free by linking several Google accounts to services that grant free tokens.

One possible direction for Web3 looks like this:

It is too early to say whether technology will evolve precisely this way. Web3 may turn out more familiar—free of single points of control and less dependent on large platforms, yet with decentralisation delivered by relatively small groups of developers. And most users will still rely on ready-made solutions.

Exit mobile version