(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=40974112

所提出的小型 Web 浏览器旨在为传统的基于 HTTP/HTML 的 Web 浏览提供替代方法。 该浏览器支持各种协议,包括 HTTP(S)、Gopher、Gemini、Spartan、Scorpion 和本地文件,还可以选择支持 NNTP 以进行新闻组阅读。 出于对隐私和复杂性的考虑,浏览器还排除了某些 HTTP/1.1 功能,如 cookie、用户代理、引用者、etag、跨源请求。 该浏览器计划采用 HTTP/1.1 的子集,支持 GET/POST 请求,同时限制对 HTTP 标头的支持。 此外,该浏览器将集成对 HTML5 的最低限度支持,从而实现基本的多媒体内容嵌入,尽管有选择地禁用嵌入媒体的显示。 此外,拟议的浏览器将利用现代 CSS,优先考虑易用性和性能。 它既不会合并也不依赖 JavaScript 来保持简单性和安全性。 鼓励但不强制通过 TLS 进行安全连接,在适当的情况下允许不安全的连接。 支持与 SOCKS5 代理(例如 Tor)集成,以增强在线匿名性,同时通过客户端证书提供无密码身份验证,以实现无缝的用户体验。 该浏览器将包括本地存储的合规网站索引,以便于在不依赖外部搜索引擎的情况下进行发现。 允许对索引进行第三方更新,类似于 APT 等包管理器。 自定义提供商选项可实现去中心化网络操作,浏览器还可以尝试使用替代方法(例如 Twitter 或 Mastodon 的简化版本)访问不受支持的服务。 网站仍然可以通过传统的网络浏览器访问,确保与现有基础设施的兼容性。 此外,该浏览器还对旨在保护用户隐私并提高整体可用性的特定网络功能提供保证。 最后,强调开源代码的可用性,促进透明度和社区协作。 但是,我们接受 AGPLv3 之外的许可条款,以适应不同的开发实践。 除了明确请求的功能(如音频输入)之外,硬件访问请求被最小化。 利用“代理功能”的设计可以解决未来操作系统实现中的硬件访问问题。

相关文章

原文






























































































































































































































































































































Comments about gemini://xavi.privatedns.org/small-web-browser.gmi :

I do not believe that just using this existing HTTP/HTML is the way to do it (and other people also agree with me about this), although it is one way to do it, and can be combined with others.

Such a "small web" browser could be designed to support multiple protocols and file formats. So, in addition to HTTP(S), also Gopher, Gemini, Spartan, Scorpion, Nex, local files, and possibly NNTP (although this would not be as good as a dedicated news reader software, it would at least allow to read articles from a NNTP server without needing to set up your dedicated NNTP client software; Lynx also supports NNTP).

> While I do think HTTP/1.1 is good enough for most tasks [...] there are several aspects that I do not particularly like: Cookies, User agent, Referer, Etag, Cross-origin requests

I do not like these features much either. HTTP/1.1 still is good enough for many tasks, although it is still messy in some ways and more complicated than it could be, although for the purpose of accessing services that use HTTP, it will be good enough for this purpose (which is what the article describes doing). (One feature of HTTP that I think is useful that Gemini, Spartan, and Gopher lack (but Scorpion does not lack) is Range requests, although that isn't that useful for a browser and is more useful for a download manager (including command-line programs such as curl). Multiple ranges in a single request seems an unnecessarily complexity to me, though.)

> Support a small subset of HTTP/1.1, supporting GET/POST, while effectively removing support for most HTTP headers.

Agree. (You could also suppport adding arbitrary extra headers by user configuration; e.g. the user could specify that they want to add a "Accept-Language" header or a "DNT" header or whatever other arbitrary headers they might want.)

> Support a subset of HTML5, so that embedded images, audio and video are possible.

Mostly agree. Embedded images would be useful to be able to switch on/off by the user; if off then they appear as links. Embedded audio/video is probably not useful at all; I would have

> Support modern CSS, possibly leaving deprecated or complex features out.

I would probably leave out most of the features, although you do not necessarily have to do so. However, important would be to allow disabling CSS (and ensure that "complying with the requirements above" (see below) means that it is guaranteed to work correctly if the user chooses to disable CSS).

> Support NO JavaScript at all, as JavaScript is one of the main sources of complexity behind a modern web browser, and is typically abused for user fingerprinting.

Agree.

> Mandate the use of TLS-encrypted connections.

Disagree. Encrypted and unencrypted connections are both useful (and the URI scheme would distinguish them; this allows end users to easily filter out any sites that do not support encryption from their local index).

> Allow integration with SOCKS5 proxies e.g.: Tor.

Agree, although in addition to this, it is also sometimes useful to be able to use local programs as proxies and to have the proxy to handle TLS (although there is some complication in handling client certificates when doing so).

> Provide passwordless authentication via client certificates, and always ask for user authorization beforehand.

Agree, with both parts. (Passwords might still be implemented too (although if you don't want to, then you don't have to); HTTP has a "Authorization" header for this purpose, and Scorpion also supports something similar (in addition to supporting client certificates if the connection is encrypted).) It will be necessary to ensure that the user can command the browser to log out at any time (both with passwords and with client certificates).

> Provide a local index of sites complying with the requirements above, so that sites can be found without the use of an external search engine. [...] Such index can be updated from third-parties, similarly to package managers like APT.

I think it is a good idea.

> Custom providers can be easily added by users, so the network remains decentralised.

This is important if you are doing the above. (Being able to manually adjust the index is also helpful; see the next paragraph for why this is helpful.)

In addition to this, there is another possibility of alternate service index; in case of a link to an unsupported service (i.e. one not in the index), it can interpret it using an alternate service (e.g. to a plain HTML version of Twitter or Mastodon, or a Gemini service that displays a proxied news article, etc). In some cases, it may be able to try to figure out from the retrieved HTML or HTTP response headers, e.g. if it is a Mastodon instance. Other times the user might manually specify them when viewing them.

> Sites accessible from it can still be accessed from traditional web browsers.

OK. (If you follow my multi-protocol suggestion above, then this is not always the case; I think it is useful to have multiple ways, and this is one of them.)

> It provides guarantees on a subset of features from the modern web that do not harm users.

OK.

> Users do no longer have to worry on inspecting which websites can be trusted, as such guarantees would be provided by the browser.

This is very helpful.

> It allows reusing existing tools, both web browsers and servers.

Yes, although it is not always desirable for several reasons, e.g. for testing compatibility. (Sometimes it is desirable, though.)

> Because of the smaller set of features, it also leads to simpler code, allowing more implementations to flourish over time.

This is also helped by my suggestion to require that it works correctly if the user chooses to disable CSS.

It additionally links to a "Native Web" document. I disagree with those ideas. It is not necessarily to only allow AGPL3, since it is possible to have source code available in such a way that is compatible with AGPL3 in other ways (e.g. public domain source code without patent restrictions etc). I would use uxn/varvara which is much simpler to implement, also being more portable and avoiding the other disadvantages listed there, but it is also not as "powerful" system and not native code, so is a different disadvantage. About hardware access, I think that it should not request hardware access but only e.g. if you request audio input, the user can specify a microphone or another program or an existing audio file etc. (Solving this also can be done in my way of designing a new operating system with "proxy capabilities"; such a system could run inside of other systems as well as stand-alone, and can run native code as well as being able to emulate non-native instruction sets, so that is another way to solve it, although it is more complicated than using uxn/varvara.)



















联系我们 contact @ memedata.com