100 个操作系统上的 Curl
Curl on 100 Operating Systems

原始链接: https://daniel.haxx.se/blog/2023/11/14/curl-on-100-operating-systems/

这篇文章“curl on 100 Operating Systems”最初出现在 Dan 为 Curious Reader 撰写的博客上 - 探索和学习开源软件维护和管理以及个人生产力技术。 仍然可以在这里查看原件。 根据上面的段落,作者如何优先考虑维护软件的寿命以及与过时硬件的兼容性,特别是在讨论与开源软件相关的主题时? 此外,他还提供了哪些实现最佳个人生产力的建议?

然而,这提醒我们,跨多个硬件和平台配置维护代码可能具有挑战性,需要仔细考虑和规划。 遗留系统通常需要特别关注和资源来确保适当的功能、性能和稳定性。 因此,在设计或更新软件时必须考虑遗留系统并进行相应的规划。 就此处讨论的具体软件而言,curl 旨在跨 100 多种操作系统类型一致地运行,这一事实说明了跨平台兼容性和互操作性的重要性。 然而,如前所述,实现完全的跨平台一致性可能很困难,并且需要大量的开发工作。 最终,是否主要关注支持较旧的遗留环境,还是专门关注新的现代环境,将在很大程度上取决于软件的目标及其目标受众。 尽管如此,很明显,提供跨平台兼容性仍然是当今软件设计和维护的宝贵资产。 关于用其他语言编译Go程序的示例,我通过使用Go程序构建Docker镜像来熟悉这种方法。 它允许人们利用更多的库和模块,并且还可以简化 Go 软件的部署和管理。 然而,与涉及在外部环境中进行编译的其他方法一样,可能会出现与模块兼容性或依赖性相关的限制和问题,并且可能需要额外的设置和配置步骤来确保正确的功能。 尽管如此,该技术代表了一种利用两个不同编程生态系统优势的实用解决方案,尽管需要一定程度的额外努力才能正确实施。 总体而言,软件设计和开发固有的挑战涉及在创新和进步之间取得平衡,同时满足不同受众的需求。 在软件可维护性和可靠性方面,它强调了在整个生命周期中仔细考虑软件设计选择的重要性,特别是在考虑持续的技术进步以及不断变化的操作需求时。 这些挑战提出了复杂的问题,这些问题仍然是开放式的,必须单独解决。 最终,将遗留软件环境与现代软件环境一起维护需要仔细判断,以确保在所有支持的平台配置上实现一致的功能、最佳性能、可靠的可扩展性和高可用性。
相关文章

原文

In a recent pull-request for curl, I clarified to the contributor that their change would only be accepted and merged into curl’s git code repository if they made sure that the change was done in a way so that it did not break (testing) for and on legacy platforms.

In that thread, I could almost feel how the contributor squirmed as this requirement made their work harder. Not by much, but harder no less.

I insisted that since curl at that point (and still does) already supports 32 bit time_t types, changes in this area should maintain that functionality. Even if 32 bit time_t is of limited use already and will be even more limited as we rush toward the year 2038. Quite a large number of legacy platforms are still stuck on the 32 bit version.

Why do I care so much about old legacy crap?

Nobody asked me exactly that using those words. I am paraphrasing what I suspect some contributors think at times when I ask them to do additional changes to pull requests. To make their changes complete.

It is not so much about the legacy systems. It is much more about sticking to our promises and not breaking things if we don’t have to.

Partly stability and promises

In the curl project we work relentlessly to maintain ABI and API stability and compatibility. You can upgrade your libcurl using application from the mid 2000s to the latest libcurl – without recompiling the application – and it still works the same. You can run your unmodified scripts you wrote in the early 2000s with the latest curl release today – and it is almost guaranteed that it works exactly the same way as it did back then.

This is more than a party trick and a snappy line to use in the sales brochures.

This is the very core of curl and libcurl and a foundational principle of what we ship: you can trust us. You can lean on us. Your application’s Internet transfer needs are in safe hands and you can be sure that even if we occasionally ship bugs, we provide updates that you can switch over to without the normal kinds of upgrade pains software so often comes with. In a never-ending fashion.

Also of course. Why break something that is already working fine?

Partly user numbers don’t matter

Users do matter, but what I mean in this subtitle is that the number of users on a particular platform is rarely a reason or motivator for working on supporting it and making things work there. That is not how things tend to work.

What matters is who is doing the work and if the work is getting done. If we have contributors around that keep making sure curl works on a certain platform, then curl will keep running on that platform even if they are said to have very few users. Those users don’t maintain the curl code. Maintainers do.

A platform does not truly die in curl land until necessary code for it is no longer maintained – and in many cases the unmaintained code can remain functional for years. It might also take a long time until we actually find out that curl no longer works on a particular platform.

On the opposite side it can be hard to maintain a platform even if it has large amount of users if there are not enough maintainers around who are willing and knowledgeable to work on issues specific to that platform.

Partly this is how curl can be everywhere

Precisely because we keep this strong focus on building, working and running everywhere, even sometimes with rather funny and weird configurations, is an explanation to how curl and libcurl has ended up in so many different operating systems, run on so many CPU architectures and is installed in so many things. We make sure it builds and runs. And keeps doing so.

And really. Countless users and companies insist on sticking to ancient, niche or legacy platforms and there is nothing we can do about that. If we don’t have to break functionality for them, having them stick to relying on curl for transfers is oftentimes much better security-wise than almost all other (often homegrown) alternatives.

We still deprecate things

In spite of the fancy words I just used above, we do remove support for things every now and then in curl. Mostly in the terms of dropping support for specific 3rd party libraries as they dwindle away and fall off like leaves in the fall, but also in other areas.

The key is to deprecate things slowly, with care and with an open communication. This ensures that everyone (who wants to know) is aware that it is happening and can prepare, or object if the proposal seems unreasonable.

If no user can detect a changed behavior, then it is not changed.

curl is made for its users. If users want it to keep doing something, then it shall do so.

The world changes

Internet protocols and versions come and go over time.

If you bring up your curl command lines from 2002, most of them probably fail to work. Not because of curl, but because the host names and the URLs used back then no longer work.

A huge reason why a curl command line written in 2002 will not work today exactly as it was written back then is the transition from HTTP to HTTPS that has happened since then. If the site actually used TLS (or SSL) back in 2002 (which certainly was not the norm), it used a TLS protocol version that nowadays is deemed insecure and modern TLS libraries (and curl) will refuse to connect to it if it has not been updated.

That is also the reason that if you actually have a saved curl executable from 2002 somewhere and manage to run that today, it will fail to connect to modern HTTPS sites. Because of changes in the transport protocol layers, not because of changes in curl.

Credits

Top image by Sepp from Pixabay

Discussion

Hacker news

联系我们 contact @ memedata.com