算法中,少量内存的开销远大于大量时间的开销。
For algorithms, a little memory outweighs a lot of time

原始链接: https://www.quantamagazine.org/for-algorithms-a-little-memory-outweighs-a-lot-of-time-20250521/

理解P(可在合理时间内解决的问题)和PSPACE(可在合理空间内解决的问题)之间关系的探索,是复杂性理论的核心挑战。虽然P是PSPACE的子集,但普遍认为PSPACE要大得多,这意味着空间比时间是一种更强大的计算资源,因为内存可以重复利用。 研究人员试图通过证明有限空间的算法可以解决需要更大时间预算的问题来证明这一点,他们使用了一种叫做“模拟”的技术。Hopcroft、Paul和Valiant通过设计一种节省空间的通用模拟程序取得了初步突破。然而,研究人员发现了一个根本性的限制:你不能在同一时间重复使用相同的内存位置,这阻碍了研究的进展。 五十年来,这一限制阻碍了研究进展,直到Ryan Williams来到康奈尔大学——复杂性理论的历史中心——着手解决P与PSPACE问题。Williams早期就展现出对该领域的专注投入,为他后来突破性的工作奠定了基础,最终打破了僵局。

Hacker News 上的一篇讨论围绕着一个原则展开:利用内存可以显著提高算法效率,甚至比优化时间更重要。一篇论文指出,多磁带图灵机可以用比预期更少的空间进行模拟。评论者讨论了预计算和缓存,例如查找表和使用哈希的存储去重,如何在实践中体现这一原则。讨论扩展到社区规模缓存(如预编译软件)的考虑以及 O(1) 随机内存访问在实际数据中心中的局限性,指出 O(n^(1/2)) 的访问复杂度。讨论还涉及到大型语言模型 (LLM) 本质上是巨大的预计算模型,作为一种高级缓存形式来逼近复杂解。核心结论是,存储中间结果或预计算值可以起到“时间压缩”的作用,重用存储数据以避免冗余计算,从而提高算法效率。

原文

One of the most important classes goes by the humble name “P.” Roughly speaking, it encompasses all problems that can be solved in a reasonable amount of time. An analogous complexity class for space is dubbed “PSPACE.”

The relationship between these two classes is one of the central questions of complexity theory. Every problem in P is also in PSPACE, because fast algorithms just don’t have enough time to fill up much space in a computer’s memory. If the reverse statement were also true, the two classes would be equivalent: Space and time would have comparable computational power. But complexity theorists suspect that PSPACE is a much larger class, containing many problems that aren’t in P. In other words, they believe that space is a far more powerful computational resource than time. This belief stems from the fact that algorithms can use the same small chunk of memory over and over, while time isn’t as forgiving — once it passes, you can’t get it back.

“The intuition is just so simple,” Williams said. “You can reuse space, but you can’t reuse time.”

But intuition isn’t good enough for complexity theorists: They want rigorous proof. To prove that PSPACE is larger than P, researchers would have to show that for some problems in PSPACE, fast algorithms are categorically impossible. Where would they even start?

A Space Odyssey

As it happened, they started at Cornell University, where Hartmanis moved in 1965 to head the newly established computer science department. Under his leadership it quickly became a center of research in complexity theory, and in the early 1970s, a pair of researchers there, John Hopcroft and Wolfgang Paul, set out to establish a precise link between time and space.

Hopcroft and Paul knew that to resolve the P versus PSPACE problem, they’d have to prove that you can’t do certain computations in a limited amount of time. But it’s hard to prove a negative. Instead, they decided to flip the problem on its head and explore what you can do with limited space. They hoped to prove that algorithms given a certain space budget can solve all the same problems as algorithms with a slightly larger time budget. That would indicate that space is at least slightly more powerful than time — a small but necessary step toward showing that PSPACE is larger than P.

With that goal in mind, they turned to a method that complexity theorists call simulation, which involves transforming existing algorithms into new ones that solve the same problems, but with different amounts of space and time. To understand the basic idea, imagine you’re given a fast algorithm for alphabetizing your bookshelf, but it requires you to lay out your books in dozens of small piles. You might prefer an approach that takes up less space in your apartment, even if it takes longer. A simulation is a mathematical procedure you could use to get a more suitable algorithm: Feed it the original, and it’ll give you a new algorithm that saves space at the expense of time.

Algorithm designers have long studied these space-time trade-offs for specific tasks like sorting. But to establish a general relationship between time and space, Hopcroft and Paul needed something more comprehensive: a space-saving simulation procedure that works for every algorithm, no matter what problem it solves. They expected this generality to come at a cost. A universal simulation can’t exploit the details of any specific problem, so it probably won’t save as much space as a specialized simulation. But when Hopcroft and Paul started their work, there were no known universal methods for saving space at all. Even saving a small amount would be progress.

The breakthrough came in 1975, after Hopcroft and Paul teamed up with a young researcher named Leslie Valiant. The trio devised a universal simulation procedure that always saves a bit of space. No matter what algorithm you give it, you’ll get an equivalent one whose space budget is slightly smaller than the original algorithm’s time budget.

“Anything you can do in so much time, you can also do in slightly less space,” Valiant said. It was the first major step in the quest to connect space and time.

In 1975, Leslie Valiant helped prove that space is a slightly more powerful computational resource than time.

Katherine Taylor for Quanta Magazine

But then progress stalled, and complexity theorists began to suspect that they’d hit a fundamental barrier. The problem was precisely the universal character of Hopcroft, Paul and Valiant’s simulation. While many problems can be solved with much less space than time, some intuitively seemed like they’d need nearly as much space as time. If so, more space-efficient universal simulations would be impossible. Paul and two other researchers soon proved that they are indeed impossible, provided you make one seemingly obvious assumption: Different chunks of data can’t occupy the same space in memory at the same time.

“Everybody took it for granted that you cannot do better,” Wigderson said.

Paul’s result suggested that resolving the P versus PSPACE problem would require abandoning simulation altogether in favor of a different approach, but nobody had any good ideas. That was where the problem stood for 50 years — until Williams finally broke the logjam.

First, he had to get through college.

Complexity Classes

In 1996, the time came for Williams to apply to colleges. He knew that pursuing complexity theory would take him far from home, but his parents made it clear that the West Coast and Canada were out of the question. Among his remaining options, Cornell stood out to him for its prominent place in the history of his favorite discipline.

“This guy Hartmanis defined the time and space complexity classes,” he recalled thinking. “That was important for me.”

Williams was admitted to Cornell with generous financial aid and arrived in the fall of 1997, planning to do whatever it took to become a complexity theorist himself. His single-mindedness stuck out to his fellow students.

“He was just super-duper into complexity theory,” said Scott Aaronson, a computer scientist at the University of Texas, Austin, who overlapped with Williams at Cornell.

联系我们 contact @ memedata.com