SETI 主页 标志 100 个信号 排序后 120 亿个其他信号
SETI Home Flags 100 Signals After Sorting 12B Others

原始链接: https://news.berkeley.edu/2026/01/12/for-21-years-enthusiasts-used-their-home-computers-to-search-for-et-uc-berkeley-scientists-are-homing-in-on-100-signals-they-found/

## SETI@home:21年的搜索与经验教训 在二十年间(1999-2020),SETI@home项目利用数百万志愿者电脑的计算能力,分析来自阿雷西博天文台的无线电数据,寻找外星智慧的迹象。该项目产生了120亿次探测,最终将其缩小到100个候选信号,目前正利用中国的FAST望远镜进行重新检查。 虽然尚未发现确凿的外星生命证据,但该项目并非失败。研究人员插入了虚假信号来测试他们的系统,揭示了当前SETI搜索方法的局限性——特别是将真实信号过滤在无线电干扰中的风险。他们发现最初的方法并不理想,并确定了未来天空巡查的改进方向。 SETI@home展示了分布式计算的力量,其搜索的灵敏度超出了预期。尽管寻找外星生命仍然难以捉摸,但该项目的遗产在于其科学贡献以及类似众包努力的潜力,利用当今更快的计算机和互联网速度来分析现代望远镜产生的大量数据。该团队相信,利用新的见解重新分析现有数据,仍然可能取得成果。

## SETI Home 报告潜在信号,引发人工智能讨论 SETI Home 搜寻地外智慧生命项目在分析超过120亿个信号后,标记了100个有趣的信号。然而,研究人员承认,人工调查每个潜在的探测都面临巨大挑战,这引发了人们对可能丢弃有价值数据的担忧。 Hacker News 的一个讨论中提出的关键问题是,人工智能是否可以有效地用于自动化这一过程——筛选数十年的信号,并识别出最值得人类审查的候选信号。 目前,一些被标记的信号正在使用中国的 FAST 射电望远镜进行重新检查。虽然令人兴奋,但讨论中也幽默地承认了主动广播我们存在可能存在的风险,并提到了科幻小说《三体》系列中的敌对外星人。核心问题仍然是:如何在不遗漏真正地外信息的情况下,有效地分析庞大的数据集。
相关文章

原文

For 21 years, between 1999 and 2020, millions of people worldwide loaned UC Berkeley scientists their computers to search for signs of advanced civilizations in our galaxy.

The project — called SETI@home, after the Search for Extraterrestrial Intelligence (SETI) — generated a loyal following eager to participate in one of the most popular crowd-sourced projects in the early days of the internet. They downloaded the SETI@home software to their home computers and allowed it to analyze data recorded at the now-defunct Arecibo Observatory in Puerto Rico to find unusual radio signals from space. All told, these computations produced 12 billion detections — “momentary blips of energy at a particular frequency coming from a particular point in the sky,” according to computer scientist and project co-founder David Anderson.

After 10 years of work, the SETI@home team has now finished analyzing those detections, winnowing them down to about a million “candidate” signals and then to 100 that are worth a second look. They have been pointing China’s Five-hundred-meter Aperture Spherical Telescope, a radio telescope referred to as FAST, at these targets since July, hoping to see the signals again.

Though the FAST data are not yet analyzed, Anderson admits he doesn’t expect to find a signal from ET. But the results of the SETI@home project — presented in two papers published last year in The Astronomical Journal — provide lessons for future searches and point to potential flaws in ongoing searches.

“If we don’t find ET, what we can say is that we established a new sensitivity level. If there were a signal above a certain power, we would have found it,” he said. “Some of our conclusions are that the project didn’t completely work the way we thought it was going to. And we have a long list of things that we would have done differently and that future sky survey projects should do differently.”

According to astronomer and SETI@home project director Eric Korpela, searches like SETI@home will inevitably turn up billions of possible signals. The challenge for researchers is to develop algorithms to cull the spurious signals caused by noise or radio interference without eliminating actual beacons from a distant civilization. Radio frequency interference, or RFI, comes not only from satellites orbiting Earth and scattered throughout the solar system, but from radio and TV broadcasts and even microwave ovens.

“There’s no way that you can do a full investigation of every possible signal that you detect, because doing that still requires a person and eyeballs,” he said. “We have to do a better job of measuring what we’re excluding. Are we throwing out the baby with the bath water? I don’t think we know for most SETI searches, and that is really a lesson for SETI searches everywhere.”

four men standing behind a fifth man sitting in a chair
An early photo of some of the SETI@home team, with David Anderson seated in the front. Standing, left to right, are Jeff Cobb, Matt Lebofsky, Eric Korpela and Dan Werthimer.

SETI@home

Anderson and Korpela addressed that issue by inserting some 3,000 fake signals — called birdies — into their data pipeline before combing through it to eliminate the RFI and noise. They blinded themselves to the nature of these fake signals, and calculated their sensitivity based on the signal power of the birdies they were able to detect.

Korpela pointed out that nearly all searches today assume a civilization would put lots of power into a narrow frequency band to get the attention of other civilizations, then send information or data through an adjacent broadband frequency. To increase the chances of being detected, the beacon should be around a frequency at which astronomers would be observing the universe, Korpela said — most likely around the radio wavelength of 21 centimeters, which is used to map hydrogen gas in the galaxy.

“This powerful narrow-band beacon would be something that’s easy to detect. Then, once someone had detected that, they would dedicate more observing to try and find signals near it in frequency that might be lower power and wider band that contain information,” Korpela said. “If we saw an extraterrestrial narrowband signal somewhere, we would probably have every telescope, radio telescope and optical telescope available pointing at that point on the sky, searching in all frequencies for anything else. So far we haven’t had that. If we had, I think we would all know about it.”

Despite its failure to find ET, was SETI@home a success?

“I’d say it went way, way, way beyond our initial expectations,” Anderson said. “When we were designing SETI@home, we tried to decide whether it was worth doing, whether we’d get enough computing power to actually do new science.Our calculations were based on getting 50,000 volunteers. Pretty quickly, we had a million volunteers. It was kind of cool, and I would like to let that community and the world know that we actually did some science.”

Distributed computing

When Anderson first began working on SETI@home in the mid-1990s, he was teaching computer science at UC Berkeley and conducting research in distributed computing — breaking down large and complex problems into chunks that could be handled by smaller computers. This was a workaround for people who didn’t have access to a supercomputer. A UC Berkeley computer science graduate and former student of Anderson’s, David Gedye, suggested that the growing network of home computers could be tapped through distributed computing to analyze signals from radio telescopes in search of unusual patterns produced by an advanced civilization — what’s known today as a technosignature.

a man in a blue shirt points at the screen of an old-style desktop computer
David Anderson, co-founder of SETI@home, disusses the distributed computing project in 2003.

Robert Sanders/UC Berkeley

Anderson subsequently teamed up with Korpela and UC Berkeley electrical engineer and astronomer Dan Werthimer, and together they launched SETI@home in 1999. Within days, 200,000 people from more than 100 countries had downloaded the software. A year later, it had 2 million users.

The data came from the 300-meter Arecibo radio telescope. It was recorded passively as other astronomers pointed the radio dish — at the time, the world’s largest — at different regions of the sky for study. This so-called commensal observing turned out to be very effective. Over the course of the project, each area of the sky visible from Puerto Rico — a third of the entire sky — was observed 12 or more times, with some areas observed hundreds or even thousands of times.

“From Arecibo we covered most of the stars in the Milky Way, which is billions and billions,” Anderson said.

“We are, without doubt, the most sensitive narrow-band search of large portions of the sky, so we had the best chance of finding something,” Korpela added. “So yeah, there’s a little disappointment that we didn’t see anything.”

Most current SETI searches — including the 10-year-old Breakthrough Listen project — are targeted searches rather than all-sky scans. That is, they look for technosignatures from specific nearby stars or more distant stars that have been found to harbor planets. The radio telescopes used, such as the Greenbank Telescope in West Virginia and the MeerKAT array in South Africa, are still only capable of detecting an Arecibo-sized transmitter relatively nearby, in galactic terms.

“In order to probe farther distances, you need bigger telescopes and longer observing times,” Korpela said. “It’s always best if you are able to control the telescope for your project. We weren’t able to control what the telescope was doing.”

The final analysis

The software that Korpela developed for SETI@home took the radio data from Arecibo — frequency, intensity, position in the sky — and manipulated it mathematically in a process called a discrete Fourier transform, which breaks the frequencies into little bins. Since Earth is moving, as is any likely signal source, the software scanned the observations for frequency shifts, called Doppler drift.

a large silvery bowl nestled in a hole surrounded by green vegetation
A panoramic view of the Arecibo radio telescope in 2019. At the time the world’s largest radio telescope, it was 1,000 feet across and built into a natural sinkhole near Arecibo, Puerto Rico. Radio signals captured during its astronomical studies were analyzed by millions of volunteers who were part of the SETI@home project. The radio dish was destroyed during a storm in 2020.

Mario Roberto Durán Ortiz/Creative Commons

“We actually had to look at a whole range of possible drift rates — tens of thousands — just to make sure that we got all possibilities,” Anderson said. “That multiplies the amount of computing power we need by 10,000. The fact that we had a million home computers available to us let us do that. No other radio SETI project has been able to do that.”

The 12 billion interesting signals these home computers identified had to be vetted, however, and Anderson admits that in the early years of SETI@home, they had not thought much about how to do that.

“Until about 2016, we didn’t really know what we were going to do with these detections that we’d accumulated,” Anderson said. “We hadn’t figured out how to do the whole second part of the analysis.”

The winnowing required a computing cluster with a large amount of storage and memory, which was provided by the Max Planck Institute for Gravitational Physics in Hanover, Germany. The supercomputer allowed Anderson and Korpela to eliminate RFI and noise, reducing the billions of detections to a couple of million signal candidates — “sets of detections that come from more or less the same place in the sky and at more or less the same frequency but possibly a lot of them spread out over time,” Anderson said.

Once they had ranked these by likelihood of being real, the top thousand had to be reviewed manually. Korpela and Werthimer worked to review the candidates and narrow the field to about 100. These are being targeted by FAST, each spot on the sky recorded for about 15 minutes. FAST has about eight times the collecting area of Arecibo.

The final analysis of these signals is yet to come, Anderson said, but “these two papers are the important conclusions of SETI@home.”

SETI at home spelled out, with a human figure standing on the @ sign
The SETI!home logo.

SETI@home

Is a similar crowdsourced SETI project feasible today?

Korpela thinks the answer is yes. The FAST telescope is already conducting a commensal survey. That data could be chunked and distributed to citizen scientists for analysis. Home computers could process this data on a platform for volunteer computing called BOINC, which Anderson created and continues to develop. BOINC, funded by the National Science Foundation, is currently used by several crowd-sourced computing projects, including Rosetta@home, which calculates how proteins fold in 3D; Einstein@home, which analyzes data in search of pulsars; and LHC@home, which simulates particle collisions at CERN’s Large Hadron Collider. Faster computers and faster internet speeds could allow analysis of much larger chunks of data than could SETI@home, which started during the era of slow, dial-up modems that made it laborious to download large amounts of data.

“I think it still captures people’s imagination to look for extraterrestrial intelligence,” Korpela said. “I think that you could still get significantly more processing power than we used for SETI@home and process more data because of a wider internet bandwidth. The biggest issue with such a project is that it requires personnel, and personnel means salaries. It’s not the cheapest way to do SETI.”

SETI@home was once operated by six people, but Korpela is the only paid staff member now, and he is semi-retired. But he sees crowd-sourced computing as an opportunity to better analyze SETI radio data using lessons from SETI@home. That could include a second look at all the SETI@home data.

“In a world where I had the money, I would reanalyze it the right way, meaning I’d fix the mistakes that we made. And we did make some mistakes. These were conscious choices because of how fast computers were in 1999,” Korpela said. “There’s still the potential that ET is in that data and we missed it just by a hair.”

RELATED INFORMATION

联系我们 contact @ memedata.com