研究人员开发出一种可以同时对不同距离对焦的相机。
Researchers develop a camera that can focus on different distances at once

原始链接: https://engineering.cmu.edu/news-events/news/2025/12/19-perfect-shot.html

卡内基梅隆大学的研究人员开发出一种突破性的“计算透镜”,能够同时将整个场景清晰地聚焦,无论景深如何。与仅限于单一焦平面的传统透镜不同,该系统动态调整图像的焦点,模拟真实世界的曲率。 这项创新结合了改进的洛曼透镜、空间光调制器和复杂的自动对焦算法,利用对比度检测和相位检测两种方法。这使得它能够实现快速、准确的对焦——达到每秒21帧——而不会出现光圈变小带来的缺点,例如亮度降低或衍射。 该技术于2025年国际计算机视觉会议上展示,其潜力远不止于摄影。应用包括改进的显微镜技术,用于详细的生物成像;增强自动驾驶车辆的清晰度;以及在AR/VR系统中更逼真的深度感知。这代表了一种新颖的光学设计,有望从根本上改变相机捕捉和感知世界的方式。

黑客新闻 新 | 过去 | 评论 | 提问 | 展示 | 招聘 | 提交 登录 研究人员开发了一种可以同时对不同距离对焦的相机 (cmu.edu) 6点 由 gnabgib 1小时前 | 隐藏 | 过去 | 收藏 | 讨论 指南 | 常见问题 | 列表 | API | 安全 | 法律 | 申请YC | 联系 搜索:
相关文章

原文

Imagine snapping a photo where every detail, near and far, is perfectly sharp—from the flower petal right in front of you to the distant trees on the horizon. For over a century, camera designers have dreamed of achieving that level of clarity. In a breakthrough that could transform photography, microscopy, and even smartphone cameras, researchers at Carnegie Mellon University have developed a new kind of lens that can bring an entire scene into sharp focus at once—no matter how far away or close different parts of the scene are.

The team, consisting of Yingsi Qin, an electrical and computer engineering Ph.D. student, Aswin Sankaranarayanan, professor of electrical and computer engineering, and Matthew O’Toole, associate professor of computer science and robotics, presented their findings at the 2025 International Conference on Computer Vision and received a Best Paper Honorable Mention recognition.

Illustration comparing a conventional photo with a defined focal plane to an All-In-Focus photo showcasing its spatially-varying autofocused focal surface.

Left: A conventional photo with a regular lens, where objects at a single focal plane appear sharp. Right: An all-in-focus photo captured through spatially-varying autofocusing. To achieve this, we combine (i) a programmable lens with spatially-varying control over focus, and (ii) a spatially-varying autofocus algorithm to drive the focus of this lens. Note that this is an optically-captured image of a real scene with no post-capture processing used.

Traditional camera lenses can only bring one flat layer of a scene into perfect focus at a time. Anything in front of or behind that layer turns soft and blurry. Narrowing the aperture can help, but it also dims the image and introduces new kinds of optical fuzziness caused by diffraction.

“We’re asking the question, ‘What if a lens didn’t have to focus on just one plane at all?’” says Qin. “What if it could bend its focus to match the shape of the world in front of it?”

The researchers developed a “computational lens”—a hybrid of optics and algorithm—that can adjust its focus differently for every part of a scene. The system builds on a design known as a Lohmann lens, which uses two curved, cubic lenses that shift against each other to tune focus. By combining this setup with a phase-only spatial light modulator—a device that controls how light bends at each pixel—the researchers were able to make different parts of the image focus at different depths simultaneously.

The system uses two autofocus methods. The first is Contrast-Detection Autofocus (CDAF), which divides the image into regions called superpixels. Each region independently finds the focus setting that maximizes its sharpness. The second is Phase-Detection Autofocus (PDAF), which uses a dual-pixel sensor to detect not just whether something is in focus, but which direction to adjust. This makes it faster and better suited for moving scenes—the team achieved 21 frames per second with their modified sensor.

“Together, they let the camera decide which parts of the image should be sharp—essentially giving each pixel its own tiny, adjustable lens,” explains O’Toole.

Our system represents a novel category of optical design. One that could fundamentally change how cameras see the world.

Aswin Sankaranarayanan, Professor, Electrical and Computer Engineering

Beyond its obvious appeal to photographers, the technology could have sweeping applications. Microscopes could capture every layer of a biological sample in focus at once. Autonomous vehicles might see their surroundings with unprecedented clarity. Even augmented and virtual reality systems could benefit, using similar optics to create more lifelike depth perception.

“Our system represents a novel category of optical design,” says Sankaranarayanan. “One that could fundamentally change how cameras see the world.”

联系我们 contact @ memedata.com