天空守望计划 (又名:威科姆在家)
Project SkyWatch (a.k.a. Wescam at Home)

原始链接: https://ianservin.com/2026/01/13/project-skywatch-aka-wescam-at-home/

该项目旨在复制专业级、昂贵的EO/IR云台(用于飞机跟踪)的稳定性,使用消费级PTZ摄像头实现。挑战在于摄像头的电机反应迟缓且有阻尼,不适合跟踪快速移动的飞机。解决方案是将控制从机械转向复杂的软件。 构建了一个定制控制环,融合了视觉跟踪(OpenCV & CSRT识别飞机特征)、预测(卡尔曼滤波器估计未来位置并考虑延迟)和控制(PID环,并结合基于预测速度的前馈控制)。这使得摄像头能够*提前*跟踪目标,而不是被动地响应其运动。 此外,一个“虚拟云台”可以数字稳定图像,补偿机械缺陷。该系统还集成了ADS-B遥测数据,将视觉跟踪与飞机识别相关联。 该项目在Github上公开,探索了“sousveillance”(将ISR技术适应于民用)——提高对空域的感知能力,以及用于监视的工具。

Hacker News 新闻 | 过去 | 评论 | 提问 | 展示 | 招聘 | 提交 登录 Project SkyWatch (又名 Wescam at Home) (ianservin.com) 4 分,来自 jjwiseman 1 小时前 | 隐藏 | 过去 | 收藏 | 1 条评论 jjwiseman 8 分钟前 [–] 这是一个有趣的项目,结合了廉价的 PTZ 摄像头 + OpenCV,卡尔曼滤波,PID 控制和数字防抖,不仅可以拍摄飞过的飞机照片,还可以进行稳定可靠的像素级跟踪,只要它们在视野范围内。 并且可以结合 ADS-B 数据源,识别它正在观察的对象,并在 OSD 上显示信息。 回复 指南 | 常见问题 | 列表 | API | 安全 | 法律 | 申请 YC | 联系 搜索:
相关文章

原文

Professional aviation surveillance relies on a specific piece of hardware: the EO/IR (Electro-Optical/Infra-Red) gimbal. These are the gyro-stabilized turrets you see on the nose of police helicopters or military drones, capable of keeping a rock-solid lock on a target regardless of how the aircraft maneuvers.

I wanted to replicate this capability to help track aircraft from the ground—building a tool that allows a consumer camera to lock onto and follow a target with similar stability, but without the defense-contractor budget.

The Hardware Constraint

The core of this build is a generic PTZ (Pan-Tilt-Zoom) camera, the kind typically used for streaming church services or campus lectures.

An Amazon product page for the AVKANS AI Auto Tracking NDI 6 Camera, priced at $389.00.

The main image features a sleek, matte black PTZ (Pan-Tilt-Zoom) camera with a prominent "NDI HX3" logo on the side arm. Beside the camera is a small black metal mounting bracket. To the left, a vertical gallery shows several thumbnail images and videos of the product in use.

The product title highlights key features: 20X Live Streaming, HDMI SDI USB3.0 connectivity, and compatibility with NDI HX2 & NDI HX3. It is marketed for church worship, events, and social media livestreaming. A blue notification bar at the top indicates the item was "Last purchased Aug 2, 2024." The right sidebar shows "Prime Two-Day" shipping, "In Stock" status, and "Add to Cart" / "Buy Now" buttons.
The camera in question: an AVKANS LV20N, a knockoff of a 20x zoom PTZOptics unit

While cost-effective, these cameras present a major engineering challenge for tracking any object, let alone aircraft. Their motors are designed for slow, dampened pans across a stage, not for tracking a jet moving at 300 knots. The mechanical and electronics latency is significant; if you simply tell the camera to “follow that plane,” by the time the motors react, the target has often moved out of the frame.

To make this hardware viable, the heavy lifting has to move from mechanics to mathematics.

The Software Stack

I built a custom control loop to bridge the gap between the camera’s sluggish motors and the dynamic speed of the targets. The stack fuses three main concepts to help the system maintain a visual lock:

Visual Processing (OpenCV & CSRT)
Once the camera is pointed at a target, the backend initializes a CSRT (Discriminative Correlation Filter with Channel and Spatial Reliability) tracker. Unlike simple contrast-based detection which can be easily fooled by clouds or changing lighting, CSRT tracks the specific visual features and texture of the aircraft. It calculates the error—how far off-center the target is in pixels—frame by frame to drive the control loop.

Prediction (Kalman Filter)
Raw visual data is noisy and processing introduces latency. To combat this, I implemented a Kalman Filter that models the aircraft’s state (position and velocity).

  • Smoothing: It filters out sensor noise and jitter from the tracker.
  • Prediction: Crucially, it predicts where the aircraft will be roughly 200ms in the future (accounting for system latency).
  • Feed-Forward Control: Instead of just reacting to the error (Feedback), the system feeds the predicted velocity directly to the motors (Feed-Forward). This allows the camera to “lead” the target, eliminating the drag/lag typical of reactive PID loops.

Control (PID + Feed-Forward Loop)
The control logic combines a standard PID controller with the Kalman velocity vector.

  • Proportional: Corrects the immediate position error.
  • Integral: Corrects steady-state error (e.g., wind resistance or motor deadbands).
  • Derivative: Dampens the movement to prevent overshoot.
  • Dynamic Speed Limiting: I also implemented a dynamic velocity clamp that scales the motor aggressivity based on the distance to the target. This allows for high-speed slew when acquireing a target, but precision micro-stepping when the target is centered.

The “Virtual” Gimbal

Even with a tuned PID loop, the plastic gears in a consumer PTZ camera have physical limitations. There is always some mechanical play that results in jitter at high zoom levels and the onboard control electronics (including their dampening/smoothing algorithms) introduce latency that prevent perfect mechanical stabilization.

To solve this, I implemented a digital stabilization layer—essentially a “virtual gimbal.” The software crops into the sensor slightly and shifts the image frame-by-frame to counteract the imperfect mechanical stabilization. The result is an incredibly stable image that mimics the expensive mechanical stabilization of professional EO/IR turrets.

Data Fusion

An optical lock is useful, but context is better. Since the system knows the camera’s precise azimuth and elevation, I can correlate the visual data with live ADS-B telemetry. When tracking a target, the system queries local ADS-B traffic to find an aircraft at those specific coordinates. This data comes from a local ADS-B receiver with a script monitoring tar1090.

Why Build This?

This project is an experiment in sousveillance—monitoring the monitors. It involves taking the technologies used for ISR (Intelligence, Surveillance, and Reconnaissance) and adapting them for civilian use. By understanding how these tracking systems work, we gain a better understanding of the airspace above us and the tools often used to watch it.

This project is available on Github under an MIT license.

联系我们 contact @ memedata.com