我报道机器人多年,这个不一样。
Eka’s robotic claw feels like we're approaching a ChatGPT moment

原始链接: https://www.wired.com/story/when-robots-have-their-chatgpt-moment-remember-these-pincers/

Eka Robotics 正在开发具有令人惊讶的“物理智能”的机器人——这是实现复杂食品处理和精细操作自动化任务的关键一步。 与需要为每种情况进行大量预编程的传统机器人不同,Eka 的机器人通过现实世界的交互和模拟相结合的方式学习,本能地从错误中恢复,就像人类一样。 作者将 Eka 目前的阶段比作 OpenAI 早期的 GPT-1,强调这是一种初生但充满希望的智能。 这些机器人展示了触觉感知,能够“感受”重量和惯性,表明它们对物理环境有更深入的理解。 虽然存在其他机器人技术方法,但 Eka 专注于触觉智能,这似乎对于实现人类水平的灵巧性至关重要,并可能扩展到像 iPhone 组装这样的任务。 尽管自动化具有潜力,但作者承认人际互动的价值,即使在机器人最终可以胜任的角色中也是如此,这引发了对未来工作和日常体验中人性的思考。

一篇最近发表在《连线》杂志上的关于新机器人的文章,正在Hacker News上引发关于先进人形机器人的可行性的争论。一些人对最近的进展感到兴奋——一位评论员认为真正像人类的机器人即将问世——而另一些人则持怀疑态度。 一个反复出现的主题是过去对《连线》杂志上宣传的技术感到失望(比如BetterPlace)。机器人专家和所介绍公司的投资者罗德尼·布鲁克斯认为,这款机器人特别接近于解决人类灵巧度这个复杂的问题,而这需要目前大多数人形机器人缺乏的复杂感知和建模。 讨论还强调了像从地板上站起来这样简单动作的惊人难度,人类可以直觉地做到,但机器人却难以完成,因为它们缺乏触觉和空间意识。 许多评论员表示希望看到真正进展的具体例子,对过于乐观的说法持谨慎态度,尤其是来自埃隆·马斯克等人物的说法。
相关文章

原文

Food handling is an area of work that still relies heavily on humans. Fruit, vegetables, meat, and other foods need to be handled quickly but gently. It is also hard to automate because no two pieces of fruit, vegetables, or chicken nuggets look exactly the same.

Eka’s demos suggest that the company may be onto something big. I found myself mentally comparing their robots to GPT-1, OpenAI’s first large language model, developed four years before ChatGPT. GPT-1 was often incoherent but showed glimmers of general linguistic intelligence.

The robots I saw seem to have a similar kind of nascent physical intelligence. When I watched a video of one reaching for a set of keys in slow motion, I noticed it did something that seemed remarkably human: It touched the tips of its grippers to the table and slid them along the surface before making contact with the keys and securing them between its digits. Eka’s algorithms seem to know instinctively how to recover from a fumble. This kind of thing is difficult for other robots to learn, unless the humans training them deliberately make a wide range of mistakes.

Unlike with any other robot I can think of, it’s almost possible to imagine what the world is like for the robot. Its sensors seem to feel the weight of its arm, the inertia as it sweeps toward the keys and slows down. Once it has the keys in its grasp, it seems to sense the weight of them dangling from its claw.

I don’t know if Eka’s approach really is the route to a ChatGPT-like breakthrough in robotics. Some very smart experts believe that mixing human demonstration with simulation will yield better results than simulation alone. Maybe some combination of the two approaches will ultimately be necessary? But it does seem clear that robots will eventually need to have the kind of tactile, physical intelligence that Eka is working on if they are to obtain humanlike dexterity.

Agrawal tells me that the same general approach should work for finer manipulation. The fiddly dexterity required to build an iPhone, for instance, could be achieved by building different actuators and sensors and practicing the task in simulation.

After spending a few hours at Eka, I decide to stop by the restaurant downstairs. I watch from the counter as the staff prepare food and make coffee. A descendant of the machine upstairs may be able to do these things just as well, if not better. But given how much I enjoy chatting with the people who work there, I think I would pay extra to keep humans around. Unless, that is, my hands get automated away too.


What Say You?
Let us know what you think about this article in the comments below. Alternatively, you can submit a letter to the editor at [email protected].

联系我们 contact @ memedata.com