人工智能不会取代白领工作。
AI doesn't replace white collar work

原始链接: https://www.marble.onl/posts/ai_doesnt_replace_work.html

安德鲁·马布尔认为,对人工智能取代白领工作的担忧 largely 无稽之谈,因为人工智能擅长*事务性*问答(查找事实),但在*关系型*问题解决上却力不从心——而后者是大多数工作的核心。 他用两个例子来说明:查找定义与寻求厨师的意见,以及使用人工智能聊天机器人修复代码与向值得信赖的顾问寻求战略建议。虽然人工智能可以高效地提供答案,但无法复制人际关系、共同理解和信任判断的价值。 许多工作,尤其是在战略咨询等领域,并非关于寻找“正确”答案,而是关于讨论问题、获得视角和建立信任。企业本质上是社会组织,即使是高度程序化的企业也依赖于人际互动。人工智能可以*辅助*完成任务,但无法取代有效工作至关重要的细微而人性化的因素。

## AI 与白领工作:Hacker News 讨论摘要 一篇近期文章认为人工智能不会取代白领工作,这在 Hacker News 上引发了争论。核心论点在于,许多白领工作依赖于*关系*和*信任*,而不仅仅是事实准确性。人们寻求建议,往往不是为了得到“正确”的答案,而是为了确认、视角和人际连接——这些是人工智能目前无法提供的。 许多评论者认同这种区分,强调了人类判断力和经验的价值,尤其是在复杂情况下。然而,也存在怀疑论,一些用户认为,即使人工智能不能*完全*取代工作,也会彻底改变它们,很可能减少员工数量并增加剩余员工的工作量。 一个关键点是,如果人工智能能够处理工作的交易性方面,公司可能就不需要那么多“关系管理者”,从而导致重组而非直接的失业。 还有人强调了自动化技术的历史模式——虽然工作岗位并不总是被消除,但它们会演变,需要新的技能和适应能力。 讨论还涉及了潜在的监管增加,以及随着人工智能普及,对质量而非单纯生产力的重视程度的转变。
相关文章

原文
ai_doesnt_replace_work

Andrew Marble
marble.onl
[email protected]
March 8, 2026

Two recent experiences where I had questions needing some external input: One, I saw a word (the word was Pareve but it’s unimportant) and I didn’t know what it meant. I thought it had something to do with food and religious practices and my first thought was to text a chef I knew to ask about it. Of course I quickly realized I could just look it up, which made me lose interest. I would have been interested in the experience and thoughts of someone whose opinions I respected, both as a social thing and to learn something. The dictionary definition I don’t really care about unless I have some acute need to know it.

The second (if you thought the first was boring) was a programming question about preventing None from being cast to NaN when adding a python list containing integers interspersed with Nones to a pandas DataFrame (spoiler alert, the answer is df["A"] = pd.Series(a,dtype='object')). For this I asked an AI chatbot, got the answer, tested it, and moved on.

These are examples of two different kinds of question answering, or problem solving, that often get conflated. The second is very transactional and there’s an imminent need for the answer. The first is relationship based and is about wanting someone’s opinion, without which the fact itself isn’t too consequential.

For transactional question answering, we have dictionaries, encyclopedias, Wikipedia, now LLMs that can provide reference information. While those sources aren’t interchangeable with opinions, they often get substituted in. How many conversations have been ruined by someone looking up a fact on wikipedia when you actually wanted to discuss what people knew about it? In relationship based question answering, the question is almost a pretense to be social, share views, and learn something. It’s why we talk to other people and it’s also the basis of most white collar work.

The distinction between question types is becoming more relevant now that people are talking about AI (LLMs) replacing human work. A material, if not dominant percentage of “questions” we answer while we are working are type 1 human interaction questions rather than type 2 transactional. An area where type 1 dominates is strategy.

For as long as there has been AI, there have been claims, often centered around AI being better at making powerpoint presentations, that strategy consulting is about to become obsolete. I don’t think many people involved in the industry (as buyers or sellers) take these too seriously, but strategy consulting is a useful study in why AI answers are often over-rated. Forgetting about the cynical “we hired consultants to provide cover for an unpopular decision” variations (which obviously don’t have the same ring if we replace consultants with ChatGPT), consulting is trust and relationship based. Buyers aren't asking for a correct answer, they are asking for advice from someone whose opinion they respect. They also often, for both catharsis and to clarify their own thinking, want to explain their situation to somebody else, and feel understood.  While there is no harm in asking an AI, few rational people are going to give the same weight to what it says than to a trusted advisor; this is just as true for major strategy decisions as it is for personal advice.

Ultimately, most business tasks have a similar component to them. They rely on judgement, experience, and trust to set a plausible course and correct it when needed, and don’t hinge on determining a correct answer or providing facts. And businesses are organized as groups of people that communicate socially with each other. Perhaps unintuitively, human factors become even more important in procedural organizations like government and military because they don’t have market exposure to provide feedback, and for better or worse rely on human organization.

None of this is to say that people can’t use AI for sub-processing tasks that require a type 2 answer. It is great for this. Just that it doesn’t replace the social, human, and relationship based aspects of work, whether this is trust, or just being interested in what someone else says. It doesn’t really matter how good AI systems get, that’s not going to change, and since most white collar work deals with these kinds of problems, there is little danger in it being replaced.

联系我们 contact @ memedata.com