新西兰卫生部门员工被告知停止使用ChatGPT撰写临床记录。
Health NZ staff told to stop using ChatGPT to write clinical notes

原始链接: https://www.rnz.co.nz/news/national/590645/health-nz-staff-told-to-stop-using-chatgpt-to-write-clinical-notes

新西兰健康(HNZ)已警告员工不要使用ChatGPT和Gemini等免费人工智能工具起草临床记录,理由是存在严重的数据安全、隐私和责任风险。最近一份致罗托鲁瓦精神健康和戒瘾服务的备忘录详细说明,已检测到未经授权使用人工智能的情况,并可能导致纪律处分。 HNZ的政策要求所有用于临床目的的人工智能工具必须注册并获得批准。虽然HNZ正在推广一款批准的人工智能书写工具(“Heidi”),但即使使用匿名化的患者数据,也严格禁止使用免费替代品。 公共服务协会认为,员工诉诸这些工具是由于“巨大压力”和人手不足造成的,批评HNZ的威胁性做法,并呼吁投资于培训和批准的资源,而不是纪律威胁。HNZ拒绝就事件数量或由此产生的纪律处分发表评论。

健康新西兰员工被告知停止使用ChatGPT撰写临床笔记 (rnz.co.nz) 19 分钟前,billybuckwheat 发布 | 隐藏 | 过去 | 收藏 | 1 条评论 帮助 burnte 13 分钟前 [–] 是的,那里没有隐私或安全性。有一些工具专门设计用于帮助医疗保健提供者更快地生成更好的笔记,其中一些工具非常出色。我是一个对人工智能持悲观态度的人,我敏锐地意识到它的缺点并谨慎地使用它,即使我持怀疑态度,也有一些工具确实很棒。我认为使用LLM创建概述和摘要是这项技术的绝佳用途。回复 指南 | 常见问题 | 列表 | API | 安全 | 法律 | 申请YC | 联系 搜索:
相关文章

原文
The image shows three hospital beds flanked by drawn curtains. There is a light green wash over the image.

Photo: RNZ

Health NZ (HNZ) says staff have been caught using free AI tools like ChatGPT and Gemini to write clinical notes, a move it says could result in formal disciplinary action.

A memo seen by RNZ was sent this week from a senior manager to all Mental Health and Addiction Services staff in the Rotorua Lakes district, reminding them not to use tools like ChatGPT, Claude or Gemini in their work.

"It has come to my attention that there has been instances where it appears that AI (artificial intelligence) drafting tools have been used to prepare clinical notes," it says.

"The use of free AI tools (e.g. ChatGPT, Claude, Gemini) for clinical purposes is strictly prohibited due to data security, privacy and accountability concerns. You are also not allowed to use AI tools to draft notes and then transcribing it to handwritten or typed notes, even if you anonymise the patient information."

Doing so could result in formal disciplinary action, it said.

According to the HNZ-wide AI policy, any AI tools must be registered with the Health NZ National Artificial Intelligence and Algorithm Expert

Advisory Group (NAIAEAG) - this would include Heidi, an AI scribe tool being rolled out across EDs.

Sonny Taite, HNZ director of digital innovation and AI, said free AI tools presented risks to data security, privacy and accountability, and "any possible exemptions are assessed case by case".

"As with any new process in healthcare, we are working with our clinicians on new ways of working and this is an ongoing process."

HNZ did not answer questions about how many instances there had been of staff using unapproved AI software, or whether anyone had been disciplined.

Staff turning to AI tools under 'enormous pressure' - union

Fleur Fitzsimons, national secretary for the Public Service Association, which represents many health and addiction service workers, said clinical staff were turning to AI tools because of the "enormous pressure" they were under.

A memo which opened by threatening formal disciplinary action was the wrong approach, she said.

"It's a warning shot that will make staff afraid to ask questions or seek help."

HNZ should be investing in proper training and approved tools, she said.

"Let's not forget that HNZ has been cutting the very teams responsible for digital systems and IT support. If staff are improvising with free tools, HNZ needs to examine why that is the case, not simply threatening staff with a breach of the Code of Conduct."

Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.

联系我们 contact @ memedata.com