The US artificial intelligence (AI) firm Anthropic is looking to hire a chemical weapons and high-yield explosives expert to try to prevent catastrophic misuse of its software.
In other words, it fears that its AI tools might tell someone how to make chemical or radioactive weapons, and wants an expert to ensure its guardrails are sufficiently robust.
In the LinkedIn recruitment post, the firm states applicants should have a minimum of five years of experience in chemical weapons and/or explosives defence as well as knowledge of radiological dispersal devices – also known as dirty bombs.
The firm told the BBC that the role is similar to jobs they have already created in other sensitive areas.
This strategy is not unique to Anthropic. A similar position has been advertised by ChatGPT developer OpenAI, which lists a vacancy for a researcher in biological and chemical risks, offering a salary of up to $455,000, nearly double that offered by Anthropic.
However, some experts express concern about the inherent risks of this approach, warning it may give AI systems access to sensitive weapons information, even if they are programmed not to utilize it. Dr. Stephanie Hare, a tech researcher, inquired, Is it ever safe to use AI systems to handle sensitive chemicals and explosives information, including dirty bombs and other radiological weapons?
She further pointed out the absence of international regulations or treaties governing the use of AI in this domain, stating, All of this is happening out of sight. The urgency of this issue is accentuated by the US government's calls for AI firms to engage as military operations escalate globally.
Anthropic has also taken legal action against the US Department of Defense after being designated a supply chain risk. This designation positions the company alongside firms like Huawei, which also faces sanctions due to national security issues.
Despite the ongoing discourse on AI's potential existential risks, there has been little motivation within the technology industry to slow its advancement.






















