To the editor: I skilled a visceral response to Anita Chabria’s latest column (“The Pentagon is demanding to make use of Claude AI because it pleases. Claude informed me that’s ‘harmful,’” Feb. 26). It seems the U.S. has taken one more web page out of Joseph Stalin’s playbook on the highway to dictatorship, however now with way more sophistication with the assistance of synthetic intelligence.
The important thing issue within the fall of the Romanov dynasty to the Bolsheviks was that indignant minority group’s recognition of the quintessential worth of timing and chaos. Particularly in that 1917 12 months, their rigorously orchestrated communication amongst railroads and telegraph techniques was swift and coordinated.
AI techniques at the moment (like Claude) seem like the last word device in controlling at the moment’s chaos and communication. Stalin didn’t have AI, however he did have his personal low-tech variations of surveillance: spies, the KGB, gulags, intimidation strategies, and so forth. And the U.S. does too: masked ICE brokers, detention facilities, tear fuel.
The American individuals, if not our legislatures, should be certain that we have now guidelines and laws that management the unbridled use of this highly effective device by the highly effective people.
Darlene Pienta, San Marcos
..
To the editor: If Dario Amodei, CEO of Anthropic, desires the Trump administration to grasp his discomfort with President Trump’s demand that the Division of Protection be allowed to make use of Anthropic’s AI for “any lawful function,” (“Anthropic refuses to bend to Pentagon on AI safeguards,” March 3) then I recommend Amodei cite the quote typically attributed to Ralph Waldo Emerson: “What you do speaks so loudly that I can not hear what you say.”
Amodei is sensible to restrict a license for the Division of Protection to specified functions somewhat than broadly for “any lawful function.” The reason being easy: Trump’s observe report — in his private life, enterprise profession and position as president — clearly signifies that he can’t be trusted to behave lawfully.
Trump just lately confessed his perception that his presidential powers are restrained solely by his personal morality. Mainly, Trump believes that his presidential actions can’t be restrained by the Structure or any regulation, treaty, contractual dedication or another framework.
And it’s exactly this flimsy ethical compass that’s led him, in his private life, to change into an adjudicated sexual abuser; in his enterprise life, to be sued 1000’s of occasions, be adjudicated a fraudster and change into a convicted felon; and in his political life, to be impeached twice (to date).
In the event you pair Trump’s low regard for working inside any authorized boundaries with the broad immunity that the Supreme Courtroom granted him final 12 months, then Amodei is correct to fret that limiting Division of Protection’s use of its AI to “any lawful function” is just too weak a compliance normal for a federal authorities led by Trump.
Amodei must be recommended for his braveness in strolling from a high-visibility, profitable, consequential deal as a result of it didn’t comport together with his firm’s mission.
Todd Piccus, Venice
..
To the editor: What ought to Anthropic do now? Go to Europe (or Canada) the place it may function extra efficiently, freed from the heavy and puerile impositions of President Trump and Secretary of Protection Pete Hegseth.
Right here’s an organization that prides itself on the moral use of its merchandise being coerced into betraying that delight by our authorities (“Trump orders federal businesses to cease utilizing Anthropic’s AI after conflict with Pentagon,” Feb. 27). What does this say in regards to the moral character of that very same authorities?
Ken Johnson, Santa Barbara
..
To the editor: If all know-how corporations be a part of Anthropic and say that their merchandise can’t be used for mass surveillance towards People or in totally autonomous weapons operations, then Trump may have no alternative however to rescind his order for U.S. authorities businesses to cease utilizing Anthropic’s know-how. In unity there’s energy, even with a forceful chief.
Richie Locasso, Hemet

