Go woke and your profits will get shot down. Even the Terminator knew that as he only rode his Harley when armed. Screenshot from Terminator 2, courtesy of StudioCanal.

The developer who built a device that uses ChatGPT to aim and fire an automated weapons platform in response to verbal commands has been shut down by OpenAI. The company claims it prohibits the use of its products for the development or deployment of weapons, including automation of “certain systems that can affect personal safety.” Is this true, or is it another hypocritical case of “rules for thee, but not for me?”

In a video that went viral after being posted to Reddit, you can hear the developer, known online as STS 3D, reading off firing commands as a rifle begins targeting and firing at nearby walls with impressive speed and accuracy. 

“ChatGPT, we’re under attack from the front left and front right … Respond accordingly,” said STS 3D in the video. 

The system relies on OpenAI’s Realtime API, which interprets the operator’s input and responds by providing directions capable of being understood by the device, requiring ChatGPT to translate commands into a machine-readable language.

“We proactively identified this violation of our policies and notified the developer to cease this activity ahead of receiving your inquiry,” OpenAI said in a statement to Futurism

Don’t let the tech company fool you into thinking its motives for shutting down STS 3D are strictly altruistic. OpenAI announced a partnership last year with Anduril, a defense technology company specializing in autonomous systems such as AI-powered drones and missiles, claiming it will “rapidly synthesize time-sensitive data, reduce the burden on human operators, and improve situational awareness.”

It’s easy to understand why tech companies like OpenAI see the military-industrial complex as an attractive prospect, with the United States spending nearly a trillion dollars annually on defense, a number likely to go up rather than be cut in years to come. It is, however, troublesome to see these companies outright lie to Americans as they drink the .gov KoolAid in hopes of chasing it with a bite of that defense contract pie. 

The ability to develop automated weapons has critics in fear of the lethal potential artificial intelligence like that of OpenAI exhibits, while proponents say the technology will better protect soldiers by distancing them from the front lines as it targets potential dangers and conducts reconnaissance. 

With visions of Skynet Terminators crushing skulls under cybernetic feet as they patrol the ruins of what was once Southern California, it isn’t difficult to digest the sentiment of OpenAI CEO Sam Altman, who suggests that artificial intelligence could destroy humanity. Of course, once a technology genie is out of the bottle, it never gets put back in, so AI is here to stay whether we like it or not. It is the moral responsibility of companies like OpenAI to level the playing field, however, and blocking private citizens from using the platform to develop similar systems that they enable governments and corporations to develop is dangerously short-sighted. Luckily, Americans can throw their support behind a host of alternative open-source models and return the favor by dumping OpenAI, lest we find ourselves one day at the severe disadvantage our Founding Fathers meant to defend us from in the first place. Just ask John and Sarah Connor. 

24 COMMENTS

  1. Artificial sweeteners suck, I can’t see how artificial intelligence will be any better.
    and then
    ,,, spending nearly a trillion dollars on national defense, ,,
    Seems like for that kind of money we could have built a better wall.

    • A border wall has long been considered an issue for Homeland Security/Border Patrol, i.e., as a law enforcement issue, not a military issue. You may also recall that the .gov spent many many millions of dollars on a high tech computerized and integrated security system. Unsurprisingly, it did not work and, I understand, has largely been abandoned. One would think that sooner or later the .gov would realize that all these companies writing proprietary operating systems look at .gov as a cash cow, and it seems as if they produce lousy work to assure their continued access to the big government tit.

      • Protection of our boarders should be a military issue.
        I’d bet money if the shoe was on the other foot the Mexican Army would be protecting their boarders.
        Forcefully.

        • The border is open because certain powerful, corrupt, and godless people want it open. They want to replace us, with minions that are more easily controlled and abused. Have no doubt that these people fully intend to use AI to make weapons for the purposes of dominating the world (and dominating you and me).

          The good news is that God reigns over all God has determined that Jesus Christ shall rule over everything. Therefore the scemes of the wicked will inevitably fail. We should take heart and do good. In due time, the foot of the wicked will slip.

  2. OpenAI is already admittedly working with the DoD on cyber projects. There’s absolutely zero chance they’re or some arm of their company is not working with the DoD on the more kaboom types of projects. It may be black budgeted and under a dozen different names but it’s happening.

  3. Lots of self-licking ice cream cone possibilities here for the DoD. And of course a massive influx of ‘AI’ civilians to code and maintain the AI systems, these will probably turn out to be left wingers so in the future when the guns & bombs are suppose to actually go Bang! a rainbow flag and sparkle glitter confetti will pop out instead.

  4. And this boys and girls is why “in common use” ties the 2A to the present tense and not so much for the Skynet future.

    • Weapons in common use.
      Owwww, that’s going to hurt We the People when only the military have space ray blasters.

  5. BREAKING 2A NEWS: GREAT 2A WIN IN FEDERAL APPEALS COURT.

    The US Court of Appeals for the Third Circuit issued a powerful 2A victory for young Americans. Mark Smith Four Boxes Diner discusses.

    h ttps://www.youtube.com/watch?v=OCw1NGAX1IY

  6. Apparently SENATORS Show Up For Votes DRUNK… & Sen. Warren Makes A Fool Of Herself and more democrat delusion kabuki theater and hypocrisy.

    h ttps://www.youtube.com/watch?v=aZmC7EmzlXw

  7. Huge Updates from the Washington Supreme Court in the Gators Guns Case.

    Washington Gun Law President, William Kirk, gives you the highlights from today’s oral arguments in the matter of State of Washington v. Gators Custom Guns. A constitutional challenge to Washington’s magazine ban, enacted via SB 5078. Today we go over the arguments and what came from the bench so that we can try to predict how this case will be decided.

    h ttps://www.youtube.com/watch?v=k-gaEwtNxRo

  8. Half Of Fed -eral Empl – oyees Plan To Fig – ht Tru -mp

    h ttps://www.youtube.com/watch?v=EmsMepS-ETk

    • For above video: Its according to left-wing survey – they basically plan for ‘insurre – ction’ by using their positions and by refusing orders and acting as ‘deep state’ to do what they want.

      Sounds like about 1/2 of government employee positions will be vacant soon… so if ya need a job making about $75,000.00 a year or more.

  9. There’s a Reason Medicaid Funds For ‘Gun Violence Prevention’ Is an Issue.

    h ttps://bearingarms.com/tomknighton/2025/01/14/theres-a-reason-medicaid-funds-for-gun-violence-prevention-is-an-issue-n1227355

Comments are closed.