Terminator 2 (courtesy fastcocreate.com)
Previous Post
Next Post

I ran across a new “smart gun” proposal, AI Can Stop Mass Shootings, and More. Yes, these people — three superior intellects who toil away at the Rensselaer Polytechnic Institute — are proposing that “ethical” artificial intelligence be built into guns, which would make them really smart guns. Allegedly.

While we hope that such [conventional gun control] measures, which of late have thankfully been gaining some traction, will be put in place, our optimism is instead rooted in AI; specifically, in ethically correct AI; and even more specifically still: our hope is in ethically correct AI that guards guns.

Unless AI is harnessed in the manner we recommend, it seems inevitable that politicians (at least in the U.S.) will continue to battle each other, and it does not strike us as irrational to hold that even if some legislation emerges from their debates, which of late seems more likely, it will not prevent what can also be seen as a source of the problem in many cases: namely, that guns themselves have no ethical compass.

Guns themselves, you see, lack a “moral compass,” a conscience if you will. Therefore, in the brave new world that’s dawning, we must install ethics in our firearms so the gun can decide if it’s morally permissible for it to fire.

Mass shooter at a Walmart? That would be bad.

The shooter is driving to Walmart, an assault rifle, and a massive amount of ammunition, in his vehicle. The AI we envisage knows that this weapon is there, and that it can be used only for very specific purposes, in very specific environments (and of course it knows what those purposes and environments are).

At Walmart itself, in the parking lot, any attempt on the part of the would-be assailant to use his weapon, or even position it for use in any way, will result in it being locked out by the AI. In the particular case at hand, the AI knows that killing anyone with the gun, except perhaps e.g. for self-defense purposes, is unethical. Since the AI rules out self-defense, the gun is rendered useless, and locked out.

No trigger for you.

A cop taking down a mass shooter? That’s OK. Fire away.

artificial intelligence digital smart gun

But I think the three galaxy brains behind this bold idea have failed to comprehend that the world isn’t always quite so simple. You wake up in the middle of the night to sounds of ransacking somewhere in your house. You grab your gun…and it sits there pondering the possibilities.

Is it just your child clumsily looking for a late night snack? Is it actual home invaders? Maybe it’s a drunk who’s blundered into the wrong house. Or maybe it’s an actual burglar, but the gun thinks what it’s hearing is the sound of you abusing your spouse. By the time the gun makes up its mind, you might be dead.

You may be thinking, “Hey, wouldn’t bad guys just disable the AI?” Don’t worry, Doctors Howard, Fine and Howard have already thought of that.

The AI wouldn’t be just in the gun(s), but it would be “imbued” in other objects that would somehow (it’s not really specified) stop the shooter, too. Besides, you wouldn’t be able to hack it or turn it off because . . .

This is an objection that we have long anticipated in our work devoted to installing ethical controls in such things as robots, and we see no reason why our approach there, which is to bring machine ethics down to an immutable hardware level, cannot be pursued for weapons as well.

Oh. This is immutable hardware we’re talking about. Magically hack-proof. Artificial intelligence that’s instilled with its own (naturally superior) ethical reasoning.

The three researchers behind this scholarly work don’t seem to have considered the hundreds of millions of pre-existing firearms in the US alone. But that’s all beside the point. It turns out that we already have such a system. One that’s more sophisticated than the


I know that initiating force or preying on innocents is unethical. I know that all force initiated against me doesn’t necessarily call for the use of deadly force in response. I’ve studied the ethics of — and laws surrounding armed self defense for decades. I don’t need a machine to try to make decisions for me. And I don’t want someone else to program my gun’s brain to decide for me when I can and can’t use it.

The problem, of course, is that the world has a certain population of people — criminals, gang members, psychopaths, lunatics — who are out to harm innocents and commit crimes. These people lack any such a sense of ethics. And they aren’t about to equip themselves with an artificial conscience.

The answer is not to adopt more machine control over our lives, but to allow more of us who are ethical — the vast majority of people — to freely access firearms to protect ourselves and our families. As is oft noted, an armed society is a polite society…and an increasingly ethical one.

Previous Post
Next Post


  1. AI detects Democrat voter registration and disables all guns. Good idea. Most of these spree shooters are lifelong liberal democrats.

    • And this is why this will never be a “thing”. Unbiased ethics will always be biased against unethical people. Unbiased/blind hiring and promotion systems (even ones with humans removed from the analytical process) have been accused of racism, homophobia, and misogyny….. something like this won’t be any different…. unless it’s run by a Google algorithm.

  2. The people proposing AI for guns aren’t all that swooft. They fail to note that incase the AI should fail (regardless of reason) there needs to be a way to track the gun and its components: microstamping every part, screw and roll pin.

    Ammunition must be microstamped to match all the other stamped parts. Manufacturers and FFLs must be liable for any AI failure, or unmarked/stamped part that fails, resulting in non-intelligent use of the firearm.

    An entire universe of sensors are needed to register the firearm as it moves from place to place in the home/residence, or anywhere outdoors, including movement across political boundaries.

    The AI controlled firearm must be rechargeable only via renewable energy sources, and a log be created and transmitted to law enforcement and government agencies charged with monitoring the impact to adding to, or subtracting from, climate stabilization.

    AI firearms must also be manufactures such that the firearm itself can be used as a data entry platform for voting, at every level.

  3. …times like this I’m glad the development of electrically ignited ammunition and capacitor trigger mechanisms for small arms never got past that one Ruger knock off rifle.

    What’s also funny is that electronic triggers would possibly be more wide spread and make this silly smart gun nonsense reality… but the NFA, something I’m sure they don’t want to change regarding full autos, won’t allow it.

    • Wasn’t there the Remington Electron-X rifle? Only collectors are interested in those now. I don’t think even Ian has had one on his show.

        • Thanks. I’ll have to look that one up. I remember reading a “Guns&Ammo” review on them when they were released.

          I didn’t think they were a good idea because the trigger relied on batteries and the ammunition was a custom caseless round that seems to be no longer available. Interesting technology but ultimately two dead ends.

    • Electric ignition does offer the stepping stone to a “smart” gun – those optical devices which range, set up the hold over, interrogate the Friend or Foe link, then only fire the weapon if all the correct conditions exist. Basically, you aim, pull the trigger, it then fires with the proper solution and you scan further. AI being unlimited and perfect in the fantasy future, only the enemy soldiers are hit.

      Now apply that coming back at ya. With micro drones.

      I think we will be lucky enough to get a knock sensor to accurately count rounds fired and then maintain a weapon for it’s periodic services. Glock 4 slide past 20K? Change it before it cracks! If you know of the high round count thread on a firearms forum, you get that.

      : )

  4. How much recreational pharmaceuticals were consumed in developing that logic? They must have been so high they were observed on ATC radar.

  5. “Wasn’t there the Remington Electron-X rifle”

    I thought that was the thing that caused Victor Kiam to buy the whole company, back in ’79.

  6. “Immutable hardware that cannot be hacked” – Bingo! We are there – just leave your magic gizmo off my hardware, and it cannot be hacked!

  7. If the human race had reliable technology as described in this article, it would not need the technology described in this article.

    None of this actually deals with the real problem. None of these people even care that murderous criminals were purposely let loose from their cages by politicians.

    • Well I was thinking. Why not put a chip in the repeat offenders? The person could have a choice, in or out. It could be like Clock Work Orange only better. Turn anybody loose, murders, robbers, father raper’s, dope pushers, dog molesters, anybody and anything .Turnem loose, and if they screw up the jolt hits dead.

  8. The geeks should develop a gun that requires the user to blow into the barrel to determine sobriety. Blowing into the barrel could also determine if the user is a gun control zealot and discharge to eliminate the threat:)

  9. “you’re not understanding what the term microstamping means in reference to firearms….”

    Just extending the technology in the most logical direction. The requirement is too rudimentary today. If a firing pin head can induce microstamping, why not go further?

    • There are already proposed laws, and some may be applied, that EVERY part have a unique serial number and it must be recorded for tracking. Imagine what your registration form would look like taking many pages for every nut, bolt, screw, etc.

  10. First Law
    A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    Second Law
    A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
    Third Law
    A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    Wouldn’t this make AI guns obsolete?

    • Heh. I was thinking about the sniper rifle in Borderlands that harangues you every time you take a shot: “You’re a monster!” “I bet he had a family!” I’ve been wondering if Skippy is a relation…?

  11. Here’s the deal. Regardless of the many faults and flaws in this research paper, I have no doubt that some politician or anti gun fanatic will run with it and throw buckets of money at the researchers. This seem ready made to appeal to Bloomberg or any number of liberal tech billionaires.

    • The problem with Heinlein’s 3 laws is defining what the criteria are for deciding what a human being is. Must it only be certain physical characteristics, such as 2 legs, 2 arms, and a head? Does skin color or facial characteristics enter into it? Is that an ape or a human? Would the rules also apply to military bots engaging an enemy force? Can a robot ‘kill’ another robot on the orders of a human?

      • Didn’t Terminator see the human hearts beating, he also killed that girl Terminator. Lost In Space robot had no trouble toasting other robots. This is silly talk,

        But I wonder what War is going to be like when it becomes Robotic.

        • Don’t know about robotic war, but we’ll probably find out in the next 10 years or so. It’s an active area of R&D in every major military program on the planet, and some not so major. About the only thing missing is true autonomy and that could be implemented with a couple mouse clicks.

  12. These, “intelligent,” people more than likely just watched the anime Psycho-pass and are now pretending that the idea behind the guns in that anime is some new and fresh idea that they came up with.

  13. Does anyone believe gasoline could be priced beyond affordability in order to force the acquisition of a green energy vehicle?
    That green energy vehicle might just become a gunm.
    A $700 plastic pistol and $65 box of 20 for the bullets to go in it ain’t cheap.
    DebbieW throws race into it, these prices are putting the hurt to all us poh folk out here. Actually monetarily banning the poor from exercising a Bill of Rights.
    “Work hard, pull yourself up by your boot straps”, yeah well fck it, we ain’t got no shoes.
    But seriously

    • Look at the cost of electricity in Texas over last few weeks. Sometimes the market does not provide the best outcome.

      Welcome to the Green New Deal.

        • Electricity should NOT cost $20,000 to $50,000 for a week because the power companies did not invest in their infrastructure and didn’t plan for an aberrant weather event. And people should not have to choose between freezing to death or facing a power bill more than the value of their home.

        • That is true.

          Also true is that as bad as that is, it’s still better than living in a Marxist “utopia” with no functioning markets and not enough electricity to go around at the best of times. Those bills will get sorted out…no customer is going to have to pay that much, I can guarantee it.

  14. “Can a robot ‘kill’ another robot on the orders of a human?”

    Been awhile, so I looked it up. The robot laws were constructed by Issac Asimov. In one book, or short story, there was a war where apparently all the humans were destroyed, and robots began to create a hierarchical society (apparently, some robots did serve as “soldiers”. In the end, a war battered human shows up and says, “I’m hungry”, thus ending the “dreams” of the ruling robots as the superior to all the other robots.

    There is quite an online trove of critiques of the robot laws, even discussions of why AI enabled machines might not obey Asimov’s laws.

      • Nah. I did to. It’s why I wasn’t finding anything thinking Mr. Heinlein had constructed those. Then I saw Paul’s message above. I slapped my own forehead.

  15. Reading the paper my favorite parts are when they assume away any problems with their argument… assume an ethical system exists, assume an ethical AI could be built, assume the AI could discern the intent of any person, assume it can’t be hacked, assume the AI is all knowing.

    Some might say these things are impossible, but we disagree because we already assumed that they are possible.

    It is also disturbing how they repeatedly affirm that it is extremely difficult to state why the killing of human beings is intrinsically wrong… I think they need some more ethics lessons.

  16. By the time I was 16 I had read EVERY issue of Sci-Fi magazine and Science Fiction & Fantasy magazine that had been published to-date. My uncle was a collector and even wrote reviews. I was a Nerd. Notice the initial cap. Big time.

    NOTHING, in any of those stories seemed as unrealistic, preposterous, and dangerous as the leftist agenda and regime that has usurped our Constitution, heritage, and government.

    I do not live in my imagination any more (at least not during working hours, LOL); I have a living to earn and a family that depends upon me. But I am determined to preserve the principles upon which this country was founded. Not sure how, but we only lose when we stop fighting.

    I will wager you all feel the same way.

    • “Not sure how, but we only lose when we stop fighting.“

      This. Exactly this. This needs to be a rallying cry or a poster.

  17. Every time I see someone talk about ‘smart guns’ I’m reminded of a book from the ’50s called ‘The Weapon Shops of Isher’. It was a world where a group of renegade gun smiths sell guns that have a built in shield generators that make you impervious to anything less than a directed atomic blast. But the counter is that the gun is a smart gun that can tell what you’re intending to use it for and won’t fire if you’re trying to murder someone.

    Personally I’m more willing to believe that that book makes sense better than the idea that AI and ‘smart guns’ are going to do anything to prevent mass shootings.

  18. I wouldn’t even buy a car that would keep me from Squishing a “protester” much less a firearm that controls who and whom doesn’t need to be shot!

  19. So the programmers who can’t write autonomous vehicle software that won’t run into the side of an 18-wheeler or the back of a stopped fire truck think they can write software to determine whether a situation qualifies as legal self defense. Dream on, guys. We’re a long way from The Weapon Shops of Isher or Isaac Asimov’s three laws of robotics.

  20. Not to defend smart gun laws or anything, but the argument presented in this article is flawed. AI will almost always be able to make decisions faster than humans. If that’s your objection, then you’re not making a good argument.

    The real argument should be about other potential failures. Anything except processing time. A little smartwatch can process decisions faster than any living human. AI in a gun could too. The question is, would it be the right decision.

    • Processing time is still a consideration because the data used by the AI will be visual (and maybe sound, too). That means computer vision, which means crunching a lot of pixels really fast (in real time), analyzing them, and THEN taking some kind of action. The hardware to do this takes up space; miniaturizing it has implications for heat, power, and, yes, performance. Solvable, I’m sure. Eventually. I do think you’re right to mostly key onto other criticisms, though: There are a lot.

  21. Of course, when they say “the gun” would decide, the mean “some tech mogul” would decide. That’s because of two reasons: First, because AI must be designed and trained by someone at some point. But more importantly, the kind of AI needed to make situationally aware, ethical shoot/ don’t shoot decisions isn’t likely to fit into small arms in the foreseeable future. The piece in the gun would be a front end for an AI running on massive servers somewhere. Whoever owns that server effectively owns your gun.

  22. Maybe I’ve seen too many fantasy movies but the idea that my heirloom will choose who is worthy to weild it after me is pretty friggin cool

  23. Straight from the biased paper linked in the article:
    [pertaining to the El Paso shooter]

    “…any attempt on the part of the would-be assailant to use hisweapon, or even position it for use in any way, will result init being locked out by the AI.”

    AI to interfere with Smart gun usage. What could go wrong?

  24. I go to Walmart. All guns no not to arm themselves in a Walmart parking lot. The gun on my hip becomes an inert paperweight. A person with a knife, homemade bomb, homemade zipgun… decides it is worth it to him to end my life and take my money and my car. He is much bigger and stronger than my frail self. The AI just decided that my life is not worth saving and some evil person is more worth having on this earth.

  25. Yo this shit is just the plot of psycho pass. Long story short is the last true man in japan grabs a long hidden and unregistered SP101 to solve the problem because the ai can’t read the murderer’s intent through the special guns meaning the ai is unable to make a judgement and the murderer can just slit someone’s throat and walk away scott free multiple times.

  26. “You and your logical objections. Why can’t you just be quiet and let them have their sloppy science fiction?”


    (great silence follows)

  27. “Dang it. I thought getting my way was gonna be more fun.”

    “First you say you do, and then you don’t. Then you say you will, and then you won’t. You’re undecided now, so whatta you gonna do?”

  28. A.I…. for “smart guns” 🤔… 😂🤣 ONLY from (what passes for) the mind of a ‘progressive.’ 🤦🏽‍♂️🙄

  29. “…the power companies did not invest in their infrastructure and didn’t plan for an aberrant weather event. ”

    The Texas Interconnected System began around the beginning of WW2. The people who created and managed the grid were not ignorant of historical Texas weather patterns. What happened recently was an unprecedented, not “aberrant” weather event. (And just how “aberrant” an event must states prepare for?)

    Although no longer a resident, I probably spent the majority of my life in Texas. By the late ’80s, I had never heard of snow in San Antonio. Yet, a freak storm put down 18in. The city came to a complete halt. Was it the responsibility of the San Antonio civic leaders to store and maintain snow plows, indefinitely, just in case?

    I experienced the same thing in Colorado in the mid-80s. The municipalities of Colorado Springs and Denver declared that the snow removal plan was “three days of sunshine”. Eventually, both cities did invest in snow removal equipment, but given the difference between historical Colorado and Texas weather, one might say the snow problem in Denver and Colorado Springs was not unprecedented, nor unimaginable.

    Sometime in the early ’70s, ERCOT was establish to prevent a Texas version of the great Northeast Blackout in ’68. Even then, the Texas grid was not capable of dealing with something as huge as what happened last week. Risk management is a thing, and while one might imagine a catastrophe such as in Texas, what is the likelihood, and how much resource should be allocated to mitigate? Again, how much aberration are people and cities supposed to prepare for? NOTE: nuclear weapons still exist, global hostility among nations with nuclear weapons still exists. Yet, we have no national plan (or capability) for dealing with a nuclear attack, or its aftermath. How much preparation for protection of the populace is required in case of an “aberrant event”?

    ‘Nuther NOTE: Texas was completely foolish, even under “normal” weather conditions, to copy the Californication model for “green energy”.

  30. The only new electronic component I would consider in firearms would be a rechargeable electronic discharge primer that could help alleviate the primer shortage for practice/training ammo that does not require as high a degree of reliability as a standard primer. Realistically that is not feasible as it would be a lot of resources for otherwise cheap ammo and would only be truly viable for reloaders. Also it would not appease those seeking to infringe so there is no realistic monetary drive for such a product to be developed.

  31. By the time you would tack all that crap on a gun it would be the size and weight of a modern tank. These guys are prime examples of over schooled idiots. They have to be over schooled, because you sure can’t call them educated.

  32. Simple question: Why in the blue-hammered hell would I EVER want to subject MY judgment to being second-guessed by an AI program that I didn’t write and don’t fully understand???

    If I think I need to use my gun, and my gun disagrees, and my gun turns out to be wrong, do I get to sue the manufacturer or the idiot that wrote the program (assuming I survive at all)???

    Apparently, these days, even engineering profs are stupid. And I used to think engineering was a refreshing oasis in the academic desert.

  33. These idiots need to stop watching re-runs of ‘The Big Bang Theory’ and just keep surfing porn sites. They can be dangerous. The Democrats will try anything to get rid of guns.

  34. Once again the liberals miss the entire point of the Second amendment. The sole point of the 2nd is if the government turns tyrannical. When that happens, it is the right of the people to remove such government and replace it with something else. If your gun is under the control of a tyrannical government (ie AI), then it would undermine the entire reason why guns must be owned by individual citizens.

    In other words, we neither want nor need such a device, even if it could be programmed properly. The dolts at RPI should really understand history first and engineer better products second.

  35. “Once again the liberals miss the entire point of the Second amendment. The sole point of the 2nd is if the government turns tyrannical.”

    Government, doing good things for the populace, for their own benefit, their own good, is not tyranny. It is the purpose of government; make sure the populace doesn’t harm itself, or others.

  36. “This new technology should be evaluated by secret service, fbi,state, county,and city Leos,for about 20 years until it is proven to be reliable.”

    That would fail the goal: burden gun ownership such that people decide it is not worth the effort.

Comments are closed.