Current events can lead to story ideas

  • How Current Events can give rise to Science Fiction: Dangerous Ideas


    I write science fiction and spend a lot of time investigating new inventions and trends as part of my research process. My intent in this post is not to frighten or upset you, but to give you a topic for thought and secondly, to show you how easy it is to generate story ideas.

    I'm going to address a subject that we often ignore, but which is rapidly becoming a reality. I want to warn you before you proceed that I'm not a “safe” writer. I deal with subjects that make some people uneasy.

    By the end of the post, I'll give you an idea for a story. What you do with it is up to you. Feel free to use it, if you want.

    I recently read that the US Army was utilizing a “ground-firing” robot for training exercises. Such a device qualifies as a LAWS (lethal autonomous weapons system). LAWS can have varying degrees of intelligence built into the programming. That's the autonomous part.

    There's currently an academic debate about whether such systems are ethical and whether we should build them. It's largely academic because the real world economics are such that they will be constructed (have already been done in limited form). Some companies stand to make a ton of money on these systems.

    There's an old book that is called “Computers in Battle.” It concludes that a computer should never be given the “kill-switch” because it's impossible to write computer code without bugs. The industrial average is between 10 – 50 errors per 1000 lines of code. When you consider that Windows has over 50 million lines of code, there could be (for discussion let's use a rate of 20 errors per 1000 lines) around a million errors in that software.

    There's a big difference between innocuous presentation mistakes and serious, result-impacting errors, so let's refine our million errors down to 1/10th that amount. That still leaves us with 100,000 errors that could have a severe effect. So here are some important questions: Would you want to release an autonomous killing machine that could make a mistake and kill the wrong person? What ratio of errors would you find acceptable? Would shooting the wrong person be more acceptable on the battlefield than in private society?

    I'm about to move into what some readers may consider sensitive territory. Be warned, but it's a line of thought that is obvious.

    Let's pair that idea with a very sensitive topic: school shootings. Most teachers are not prepared to carry firearms to deter potential school shooters, so what if government installed LAWS on every school campus? The intent would be to prevent anyone from bringing a weapon into a school.

    Take a deep breath and try to overcome your horror at the thought. I know how you feel about the idea. I hate it also, but it's an available solution with possible economic incentives. I'm not saying it will happen, but what if...?

    It's not possible to get rid of all weapons of whatever type. If guns don't exist, then pipe bombs do, if explosives don't exist, simple chemistry can make them. Some cultures suffer from devastating attacks by knife-wielding killers. If nothing else is available, it's a straightforward task to rent a truck and run down students waiting for rides. The threat here is asymmetric. It's like fighting terrorists. There is only one solution that is effective. That is to make an attacker so sure of failure that they don't see the benefit of attacking. LAWS could do that.

    Let's return to food for thought. Is there something else we can do? How about societal changes? Students routinely brought weapons to school years ago in some areas of the country. They'd leave firearms in their vehicles with the intent of going hunting on the way home. There were few school shootings during those days because our society/culture was different.

    Wouldn't it be more desirable to change our culture in a way that teaches people to place more value on human life, than to set up systems that merely react to violence? As I've shown above, banning any class of weapons does not solve the problem, so should we consider a different approach?

    Now here's your story idea: Postulate a built-in LAWS system that covers every entrance to a school and the outside grounds; assume that there is a bug in the code. It could result in multiple innocent deaths, or perhaps it somehow decides that bicycle tires are lethal weapons and targets them. It's your choice how you spin it.


    Eric S. Martell