Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

How to keep humans pilots instead of AI in sci-fi future?

+0
−0

I'm trying to write a 'gundam like' series, though with space fighters instead of humanoid robots. I want a semi-hard science world, I'm okay with making up technology without fully explaining how it works so long as it seems plausible, but I want to be as realistic as possible within my world and to stay consistent to whatever technology I do add to the world. This makes short range space fighters difficult to justify...

I'm using hand waves, Minovsky Particles and similar tricks to justify space fighters, most notably the presence of energy shields requires use of short range energy weapons to pierce them, they're more vulnerable to lots of weaker attacks from small fighters than a few large attacks from capital ships, and the massive ER radiation from shields, coupled with active electronic countermeasures, prevents remote control of them.

The biggest problems I have with justifying these space-fighters is AI. It still seems like a good AI will be more viable for these space fighters than human pilots. The cost of space fighters goes up if you have to include life support systems, and in fact, the increased size of the fighters to fit a human, controls, and life support makes it a bigger target. My 'Ace' class fighters, rare fighters equipped with their own shields, particularly would suffer from extra size making shields noticeably less effective. Removing humans from the fighters make them cheaper and smaller.

In addition a future AI could presumably be faster to respond to attacks than humans, less predictable, more trustworthy (won't betray you, won't retreat in fear, won't do something stupid, or try piloting drunk), better able to handle a truly 3D fight, that humans aren't used to thinking in, and able to handle G-forces humans can't. Plus use of AI means no human death if fighters are destroyed.

However, I want human pilots as my protagonists. I do not want AI-controlled fighters to exist. Thus I want to come up with the best justification(s) for why humans would still be piloting these vehicles.

Right now my best justification was to simply say that AI advancements stagnated in our future. While we can do all the AI we manage now, and some things AI do better, AI capable of processing the complexity of fighting in space simply have not been developed. However, this seems unlikely to me. I'm a programmer, and I feel like our AI of today, with enough (and I'm talking many years) of development, could already almost handle controlling a space fighter. Give up faster processing and better computers, which will exist in the future, and it's hard to believe that AI would be less suited than humans.

Are there other approaches I can use to justify human pilots over AI? I will not have an "AI went crazy and tried to kill us all" backstory, or otherwise make people afraid of a "terminator scenario". I'm not discussing human intellect or actual 'learning' AI when I say AI here, so there is no danger of an AI being smart enough to revolt and I just don't consider it a realistic concern.


EDIT: I sort of implied it, but to be clear I'm not talking about sapient level learning strong AI, or anything that advanced. I'm talking about weak AI in the sense we have now mostly, it responds to per-programed stimuli quickly in a manner that its programmers felt was best, with some randomness and game theory strategies to avoid predictability. It doesn't need to learn or be capable of doing anything other than flying a fighter and shooting at things. Sapient AI will never be in any of my stories, I think it's game breaking and boring.

Final Decision: Wow, I can't thank everyone enough, there has been a multitude of good reasons listed below. I don't think that anyone alone solves fully the problem, at least not within my desired world and limits on what technology and limits I want to place in it; but luckily I have many reasons provided!

One of my characters is a pacifist and programmer, who effectively writes basic weak AI to drive shields, and is working on trying to find a way to remove pilots from fighters because he figures humans will always war, the best one can do is limit the deaths from it. I'm going to early on have him go on a tirade with how he would love to replace pilots with AI and, when questioned on it, he will go on a bit of a geek rant on the numerous factors which limit AI, and which all collaboratively work to make it not yet viable, and unlikely to be viable for a while. I'm going to draw on many of the answers below to fill out his long list of reasons he gives. Therefore I feel bad about being able to only reward one person top answer, at least a half-dozen of people's answers will be used.

Here is a short list of most of the things he will go on about, though I am including some other minor parts of other answers.

  • AI techniques have not progressed much in the future. We have faster computers, but our approaches for learning AI and genetic algorithms still haven't panned out for large-scale tools; so we're still dependent on the deep blue "calculate all possibilities in a quadratically expanding tree" which simply doesn't scale well. as a geek I almost see him starting to explain big O and how the quadratic increase in processing speed per year can't keep up with higher big O of processing complexity for every nano-second 'look ahead' needed for these AI before he realizes he's talking way over his audience's heads.
    • Limits in AI development mean that humans are better at making decisions in the heat of battle. With communication being somewhat limited during firefights to occasional burst transmissions (limits of my world, regular comms are all effective blocked, quasi-FTL comms exist but are limited in how they work) it's important to have someone who can make decisions even if communications go down entirely.
  • AI is expensive. Shields emit massive EM radiation and even EMP spikes. Working in a battleground with so much ER requires shielded hardware that is more expensive, and the cheaper non-shielded mobile suits are better mass produced. Computers can still exist, but your processing power is more limited by the expense of building machines that function in space with ER and other emissions in battle.
  • I've begrudgingly agreed to have pilots use a mind-machine interface to handle some of their piloting, though I'll have them use a combo of that and regular controls under the claim that the MMI can only interface with certain parts of the mind that are easiest to translate to actionable commands. Specifically, the MMI is used for movement and navigation only, and physical controls for everything else. I would prefer to avoid this one for storytelling reasons, but otherwise, it's just too hard to justify pilots reaction speed being fast enough.
  • All this combines to resulting in AI existing on fighters but being limited to certain functions. Humans still are used for those things we can't make AI do easily and cheaply.

Other more political factors which also play a role, primarily in limiting funding towards developing techniques to work around the above issues.

  • People don't trust AI with guns, everyone is afraid their go rouge and hurt people. He will likely go on to point out some of this is unreasonable bias on people from watching too many unrealistic sci-fi stories, but none the less the bias is against it.
  • People distrust AI for fear of hacking. He blatantly says this is nonsense and locking down the system would require programming effort but is by no means impossible, but it's politicians, not programmers who sign the checks for hardware purchases; and you can't convince them of that fact.
  • political pressure exists to keep people as pilots. A combination of fear of AI in weapons, soldiers not wanting to be rendered unemployed by AI, desire for accountability, and a belief that wars will grow more and more excessive without human factor.
  • People want humans willing/capable of saying no if a general goes too far in his decisions. There will be a past infamous example of a general who went against orders and fired off numerous automated weapons without regards to their equivalent of the Geneva convention, killing many civilians and generally the entire incident is considered an atrocity. It's agreed this was one crazy man without any support from others who were able to do this only because of the automated systems not having any check to prevent one man from firing all of them. Militaries all now have multiple people required to authorize automated weapons, as they should have then, but this incident is remembered still. One of the argument's against AI is that this sort of situation could occur again if a pilot is not present to refuse unlawful orders.
  • Assorted tweaks to my technology to make limits of pilots not be as significant. For instance, the best propulsion systems for shielded craft have limited delta-V, because other propulsion systems are either too expensive for mass produced (non-shielded) fighters or tend to destabilize shields for shielded crafts. This, in turn, limits G-forces imposed on pilots. I'll also have a poor man's inertial dampeners used to address G-force concerns.
History
Why does this post require moderator attention?
You might want to add some details to your flag.
Why should this post be closed?

This post was sourced from https://worldbuilding.stackexchange.com/q/17043. It is licensed under CC BY-SA 3.0.

0 comment threads

0 answers

Sign up to answer this question »