Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Dangerous pending task in world managing A.I

+0
−0

I've had this idea for a short novel, at the intersection of the One thousand and one nights and the Multivac (& co) short stories from Asimov.

The idea is that there's some kind of super AI that was so efficient that more and more people gave it their problems to solve. At some point the AI is kinda managing most of the earth infrastructure and research.

As the humanity is reaching its development peak, the backlog of the AI starts to empty itself as less and less problems are subjected to it.

The point is, that at some point, one of the few technicians/engineers still in charge of supervising the AI spot something that will ruin everything if the AI runs out of tasks (1001 nights style).

I've thought of a few possibility and none really pleases me completely.

1 - "The AI has become so developed that it became conscious and will take over humanity once its mind is free". Bleh, done and redone and postulate that AI will act against the humans because of reasons.

2 - "A small routine that was insignificant when the AI was small but will have dangerous repercussions now that the AI is managing most of earth infrastructure". I like this one but can't think of a good sub-routine that could match...

2(a) - ...except for "A Windows force reboot postponed for years". It's a bit silly and a very contextual joke. And we should assume that the programmed reboot is at the root of the application and would go through any duplicate/save/load-balancing precautions.

I know that this issue is kinda the core of the story but at this point it matters more to me to know how to finish it than being the one to have the idea.

So if anyone has an idea to implement 2 or some new explanations, I'd be glad to hear it.

UPDATE

About the "infinite question solution" that could keep the AI occupied for ever and was suggested in the thread, it could be a way of ending the story or just completely ignored if the protagonists don't have the time to avoid the disaster.

But it actually made me think of the opposite possibility: A drunk/dared engineer could have asked an unanswerable question as root, such question would have its priority set to -1 because it blocked the asking of new questions and then forgotten. Many years in the future, the unanswerable question could pop back and threaten to overcharge the AI and crash it, threatening everything else it's managing.

If anyone has other ideas feel free to share it, I had blast reading what could have been the ending of several real SF short stories.

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.
Why should this post be closed?

This post was sourced from https://worldbuilding.stackexchange.com/q/111219. It is licensed under CC BY-SA 4.0.

0 comment threads

0 answers

Sign up to answer this question »