Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

AI tasked with bringing down medical costs? What could possibly go wrong?

+0
−0

In the vision of its creators, the Dr. Watson AI would gather multiple live feeds from its insurance buyers, overcoming privacy concerns through friendly advice (I'm sorry Dave, I'm afraid I can't let you sit on the couch, you have not fulfilled your daily step quota yet. The TV stays off.) and ridiculously low insurance rates.

Watson would keep an eye on your vitals and call 911 if needed.

Watson would also keep track of your bone microfractures and predict when you'll need a rest from running, or keep tentative doctor's appointments automatically booked if you insist on doing something against its advice (and it can't compel you to do otherwise by shutting off your alarm, putting on a great show on TV, or something like that). Same for artery buildup and other preventable medical conditions.

Watson has 3 primary incentives, with these 2 initial ones:

  1. Maximize insurance enrolee pool.
  2. Minimize insurance payouts/costs.

One of the software engineers insisted that a 3rd priority be added:

  1. Exercise best efforts to keep insured patients alive as long as possible.

Could something like this work? If not, can this be fixed through a more careful specification of the initial incentives?

History
Why does this post require moderator attention?
You might want to add some details to your flag.
Why should this post be closed?

This post was sourced from https://worldbuilding.stackexchange.com/q/33272. It is licensed under CC BY-SA 3.0.

0 comment threads

0 answers

Sign up to answer this question »