While machine learning has advanced a lot in the last decade, that's only one part of AI. I haven't heard anything to make me think artificial general intelligence, or AGI, is near.
Corporations are arguably already a form of AI. It's not perfect, but it's no apocalypse.
Robots would more likely want to exterminate us than enslave us.
Enslavement by robots might not be all that bad, as manual labor and suffering are not likely to be their end goals.
AI is created for human goals, by humans. As long as we can keep this from creating more poverty, I think we'll be fine.
In a sense we are artificial intelligence. We're a neural net generated by a genetic algorithm. In the process of re-creating that, we'll learn a lot about ourselves and get to witness the birth either of an exciting new life form, or of the next stage of our own evolution. So that excitement outweighs the fear that it will destroy us, even though it might.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Discussion on: Why are you NOT worried about the robot apocalypse?
For further actions, you may consider blocking this person and/or reporting abuse