In 2014, Stephen Hawking voiced grave warnings about the threats of artificial intelligence. His concerns were not based on any anticipated evil intent, though. Instead, it was from the idea of AI achieving “singularity.” This refers to the point when AI surpasses human intelligence and achieves the capacity to evolve beyond its original programming, making it uncontrollable. As Hawking theorized, “a super intelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble.” With rapid advances toward artificial general intelligence over the past few years, industry leaders and scientists have…
Read More