AI Becoming so Complex its Creators Don't Know How it Comes to a Decision

AI continues to evolve day-to-day and it’s becoming so complex that we don’t know how it comes to some of its decisions. Also, with AI now starting to design new AI how trustyworthy is the new AI going to be? Questions like these are going to continue to dog the AI community until they can be answered. Can we trust it?



“We don’t want to accept arbitrary decisions by entities, people or AIs, that we don’t understand,” said Uber AI researcher Jason Yosinkski, co-organizer of the Interpretable AI workshop. “In order for machine learning models to be accepted by society, we’re going to need to know why they’re making the decisions they’re making.”

Discussion

Source: [H]ardOCP – AI Becoming so Complex its Creators Don’t Know How it Comes to a Decision