Cutting-edge artificial intelligence laden with false and misleading information poses an imminent threat to global economic stability, warns the World Economic Forum in its latest Global Risks Report.
Released ahead of the annual Davos summit, which is scheduled to take place from 15 to 19 January 2024, the report identifies misinformation and disinformation as the most severe risk in the short term, citing rapid technological advancements exacerbating existing problems. The proliferation of generative AI chatbots, like ChatGPT, raises concerns about the widespread creation of sophisticated synthetic content capable of manipulating groups, transcending specialized skills.
As the Davos meetings approach, the spotlight on AI intensifies, drawing attention from tech giants such as OpenAI’s Sam Altman, Microsoft’s Satya Nadella, and Meta’s chief AI scientist, Yann LeCun. The report underscores the immediate risk posed by AI-driven misinformation and disinformation, coinciding with upcoming elections in several countries. The ability to leverage AI for deepfakes and large-scale misinformation campaigns may lead to societal polarization, questioning government legitimacy and eroding democratic processes.
Beyond the societal impact, the report highlights additional risks associated with AI, including its empowerment of malicious actors in cyberattacks, automation of phishing attempts, and creation of advanced malware. The poisoning of data used to train AI systems further compounds risks, embedding biases into models with potentially irreversible consequences.
While the report acknowledges climate change as a significant long-term concern, it places extreme weather as the second-most-pressing short-term risk. Over the next decade, irreversible climate change tipping points loom, driven by critical changes to Earth systems, biodiversity loss, ecosystem collapse, and natural resource shortages. The intersection of AI and environmental risks underscores the multifaceted challenges that demand urgent attention on a global scale.