On September 11th, at the 2025 Inclusion·The Bund Conference, Yuval Noah Harari—historian, philosopher, and author of the “Sapiens” book series—pointed out that progress should not be measured solely by the speed of technological advancement, but rather by humanity’s ability to build cooperation, trust, and empathy.
“I am not opposed to technological change. Technology has given us healthier bodies, broader knowledge, light to dispel darkness, and unprecedented ways to connect with one another. But as a historian, I am concerned about the pace and manner of this change,” Harari stated. Throughout history, he noted, the greatest problem with transformation has rarely been the ultimate goal, but rather the process of getting there.
He argued that humans are highly adaptable creatures, yet we need time to adapt—and reliable mechanisms to support that adaptation. Whenever powerful new technologies emerge in history, societies take a long time to develop matching systems and practices. The Industrial Revolution, for instance, was not just the story of the steam engine; it also required supporting mechanisms such as corporate law, labor unions, environmental regulations, and social security networks.
“AI differs from all previous technologies in that it touches the central nervous system of society,” Harari emphasized. The risks and danger of AI we face, he explained, “is not a bad person pressing a bad button.” Instead, the risk lies in the invisible processes quietly unfolding around us. Science fiction has conditioned us to fear “robot rebellions,” but the real danger is quieter—and far more terrifying: the transfer of decision-making power from accountable humans to invisible algorithms.

Speed Alone Is Not Progress
For Harari, achieving true progress begins with cooperation. “Human strength has never come from isolation; it comes from collaborating with strangers and the world around us. Cutting off all connections to others and relying solely on yourself does not make you stronger—it will eventually suffocate you. Carrying this lesson into the AI era means building verifiable global commitments, rather than just racing to ‘be faster.”
Second, the real cause for concern is not technology itself, but the deployment of technology without regard for safety boundaries in the pursuit of competitive advantage. “No system that truly reshapes human society should be ‘launched first, regulated later,’” he asserted. “History has repeatedly shown that speed and safety can coexist—but only if we build a closed-loop system of self-correction. An advanced technological society must have ways to identify and rectify its own errors and biases in a timely manner; this is how we ensure it operates both quickly and safely.”
He warned that if we rush to let AI “run” before we learn to identify and correct the inevitable flaws in the system, the costs of that speed will be borne by the most vulnerable groups.
Finally, Harari stressed that humanity must move forward with memory. “As AI begins to take over decision-making processes and shape narratives, we must safeguard humanity’s ability to remember and tell its own stories. If we entrust our memories to non-human intelligence, we will lose everything.”
In the closing of his speech, Harari urged that in the AI era, we must leave enough time for humans to preserve their memories and build trust and affection with one another. “Progress is measured not by the speed of our technology, but by the strength of our cooperation and the depth of our empathy.”