Insights · December 18th, 2022
Rather than sugar coating forecasts, futurists should find ways to help the public to clearly understand the radical scenarios that could lie in the relatively near future for humanity. That was one of the conclusions at London Futurists event on December 17th, 2023.
This webinar took place on 17th December 2002 and featured the ideas of Daniel Faggella, the founder and CEO of Emerj Artificial Intelligence Research. Daniel has researched and written extensively on topics such as:
- A forthcoming “moral singularity”
- Scenarios for the emergence of AGI
- Why ideas of “friendly AI” are fraught with difficulty
- Possible trajectories for posthumans in the wake of advanced AI
The event also featured comments and feedback from Futurist Think Tank member Bronwyn Williams, Foresight Lead, Flux Trends and Rohit Talwar, CEO of Fast Future
Another key takeaway is that the need is stronger than ever to evaluate ASAP, before it’s too late, various actions that could be taken NOW to positively influence any forthcoming transition from today’s AI to more powerful AI that proceeds beyond human understanding and control.
Humanity’s choices for which scenarios to prefer for this transition will depend, in turn, on how we assess and value the various potential end outcomes that could lie beyond the emergence of AGI (Artificial General Intelligence). These end outcomes deserve wider study.
These end outcomes include:
- Humans are essentially unchanged
- Humans are significantly uplifted
- A baton is passed to beings significantly more sentient & conscious than us
- Humans are relocated to secure new locations in paradise-like virtual worlds
- Human-AGI merger.
But all these transitional paths are fraught with landmines:
- AGIs, whilst enormously clever, might miscalculate, and annihilate everything (including themselves)
- AGIs, whilst enormously clever, might lack any true sentience or consciousness, so the cosmos becomes soulless
- The public might react badly to becoming aware of some of these forthcoming possibilities (and the outlandish ideas that “tech elites” are seemingly happy to discuss), resulting in panic, social chaos, demagoguery, and a collapse in civilization
- Attempts to steer the evolution of AGI might result in ham-fisted regulations that quash genuine innovation and drive it underground or overseas (where it may take place more carelessly and dangerously)
_______________
Nikolas Badminton is the Chief Futurist at futurist.com. He’s a world-renowned futurist speaker, consultant, author, media producer, and executive advisor that has spoken to, and worked with, over 300 of the world’s most impactful organizations and governments. He helps shape the visions that shape impactful organizations, trillion-dollar companies, progressive governments, and 200+ billion dollar investment funds.
You can preorder ‘Facing Our Futures’ at Amazon, Bloomsbury, Barnes and Noble and other fine purveyors of books. We’s also love it if you considered pre-ordering from your local, independent book store as well.
Please contact futurist speaker and consultant Nikolas Badminton to discuss your engagement or event.