Artificial Superintelligence or ASI is coming eventually. There are groups of organizations discussing the existential risk that ASI poses to humanity. Even if we only develop an AGI, AGI will still create ASI and we lose control at some point.
Supporting the Open Sourcing of Collective Superintelligent systems is our only hope for keeping up and moves us forward before other technologies outpace our ability to keep up. Check out the videos for the 2021 Conference.