Global Cooperation

I have thought about this idea before and brought it to light in a few other posts albeit I only mentioned it in passing. From what I can remember I only mentioned it when talking about the future of the human race. However, recently I was reading a book, a terrible book, but it mentioned this idea and also brought in the idea of super intelligence. The phrase in the book stated that, “Super Intelligence is only possible with global cooperation.” When I read that I immediately stopped and started to think about that statement. Honestly through the entire book this was the only thing that really stood out to me, but despite my many hours toiling away reading, I do not think it was a waste. I value a new idea or thought to the utmost and this is one that I had not quite thought of. Plus new ideas are getting harder and harder to come by so I am always excited when I come across something new.

Now on to the topic at hand.

In some of my previous posts I remember talking about global cooperation in terms of just sheer survival of the human race. To me it was more about what we could achieve if we were all working together toward the same goal, almost like a hive mind. At the time my mind was more focused on making the species an interstellar one, rather than achieving super intelligence. In some respect I think these tow ideas go hand in hand, meaning the probability of an interstellar civilization not being a super intelligent one is near 0.

I think with every civilization in the universe, if they exist, there will come a point in their time line where they either achieve total global cooperation or they do not in which case I would imagine that the species would then wither and die a slow painful death. Perhaps a civilization could get more than one shot at this, but I would think that would be rare. I called this point some sort of Civilization Precipice, where there is a clear before and after, sort of like a great war, like WWI or WWII. However neither of those conflicts had the outcome of global cooperation, but I do not think that human civilization has come to their Civilization Precipice just yet. My point is that this is a pivotal point for any civilizations, and will ultimately determine the future of the civilization.

The reason I mention my Civilization Precipice is because I think this theory is a series of happenings that take place for any intelligent or civilization on the verge of intelligence. Along with my Way of the Universe theory I think these two ideas go hand in hand. My Way of the Universe theory states that if intelligence arises in a civilization that that civilization has a greater than 0 chance to become super intelligent, then they will eventually create AI if achieving super intelligence. Through various circumstances the civilization eventually dies out and all that is left is the AI that it created. So throughout the universe there is potentially a plethora of Alien Super Intelligent AI’s but their creators are all dead. And no I do not believe that the AI killed them, I just think that over time the civilization declines for whatever reason. Perhaps they eventually merge with the AI creating a single species. I would say there are a few different scenarios and I am not ruling out AI destroying its creators, I just think that would be rare, although still possible, just not the norm.

All that is fine and dandy, but how do we get to that point. That is where my Civilization Precipice comes into play. For my Way of the Universe theory to be viable I think every intelligent civilization will have to pass through their own Precipice. Again this is a major turning point in their civilization, once this occurs then it will be easier to achieve Global Cooperation. Once this happens then the civilization could much easier  achieve super intelligence and from there, AI. For me I think this is just about the most plausible scenario for not only achieving super intelligence but also AI. I am not saying this is the only way it is possible, but I think the combination of these two theories is the most plausible.

If you work backwards I think it will make more sense. I think it is extremely difficult if not impossible for a non-super intelligent species to create a real true AI. Still working backwards, I think it is extremely difficult if not impossible for a civilization to achieve super intelligence without global cooperation. I do not think global cooperation is possible without the civilization precipice. When you look at the theory that way I think it makes even more sense. Regardless of what the experts say I think we are a long way off from creating the AI that we see in movies. The AI that basically is indistinguishable from a human in thoughts and actions. I think we are dozens if not hundreds of years from that. However, if you look at how our civilization is playing out I do not think we will follow my theories as stated above. I think there is a slim chance we could create AI before becoming super intelligent, however, I am not sure what this “AI” will be. I doubt it will be the AI that we see in movies. If we are able to create a true AI, then there is a chance the AI could help us achieve super intelligence without global cooperation. But again I stand by my statement that we are years away from creating AI so I am not sure we have to worry about that.

I think there is a good chance of both my theories coming to pass for the human race. If I had to make a prediction I think climate change will be our precipice of sorts, and this issue could force us into global cooperation. From there we could make a major technological jump to super intelligence and eventually real true AI. This is a stretch, but I think it is possible.

This idea of global cooperation is fascinating to me, but if you look at the history of humanity we rarely band together for anything. The only times I can really think of are when disaster strikes or global wars. Basically something really bad has to happen and people have to die before we put our personal goals aside for the greater good. So the real question is what type of disaster will have to happen for total global cooperation?

So why is global cooperation so important, to me there are a few reasons. The first is obvious, and that is that many hands make light work. If we had every person on the planet working toward the same goal we could achieve that goal much quicker and much more efficiently. The other reason and I think just as important is the psychological aspect. I think in order for a civilization to reach this point the entire civilization has to have the same mindset. That means this civilization cannot have crazy ass terrorist running around blowing crap up and killing random people for no reason. Also tyrannical countries like much of the Middle East and N. Korea would be a great hindrance to the overall productivity and progress of the civilization. Again the psychological aspect would potentially mean no more wars and individuals would put aside their personal goals for the greater good of the civilization. This sounds more like a utopia and to some extent it would be, but it would be a natural one, not one forced down my a suppressive regime. Of these two factors I think the second is most important. If there was ever a point in human history where humanity could all get along and put aside our differences, that is when we could truly take the next step away from extinction and toward a real interstellar civilization.

For me the jury is still out on how our civilization will play out. My theories are just theories but the logic is somewhat sound, but I stand by the Civilization Precipice. Depending on what this event is, it could be a step toward super intelligence. As it stands now I think there is a 50% chance humans achieve super intelligence. I really want to see how things play out in the next 20 years or so before I refine my prediction. What will sway my prediction is whether or not we can find renewable and sustainable energy that is cheap and practical. As it stands now there is no source that meets all those requirements. Also solving that problem should also solve the climate change issue. So if we can knock both of those issues out with one fix that will be a major win. After that I have no idea, perhaps our precipice will be many small issues that we have to overcome one at a time. Regardless I think we have a long way to go, but there is plenty of time, at least for now.

Manik

Leave a Comment

Your email address will not be published. Required fields are marked *