Billionaire seem to have a… unscientific view of a sci fi future. Especially Musk, since he thinks he’s so transcendental, but apparently Bezos can’t help himself now either.
It doesn’t look like Star Trek.
It doesn’t look like a Cyberpunk movie.
I’d recommend diving into this for a more scientifically ‘thought out’ and optimistic extrapolation: https://www.orionsarm.com/
Interestingly, this is a neat idea waaay down the line, in the way a Dyson Swarm is interesting. But not anytime in the near future, not until humanity is very, very different (assuming we survive that long).
Billionaire seem to have a… unscientific view of a sci fi future.
That’s cause due to network effects people who are making such projects are equivalent to grocers in qualification. Just were in the right place at the right time. They are not engineers, not philosophers. But since they’ve read and seen in sci-fi that they have to show something engineer-philosopher-like, they are doing all this bullshit.
Our world’s problem is in these monopolies which should be busted. After they are busted, we’ll see a lot of goodness through the usual normal competition.
And also I don’t think the sequence of events that led to the current state of things should be treated as some proof of “capitalism not working” or “computer-driven futurism being a dead end” or even “space travel not ever happening” or anything as radical. Every time is different. It’s like living all your life alone after one bad relationship.
We should dream, and we should make, and we should try, and we should tell those who think it’s their “vision” or none to go kick rocks.
Well, I’m not sure you’ve considered the time-frames involved in that concern. We have a whole lot of time before the sun goes out on us. It took Earth about 2 billion years to develop multicellular life. It then took another 2.5 b before we got vertebrates. That was the hard part though and it’s done, I don’t think there’s any undoing it. There aren’t many things that could wipe out all forms of vertebrates on earth, so I’m confident that would be as far back as the planet could reasonably be set back by any disaster.
Just 60 million years ago, mammals were not at all a dominant form of life, yet that’s all it took for early rodent-like mammals to evolve into human beings (as well as all the other mammals we know today). So based on that timeline, if all human life on the planet were wiped out tomorrow, I’d estimate (pessimistically) it would take less than another 200 million years before another species gained a similar level of intelligence and began a new era of civilization (and perhaps as little as 10 m years, as some species are already quite intelligent). In fact, if the next species screws up, and gets themselves killed, I expect earth will get another go at it in another 10–200 million years, over and over again.
On the other side of the equation, the sun will expand into a red giant and consume the earth in about 5 billion years. That gives us a whole lot of tries to get it right.
I have, but I’m also concerned that humanity got “lucky” so far and that this won’t happen again. There are theories positing that there are several blocking “gates” to civilization, and humanity passed an exceptional number of them already.
It’s reasonable to assert that’s a misleading, human centric perspective; but I’d also point out that the Fermi Paradox supports it. Either:
The conditions that gave birth to our civilization are not exceptional, and spread intelligent life is hiding from us (unlikely at this point, I think)
They are exceptional, and we just happened to have passed many unlikely hurdles so far (hence it is critical we don’t trip up at the end here).
They are not that exceptional (eg more intelligent vertebrates will rise, and would rise on other planets), but there is some gate we are not aware of yet (which I have heard called the Great Filter).
Another suspicious coincidence I’d point out is that we are, seemingly, the only advanced civilization from Earth so far. If we died out soon, other vertebrates that rise would find evidence of us by this point, wouldn’t they? Hence odds are we wouldnt be the first and we would have found precursors if ‘vertebrates rising and then killing themselves off’ was a likely scenario.
TL;DR: I suspect vertebrates -> our tech level is a difficult jump.
Well that’s all true, we don’t actually know what the real filters are, are we already past them, or are they still ahead of us? Certainly people have speculated about this for a long time, and I won’t pretend to have any more real answers than anyone else. But honestly, I’d have a hard time believing that the really rare event, that the great filter lays somewhere between the development of the brain and the development of the kind of intelligence humans have. It just seems like a relatively small jump (relative to all the other hurdles) between many of the smarter animals on earth and human beings. For example, many species use tools a whole lot actually. Only a few other species actually make tools or alter them to a large degree, but you know, give it 10 million years and see if that changes. Likewise, many species have languages, some species even give themselves names, so they can intentionally address other individuals in their social group.
If you don’t mind a bit of total speculation on my part, in my opinion, the explanation to the Fermi paradox is actually pretty simple, there really is no paradox. Intelligent life is probably relatively common in the universe, the reason we don’t see aliens all over the place is that intelligent life thrives too well for that. Once a species is capable of traveling other stars, it’s just a matter of time before they settle most of their galaxy, like within a million years (which is very quick on evolutionary scales). We’re just the first intelligent life in this galaxy, we can assume this because if there were others, they’d already have colonies right here on earth, because it’s a great planet.
To double back on the great filter though, my best guess about which events might be truly rare, my money is on Eukaryotic life and mitochondria. That feels like a real freak accident, as well as an absolutely vital requirement for complex life.
Yeah, I buy the filter (or at least a big filter) being early. That does seem like a freak accident, even with all that time for it.
But on the spread of civilization, this is why I love Orion’s arm: it posits that if a civilization like ours makes it another few thousand years, it’ll expanded in a bubble at a significant fraction of the speed of light and be extremely difficult to extinguish at that point, meaning civilization should have spread across galaxies by now:
“Even with all the equipment available in the Civilized Galaxy and beyond the amount of the Universe which can be examined in detail is tiny. Imagine our own Galaxy as a deep sea fish, with very sharp but tiny eyes, peering at the other galaxies with trepidation.”
Billionaire seem to have a… unscientific view of a sci fi future. Especially Musk, since he thinks he’s so transcendental, but apparently Bezos can’t help himself now either.
It doesn’t look like Star Trek.
It doesn’t look like a Cyberpunk movie.
I’d recommend diving into this for a more scientifically ‘thought out’ and optimistic extrapolation: https://www.orionsarm.com/
Interestingly, this is a neat idea waaay down the line, in the way a Dyson Swarm is interesting. But not anytime in the near future, not until humanity is very, very different (assuming we survive that long).
That’s cause due to network effects people who are making such projects are equivalent to grocers in qualification. Just were in the right place at the right time. They are not engineers, not philosophers. But since they’ve read and seen in sci-fi that they have to show something engineer-philosopher-like, they are doing all this bullshit.
Our world’s problem is in these monopolies which should be busted. After they are busted, we’ll see a lot of goodness through the usual normal competition.
And also I don’t think the sequence of events that led to the current state of things should be treated as some proof of “capitalism not working” or “computer-driven futurism being a dead end” or even “space travel not ever happening” or anything as radical. Every time is different. It’s like living all your life alone after one bad relationship.
We should dream, and we should make, and we should try, and we should tell those who think it’s their “vision” or none to go kick rocks.
The real world looks surprisingly like a cyberpunk movie already.
Man… if the Technopocolpyse is what you consider optimistic, I’d hate to find out what you consider pessimism!
AI will turn all of us into paperclips before long unless the AI jihad succeeds.
Humanity survived though. Even with ‘humans’ dying out, I’d like some form of life to expand and go on.
My biggest fear is Earth ‘fizzling’ and never expanding before the Sun eats it, and the odds of that happening are pretty high.
Well, I’m not sure you’ve considered the time-frames involved in that concern. We have a whole lot of time before the sun goes out on us. It took Earth about 2 billion years to develop multicellular life. It then took another 2.5 b before we got vertebrates. That was the hard part though and it’s done, I don’t think there’s any undoing it. There aren’t many things that could wipe out all forms of vertebrates on earth, so I’m confident that would be as far back as the planet could reasonably be set back by any disaster.
Just 60 million years ago, mammals were not at all a dominant form of life, yet that’s all it took for early rodent-like mammals to evolve into human beings (as well as all the other mammals we know today). So based on that timeline, if all human life on the planet were wiped out tomorrow, I’d estimate (pessimistically) it would take less than another 200 million years before another species gained a similar level of intelligence and began a new era of civilization (and perhaps as little as 10 m years, as some species are already quite intelligent). In fact, if the next species screws up, and gets themselves killed, I expect earth will get another go at it in another 10–200 million years, over and over again.
On the other side of the equation, the sun will expand into a red giant and consume the earth in about 5 billion years. That gives us a whole lot of tries to get it right.
I have, but I’m also concerned that humanity got “lucky” so far and that this won’t happen again. There are theories positing that there are several blocking “gates” to civilization, and humanity passed an exceptional number of them already.
It’s reasonable to assert that’s a misleading, human centric perspective; but I’d also point out that the Fermi Paradox supports it. Either:
The conditions that gave birth to our civilization are not exceptional, and spread intelligent life is hiding from us (unlikely at this point, I think)
They are exceptional, and we just happened to have passed many unlikely hurdles so far (hence it is critical we don’t trip up at the end here).
They are not that exceptional (eg more intelligent vertebrates will rise, and would rise on other planets), but there is some gate we are not aware of yet (which I have heard called the Great Filter).
Another suspicious coincidence I’d point out is that we are, seemingly, the only advanced civilization from Earth so far. If we died out soon, other vertebrates that rise would find evidence of us by this point, wouldn’t they? Hence odds are we wouldnt be the first and we would have found precursors if ‘vertebrates rising and then killing themselves off’ was a likely scenario.
TL;DR: I suspect vertebrates -> our tech level is a difficult jump.
Well that’s all true, we don’t actually know what the real filters are, are we already past them, or are they still ahead of us? Certainly people have speculated about this for a long time, and I won’t pretend to have any more real answers than anyone else. But honestly, I’d have a hard time believing that the really rare event, that the great filter lays somewhere between the development of the brain and the development of the kind of intelligence humans have. It just seems like a relatively small jump (relative to all the other hurdles) between many of the smarter animals on earth and human beings. For example, many species use tools a whole lot actually. Only a few other species actually make tools or alter them to a large degree, but you know, give it 10 million years and see if that changes. Likewise, many species have languages, some species even give themselves names, so they can intentionally address other individuals in their social group.
If you don’t mind a bit of total speculation on my part, in my opinion, the explanation to the Fermi paradox is actually pretty simple, there really is no paradox. Intelligent life is probably relatively common in the universe, the reason we don’t see aliens all over the place is that intelligent life thrives too well for that. Once a species is capable of traveling other stars, it’s just a matter of time before they settle most of their galaxy, like within a million years (which is very quick on evolutionary scales). We’re just the first intelligent life in this galaxy, we can assume this because if there were others, they’d already have colonies right here on earth, because it’s a great planet.
To double back on the great filter though, my best guess about which events might be truly rare, my money is on Eukaryotic life and mitochondria. That feels like a real freak accident, as well as an absolutely vital requirement for complex life.
Yeah, I buy the filter (or at least a big filter) being early. That does seem like a freak accident, even with all that time for it.
But on the spread of civilization, this is why I love Orion’s arm: it posits that if a civilization like ours makes it another few thousand years, it’ll expanded in a bubble at a significant fraction of the speed of light and be extremely difficult to extinguish at that point, meaning civilization should have spread across galaxies by now:
https://www.orionsarm.com/eg-article/49333a6b7d29f
That makes a lot of sense to me.
And the fiction, even as wild as it is, gives the still somewhat unsolved Fermi Paradox a lot of thought:
https://www.orionsarm.com/eg-article/464d087672fe7
I particularly like the ‘Ginnungagap Theory’ that, perhaps, there’s some unknown barrier to expansion.
https://www.orionsarm.com/eg-article/464e942db2789