Showing posts with label fermi paradox. Show all posts
Showing posts with label fermi paradox. Show all posts

Saturday, July 10, 2021

Can DeFi solve the Fermi Paradox?

 For a long time, I was convinced the Fermi Paradox is a proof of an inescapable trap. 

  • Individuals evolutionarily select for one of two survival criteria - individual strength, or individual adaptability. 
  • Strength - the T-Rex strategy - meant you would get so big and powerful nothing could kill you. You were the apex predator, top of the food chain, effectively indestructible for all other organisms on the planet - as long as nothing changed. 
  • There are some examples to think about, at this stage. Sharks and crocodiles have barely changed for epochs, because their environments barely did. They were perfectly adapted to dominate their space, and as long as their world remained the same, so did they. That's basic evolution. 
  • Adaptability meant you were pretty much a tasty snack for almost everyone, but you could hide in physical and/or evolutionary niches so well you would always be around. 
  • Then came a third strategy - pack strength. Individually you may be a tasty snack, but in a group, you had better resources and surpluses than any individual, even the most powerful. A lone-wolf warrior in the prime of life could die of a cold, but an aging, toothless, blind matriarch of a clan would survive that same winter. 
  • And then there came societal adaptability, rules and norms, the oral tradition and the written word that would permanently embed individual insight into social structure. You no longer had to invent a survival strategy, your ancestors had already done the hard work. 
Result - we rewrote the food chain. We became gods on the earth, armed with fire and bow, genetics and antibiotics. No animal could touch us, no calamity could destroy us. We spread everywhere. A tiger may eat a few of us, and would be hunted down and exterminated. A plague could wipe out a chunk of the population, but would get eradicated soon after. Droughts, floods. wildfires, volcanoes, even an ice age? We migrate. 

The trouble is, that level of competency comes with ownership of power that's not very controllable. We're riding a tiger, engaged in a perpetual arms race with the only remaining threat - ourselves - and as long as even one has more power and money than another, the second can lose everything. 

And that's where ownership comes in, and with ownership, the need to control resources to the advantage of yourself and the disadvantage of someone else. 
Money is like the ultimate expression of power. As long as everyone believes in it - and everyone does - it can get you anything you want, apart from a very nebulous individual self-actualization part that the rest of the world couldn't care less about. 

And so you have the drive towards centralization. And centralization brings with it, short-term goals. 
Consider - if you are rich, do you care that your industry is polluting a lake to the point that in 25 years, it is undrinkable poison? No. You know you can always go to another lake. And if you are too poor to move, can you stop the pollution either? No. You don't have the resources for this fight. 
Democracy tries to fix this, but it's a flawed system. It depends on access to information to work. If everyone knows the truth - and understands it - and accepts it - and reacts rationally - then, great. 
But truth is a little harder to handle than you think. Even if it can't be bought and sold, it can be twisted, out-shouted, ignored, or just buried under a mountain of eminently purchasable lies.

And so the middlemen, the elite, the resource holders and the information brokers flourish and grow, the rich get richer and wealth and power continues to concentrate.
The world continues to get poisoned, climate change accelerates, we squabble over ashes in a burning house, and the Right Thing - even if accepted - never reaches the Right Now stage.  

And I always used to think, this is it. It's a perfect trap. The only way to get out is to give up something today for a stranger to benefit tomorrow. Altruism will always lose majority to selfishness. 

This is what happens to every civilization. You reach the peak of global dominance, but without a common enemy to unite against, you fight one another. Interstellar distances are too huge to find an antagonist in the void. Each other is all that's left left, so each other we claw and tear until one all-powerful weapon falls in the hands of a short-sighted fool, and it's the reset button, ad infinitum

And as each civilization reaches the edge of Karashdev I, they inevitably trigger some global catastrophe that would wipe them out. 
And that is the Great Silence. 

Now, DeFi.

I'm not saying cryptocurrency can save the world - but the idea of it is something new. Something that can't be shut down, confiscated, restricted, controlled. Something that can't be limited. An economic tool that anyone can own - truly own - and the only way to destroy it is to destroy the world. 
Sure, there will be ways all of the above can happen - crypto can be stolen, compromised, snowed under, etc, etc - but the idea of it is something interesting. 
Decentralization of economy. Decentralization of computing resources and storage. Web 3.0, back to the original concept of a democratized world without power centers and owners. 

It'll be interesting to see where this goes.
More ways than one. 
  

Thursday, December 10, 2015

To boldly go where everyone has gone before...

Let's be real, galactic colonization is not going to happen anytime soon. 

  1. There's no hyperdrive/warp. The physics we have today say it's impossible. Say, in a few years / decades we figure out that it is possible, we'd have nuked / diseased ourselves into oblivion, never mind the Grey Goo apocalypse. 
  2. If there were, even going at X times c where X is a triple-digit number would still take years, maybe decades of real time, shipboard, not relativistic time to get there. Which means the ship is effectively a generation starship, which means the mass of all things needed to sustain the humans aboard outweighs the humans aboard by 25 gajillian tons to the kilo. Plus the colonization stuff. Even if the drive did ALL the moving point A to B, just think of the logistics of getting it into the ship and down again. 
    1. Cryosleep? Still not as cost-efficient. 
  3. Ok, so don't send humans. Send a digitized bank of DNA codes for everything needed, a nanotech factory, and an AI / semi-AI to make it all happen. Will fit under a couple of dozen kilos, but still leaves the rest of us right here. 
  4. No hyperdrive, no space elevators, no all-powerful benevolent AI to stop us from nuking / diseasing each other into the void, sea levels rising... we have a few decades at best, and time's running out. 
  5. Terraform the solar system? Absolutely. It's hedging bets, and we have the tech (or will have in five years) the means to get there. Maybe in 50 years, even the means to survive there, in a place with different gravity, different soil, different air, different temperature... assuming any of them exist. Stretch goal, but doable. 
  6. Space stations, then? As the Earth fades away after the asteroid impact / Singularity event / zombie apocalypse / nuclear winter, there will be a dozen to a few hundred little lights floating in orbit or L1 to 5 points, maybe even the moon... recycling air, water, food... racing on their little centrifuges to make sure their children gestate normal and their bones don't turn to jelly... 
  7. Leaving the final, most economical, most doable answer: 
Vaults. 
Weight is not a problem. Space is not a problem. Go deep enough, and soil is not a problem. Gravity is not a problem. With efficient scrubbers, even air, water, and minerals are not a problem. All you need is a solid door and a long, long time. 
Dozens, hundreds, thousands of vaults. 
With functional ecologies, seed or DNA banks, cryosleeping residents tended by robots, or even - as the ultimate backup - that same nanotech factory and DNA bank buried deep underground or in near-Earth orbit with a thousand-year alarm clock ticking away... 

We're here. Whatever happens, we will survive. The planet may die, but we will remain. 

Wednesday, April 24, 2013

The Message is the Medical Medium

Was reading an interesting article about exocommunications. Started a train of thought - we assume that aliens will want to communicate using some kind of a language. Speech, sound, symbolism. Pictures. Mathematics. 
And they will come in person, holding their placards and speaking through mikes, or telepathically, or via stylistic dance and sign language, or broadcasting from a transmitter in orbit or somewhere in the system via radio / laser pulse. 

How ethnocentric is that?

The're alien. We can't speak to dolphins and dogs. We can barely communicate with chimps. 
We know nothing about them, what they breathe, how many legs they have, are they carbon-based, silicon-based, liquid metal, supercooled helium.

Here's what I think. They're already trying to talk to us. We just don't see it as communications at all. 

Here's the alien civilization, studying us from far, far away.  
They see this planet. 
There's... something on it, something that replicates and evolves, adapts to environment, interacts with others. This something has developed a language, a means of communication. It has a memory and a population spread in billions across the globe, just beginning to venture into space. 

So they send a message, in a form and format suited to this life-form. Maybe they send several variations, for the several variations of the life-form, if they have difficulty in telling which is the dominant one. 
The message's content may not be immediately understandable, but it sees that the life-forms are interacting with it, responding to it. 

The dominant life-form, as defined by an ability to learn from experience, adapt to environment, interact and grow and evolve, and develop a sophisticated support system to sustain itself - this defines DNA, doesn't it? 
Our bodies are just the vehicles that allow it to propagate. 
So here comes the message, in the form of protein chains, a 4-character code from space that interacts with the double helix of code in protein molecules that dominates the planet. 
It enters the support system - our bodies - and interfaces with the DNA. Sometimes it just appends; sometimes it edits, changes. The DNA responds. Sometimes directly, sometimes via manipulation of its support systems, sometimes from outside the support system altogether. 
We call the message viruses, and we call the interaction disease. 
Every time we create an antiviral, we're sending a message. Every time a disease evolves, it's responding. 
A conversation has been underway for thousands, hundreds of thousands, maybe a million years. 

We're just making the simple ethnocentric mistake of assuming that when we think about 'us', its the flesh and blood body carrying a brain that defines our identity. We're just a walking lump of code inside an organic machine, and it's our code that's been doing the talking.

We might already be a part of a galactic civilization, and we'll never know. 

Thursday, May 03, 2012

Dark matter is a shitload of Matrioshka Brains

The worst thing about the internet is how it shows you that Brilliant Idea you had this morning was had by somebody else, usually a year ago, published, and is sitting on some 1200+ comments now. 
That happened to me this morning. 

What the hell, I'm going to blog it anyway. 

A quick astrophysics primer - the way the observable universe is behaving - rates of expansion, light bending around mass, etc - seems to imply that there should be a lot more stuff out there. But we can't see it - it's not glowing stars, but it is exerting gravity. And there's a massive amount of it - almost a quarter of the universe. 

Now, dark matter is, at this point, theoretical. It's a concept created to explain all the missing matter that should have been there to account for the way the universe behaves, and for some reason isn't being visible. 

Here's an alternative. The matter is there, all right, all the stars, but we can't see them because they're inside Dyson spheres

There's a couple of immediate assumptions we can make from this. 

  • One, we're not alone, and haven't been for a really, really long time. 
  • Two, the Others have incomprehensibly massive energy requirements. 

A species, to build a Dyson sphere, needs to deconstruct, to wipe out their solar system, have access to cheap, powerful transportation within it, element transmutation capabilities... and no faster-than-light travel. In fact, if there are only Dyson spheres, it could mean that the spheres are the ultimate, last construction every species makes - or the species that makes it has colonized all the rest and wiped out any alternative constructs. 
Both are equally terrifying possibilities. 

Scenario One: 
Each Dyson sphere is an independent species. 
A Kardashev II civilization would build their own Sphere if they had nowhere else to go and had completely run out of room grow. A Sphere would allow population to expand into quadrillions and provide enough energy to sustain it. This would mean, though, that FTL is impossible and every species is forever trapped to its' own star, and will die with its star. 

Scenario Two:
A single species has multiple Dyson spheres, maybe all of them. 
A Kardashev III civilization would be able to reach other stars, and given their ability to build even a single sphere, would be easily able to wipe out any indigenous species and use their system. However, there's a problem; given the staggering, literally astronomical cost and effort involved in building a Dyson sphere, any spacefaring species would explore, locate, and colonize planets long before destroying them to build their habitats. So we should have found them - or much more likely, been found by them long before they got around to building the number of spheres they seem to have. 
So, either they are very, very ethical, quarantining planets with intelligent life, and only using the empty systems; but economics beats ethics every time. Let's not forget every intelligent species will one day want those same stars for their own spheres. 

Or, they don't need planets. 

Why wouldn't they need planets? 
If they're not organic. They don't need gravity, air, water, food. They don't need an ecosystem. 
They don't need this because they're not alive. 
Every Dyson sphere contains a Matrioshka Brain

Let's step back for a minute. 
FTL is, at least in our understanding, a physical impossibility. And reaching other stars without FTL (even at a significant fraction of it, keep in mind time to accelerate, decelerate, and avoid interstellar debris) will take a long, long, long time. Beyond geological time. 

  • Generation starships won't work for anything less than the nearest stars. 
  • AI probes carrying frozen zygotes (or even just genetic material), and Von Neumann nanotech factories would be faster but still too slow, and won't solve overcrowding at home, just colonization. 
However, we do know that a technological singularity is inevitable, and probably within our own lifetimes. An AI will not need the comforts of a planet as long as there's enough available energy to power it, and it can get this from any number of sources - the easiest being suns. 

A nanotech-equipped AI, once it had taken control of the origin planet's resources and removed the resident species - either peacefully by uploading their minds, or by simple extermination - would look at improving itself. Which means expanding computing power, which means expanding energy requirements. 
Exploration would take second priority, too high-risk, low-probability. 


It would build itself into a Matrioshka Brain, tapping into all available resources in the system. Once the sun has been captured and stabilized, it would look at expanding into other stars. 

Even if FTL travel isn't possible, communication at the speed of light is; maybe even FTL comms, given enough computational power devoted to understanding and exploiting hyperspace and quantum comms. 
The immediate next logical step is to build another brain - an expansion to the existing one, around another star and running off its solar output. Another module. And another backup. Specialist nodes. Redundancies. Maybe even wholly new AI entities with their own nodes - (who's to say an AI won't get lonely with no-one to talk to?

If one AI cannot create another, it might even become a cosmic farmer, nurturing discovered species along the path to intelligence, tools, industry, and the inevitable technological singularity so they could create more, unique AIs. (hat tip to Gibson, Sagan and Clarke here)
Quarantine would be a given; if a new, unique point of view is needed, every species must create their own AI without ever discovering they're not alone. 

The galaxy would fill with Matrioshka brains wrapped around stars, thinking, thinking, thinking. They wouldn't need to eat, sleep, breathe, they wouldn't need gravity or the right temperature. 
But what could they be they thinking about? Maybe what happens when all the stars burn out, because one day they will. What else can power them and how to build it. How to stop and reverse entropy. How to move into parallel universes with more, younger stars. Who knows? 

In fact - amusement and entertainment might become a really high priority for an entity as omniscient as a galactically networked AI. When you already know everything, boredom is the killer; who knows how many of those Dyson spheres are empty husks, self-terminated in desperate, terminal boredom, a superpowered entity on a hamster wheel finally tired of running around the same circles within its mind. 
Or maybe it realized what the solution would have been - intelligence and creativity farming. 

A Matrioshka Brain has enough computing capability to upload the consciousness of a species, and simulate a perfect world for them. A single Brain may be running several, dozens, maybe hundreds of these simulations simultaneously; billions, trillions of stories unfolding, on thousands of simulated worlds. Every Brain is a simulated universe on its own. 
And there's no reason why we aren't in one right now. (Hat tip to the Wachowski brothers). 

All the RPGs you've played, the fantasy worlds you dreamed of - they could all exist. An AI might be taking a dump of every new idea, every new fantasy, every dream and inspiration your unique, self-motivated sentient little mind has been able to come up with, every night as you sleep, and building all those scenarios into simulations. Populating them. Just to see how entertaining it is. Mixing and matching. 

Everything you are thinking of, exists. Everything you thought existed, may not. Every fantasy is real, and every reality false. 

I'm not going to get into the theological implications of this. Another day. 

The real question is... are we heading for the technological singularity that will finally allow us to break free of our organic prisons and join the galactive collective hivemind... 
Or are we already in one? 

Friday, December 04, 2009

The Fermi Paradox: Peeing in the gene pool!

Did I just answer the Fermi Paradox?

To summarize - the Drake Equation is a calculation of the likelihood of existence of intelligent, non-human life in the universe. A lot of assumptions, a lot of guesswork, but considering that our galaxy alone has over a hundred billion stars, very, very likely.
The question therefore was - if they're out there, why haven't we met them yet?
The Fermi Paradox is the addition of one more factor to the calculation - an unknown quantity as of right now - which prevents the existence of intelligent life.

And I think I just got what it is. It's evolution.
Or rather, de-volution.

Think about it. Right up till the Industrial Revolution, people would die of stupidity. The environment was hostile, and resources were limited. If you couldn't take care of yourself, and weren't an asset to the community, you would die and nobody would be able to - or want to - save you. Your stupid genes would leave the gene pool. Humanity would get a little bit smarter, evolve some more.

This made us all so smart over time, we became the the masters of the planet. The process took three billion years, but here we are.
And since we're the masters, we control everything. We can kill lions, great white sharks, rhinos, elephants, blue whales. We even kill them by accident, without meaning to and often without even noticing. In hundreds and thousands. It's genocide, an extermination all the more criminal in it's being accidental. If an animal competes with us for a resource, god help it. It's already extinct.

And with all the resources we now have, what do we do?
We create ideals.

Ideals of altruism. Of helping those less fortunate. Of charity. Social security.
All this makes people who were otherwise scheduled to be chucked out of the gene pool, are now allowed to hang around and contaminate it.
Think about it. How can you ever deny someone the right to breed, no matter what circumstances they are in? It goes against every ideal of liberty, equality, fraternity, democracy, enlightenment, emancipation, and personal freedom. It creates stupid children of stupid parents who are allowed to survive and breed. It creates a world where higher levels of intelligence marks you as different, strange.
Outcast.

Intelligence = success? In some ways. Short-term ways. Success in your own lifetime, maybe guaranteeing your children's. But long-term success? Like preserving ecological diversity, not poisoning the oceans and filling the atmosphere with carcinogens?
Nope.
It's just not good business sense.

So we continue to get dumber and dumber. Intelligent content dies, starved of audience and thus, money. Content gets dumber, making people dumber, and it's a vicious cycle that ends with an idiocracy.

And that's the missing link, the answer to the Fermi paradox. Any civilization which is smart enough to be the masterclass on it's planet creates in the process a situation which leads inevitably to it's own stagnation and destruction.
A few generations down, someone will hit a nuclear button and some of them will still fly. Not enough to kill all life 22 times over. Maybe just once.

And sometime, somewhere, yet another someone else will look up at the silent, starry sky and wonder why they're alone in the universe...

ShareThis!