583400+ entries in 0.332s

Adlai: so
they'll hire xenotribal mercenaries
mircea_popescu: Adlai you don't understand.
there's nothing more readily cohesive of a group
than hatred for old people.
mircea_popescu: incidentally, my favourite discussion of
the
topic is balzac. eugenie grandet.
Adlai: well, old people who don't keep supporting
themselves enough
to stay alive without support from other
tribes... will die of old age
mircea_popescu: part of
the reason
there's going
to be blood is
that
there's
too many old people.
undata: mircea_popescu: and
the lamb laid down with
the lion
mircea_popescu: but see... because of
the "nonviolent principle" or w/e, it doesnt HAVE
TO adapt anymore.
undata: mircea_popescu: seems
the hive would lose
the ability
to adapt without some mental youth elixir being invented alongside
mircea_popescu: only way
to select people now is age, which explains disasters like nanci pelosi
mircea_popescu: the west did
too, for... well... "fairness" i guess, same bs.
mircea_popescu: asciilifeform it's
the normally sclerotic reaction of society
to a dissolution of values
nubbins`: somewhere out
there, a guy named langton is eating chips on
the couch and watching seinfeld. again.
nubbins` ponders
turmite / langton's ant steady states as analogies for steady states in human behaviour
mircea_popescu: something
the us is getting more and more acquainted with.
mircea_popescu: in fairness,
the commies were very well familiar with
the problem of just-wont-fucking-die-already dinosaurs
mircea_popescu: undata i recall reading
this east-side of cold war story (rdg it was i
think ?), about
the drink
that bestowed immortality, and how obnoxious old people were, 700+ yo clinging on
to life
to "see who wins whatever games"
undata: and I question
the "most"
undata: most of
the
things in my head, somebody/something else put
there
Adlai: :-o "millions long for immortality who don't know what
to do with
themselves on a rainy Sunday afternoon" by
the guy who died on a Sunday afternoon
nubbins`: i suppose if you want
to get cheeky it's (1) wiggle (2) if bumped, wiggle less
nubbins`: undata
the complete model has maybe 2 rules
tops
mircea_popescu: Adlai i have no idea, but i suspect your conviction is more informed by a faint whiff of a personal desire
to survive/fear of death
than anything else.
undata: What if certain
things are just out of reach? Maybe
the complete model is simple, but
the approximations leading
to it are larger
than brains can handle?
Adlai: you don't
think
this is possible?
Adlai: the exact
timescale doesn't matter, it'll happen
nubbins`: i was going
to guess you used differential equations
Adlai: well my initial random was "certain within decades", but
then I
toned it down a few orders of fartitude
mircea_popescu: as opposed
to moderately possible within a few
thousand years ?
Adlai: they could with external support. maybe not
today, but it's easily possible within a few hundred years.
nubbins`: asciilifeform i fell into a black hole halfway
through my cs/math joint major on points like
these
Adlai: would
the same happen
to an ant colony if you kidnapped
the queen?
Adlai: which is why I linked
to clinical examples, rather
than speculation
nubbins`: but i feel like
the upper bound is significantly higher
than people realize
assbot: Reticular formation - Wikipedia,
the free encyclopedia
nubbins`: i know people claim
that
there's an upper bound on
the complexity
that can form in a cellular-automaton-type system with simple rules
mircea_popescu: <asciilifeform> or with an entirely non-interactive and ultimately easily described physical system
that you simply don't grasp yet. << incidentally,
the brain seems
to fit
this quite exactly.
mircea_popescu: perhaps except for
the queen, but
that's an endless discussion.
assbot: Logged on 14-11-2014 21:44:39; mircea_popescu: a good example would be, a spontaneous determination
to build itself a house,
nubbins`: one of
those "my
teacher is an alien"
type ones
nubbins`: asciilifeform unfortunately
the sea won't be seen as alive until after it's dead D:
undata: nubbins`: Greg Egan did one with intelligent
turbulence
mircea_popescu: <mircea_popescu> any computer program of which identifiable components can be unambiguously named is not capable of displaying AI. <<
this obviously requires you
to be allowed under
the hood.
nubbins` recalls
the (clarke?) short story about an intelligent electrical field
undata: when
two of
those ants hit each other, do
they combine
their "houses"?
mircea_popescu: when intelligences meet without
that basis, superamazement ensues.
mircea_popescu: so we eliza-recognise what WE do on
the grounds of culture and convention
nubbins`: recognize is an overly broad
term for
this discussion
mircea_popescu: this is counterintuitive, because we're very ethnicallyclose, so
to speak,
undata: seems a lot
that passes for human intelligence is driven by autonomous processes evolution carved into us, just like
the ant
mircea_popescu: well,
this definition is, "when you recognise what's being done, but neither why nor how, you're confronted with intelligence"
mircea_popescu: say if you showed
this process
to 1k randomly selected 5 yos, would
they say "it built itself a house" ?
mircea_popescu: so your argument is
that ant meets my definition of ai ?
nubbins`: <+mircea_popescu> if
the machine ends up housed within a house of its own making at
the end of a process which was not either understood or its endpoint predictable by observers, well... iut;'s intelligent. <<< mis lados!!
mircea_popescu: see,
that "intend
to go" is exactly why
the preoccupation with non identifiability.
mircea_popescu: if
the machine ends up housed within a house of its own making at
the end of a process which was not either understood or its endpoint predictable by observers, well... iut;'s intelligent.
mircea_popescu: while no identifiable part of
the code deals with housebuilding.
mircea_popescu: a good example would be, a spontaneous determination
to build itself a house,
☟︎ mircea_popescu: well,
the only way
to build one may be
to first build a planet,
then let cnc evolve.