238600+ entries in 0.049s

mircea_popescu: rationality is, basically, the ability to distinguish homomorphisms from homologies.
mircea_popescu: rationality is a purely linguistic property, it has exactly nothing to do with action.
mircea_popescu: "What is rationality? It's the ability to make life plans which reach your goals."
mircea_popescu: can't figure out how to thank that physics dude that read what seems like > 500 pages of shit.
mircea_popescu: i've known many of these, they look like radiation. sooner or later start moving toward wall.
mircea_popescu: all sorts of fun could be had here. what if i get a simulation of me to impregnate the actual girl, and then i impregnate the simulation of her.
mircea_popescu: i suppose i got to ask lesswrong, they seem to know all sortsa shit about how shit that doesn't exist works.
mircea_popescu: asciilifeform what happens if i make two simulations of the same one slavegirl and order them both to fight to the death ?
mircea_popescu: i don't get the argument tho, why is superai of the future influenced by ruminations of humans today
mircea_popescu: poli_ so do they keep attacking wikipedia to take it off ?
mircea_popescu: poli_ hit the search button, select politics and all bets, and see what the cat drags.
mircea_popescu: i don't get it. why would anyone care about some simulation ?!
mircea_popescu: apparently caused several contributors who took it seriously considerable anguish."
mircea_popescu: "Roko's basilisk is a thought experiment that assumes that an otherwise benevolent future artificial intelligence (AI) would torture the simulated selves of the people who did not help bring about the AI's existence. [...] The concept was proposed in 2010 by contributor Roko in a discussion on LessWrong. Yudkowsky deleted the posts regarding it and banned further discussion of Roko's basilisk on LessWrong after it had
☟︎ mircea_popescu: decimation some other guy got a job as area supervisor of kfc, you know ? nothing wrong with it.
mircea_popescu: seriously eliezer baby : first off we know you're stupid, and second off it really doesn't make much difference. okay ?
mircea_popescu: way to let one's own wounded narcisism get hold of the whole cart.
mircea_popescu: asciilifeform why exactly do we suspect very intelligent machines would reproduce ?
mircea_popescu: wikipedia goes through an entire bio article without resolving her relationship to the max more dude.
mircea_popescu: apaprently the future is with bad lipstick and weirdo muslim headwear.
mircea_popescu: i love these scientists that don't even bother putting magic numbers in the code anymore.
mircea_popescu: The concept of the technological singularity, or the ultra-rapid advent of superhuman intelligence, was first proposed by the British cryptologist I. J. Good in 1965: "Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever."
mircea_popescu: In the mid-1970s F.M. Esfandiary legally[5] changed his name to FM-2030 for two main reasons. Firstly, to reflect the hope and belief that he would live to celebrate his 100th birthday in 2030
mircea_popescu: "Extropy also attempts to describe the nature of the order, so a crystal , although highly ordered would have low extropy, but an organism or machine would have high extropy because of the informational structure contained within it. "
mircea_popescu: god im sick of fucktarded usians giving their name to random items.
mircea_popescu: "Extropy is a proposed opposing concept to entropy. As entropy decreases , signifying more order, so extropy would increase in the manner of negentropy."
mircea_popescu: Founder of the Extropy Institute, Max More has written many articles espousing the philosophy of transhumanism and the transhumanist philosophy of extropy,[6] most importantly his Principles of Extropy (currently version 3.11)
mircea_popescu: "Max More (born Max T. O'Connor, January 1964) is a philosopher and futurist who writes, speaks, and consults on advanced decision-making about emerging technologies."
mircea_popescu: "Influenced by seminal works of science fiction, the transhumanist vision of a transformed future humanity has attracted many supporters and detractors from a wide range of perspectives." aaahahahaha
mircea_popescu: basically this is scientology bahai ? do they speak esperanto ?
mircea_popescu: "This hypothesis would lay the intellectual groundwork for the British philosopher Max More to begin articulating the principles of transhumanism as a futurist philosophy in 1990 and organizing in California an intelligentsia that has since grown into the worldwide transhumanist movement."
mircea_popescu: maybe humanity could go to transyale and get a transphd ?
mircea_popescu: check it out lmao, an international cultural and intellectual "movement" that aspires to one day... label soemthing!
mircea_popescu: "The most common thesis put forward is that human beings may eventually be able to transform themselves into beings with such greatly expanded abilities as to merit the label posthuman."
mircea_popescu: asciilifeform you don't understand how the world works. "nobody's using bitcoin, nobody could have foreseen their using a plane as a rocket and nobody's doing anything better nor anything better could exist than ... you know, raising awareness"