CONCEPTS
JOURNALS
EXPLORE
Academic Commons
Chronicle of Higher Education
SFC NYC 2011
Larry Lessig, Harvard Berkman Klein
Jonathon Richter, Immersive Learning Research Network
Doug Blandy, UO Folklore
Mark Johnson, UO Philosophy
Antonio Lopez, John Cabot Univ.
Victoria Vensa, UCLA ArtSci
Berkeley DMAX/BAMPFA
Berkman Center, Dana Boyd
Berkman Center Harvard Law
MediaBerkman Harvard Law
Bioneers Collective Heritage Institute
Cardozo Law, Susan Crawford
Complexity Digest
Cooperation Commons *
Digital Humanities UCLA
• welcome
Harvard Free Culture Computer Society
Santa Fe Institute
Intl. Society for Systems Sciences
New England Complex Systems Institute
Institute for Ethics and Emerging Tech
Kairos: Rhetoric, Tech, Pedagogy
MediaTropes
MIT CMS New Media Literacies
• NML Blog
MIT Center for Civic Media
Music Cognition Matters
New Media Consortium
Pressthink, New York University
On The Commons
Open Source Lab, Oregon State Univ.
Our (and Your) RISD
Regenerative & Permaculture Institutes
Creative Commons
Stanford Archeolog
Stanford Encyclopedia of Philosophy
Stanford Humanities Lab
Stanford Metamedia
Stanford MetaverseU *
Stanford Open Source Lab
Stanford Philosophy Talk
Uplift Academy, Tom Munnecke
Contribute
There are 1 unlogged user and 0 registered users online.
You can login or register for a user account here. 
Tuesday, February 10, 2004  12:16 PM
Physicist and computer scientist Stephen Wolfram has made his seminal work, A New Kind of Science, available for free online. What is the Principle of Computational Equivalence? Almost all processes that are not obviously simple can be viewed as computations of equivalent sophistication. More specifically, the principle of computational equivalence says that systems found in the natural world can perform computations up to a maximal ("universal") level of computational power, and that most systems do in fact attain this maximal level of computational power.
Consequently, most systems are computationally equivalent. For example, the workings of the human brain or the evolution of weather systems can, in principle, compute the same things as a computer. Computation is therefore simply a question of translating inputs and outputs from one system to another.
What is Computational Irreducibility? While many computations admit shortcuts that allow them to be performed more rapidly, others cannot be sped up. Computations that cannot be sped up by means of any shortcut are called computationally irreducible. The principle of computational irreducibility says that the only way to determine the answer to a computationally irreducible question is to perform, or simulate, the computation. Some irreducible computations can be sped up by performing them on faster hardware, as the principle refers only to computation time. Israeli and Goldenfeld (2003) have shown that computationally irreducible physical processes can be predictableand even computationally reducible!at a coarsegrained level of description. In particular, coarsegrained cellular automata can emulate the largescale behavior of the original systems without accounting for smallscale details. Furthermore, at least one of these automata is irreducible and known to be a universal Turing machine. Here's immediate access to the complete book. Starting from a collection of simple computer experiments, Wolfram shows how their unexpected results force a whole new way of looking at the operation of our universe, including the origin of the second law of thermodynamics, the development of complexity in biology, the computational limitations of mathematics, the possibility of a truly fundamental theory of physics and the interplay between free will and determinism. Biography of Stephen Wolfram: A wellknown scientist and the creator of Mathematica. He is widely regarded as one of the world's most original scientists, as well as an important innovator in computing and software technology. Born in London in 1959, Wolfram was educated at Eton, Oxford, and Caltech. He published his first scientific paper at the age of 15, and had received his Ph.D. in theoretical physics from Caltech by the age of 20. Wolfram's early scientific work was mainly in highenergy physics, quantum field theory, and cosmology, and included several nowclassic results. Having started to use computers in 1973, Wolfram rapidly became a leader in the emerging field of scientific computing, and in 1979 he began the construction of SMPthe first modern computer algebra systemwhich he released commercially in 1981. In recognition of his early work in physics and computing, Wolfram became in 1981 the youngest recipient of a MacArthur Prize Fellowship. Late in 1981 Wolfram then set out on an ambitious new direction in science aimed at understanding the origins of complexity in nature. Wolfram's first key idea was to use computer experiments to study the behavior of simple computer programs known as cellular automata. And starting in 1982 this allowed him to make a series of startling discoveries about the origins of complexity. The papers Wolfram published quickly had a major impact, and laid the groundwork for the emerging field that Wolfram called "complex systems research." Through the mid1980s, Wolfram continued his work on complexity, discovering a number of fundamental connections between computation and nature, and inventing such concepts as computational irreducibility. Wolfram's work led to a wide range of applicationsand provided the main scientific foundations for such initiatives as complexity theory and artificial life. Wolfram himself used his ideas to develop a new randomness generation system and a new approach to computational fluid dynamicsboth of which are now in widespread use. Following his scientific work on complex systems research, in 1986 Wolfram founded the first research center and the first journal in the field. Then, after a highly successful career in academiafirst at Caltech, then at the Institute for Advanced Study in Princeton, and finally as Professor of Physics, Mathematics, and Computer Science at the University of IllinoisWolfram launched Wolfram Research, Inc. Wolfram began the development of Mathematica in late 1986. The first version of Mathematica was released on June 23, 1988, and was immediately hailed as a major advance in computing. In the years that followed, the popularity of Mathematica grew rapidly, and Wolfram Research became established as a world leader in the software industry, widely recognized for excellence in both technology and business. Wolfram has been president and CEO of Wolfram Research since its inception, and continues to be personally responsible for the overall design of its core technology. Following the release of Mathematica Version 2 in 1991, Wolfram began to divide his time between Mathematica development and scientific research. Building on his work from the mid1980s, and now with Mathematica as a tool, Wolfram made a rapid succession of major new discoveries. By the mid1990s his discoveries led him to develop a fundamentally new conceptual framework, which he then spent the remainder of the 1990s applying not only to new kinds of questions, but also to many existing foundational problems in physics, biology, computer science, mathematics and several other fields.

GETTING STARTED
PUBLIC LEARNING
OPEN COURSEWARE
OPEN METAVERSE
• Blender [3D Suite] OPEN FORGES
OPEN ACCESS ARCHIVES
OPEN WEBCASTS
OPEN SOURCE MOVIES
FREE CULTURE +
OPEN ACCESS TEXTS
Blog. Cliff Gerrish  Echovar
Blog. Solving For Pattern
Blog. PaulBHartzog
Blog. Dave Pollard
Blog. George Por
Electronic Frontier Foundation [EFF]
Free Software Foundation News
Login
Future of the Book
High Fidelity Dreams Scott Draves
H+ magazine
IFTF Future Now
Kolabora Collaboration
Make Magazine & Craft Zine
Nation of Makers
Neurotechnology Zack Lynch
NextNow Collaborative
Unconference.net
Valley Zen
Visual Complexity
Wikinews
WorldChanging
