We are a commune of inquiring, skeptical, politically centrist, capitalist, anglophile, traditionalist New England Yankee humans, humanoids, and animals with many interests beyond and above politics. Each of us has had a high-school education (or GED), but all had ADD so didn't pay attention very well, especially the dogs. Each one of us does "try my best to be just like I am," and none of us enjoys working for others, including for Maggie, from whom we receive neither a nickel nor a dime. Freedom from nags, cranks, government, do-gooders, control-freaks and idiots is all that we ask for.
The problem -- or at least the change -- is that we humans cannot understand systems even as complex as that of a simple cell. It's not that were awaiting some elegant theory that will snap all the details into place. The theory is well established already: Cellular systems consist of a set of detailed interactions that can be thought of as signals and responses. But those interactions surpass in quantity and complexity the human brains ability to comprehend them. The science of such systems requires computers to store all the details and to see how they interact. Systems biologists build computer models that replicate in software what happens when the millions of pieces interact. It's a bit like predicting the weather, but with far more dependency on particular events and fewer general principles.
Models this complex -- whether of cellular biology, the weather, the economy, even highway traffic -- often fail us, because the world is more complex than our models can capture. But sometimes they can predict accurately how the system will behave. At their most complex these are sciences of emergence and complexity, studying properties of systems that cannot be seen by looking only at the parts, and cannot be well predicted except by looking at what happens.
I was also reading sometime ago that many researchers can replicate past conditions with their models with amazing accuracy. However, when they run their models into the future the model diverges with the real world. I think it was John Derbyshire at The Corner who pointed that article out.
The Wisconsin Skier
I was for a time enamored of Chaos Theory, which has many aspects, but includes, relevant to this, the idea of sensitive dependence - that small or seemingly-unimportant events can change a system's behavior dramatically. Yet at other, indistinguishable times, a large event might have no effect whatsoever, swallowed up in the system chugging along.
I never knew much, but enough to witness how baffling it can be when you see it in action.
Thanks for the link.
Assistant VIllage Idiot
Weather prediction is much better today than it was a hundred years ago due to the understanding of complex systems, and how to model them. The basic principles were discovered by Edward Lorenz who was using early computers to simulate weather systems. One day, he wanted to repeat a particular run, so he went to a printout and reentered the data. Unexpectedly, the simulation didn't come out the same way. That's because the printout used truncated numbers, so even though they were close to the original numbers, they varied from the originals very slightly. These small differences meant that the simulation diverged over a period of time until it no longer resembled the original run. This phenomena is called chaos, the sensitivity of a complex system to initial conditions.
The solution to predicting the weather was to provide as much accurate data as possible, and to make adjustments based on how closely the model followed the observed unfolding of weather. This has allowed weather predictions to go from 1-2 days, to up to 10 days with statistical measures of certainty. It has also informed climate research, especially with regards to understanding how humans interact with and influence the climate.