Hi James_Van_Artsdalen
Unfortunately no computer model prediction 100 years into the future has ever been tested. That's the essence of science: propose a theory and then back it up by testing its predictions
The science is pretty well understood (the physics and thermodynamics of the atmosphere i.e how the amount of CO2 will tend to warm a volume of Nitrogen and Oxygen when illuminated with IR radiation etc), finite element analysis modelling is well understood, even solving complex non-linear differential equations is well understood.
What the BBC experiment did was to start the computer model of the global climate in 1920s using the limited real world data set of conditions in the 1920s of the atmosphere as a starting point of the simulations. The variability of the starting condition parameters was assigned to each of the computers within the distrubution. Each computer within the distribution then ran its own individual climate prediction. Hundreds of thousands of climate predictions were computed each with a slightly different set of starting conditions. The recorded data of atmospheric conditions from the 1920s would have of course been subject to recording error.
There was not just one climate model, which was selected, there were over 250,000 models. Each climate model was then grouped and averaged statistically and the results produced an overall trend.
What is interesting are the results, the slight cooling in the 1970s is seen. The results do seem to follow the temperature trends of the global climate from the 1920s up to the present day with remarkable accuracy to the recorded climate data since the 1920s. In effect the model has been tested in accordance to the principles of science. The theoretical modelling used in distributed climate modelling exercise is being compared to the hundred of thousands of atmospheric recordings that have been made since the 1920s until the present day.
http://www.bbc.co.uk/sn/climateexperiment/theresult/graph1.shtml