"why and how the WTC 1 and 2 (the WTC towers) collapsed after the initial impact of the aircraft, and why and how WTC 7 collapsed."
NIST never fulfilled their objective to explain "how" the buildings collapsed. They state clearly in their report that they do not analyze the "structural behavior of the tower after the conditions for collapse initiation were reached." In this sense, the report is merely a pre-collapse analysis.
In their response to the RFC issued by members of the 9/11 Truth Movement, they responded, "we are unable to provide a full explanation of the total collapse." Why is this? NIST stated that they carried the analysis, "to the point where the buildings reached global instability. At this point, because of the magnitude of deflections and the number of failures occurring, the computer models are not able to converge on a solution."
In other words, the collapse of World Trade Towers 1 and 2 are to complicated to model. The individuals at the JREF forum can sum it up best:
Dave Rogers: "NIST can't model the collapse to a degree of accuracy that specifies exactly where every perimeter column ended up, what exact proportion of the concrete was pulverised to what particle size distribution, what proportion of the debris fell within the original footprint, or all the other insignificant minutiae of the collapse that the truth movement obsesses about..."
Wildcat: "No, there is too little data. There is simply no way to model such a chaotic event, likely never will be."
The Doc: "There is no reason to determine what happened during the collapse."
These statements seem to imply that it is not possible to model the collapse of the World Trade Centers and even if we could it would not be important. These comments seem strange coming from the James Randi Educational Forum, "a place to discuss skepticism, critical thinking, the paranormal and science in a lively and friendly manner." Are these people really saying that the collapse of two buildings will forever be beyond the ability of science to explain?
What types of problems, events and occurrences are computers modeling today? According to an article in Wired, "Scenario planning is not a waste; computer models can now game the behavior of millions of variables and render nuanced predictions of everything from bioterror attacks to massive earthquakes."
Mathematical software is also being used to solve complex linear algebra problems. "BCSLIB-EXT is used to solve these sparse linear algebra problems, which arise in many applications. It is used by Boeing application packages for circuit analysis, trajectory optimization, chemical process control, machine tool path definition, constrained data fitting, and finite element analysis (FEA) programs...BCSLIB-EXT solves problems arising out of static and frequency response analysis (which use Ax=b) and for buckling and vibration studies (using AX=BXΛ). Instead of taking days to solve problems of 50,000 variables, FEA programs using BCSLIB-EXT now solve problems in the range of 3 million variables in just hours!"
An article in the Newscientist from 2005 states, "An effort to create the first computer simulation of the entire human brain, right down to the molecular level, was launched on Monday."
According to CNET News, "The petaflop era has begun. IBM has devised a new Blue Gene supercomputer--the Blue Gene/P--that will be capable of processing more than 3 quadrillion operations a second, or 3 petaflops, a possible record. Blue Gene/P is designed to continuously operate at more than 1 petaflop in real-world situations."
Computer models are also being created to simulate nuclear explosions. "One of the first problems that scientists working on the ASCI project had to tackle was finding computers that could handle the large datasets necessary for simulating nuclear blasts. A typical model can be as large as tens of millions of elements, and over the next couple of years the simulations will grow to more than tens of billions of elements...the major focus of the ASCI program is terascale computational simulations, visualization is essential to understanding the terabytes of data produced."
In an interview with the Boston Globe, computer scientist Roscoe Giles was asked:
Boston Globe: "What are the most complex kinds of calculations made on high-performance machines?"
Roscoe Giles: "One is modeling weather and climate, which involves hundreds of millions of variables. In the microscopic realm, you get similarly big problems. I was involved in simulating patches of surface of the kind of magnetic material in disk drives, studying features on the scale of nanometers, which are billionths of a meter. The goal of the new Department of Energy program called [Advanced Strategic Computing Initiative] is to replace a lot of experimentation and testing of nuclear weapons and materials by computer simulations. That probably is the largest single supercomputing effort in the world and is driving machines on the 30-teraflop scale."
We can see from these examples that NIST's claim that the collapse of the two towers is to complex to model simply does not stand up to scrutiny. The collapses occurred in roughly 15 seconds. Can NIST model the first 3 seconds and see if the model correlates closely to the observable reality of the actual collapse? Would the number of variables be to many in just the first three seconds?
NIST cannot provide a full explanation of the total collapse, not because the event is to complex to model, but because their theories as to how the towers collapsed contradict basic laws of physics.