Accidents are a normal part of life. Although we can try to be very careful, sh-t happens (so to speak). Moreover, when a technology becomes complex, these unavoidable accidents cascade through the system in unpredictable ways.
This is all the more so when technology is "tightly coupled". When we try to understand the causes of technological accidents, it often turns out to be very difficult to pinpoint exactly what went wrong. The reason for this is that technologies are intrinsically complex and depend on many things working closely together: Materials and components of different quality are structured into tightly engineered sub-systems, which are operated by error-prone humans in not always optimal organisational structures, which in turn are subject to production pressures and all kinds of managerial manoeuvring.
Accidents such as Three Mile Island often begin with a mechanical or other technical mishap and then spin out of control through a series of technical cause-effect chains because the operators involved could not stop the chain or unwittingly did things that made it worse. Trivial initial errors cause disastrous results.
Failure in just one part (material, sub-system, human, or organisation) may coincide with the failure of an entirely different part, revealing hidden connections, neutralised redundancies, bypassed firewalls, and random occurrences for which no engineer or manager could reasonably plan.
In conclusion, technologies with potentially catastrohic consequences such as nuclear power, therefore, should be abandoned because, in the end accidents are unavoidable or 'normal.' Nothing is disaster proof.
Sources / Further Reading:
Perrow, C. (1984/1999). Normal Accidents: Living with High-Risk Technologies, Princeton University Press. A detailed book summary/review on the web ; a good set of power point slides from a NASA presentation.
Sagan, S.D. (1993). "The Limits of Safety: Organizations, Accidents, and Nuclear Weapons".