There are numerous findings in the last 100 years that have revolutionized modern science. There is the application of practical solid-state physics that lead to the invention of the microchip, the transistor and the microcomputer. DNA, which may eventually put a terminus on diseases and extend life spans by an unprecedented degree, was discovered in 1953. In 1990 Tim Berners-Lee, a consultant at CERN, the European laboratory for particle physics, along with his colleague Robert Cailliau created software that gave birth of the Internet, which allowed for a complete shift in human society in terms of the availability of information and how we connect.
Nevertheless, one of the most important findings was in the field of physics with the ability to control nuclear fission and fusion. Both reactions release vast quantities of energy from relatively small amounts of matter, which has lead to both nuclear energy and the atomic bomb. Furthermore, the atomic bomb ushered in "big science," where science could be used for geopolitical and economic gain. Suddenly massive laboratories, machines, and budgets were put forth in order for scientists to test their theories and move into new frontiers—in all fields of science.
More recently in the past few decades, the rise of the microcomputer has enabled stunning progress in many aspects of society. Cell phones and inexpensive computers are beginning to bring the Internet to even rural areas of developing countries, with major implications for distance learning and democratization. As mentioned previously with the discovery of DNA, the emergence of genetic engineering and biotechnology may be the most revolutionary development in the last 50 years. And in its wake come a series of possibilities that link science and ethics more than ever before.