One of the greatest achievements man has developed throughout all ages has been "The Scientific Method". In a nutshell, it is a step-by-step outline on how to arrive at "truth". Truth is the ultimate goal of science, and consequently the scientific method.
The Scientific Method can be generally described as taking these steps:
1. Define the question
2. Gather information and resources (observe)
3. Form hypothesis
4. Perform experiment and collect data
5. Analyze data
6. Interpret data and draw conclusions that serve as a starting point for new hypothesis
7. Publish results
8. Retest (frequently done by other scientists)
(Ripped right from Wikipedia)
I trust that you're all fairly familiar with this from middle school science class.
The Scientific Method as we know it has gone through many revisions and alterations throughout the ages, but the main emphasis remains the same:
-Gather a set of incomplete data
-Speculate as to what the data indicates
-Test your speculation
It seems so intuitive that we can hardly think of an alternative. Can there be an alternative to what we today call The Scientific Method, which will steer us toward truth?
As it turns out, there may be. Here is a good article which describes some of it. But here is the gist:
We may just be on the brink of a new age. The Information Age made the world smaller and more connected, but the next big development may just change everything: Quantum Computing. Some major engineering tasks were recently solved which may just make Quantum Computers a reality within a decade.
A Quantum Computer can easily reach speeds of more than quadrillions of times faster than today's machines. This massive increase in efficiency will have profound effects on the entire world, but for us, it may just usher in the death of the Scientific Method.
Incomplete data may just be a thing of the past. Why model data when you can just enumerate? We won't need to create theoretical models for incomplete data because it may be possible to just get more data until the picture becomes clear.
A good example is language translation (as the article I posted mentions). Google is able to translate foreign languages by sheer enumeration. Rather than modeling "meanings" to words and trying to infer context, we can simply associate phrase to phrase and word to word. With enough data, the picture becomes clear. As a result, Google can translate languages is does not "understand" in the slightest.
NEW: To be perfectly clear, one of the major things that Quantum Computing brings to the table is the reduction (or elimination in most cases) of intractability. Data becomes "intractable" when there is so much of it that it becomes unusable.
A common example of intractability is a reverse phone book lookup. Imagine taking a phone book, and trying to find a name to match a given number! Phone books are alphabetized so that you can find a number for a given name easily. But doing it backwards (finding a name to match a given number) is a difficult task. It would take far too long for you to try and do it. Now imagine that it wasn't just a local phone book, but a global one! That would be intractable data indeed.
Quantum Computers would be able to search through vast data sets with ease, ridding ourselves of the confines of intractability. (Or at least pushing the envelope sufficiently far).
Data from experiments as well as mundane details can be stored in grand volumes, and later analyzed as a whole. It would be like trying to analyze and inspect the entire internet collectively. But this is something not in principle unreasonable for the future. Data would no longer be used and then discarded, as is essentially done today. It can all be recycled and placed into a singe repository for later inspection.
Do you think this new method is a viable alternative to the "Scientific Method" as we know it? Or perhaps you have another idea for something else? As always, thanks for reading.
The Scientific Method can be generally described as taking these steps:
1. Define the question
2. Gather information and resources (observe)
3. Form hypothesis
4. Perform experiment and collect data
5. Analyze data
6. Interpret data and draw conclusions that serve as a starting point for new hypothesis
7. Publish results
8. Retest (frequently done by other scientists)
(Ripped right from Wikipedia)
I trust that you're all fairly familiar with this from middle school science class.
The Scientific Method as we know it has gone through many revisions and alterations throughout the ages, but the main emphasis remains the same:
-Gather a set of incomplete data
-Speculate as to what the data indicates
-Test your speculation
It seems so intuitive that we can hardly think of an alternative. Can there be an alternative to what we today call The Scientific Method, which will steer us toward truth?
As it turns out, there may be. Here is a good article which describes some of it. But here is the gist:
We may just be on the brink of a new age. The Information Age made the world smaller and more connected, but the next big development may just change everything: Quantum Computing. Some major engineering tasks were recently solved which may just make Quantum Computers a reality within a decade.
A Quantum Computer can easily reach speeds of more than quadrillions of times faster than today's machines. This massive increase in efficiency will have profound effects on the entire world, but for us, it may just usher in the death of the Scientific Method.
Incomplete data may just be a thing of the past. Why model data when you can just enumerate? We won't need to create theoretical models for incomplete data because it may be possible to just get more data until the picture becomes clear.
A good example is language translation (as the article I posted mentions). Google is able to translate foreign languages by sheer enumeration. Rather than modeling "meanings" to words and trying to infer context, we can simply associate phrase to phrase and word to word. With enough data, the picture becomes clear. As a result, Google can translate languages is does not "understand" in the slightest.
NEW: To be perfectly clear, one of the major things that Quantum Computing brings to the table is the reduction (or elimination in most cases) of intractability. Data becomes "intractable" when there is so much of it that it becomes unusable.
A common example of intractability is a reverse phone book lookup. Imagine taking a phone book, and trying to find a name to match a given number! Phone books are alphabetized so that you can find a number for a given name easily. But doing it backwards (finding a name to match a given number) is a difficult task. It would take far too long for you to try and do it. Now imagine that it wasn't just a local phone book, but a global one! That would be intractable data indeed.
Quantum Computers would be able to search through vast data sets with ease, ridding ourselves of the confines of intractability. (Or at least pushing the envelope sufficiently far).
Data from experiments as well as mundane details can be stored in grand volumes, and later analyzed as a whole. It would be like trying to analyze and inspect the entire internet collectively. But this is something not in principle unreasonable for the future. Data would no longer be used and then discarded, as is essentially done today. It can all be recycled and placed into a singe repository for later inspection.
Do you think this new method is a viable alternative to the "Scientific Method" as we know it? Or perhaps you have another idea for something else? As always, thanks for reading.