On July 16, 1969, Apollo 11 left the Earth bound for the moon with the most sophisticated guidance computer of its time. Built by the finest engineering minds, it held the lives of three Americans and the hopes of the entire planet in its solid-state memory.
All 32 kilobytes of it.
These days, one doesn’t have to travel to orbiting satellites to find that kind of computational capacity. Heck, just reach in your front pocket; the average 32 GB iPhone has 1 million times the computing memory that navigated for Neil Armstrong and Co.
Quantifying this quantum jump in artificial intelligence often comes down to an explanation of the law, in this case Moore’s law. In 1965, Intel’s Gordon Moore noted that the number of transistors that could be placed on a chip was roughly doubling every year. Ever since, Moore’s law has held almost true (the doubling has occurred more on the order of every two years, but still…).
Moore’s law applied to a computer’s processing speed, but clearly data storage has kept pace. Entire genomes can now be stored on drives that fit in the palm of the hand, while galaxies might call for something a little bigger—like a drive the size of a cigar box.
That ability to collect and store data has infiltrated nearly every aspect of the human condition while captivating human imagination. From exploring the heavens to researching repetitive motion injuries, if an action can be observed—whether it’s stargazing or typing—it stands a very good chance these days of being quantified and placed in a database.
Organizing all that data —and making it more easily accessible—is arguably the next great wave in computing, akin to the way the Dewey Decimal System brought order to what had been the chaos of referencing the printed word.
“I think of computing as having gone through three generations at this point,” says Greg Hager, chair and professor of computer science at the Whiting School. “We had the hardware generation, which was concerned with constructing the computer, the software generation where we were more concerned with what ran inside the computer, and now the data generation—which is about what computing can do to essentially take data and turn it into usable information.”
Information that for many Whiting School faculty is changing the way they do business and, in turn, impacting how each of us lives our lives…
On July 16, 1969, Apollo 11 left the Earth bound for the moon with the most sophisticated guidance computer of its time. Built by the finest engineering minds, it held the lives of three Americans and the hopes of the entire planet in its solid-state memory.All 32 kilobytes of it.These days, one doesn’t have to travel to orbiting satellites to find that kind of computational capacity. Heck, just reach in your front pocket; the average 32 GB iPhone has 1 million times the computing memory that navigated for Neil Armstrong and Co. Quantifying this quantum jump in artificial intelligence often comes down to an explanation of the law, in this case Moore’s law. In 1965, Intel’s Gordon Moore noted that the number of transistors that could be placed on a chip was roughly doubling every year. Ever since, Moore’s law has held almost true (the doubling has occurred more on the order of every two years, but still…). Moore’s law applied to a computer’s processing speed, but clearly data storage has kept pace. Entire genomes can now be stored on drives that fit in the palm of the hand, while galaxies might call for something a little bigger—like a drive the size of a cigar box.That ability to collect and store data has infiltrated nearly every aspect of the human condition while captivating human imagination. From exploring the heavens to researching repetitive motion injuries, if an action can be observed—whether it’s stargazing or typing—it stands a very good chance these days of being quantified and placed in a database.Organizing all that data —and making it more easily accessible—is arguably the next great wave in computing, akin to the way the Dewey Decimal System brought order to what had been the chaos of referencing the printed word. “I think of computing as having gone through three generations at this point,” says Greg Hager, chair and professor of computer science at the Whiting School. “We had the hardware generation, which was concerned with constructing the computer, the software generation where we were more concerned with what ran inside the computer, and now the data generation—which is about what computing can do to essentially take data and turn it into usable information.”Information that for many Whiting School faculty is changing the way they do business and, in turn, impacting how each of us lives our lives…
การแปล กรุณารอสักครู่..