A couple of entries from our buddy Harvey this week. First here’s Cher (with a little Sonny) to get us ready for the holidays…
I remember seeing this cartoon when it first aired a zillion years ago. It was pretty neat then and it’s still pretty neat.
Still, whenever I see Cher singing a Christmas song, I think of Paul Shaffer imitating her singing “What Child is This?” on Letterman. That used to be a holiday tradition on Letterman; don’t know if they still do it.
On a somewhat more Speculist note, here’s Fine Young Cannibals singing Don’t Look Back:
Researchers from Japan’s ATR Computational Neuroscience Laboratories have developed new brain analysis technology that can reconstruct the images inside a person’s mind and display them on a computer monitor, it was announced on December 11. According to the researchers, further development of the technology may soon make it possible to view other people’s dreams while they sleep.
The scientists were able to reconstruct various images viewed by a person by analyzing changes in their cerebral blood flow. Using a functional magnetic resonance imaging (fMRI) machine, the researchers first mapped the blood flow changes that occurred in the cerebral visual cortex as subjects viewed various images held in front of their eyes. Subjects were shown 400 random 10 x 10 pixel black-and-white images for a period of 12 seconds each. While the fMRI machine monitored the changes in brain activity, a computer crunched the data and learned to associate the various changes in brain activity with the different image designs.
Then, when the test subjects were shown a completely new set of images, such as the letters N-E-U-R-O-N, the system was able to reconstruct and display what the test subjects were viewing based solely on their brain activity.
Putting together entries for this blog means that I read an amazing story every other day — sometimes more frequently than that. We see so many huge developments that it’s hard to realize how impressive, how potentially world-changing some of them are.
If this is real, it is a world-changing development. Technology such as this could lead to a revolution in art and entertainment unlike anything that has come before. Such a development has the potential to unite the machine world and the human world in a completely new and powerful way.
But there’s a downside. As surveillance technology has continued to dig its way deeper and deeper into every level of our existence over the past few years, we could always take comfort that the human imagination is the one final refuge for someone seeking privacy.
Now that reassurance is gone. And that is pretty damn scary.
Researchers from Japan’s ATR Computational Neuroscience Laboratories have developed new brain analysis technology that can reconstruct the images inside a person’s mind and display them on a computer monitor, it was announced on December 11. According to the researchers, further development of the technology may soon make it possible to view other people’s dreams while they sleep.
The scientists were able to reconstruct various images viewed by a person by analyzing changes in their cerebral blood flow. Using a functional magnetic resonance imaging (fMRI) machine, the researchers first mapped the blood flow changes that occurred in the cerebral visual cortex as subjects viewed various images held in front of their eyes. Subjects were shown 400 random 10 x 10 pixel black-and-white images for a period of 12 seconds each. While the fMRI machine monitored the changes in brain activity, a computer crunched the data and learned to associate the various changes in brain activity with the different image designs.
Then, when the test subjects were shown a completely new set of images, such as the letters N-E-U-R-O-N, the system was able to reconstruct and display what the test subjects were viewing based solely on their brain activity.
Putting together entries for this blog means that I read an amazing story every other day — sometimes more frequently than that. We see so many huge developments that it’s hard to realize how impressive, how potentially world-changing some of them are.
If this is real, it is a world-changing development. Technology such as this could lead to a revolution in art and entertainment unlike anything that has come before. Such a development has the potential to unite the machine world and the human world in a completely new and powerful way.
But there’s a downside. As surveillance technology has continued to dig its way deeper and deeper into every level of our existence over the past few years, we could always take comfort that the human imagination is the one final refuge for someone seeking privacy.
Now that reassurance is gone. And that is pretty damn scary.
A Russian professor at an Ohio university has applied to patent a method for snuffing out hurricanes by flying jet fighters around the eye of the storm at supersonic speeds.
Professor Arkadii Leonov and his collaborator Atanas Gagov, both of Akron Uni, actually filed their patent application “Hurricane Suppression by Supersonic Boom” last year.
There is plenty to love about this idea –
1. It’s original.
2. It relies on existing technology.
3. If it works, it solves a huge existing problem.
But if it does work, I think it will ultimately fall to supersonic unmanned drones to carry out this task. I know we already send aircraft into storms for scientific observation, but something tells me that whipping around the perimeter of a hurricane at supersonic speeds opens up a whole new level of risk.
Using argon-argon dating—a technique that compares different isotopes of the element argon—researchers determined that the volcanic ash layers entombing the tools at Gademotta date back at least 276,000 years.
Many of the tools found are small blades, made using a technique that is thought to require complex cognitive abilities and nimble fingers, according to study co-author and Berkeley Geochronology Center director Paul Renne.
Some archaeologists believe that these tools and similar ones found elsewhere are associated with the emergence of the modern human species, Homo sapiens.
Bottom line: either that was us 275,000 years ago — 80,000 years earlier than the supposed emergence of homo sapiens –or there was another species of human during that period capable of doing then what we would be doing a few dozen millennia later.
Way back then it could have been neanderthals (our possibly their ancestors, homo heidelbergensis, assuming either of these species were ever present in Africa, which I’m not sure about.) Or it could have been homo erectus, which would indicate that these early humans were more sophisticated than we’ve given them credit for. Or it could have been some dead-end offshoot from homo ergaster — Africa would be the right place to look for that. Or, again, it could have been us.
The problem is that there are no human bones, just artifacts suggesting human beings more sophisticated than any humans that were supposed to be around at that early date.
Using argon-argon dating—a technique that compares different isotopes of the element argon—researchers determined that the volcanic ash layers entombing the tools at Gademotta date back at least 276,000 years.
Many of the tools found are small blades, made using a technique that is thought to require complex cognitive abilities and nimble fingers, according to study co-author and Berkeley Geochronology Center director Paul Renne.
Some archaeologists believe that these tools and similar ones found elsewhere are associated with the emergence of the modern human species, Homo sapiens.
Bottom line: either that was us 275,000 years ago — 80,000 years earlier than the supposed emergence of homo sapiens –or there was another species of human during that period capable of doing then what we would be doing a few dozen millennia later.
Way back then it could have been neanderthals (our possibly their ancestors, homo heidelbergensis, assuming either of these species were ever present in Africa, which I’m not sure about.) Or it could have been homo erectus, which would indicate that these early humans were more sophisticated than we’ve given them credit for. Or it could have been some dead-end offshoot from homo ergaster — Africa would be the right place to look for that. Or, again, it could have been us.
The problem is that there are no human bones, just artifacts suggesting human beings more sophisticated than any humans that were supposed to be around at that early date.
It’s not often that Britain can claim a win in the space race. But these teddy bears drifting nearly 20 miles above Earth have become the first soft toys to take part in extra-vehicular activity (to use correct NASA jargon) at such an altitude.
The soft toys MAT and KMS were named after the first initials of the pupils who helped make their space suits.
Along with their two intrepid colleagues, they were strapped to a beam attached to a foam-padded box containing instrumentation and cameras on Monday.
After rising to an altitude of around 100,000ft, a webcam caught their ‘space-walk’ for posterity before the helium balloon burst.
They then fell to Earth before a parachute opened automatically to provide a soft landing.
I’m impressed that school kids could pull this off — albeit with some help. This is just more evidence of powerful capabilities finding their way into the hands of regular people.
We don’t usually think of weather balloons as spacecraft, but what these kids managed to create for the stuffed animals is a fairly good prototype for a manned sub-orbital mission. I would certainly like to take a balloon up and have a look at the view those bears were posed in front of. I bet others would, too. And I have a feeling that lighter-than-air missions to these heights can be done for a fraction of the cost of rocket-propelled missions. Sure, you won’t go fast, but if we’re talking sub-orbital flights with the rockets anyway, what difference does it make?
If space tourism really does take off as a business model, look for ballooning to provide the low end of the market. I have a feeling it will be quite popular.
A supernova explosion first seen from Earth 436 years ago has come back to life for astronomers in a time-travel-like astronomical twist.
By observing light from supernova SN 1572 that was slowed on its trip to Earth by dust particles, scientists can watch the outburst now as it would have looked originally.
When the explosion first appeared in the sky in 1572, Danish astronomer Tycho Brahe named it “Stella Nova†or “New Star†because it looked like an extremely bright star that hadn’t been there before. Astronomers today call it Tycho’s supernova.
So today astronomers get an up-close look at the cosmic phenomenon that Tycho Brahe observed hundreds of years ago. We’re only getting a glimmer, a reflection of the original. And yet I think we’re now seeing more than Tycho did…
Amazing.
A well-placed and highly reflective dust cloud bounced the image of the supernova back towards earth, giving us this latter-day shot at seeing the event. This video shows how that worked:
Of course, every time we look into the night sky, we are looking looking at either the recent past (e.g., the moon) or the very distant past (e.g., the Andromeda galaxy.) But it’s one thing to see these objects just sitting there, as it were, and quite another to see something happen.
This makes me wonder…are there other past events that we might get a second shot at observing? If a well-placed dust cloud can bring back an event from nearly half a millennium ago, what other options might exist for retrieving visual information on events long since past? I think we’d all have to agree that a dust cloud is a fairly low-tech approach to viewing the past, although clearly it was aided by a high-tech telescope and imaging technology. Still, it makes you wonder.
We discussed on a recent FastForward Radio whether the technology for traveling back in time is possible, and if so whether it is reasonable to expect that it will ever exist. Perhaps going back in time is not in the cards. But seeing the past is a real possibility, as the above image demonstrates.