North Korea’s 2009 nuclear test: A geospatial roundup

On May 25, 2009, North Korea performed a second underground test of a nuclear explosive. The first test had taken place on October 9, 2006, and resulted in several KML files pinpointing the suspected location of the test. What follows is round-up of the public geospatial intelligence gathered this time around. Where relevant, I’ve collected the information into this KMZ file. All content is attributed to its original sources.

dprkroundup.jpg

CTBTO

The Preparatory Commission for the Comprehensive Nuclear-test-ban Treaty Organization (CTBTO) has posted the results of its remote sensing analysis regarding where precisely the test was conducted. In addition to its best guess for 2006 (the “2006 Reviewed Event Bulletin” (REB) ellipse), we are also given the first and second automated guesses (SEL1 and SEL2) as well as the organization’s latest best guess (the red “2009 REB Event” ellipse). The CTBTO is confident the 2009 test was conducted somewhere inside the red ellipse.

Unfortunately, the ellipses aren’t available as KML, so I’ve overlaid the JPG of the CTBTO’s conclusions on Google Earth. (As you zoom in, be sure to turn off this layer if you want to see the region at the highest possible resolution.)

IMINT & Analysis

Over on IMINT & Analysis, Sean O’Connor has a long and detailed post looking at the imagery in the region, wherein he identifies two additional potential specific sites at which the test may have been conducted in addition to the northernmost site, which is usually the one mooted as the most likely candidate.

Sean hasn’t put these placemarks online (as far as I can tell), so I’ve annotated the three sites in the KMZ file.

ISIS

The Institute for Science and International Security (ISIS) published a PDF report on May 27 that contains a satellite image of the likely test site taken on May 14, 2009, just 11 days before the test itself.

The above KMZ file contains the image in the PDF, overlaid on top of the base imagery in Google Earth, which was taken on February 15, 2005. An additional overlay of the same location, commissioned by GlobalSecurity.org and taken a few days after the 2006 test, is added to the file. (I found it in the exhaustive (and now well-known) KMZ layer by North Korea Uncovered.)

Finally, for reference, I added POIs of the region annotated by ArmsControlWonk.com in 2006, based on a New York Times article of July 25, 2005 that cited US intelligence and brought the area to the wider public’s attention for the first time.

When living in Africa or Europe or the Americas, what North Korea does can at times look bizarre but distant and abstract. Now that I live in Shanghai, a mere 800km away, the implications are a lot more tangible, and it appears that this time around, China’s leadership too has been shaken out of its complacency. To be continued, of course.

links for 2009-05-27

Augmented reality apps: Sky Map for Android is just the beginning

I experienced Android envy for the first time last week when Google released their Sky Map for Android — envy, because in addition to using GPS and pitch & roll detection, the app also puts Android’s built in magnetometer to work, and that is not something my iPhone has.

Rumor has it, however, that the next iPhone will come with a magnetometer, and so we can assume that by this time next year, all smartphones will avail themselves of the GPS/pitch & roll detection/compass technology trio. What other uses could this technology be put to, besides pointing you to objects in the sky?

Probably the biggest potential is for photographs. If the phone can tell at what angle your are holding it and in what direction you are pointing it when taking a photo, in addition to where exactly and when (and what the viewing angle of the lens is), and you upload all this metadata along with the photo to the cloud, then services like Panoramio and PhotoSynth will have all the information they need to start constructing a crowdsourced 3D simulacrum of the world, photo by photo. PhotoSynth already does some of this by trying to calculate roll, pitch, direction and viewfinder angle of a photo by comparing it to photos taken in the vicinity, but it should get a lot better if it has starting values for these variables, even if they are not completely accurate.

I wonder how long it will take for DSLRs to incorporate this technology. There have been GPS modules for cameras for a while now, but (to me at least) the advantage of having coordinate metadata attached immediately to the image file by the camera is outweighed by the requirement of having the GPS unit be physically attached to the camera — Especially as a proper GPS unit can be kept in your rucksack, whose info you can use when downloading the photos to your computer.

Cheap pitch, roll and direction detection changes the game, however. Perhaps at first we’ll be taping our Androids and iPhones to the back of our DSLRs, but it can’t be long before camera manufacturers realize the benefits of having this information recorded by default. Then, when we upload our photos to the web, we’ll be able to automatically generate KML that lets us view the photos from the exact same vantage point in Google Earth.

Another use that this new technology will be put to is augmented reality applications. Sky Map is already an excellent example of a genre that should explode; perhaps we’ll see apps for superimposing names of distant mountain peaks on your screen (à la HeyWhatsThat); or an app that lets us “x-ray” the planet, so we finally know where precisely Buenos Aires is beneath our feet. Games will no doubt take advantage: Any empty parking lot could become a virtual maze, with the phone as your HUD — and you racing against your friends (both close by, in the same space as you, or on the other side of the planet).

In sum, there are plenty of reasons to hanker after tomorrow’s gadgets, considering the possibilities…