I did something like this in DBPro, but it involved sampling every pixel with a camera render, so real slow to capture.
It's not practical to capture gameplay footage with the method I used, but it does work for say, capturing a screenshot that you can view from any angle, like a bubble snapshot of what is happening in 1 frame.
It's slow, but fairly easy - just rotate the camera by X then pitch it up by Y. If your render image is 1024x512 for example, you'd divide 360 by 1024 then multiply it by the X pixel position to calculate the camera Y angle. The X angle of the camera only needs to extend 180 degrees, so 180/512 * Y to get the camera Y angle. Sync the camera, store the mid-screen pixel colour and draw it onto your render image. In DBPro you can fudge around with the camera FOV and render area to improve performance, but I'm not sure that's something AppGameKit can do yet.
If you did this, then you could easily make an app to preview these spheremaps, might be kinda cool to make a tablet version so you can rotate the tablet to look around.
So it's certainly possible to capture a spheremap bubble screenshot, and can effectively capture several to make a video, but it would take so long that it's impractical. In DBPro it might be possible to render separate cameras and consolidate them into a cube map, but that's not really an option for AppGameKit right now.