So the obvious thing when working with HDR is that you cant render directly to your screen. Your screen only has 8bits per channel. DirectX allows you to render to an off screen buffer called a render target. Using a render target you can render in any format your gpu supports.
After you tell the GPU to use a render target you should be able to what ever you want in HDR right? Unfortunately this is not the experience I had. I set up my render target and drew a image and what I got back out was only an 8bit image. It didn't matter what format the render target was in, or what I drew for an image, it was always rendering to 8bits.
Turns out you need to also use a pixel shader. You need nothing more than:
texld r1, t0, s0
mov oC0, r1
All this does it get the value of the pixel in the texture and return it. This should be exactly what is happening by default without a pixel shader, but it doesn't seem to be. Using these 2 lines of pixel shader suddenly I was getting more than 8bits out.
Not sure why it works this way, maybe someone knows more than I do?