Reading from a staging 2D texture array in DirectX10

Posted by Don Reba on Game Development See other posts from Game Development or by Don Reba
Published on 2012-04-19T14:07:18Z Indexed on 2012/09/12 15:52 UTC
Read the original article Hit count: 475

Filed under:
|

I have a DX10 program, where I create an array of 3 16x16 textures, then map, read, and unmap each subresource in turn. I use a single mip level, set resource usage to staging and CPU access to read. Now, here is the problem:

  • Subresource 0 contains 1024 bytes, pitch 64, as expected.
  • Subresource 1 contains 512 bytes, pitch 64.
  • Subresource 2 contains 256 bytes, pitch 64.

I expect all three to be the same size. Debugging output is enabled, but not reporting any warnings or errors. Am I missing something, or might this be some sort of driver issue?

Here is the code. The language is Nemerle, but C# and C++ would look almost the same. I have looked through the generated code, and am fairly confident the problem is not language-related.

def cpuTexture = Texture2D
    ( device
    , Texture2DDescription() <-
    {
        Width     = 16;
        Height    = 16;
        MipLevels = 1;
        ArraySize = 3;
        Format    = Format.R32_Float;
        Usage     = ResourceUsage.Staging;
        CpuAccessFlags    = CpuAccessFlags.Read;
        SampleDescription = SampleDescription(count = 1, quality = 0);
    }
    );
foreach (subresource in [0 .. 2])
{
    def data = cpuTexture.Map(subresource, MapMode.Read, MapFlags.None);
    Console.WriteLine($"subresource $subresource");
    Console.WriteLine($"length = $(data.Data.Length)");
    Console.WriteLine($"pitch  = $(data.Pitch)");
    cpuTexture.Unmap(subresource);
}

© Game Development or respective owner

Related posts about directx

Related posts about directx10