Since ResolveSubresource cannot be used with depth textures (and throws an error with the debug layer enabled), use a shader which selects the minimum depth value from all samples.
Changes the sampler by XFBEncoder to use a linear filter, rather than point, to match GL behavior.
Instead of blindly using the expected width, clamp it to the stride of the
buffer which dx11 returns. This prevents use from reading invalid memory
at the end of textures.
This doesn't solve the base issue of what to do when a game tries to copy
from outside the efb. On real hardware it returns random noise (biased
to all ones)
Addded a few duplicated depth copy texture formats to the enum
in TextureDecoder.h. These texture formats were already implemented
in TextureCacheBase and the ogl/dx11 texture cache implementations.
Rendering EFB textures currently crashes with the D3D backend when MSAA is enabled, because the depth texture wasn't correctly resolved. An example for a crash would be starting Pokemon Snap with D3D and MSAA enabled.
A texture was still being bound when OMSetRenderTargets is called.
State manager resource cache must be flushed to unbind it.
This fixes The Last Story cut scene rendering.
With strings, we don't need to care about passing in a length, since it internally stores it. So now, we don't even need a length parameter for these functions anymore as well.
This also kills off some sprintf_s calls.
We need to explicitly round when converting colors from float to uint
because multiplying a normalized float by 255 might not result in a whole
number. (The exact result here may vary depending on your
drivers/hardware.)
Ideally, we shouldn't be using floating point here, but fixing that is a
much more complicated patch.
Fixes gxtest TEV tests using Intel HD 4000.