* Add full PMODE register to replace slbg/mmod
* Add full EXTBUF register (will allow to emulate write feedback)
* Add a third source (which will actually be the destination of the
write feedback)
Purpose is to control the filtering when final image is displayed on the screen
Could improve the sharpness of the output in some games (ofc, it will be pixelated)
It might save a couple of fps
Add a define to test the perf if we keep only the blue channel. It brokes
the code in Prince Of Persia that use the Red/Green channel... Maybe the
speed hack :( Or find a way to replace all if with a lookup table
Note: it is only supported on OpenGL currently
Currently we're trying to infer the conversion shader based on the output format
It only works if the input data is RGBA8. It might not be true in the future
* try to use more subroutine on VS&PS, unfortunately hit a driver crash!
* Call Attach/DetachContext through GSDevice so I can unmap currently mapped buffer
* Implement glsl part of GL_ARB_bindless texture, again hit another driver crash!
* various fix of GL_ARB_buffer_storage. Basic benchmark show only improvement on 'cold' case, I guess it will improve smoothness
* try to fix GL_clear_texture, no success so far. It seem the extension is limited to basic texture (aka no depth/stencil)
git-svn-id: http://pcsx2.googlecode.com/svn/trunk@5752 96395faa-99c1-11dd-bbfe-3dabce05a288
Probably only of interest to testers (and me). Absolutely do NOT select the reference device even out of extreme morbid curiosity. It's not even very good at being a reference despite being slower than you can probably believe.
git-svn-id: http://pcsx2.googlecode.com/svn/trunk@5358 96395faa-99c1-11dd-bbfe-3dabce05a288
- GSWnd is not implemented, no config dialogs either
- no output, just the null device
- threading classes were not tested (my first experience with pthread)
git-svn-id: http://pcsx2.googlecode.com/svn/trunk@4315 96395faa-99c1-11dd-bbfe-3dabce05a288
GSDx: Reset device on save state load.
GSDx: Made GSRenderer::ResetDevice() actually do the main function in its name and implemented InvalidateTextureCache(). If anything breaks, it's because of this change.
git-svn-id: http://pcsx2.googlecode.com/svn/trunk@3067 96395faa-99c1-11dd-bbfe-3dabce05a288
32-bit depth buffers for D3D9 users if available. Lots of code shuffling for reasons I don't even remember. Stuff. Pretty much just the 32-bit depth buffers. That's good though, you don't have to envy D3D10 users half as much now.
git-svn-id: http://pcsx2.googlecode.com/svn/trunk@3002 96395faa-99c1-11dd-bbfe-3dabce05a288
- trying the dx10.1-only "gather" shader instruction for palletized lookups ("8-bit texture" mode), saves 4 instructions which isn't much but still... (not tested, don't have ati)
- may fix the intel gma "no output" bug (don't have gma either :P)
- and the usual small code optimizations
git-svn-id: http://pcsx2.googlecode.com/svn/trunk@1549 96395faa-99c1-11dd-bbfe-3dabce05a288
This patch by matsuri makes playing with vsync a lot nicer, so thanks for that one ;)
Note: For now its directx 10 only, due to dx9 needing a swapchain reset..
git-svn-id: http://pcsx2.googlecode.com/svn/trunk@1526 96395faa-99c1-11dd-bbfe-3dabce05a288