bsnes/higan/sfc/coprocessor/icd2/interface.cpp

152 lines
3.7 KiB
C++
Raw Normal View History

auto ICD2::lcdScanline() -> void {
if(GameBoy::ppu.status.ly > 143) return; //Vblank
if((GameBoy::ppu.status.ly & 7) == 0) {
writeBank = (writeBank + 1) & 3;
writeAddress = 0;
}
}
auto ICD2::lcdOutput(uint2 color) -> void {
uint y = writeAddress / 160;
uint x = writeAddress % 160;
uint addr = writeBank * 512 + y * 2 + x / 8 * 16;
output[addr + 0] = (output[addr + 0] << 1) | (bool)(color & 1);
output[addr + 1] = (output[addr + 1] << 1) | (bool)(color & 2);
writeAddress = (writeAddress + 1) % 1280;
}
auto ICD2::joypWrite(bool p15, bool p14) -> void {
//joypad handling
if(p15 == 1 && p14 == 1) {
if(joyp15Lock == 0 && joyp14Lock == 0) {
joyp15Lock = 1;
joyp14Lock = 1;
joypID = (joypID + 1) & 3;
}
}
if(p15 == 0 && p14 == 1) joyp15Lock = 0;
if(p15 == 1 && p14 == 0) joyp14Lock = 0;
//packet handling
if(p15 == 0 && p14 == 0) { //pulse
pulseLock = false;
packetOffset = 0;
bitOffset = 0;
strobeLock = true;
packetLock = false;
return;
}
if(pulseLock) return;
if(p15 == 1 && p14 == 1) {
strobeLock = false;
return;
}
if(strobeLock) {
if(p15 == 1 || p14 == 1) { //malformed packet
packetLock = false;
pulseLock = true;
bitOffset = 0;
packetOffset = 0;
} else {
return;
}
}
//p15:1, p14:0 = 0
//p15:0, p14:1 = 1
bool bit = (p15 == 0);
strobeLock = true;
if(packetLock) {
if(p15 == 1 && p14 == 0) {
if((joypPacket[0] >> 3) == 0x11) {
mltReq = joypPacket[1] & 3;
if(mltReq == 2) mltReq = 3;
joypID = 0;
}
if(packetSize < 64) packet[packetSize++] = joypPacket;
packetLock = false;
pulseLock = true;
}
return;
}
bitData = (bit << 7) | (bitData >> 1);
if(++bitOffset < 8) return;
bitOffset = 0;
joypPacket[packetOffset] = bitData;
if(++packetOffset < 16) return;
packetLock = true;
}
Update to v099r07 release. byuu says: Changelog: - (hopefully) fixed BS Memory and Sufami Turbo slot loading - ported GB, GBA, WS cores to use nall/vfs - completely removed loadRequest, saveRequest functionality from Emulator::Interface and ui-tomoko - loadRequest(folder) is now load(folder) - save states now use a shared Emulator::SerializerVersion string - whenever this is bumped, all older states will break; but this makes bumping state versions way easier - also, the version string makes it a lot easier to identify compatibility windows for save states - SNES PPU now uses uint16 vram[32768] for memory accesses [hex_usr] NOTE: Super Game Boy loading is currently broken, and I'm not entirely sure how to fix it :/ The file loading handoff was -really- complicated, and so I'm kind of at a loss ... so for now, don't try it. Everything else should theoretically work, so please report any bugs you find. So, this is pretty much it. I'd be very curious to hear feedback from people who objected to the old nall/stream design, whether they are happy with the new file loading system or think it could use further improvements. The 16-bit VRAM turned out to be a wash on performance (roughly the same as before. 1fps slower on Zelda 3, 1fps faster on Yoshi's Island.) The main reason for this was because Yoshi's Island was breaking horribly until I changed the vramRead, vramWrite functions to take uint15 instead of uint16. I suspect the issue is we're using uint16s in some areas now that need to be uint15, and this game is setting the VRAM address to 0x8000+, causing us to go out of bounds on memory accesses. But ... I want to go ahead and do something cute for fun, and just because we can ... and this new interface is so incredibly perfect for it!! I want to support an SNES unit with 128KiB of VRAM. Not out of the box, but as a fun little tweakable thing. The SNES was clearly designed to support that, they just didn't use big enough VRAM chips, and left one of the lines disconnected. So ... let's connect it anyway! In the end, if we design it right, the only code difference should be one area where we mask by 15-bits instead of by 16-bits.
2016-06-24 12:09:30 +00:00
auto ICD2::load(uint id, string name, string type, bool required) -> void {
Update to v097r12 release. byuu says: Nothing WS-related this time. First, I fixed expansion port device mapping. On first load, it was mapping the expansion port device too late, so it ended up not taking effect. I had to spin out the logic for that into Program::connectDevices(). This was proving to be quite annoying while testing eBoot (SNES-Hook simulation.) Second, I fixed the audio->set(Frequency, Latency) functions to take (uint) parameters from the configuration file, so the weird behavior around changing settings in the audio panel should hopefully be gone now. Third, I rewrote the interface->load,unload functions to call into the (Emulator)::System::load,unload functions. And I have those call out to Cartridge::load,unload. Before, this was inverted, and Cartridge::load() was invoking System::load(), which I felt was kind of backward. The Super Game Boy really didn't like this change, however. And it took me a few hours to power through it. Before, I had the Game Boy core dummying out all the interface->(load,save)Request calls, and having the SNES core make them for it. This is because the folder paths and IDs will be different between the two cores. I've redesigned things so that ICD2's Emulator::Interface overloads loadRequest and saveRequest, and translates the requests into new requests for the SuperFamicom core. This allows the Game Boy code to do its own loading for everything without a bunch of Super Game Boy special casing, and without any awkwardness around powering on with no cartridge inserted. This also lets the SNES side of things simply call into higher-level GameBoy::interface->load,save(id, stream) functions instead of stabbing at the raw underlying state inside of various Game Boy core emulation classes. So things are a lot better abstracted now.
2016-02-08 03:17:59 +00:00
}
Update to v099r07 release. byuu says: Changelog: - (hopefully) fixed BS Memory and Sufami Turbo slot loading - ported GB, GBA, WS cores to use nall/vfs - completely removed loadRequest, saveRequest functionality from Emulator::Interface and ui-tomoko - loadRequest(folder) is now load(folder) - save states now use a shared Emulator::SerializerVersion string - whenever this is bumped, all older states will break; but this makes bumping state versions way easier - also, the version string makes it a lot easier to identify compatibility windows for save states - SNES PPU now uses uint16 vram[32768] for memory accesses [hex_usr] NOTE: Super Game Boy loading is currently broken, and I'm not entirely sure how to fix it :/ The file loading handoff was -really- complicated, and so I'm kind of at a loss ... so for now, don't try it. Everything else should theoretically work, so please report any bugs you find. So, this is pretty much it. I'd be very curious to hear feedback from people who objected to the old nall/stream design, whether they are happy with the new file loading system or think it could use further improvements. The 16-bit VRAM turned out to be a wash on performance (roughly the same as before. 1fps slower on Zelda 3, 1fps faster on Yoshi's Island.) The main reason for this was because Yoshi's Island was breaking horribly until I changed the vramRead, vramWrite functions to take uint15 instead of uint16. I suspect the issue is we're using uint16s in some areas now that need to be uint15, and this game is setting the VRAM address to 0x8000+, causing us to go out of bounds on memory accesses. But ... I want to go ahead and do something cute for fun, and just because we can ... and this new interface is so incredibly perfect for it!! I want to support an SNES unit with 128KiB of VRAM. Not out of the box, but as a fun little tweakable thing. The SNES was clearly designed to support that, they just didn't use big enough VRAM chips, and left one of the lines disconnected. So ... let's connect it anyway! In the end, if we design it right, the only code difference should be one area where we mask by 15-bits instead of by 16-bits.
2016-06-24 12:09:30 +00:00
/*
Update to v097r12 release. byuu says: Nothing WS-related this time. First, I fixed expansion port device mapping. On first load, it was mapping the expansion port device too late, so it ended up not taking effect. I had to spin out the logic for that into Program::connectDevices(). This was proving to be quite annoying while testing eBoot (SNES-Hook simulation.) Second, I fixed the audio->set(Frequency, Latency) functions to take (uint) parameters from the configuration file, so the weird behavior around changing settings in the audio panel should hopefully be gone now. Third, I rewrote the interface->load,unload functions to call into the (Emulator)::System::load,unload functions. And I have those call out to Cartridge::load,unload. Before, this was inverted, and Cartridge::load() was invoking System::load(), which I felt was kind of backward. The Super Game Boy really didn't like this change, however. And it took me a few hours to power through it. Before, I had the Game Boy core dummying out all the interface->(load,save)Request calls, and having the SNES core make them for it. This is because the folder paths and IDs will be different between the two cores. I've redesigned things so that ICD2's Emulator::Interface overloads loadRequest and saveRequest, and translates the requests into new requests for the SuperFamicom core. This allows the Game Boy code to do its own loading for everything without a bunch of Super Game Boy special casing, and without any awkwardness around powering on with no cartridge inserted. This also lets the SNES side of things simply call into higher-level GameBoy::interface->load,save(id, stream) functions instead of stabbing at the raw underlying state inside of various Game Boy core emulation classes. So things are a lot better abstracted now.
2016-02-08 03:17:59 +00:00
auto ICD2::loadRequest(uint id, string name, bool required) -> void {
if(id == GameBoy::ID::SystemManifest) {
interface->loadRequest(ID::SuperGameBoyManifest, name, required);
}
if(id == GameBoy::ID::SuperGameBoyBootROM) {
interface->loadRequest(ID::SuperGameBoyBootROM, name, required);
}
if(id == GameBoy::ID::Manifest) {
interface->loadRequest(ID::GameBoyManifest, name, required);
}
if(id == GameBoy::ID::ROM) {
interface->loadRequest(ID::GameBoyROM, name, required);
}
if(id == GameBoy::ID::RAM) {
interface->loadRequest(ID::GameBoyRAM, name, required);
}
}
auto ICD2::saveRequest(uint id, string name) -> void {
if(id == GameBoy::ID::RAM) {
interface->saveRequest(ID::GameBoyRAM, name);
}
Update to v088r03 release. byuu says: static vector<uint8_t> file::read(const string &filename); replaces: static bool file::read(const string &filename, uint8_t *&data, unsigned &size); This allows automatic deletion of the underlying data. Added vectorstream, which is obviously a vector<uint8_t> wrapper for a data stream. Plan is for all data accesses inside my emulation cores to take stream objects, especially MSU1. This lets you feed the core anything: memorystream, filestream, zipstream, gzipstream, httpstream, etc. There will still be exceptions for link and serial, those need actual library files on disk. But those aren't official hardware devices anyway. So to help with speed a bit, I'm rethinking the video rendering path. Previous system: - core outputs system-native samples (SNES = 19-bit LRGB, NES = 9-bit emphasis+palette, DMG = 2-bit grayscale, etc.) - interfaceSystem transforms samples to 30-bit via lookup table inside the emulation core - interfaceSystem masks off overscan areas, if enabled - interfaceUI runs filter to produce new target buffer, if enabled - interfaceUI transforms 30-bit video to native display depth (24-bit or 30-bit), and applies color-adjustments (gamma, etc) at the same time New system: - all cores now generate an internal palette, and call Interface::videoColor(uint32_t source, uint16_t red, uint16_t green, uint16_t blue) to get native display color post-adjusted (gamma, etc applied already.) - all cores output to uint32_t* buffer now (output video.palette[color] instead of just color) - interfaceUI runs filter to produce new target buffer, if enabled - interfaceUI memcpy()'s buffer to the video card videoColor() is pretty neat. source is the raw pixel (as per the old-format, 19-bit SNES, 9-bit NES, etc), and you can create a color from that if you really want to. Or return that value to get a buffer just like v088 and below. red, green, blue are 16-bits per channel, because why the hell not, right? Just lop off all the bits you don't want. If you have more bits on your display than that, fuck you :P The last step is extremely difficult to avoid. Video cards can and do have pitches that differ from the width of the texture. Trying to make the core account for this would be really awful. And even if we did that, the emulation routine would need to write directly to a video card RAM buffer. Some APIs require you to lock the video buffer while writing, so this would leave the video buffer locked for a long time. Probably not catastrophic, but still awful. And lastly, if the emulation core tried writing directly to the display texture, software filters would no longer be possible (unless you -really- jump through hooks and divert to a memory buffer when a filter is enabled, but ... fuck.) Anyway, the point of all that work was to eliminate an extra video copy, and the need for a really painful 30-bit to 24-bit conversion (three shifts, three masks, three array indexes.) So this basically reverts us, performance-wise, to where we were pre-30 bit support. [...] The downside to this is that we're going to need a filter for each output depth. Since the array type is uint32_t*, and I don't intend to support higher or lower depths, we really only need 24+30-bit versions of each filter. Kinda shitty, but oh well.
2012-04-27 12:12:53 +00:00
}
Update to v099r07 release. byuu says: Changelog: - (hopefully) fixed BS Memory and Sufami Turbo slot loading - ported GB, GBA, WS cores to use nall/vfs - completely removed loadRequest, saveRequest functionality from Emulator::Interface and ui-tomoko - loadRequest(folder) is now load(folder) - save states now use a shared Emulator::SerializerVersion string - whenever this is bumped, all older states will break; but this makes bumping state versions way easier - also, the version string makes it a lot easier to identify compatibility windows for save states - SNES PPU now uses uint16 vram[32768] for memory accesses [hex_usr] NOTE: Super Game Boy loading is currently broken, and I'm not entirely sure how to fix it :/ The file loading handoff was -really- complicated, and so I'm kind of at a loss ... so for now, don't try it. Everything else should theoretically work, so please report any bugs you find. So, this is pretty much it. I'd be very curious to hear feedback from people who objected to the old nall/stream design, whether they are happy with the new file loading system or think it could use further improvements. The 16-bit VRAM turned out to be a wash on performance (roughly the same as before. 1fps slower on Zelda 3, 1fps faster on Yoshi's Island.) The main reason for this was because Yoshi's Island was breaking horribly until I changed the vramRead, vramWrite functions to take uint15 instead of uint16. I suspect the issue is we're using uint16s in some areas now that need to be uint15, and this game is setting the VRAM address to 0x8000+, causing us to go out of bounds on memory accesses. But ... I want to go ahead and do something cute for fun, and just because we can ... and this new interface is so incredibly perfect for it!! I want to support an SNES unit with 128KiB of VRAM. Not out of the box, but as a fun little tweakable thing. The SNES was clearly designed to support that, they just didn't use big enough VRAM chips, and left one of the lines disconnected. So ... let's connect it anyway! In the end, if we design it right, the only code difference should be one area where we mask by 15-bits instead of by 16-bits.
2016-06-24 12:09:30 +00:00
*/
Update to v088r03 release. byuu says: static vector<uint8_t> file::read(const string &filename); replaces: static bool file::read(const string &filename, uint8_t *&data, unsigned &size); This allows automatic deletion of the underlying data. Added vectorstream, which is obviously a vector<uint8_t> wrapper for a data stream. Plan is for all data accesses inside my emulation cores to take stream objects, especially MSU1. This lets you feed the core anything: memorystream, filestream, zipstream, gzipstream, httpstream, etc. There will still be exceptions for link and serial, those need actual library files on disk. But those aren't official hardware devices anyway. So to help with speed a bit, I'm rethinking the video rendering path. Previous system: - core outputs system-native samples (SNES = 19-bit LRGB, NES = 9-bit emphasis+palette, DMG = 2-bit grayscale, etc.) - interfaceSystem transforms samples to 30-bit via lookup table inside the emulation core - interfaceSystem masks off overscan areas, if enabled - interfaceUI runs filter to produce new target buffer, if enabled - interfaceUI transforms 30-bit video to native display depth (24-bit or 30-bit), and applies color-adjustments (gamma, etc) at the same time New system: - all cores now generate an internal palette, and call Interface::videoColor(uint32_t source, uint16_t red, uint16_t green, uint16_t blue) to get native display color post-adjusted (gamma, etc applied already.) - all cores output to uint32_t* buffer now (output video.palette[color] instead of just color) - interfaceUI runs filter to produce new target buffer, if enabled - interfaceUI memcpy()'s buffer to the video card videoColor() is pretty neat. source is the raw pixel (as per the old-format, 19-bit SNES, 9-bit NES, etc), and you can create a color from that if you really want to. Or return that value to get a buffer just like v088 and below. red, green, blue are 16-bits per channel, because why the hell not, right? Just lop off all the bits you don't want. If you have more bits on your display than that, fuck you :P The last step is extremely difficult to avoid. Video cards can and do have pitches that differ from the width of the texture. Trying to make the core account for this would be really awful. And even if we did that, the emulation routine would need to write directly to a video card RAM buffer. Some APIs require you to lock the video buffer while writing, so this would leave the video buffer locked for a long time. Probably not catastrophic, but still awful. And lastly, if the emulation core tried writing directly to the display texture, software filters would no longer be possible (unless you -really- jump through hooks and divert to a memory buffer when a filter is enabled, but ... fuck.) Anyway, the point of all that work was to eliminate an extra video copy, and the need for a really painful 30-bit to 24-bit conversion (three shifts, three masks, three array indexes.) So this basically reverts us, performance-wise, to where we were pre-30 bit support. [...] The downside to this is that we're going to need a filter for each output depth. Since the array type is uint32_t*, and I don't intend to support higher or lower depths, we really only need 24+30-bit versions of each filter. Kinda shitty, but oh well.
2012-04-27 12:12:53 +00:00
auto ICD2::videoRefresh(const uint32* data, uint pitch, uint width, uint height) -> void {
}
Update to v098r14 release. byuu says: Changelog: - improved attenuation of biquad filter by computing butterworth Q coefficients correctly (instead of using the same constant) - adding 1e-25 to each input sample into the biquad filters to try and prevent denormalization - updated normalization from [0.0 to 1.0] to [-1.0 to +1.0]; volume/reverb happen in floating-point mode now - good amount of work to make the base Emulator::Audio support any number of output channels - so that we don't have to do separate work on left/right channels; and can instead share the code for each channel - Emulator::Interface::audioSample(int16 left, int16 right); changed to: - Emulator::Interface::audioSample(double* samples, uint channels); - samples are normalized [-1.0 to +1.0] - for now at least, channels will be the value given to Emulator::Audio::reset() - fixed GUI crash on startup when audio driver is set to None I'm probably going to be updating ruby to accept normalized doubles as well; but I'm not sure if I will try and support anything other 2-channel audio output. It'll depend on how easy it is to do so; perhaps it'll be a per-driver setting. The denormalization thing is fierce. If that happens, it drops the emulator framerate from 220fps to about 20fps for Game Boy emulation. And that happens basically whenever audio output is silent. I'm probably also going to make a nall/denormal.hpp file at some point with platform-specific functionality to set the CPU state to "denormals as zero" where applicable. I'll still add the 1e-25 offset (inaudible) as another fallback.
2016-06-01 11:23:22 +00:00
auto ICD2::audioSample(const double* samples, uint channels) -> void {
stream->write(samples);
}
auto ICD2::inputPoll(uint port, uint device, uint id) -> int16 {
GameBoy::cpu.status.mlt_req = joypID & mltReq;
uint data = 0x00;
switch(joypID & mltReq) {
case 0: data = ~r6004; break;
case 1: data = ~r6005; break;
case 2: data = ~r6006; break;
case 3: data = ~r6007; break;
}
switch((GameBoy::Input)id) {
case GameBoy::Input::Start: return (bool)(data & 0x80);
case GameBoy::Input::Select: return (bool)(data & 0x40);
case GameBoy::Input::B: return (bool)(data & 0x20);
case GameBoy::Input::A: return (bool)(data & 0x10);
case GameBoy::Input::Down: return (bool)(data & 0x08);
case GameBoy::Input::Up: return (bool)(data & 0x04);
case GameBoy::Input::Left: return (bool)(data & 0x02);
case GameBoy::Input::Right: return (bool)(data & 0x01);
}
return 0;
}