bsnes/higan/ws/system/system.hpp

70 lines
1.8 KiB
C++
Raw Normal View History

struct System : IO {
Update to v104r06 release. byuu says: Changelog: - gba,ws: removed Thread::step() override¹ - processor/m68k: move.b (a7)+ and move.b (a7)- adjust a7 by two, not by one² - tomoko: created new initialize(Video,Audio,Input)Driver() functions³ - ruby/audio: split Audio::information into Audio::available(Devices,Frequencies,Latencies,Channels)³ - ws: added Model::(WonderSwan,WonderSwanColor,SwanCrystal)() functions for consistency with other cores ¹: this should hopefully fix GBA Pokemon Pinball. Thanks to SuperMikeMan for pointing out the underlying cause. ²: this fixes A Ressaha de Ikou, Mega Bomberman, and probably more games. ³: this is the big change: so there was a problem with WASAPI where you might change your device under the audio settings panel. And your new device may not support the frequency that your old device used. This would end up not updating the frequency, and the pitch would be distorted. The old Audio::information() couldn't tell you what frequencies, latencies, or channels were available for all devices simultaneously, so I had to split them up. The new initializeAudioDriver() function validates you have a correct driver, or it defaults to none. Then it validates a correct device name, or it defaults to the first entry in the list. Then it validates a correct frequency, or defaults to the first in the list. Then finally it validates a correct latency, or defaults to the first in the list. In this way ... we have a clear path now with no API changes required to select default devices, frequencies, latencies, channel counts: they need to be the first items in their respective lists. So, what we need to do now is go through and for every audio driver that enumerates devices, we need to make sure the default device gets added to the top of the list. I'm ... not really sure how to do this with most drivers, so this is definitely going to take some time. Also, when you change a device, initializeAudioDriver() is called again, so if it's a bad device, it will disable the audio driver instead of continuing to send samples at it and hoping that the driver blocked those API calls when it failed to initialize properly. Now then ... since it was a decently-sized API change, it's possible I've broken compilation of the Linux drivers, so please report any compilation errors so that I can fix them.
2017-08-26 01:15:49 +00:00
enum class Model : uint { WonderSwan, WonderSwanColor, SwanCrystal };
Update to v099r07 release. byuu says: Changelog: - (hopefully) fixed BS Memory and Sufami Turbo slot loading - ported GB, GBA, WS cores to use nall/vfs - completely removed loadRequest, saveRequest functionality from Emulator::Interface and ui-tomoko - loadRequest(folder) is now load(folder) - save states now use a shared Emulator::SerializerVersion string - whenever this is bumped, all older states will break; but this makes bumping state versions way easier - also, the version string makes it a lot easier to identify compatibility windows for save states - SNES PPU now uses uint16 vram[32768] for memory accesses [hex_usr] NOTE: Super Game Boy loading is currently broken, and I'm not entirely sure how to fix it :/ The file loading handoff was -really- complicated, and so I'm kind of at a loss ... so for now, don't try it. Everything else should theoretically work, so please report any bugs you find. So, this is pretty much it. I'd be very curious to hear feedback from people who objected to the old nall/stream design, whether they are happy with the new file loading system or think it could use further improvements. The 16-bit VRAM turned out to be a wash on performance (roughly the same as before. 1fps slower on Zelda 3, 1fps faster on Yoshi's Island.) The main reason for this was because Yoshi's Island was breaking horribly until I changed the vramRead, vramWrite functions to take uint15 instead of uint16. I suspect the issue is we're using uint16s in some areas now that need to be uint15, and this game is setting the VRAM address to 0x8000+, causing us to go out of bounds on memory accesses. But ... I want to go ahead and do something cute for fun, and just because we can ... and this new interface is so incredibly perfect for it!! I want to support an SNES unit with 128KiB of VRAM. Not out of the box, but as a fun little tweakable thing. The SNES was clearly designed to support that, they just didn't use big enough VRAM chips, and left one of the lines disconnected. So ... let's connect it anyway! In the end, if we design it right, the only code difference should be one area where we mask by 15-bits instead of by 16-bits.
2016-06-24 12:09:30 +00:00
auto loaded() const -> bool { return _loaded; }
auto model() const -> Model { return _model; }
auto color() const -> bool { return r.color; }
auto planar() const -> bool { return r.format == 0; }
auto packed() const -> bool { return r.format == 1; }
auto depth() const -> bool { return r.color && r.depth == 1; }
auto init() -> void;
auto term() -> void;
auto load(Emulator::Interface*, Model) -> bool;
Update to v099r07 release. byuu says: Changelog: - (hopefully) fixed BS Memory and Sufami Turbo slot loading - ported GB, GBA, WS cores to use nall/vfs - completely removed loadRequest, saveRequest functionality from Emulator::Interface and ui-tomoko - loadRequest(folder) is now load(folder) - save states now use a shared Emulator::SerializerVersion string - whenever this is bumped, all older states will break; but this makes bumping state versions way easier - also, the version string makes it a lot easier to identify compatibility windows for save states - SNES PPU now uses uint16 vram[32768] for memory accesses [hex_usr] NOTE: Super Game Boy loading is currently broken, and I'm not entirely sure how to fix it :/ The file loading handoff was -really- complicated, and so I'm kind of at a loss ... so for now, don't try it. Everything else should theoretically work, so please report any bugs you find. So, this is pretty much it. I'd be very curious to hear feedback from people who objected to the old nall/stream design, whether they are happy with the new file loading system or think it could use further improvements. The 16-bit VRAM turned out to be a wash on performance (roughly the same as before. 1fps slower on Zelda 3, 1fps faster on Yoshi's Island.) The main reason for this was because Yoshi's Island was breaking horribly until I changed the vramRead, vramWrite functions to take uint15 instead of uint16. I suspect the issue is we're using uint16s in some areas now that need to be uint15, and this game is setting the VRAM address to 0x8000+, causing us to go out of bounds on memory accesses. But ... I want to go ahead and do something cute for fun, and just because we can ... and this new interface is so incredibly perfect for it!! I want to support an SNES unit with 128KiB of VRAM. Not out of the box, but as a fun little tweakable thing. The SNES was clearly designed to support that, they just didn't use big enough VRAM chips, and left one of the lines disconnected. So ... let's connect it anyway! In the end, if we design it right, the only code difference should be one area where we mask by 15-bits instead of by 16-bits.
2016-06-24 12:09:30 +00:00
auto save() -> void;
Update to v097r12 release. byuu says: Nothing WS-related this time. First, I fixed expansion port device mapping. On first load, it was mapping the expansion port device too late, so it ended up not taking effect. I had to spin out the logic for that into Program::connectDevices(). This was proving to be quite annoying while testing eBoot (SNES-Hook simulation.) Second, I fixed the audio->set(Frequency, Latency) functions to take (uint) parameters from the configuration file, so the weird behavior around changing settings in the audio panel should hopefully be gone now. Third, I rewrote the interface->load,unload functions to call into the (Emulator)::System::load,unload functions. And I have those call out to Cartridge::load,unload. Before, this was inverted, and Cartridge::load() was invoking System::load(), which I felt was kind of backward. The Super Game Boy really didn't like this change, however. And it took me a few hours to power through it. Before, I had the Game Boy core dummying out all the interface->(load,save)Request calls, and having the SNES core make them for it. This is because the folder paths and IDs will be different between the two cores. I've redesigned things so that ICD2's Emulator::Interface overloads loadRequest and saveRequest, and translates the requests into new requests for the SuperFamicom core. This allows the Game Boy code to do its own loading for everything without a bunch of Super Game Boy special casing, and without any awkwardness around powering on with no cartridge inserted. This also lets the SNES side of things simply call into higher-level GameBoy::interface->load,save(id, stream) functions instead of stabbing at the raw underlying state inside of various Game Boy core emulation classes. So things are a lot better abstracted now.
2016-02-08 03:17:59 +00:00
auto unload() -> void;
auto power() -> void;
auto run() -> void;
Update to v097r28 release. byuu says: Changelog: (all WSC unless otherwise noted) - fixed LINECMP=0 interrupt case (fixes FF4 world map during airship sequence) - improved CPU timing (fixes Magical Drop flickering and FF1 battle music) - added per-frame OAM caching (fixes sprite glitchiness in Magical Drop, Riviera, etc.) - added RTC emulation (fixes Dicing Knight and Judgement Silversword) - added save state support - added cheat code support (untested because I don't know of any cheat codes that exist for this system) - icarus: can now detect games with RTC chips - SFC: bugfix to SharpRTC emulation (Dai Kaijuu Monogatari II) - ( I was adding the extra leap year day to all 12 months instead of just February ... >_< ) Note that the RTC emulation is very incomplete. It's not really documented at all, and the two games I've tried that use it never even ask you to set the date/time (so they're probably just using it to count seconds.) I'm not even sure if I've implement the level-sensitive behavior correctly (actually, now that I think about it, I need to mask the clear bit in INT_ACK for the level-sensitive interrupts ...) A bit worried about the RTC alarm, because it seems like it'll fire continuously for a full minute. Or even if you turn it off after it fires, then that doesn't seem to be lowering the line until the next second ticks on the RTC, so that likely needs to happen when changing the alarm flag. Also not sure on this RTC's weekday byte. On the SharpRTC, it actually computes this for you. Because it's not at all an easy thing to calculate yourself in 65816 or V30MZ assembler. About 40 lines of code to do it in C. For now, I'm requiring the program to calculate the value itself. Also note that there's some gibberish tiles in Judgement Silversword, sadly. Not sure what's up there, but the game's still fully playable at least. Finally, no surprise: Beat-Mania doesn't run :P
2016-03-25 06:19:08 +00:00
auto runToSave() -> void;
auto pollKeypad() -> void;
Update to v097r28 release. byuu says: Changelog: (all WSC unless otherwise noted) - fixed LINECMP=0 interrupt case (fixes FF4 world map during airship sequence) - improved CPU timing (fixes Magical Drop flickering and FF1 battle music) - added per-frame OAM caching (fixes sprite glitchiness in Magical Drop, Riviera, etc.) - added RTC emulation (fixes Dicing Knight and Judgement Silversword) - added save state support - added cheat code support (untested because I don't know of any cheat codes that exist for this system) - icarus: can now detect games with RTC chips - SFC: bugfix to SharpRTC emulation (Dai Kaijuu Monogatari II) - ( I was adding the extra leap year day to all 12 months instead of just February ... >_< ) Note that the RTC emulation is very incomplete. It's not really documented at all, and the two games I've tried that use it never even ask you to set the date/time (so they're probably just using it to count seconds.) I'm not even sure if I've implement the level-sensitive behavior correctly (actually, now that I think about it, I need to mask the clear bit in INT_ACK for the level-sensitive interrupts ...) A bit worried about the RTC alarm, because it seems like it'll fire continuously for a full minute. Or even if you turn it off after it fires, then that doesn't seem to be lowering the line until the next second ticks on the RTC, so that likely needs to happen when changing the alarm flag. Also not sure on this RTC's weekday byte. On the SharpRTC, it actually computes this for you. Because it's not at all an easy thing to calculate yourself in 65816 or V30MZ assembler. About 40 lines of code to do it in C. For now, I'm requiring the program to calculate the value itself. Also note that there's some gibberish tiles in Judgement Silversword, sadly. Not sure what's up there, but the game's still fully playable at least. Finally, no surprise: Beat-Mania doesn't run :P
2016-03-25 06:19:08 +00:00
//io.cpp
auto portRead(uint16 addr) -> uint8 override;
auto portWrite(uint16 addr, uint8 data) -> void override;
Update to v098r04 release. byuu says: Changelog: - SFC: fixed behavior of 21fx $21fe register when no device is connected (must return zero) - SFC: reduced 21fx buffer size to 1024 bytes in both directions to mirror the FT232H we are using - SFC: eliminated dsp/modulo-array.hpp [1] - higan: implemented higan/video interface and migrated all cores to it [2] [1] the echo history buffer was 8-bytes, so there was no need for it at all here. Not sure what I was thinking. The BRR buffer was 12-bytes, and has very weird behavior ... but there's only a single location in the code where it actually writes to this buffer. It's much easier to just write to the buffer three times there instead of implementing an entire class just to abstract away two lines of code. This change actually boosted the speed from ~124.5fps to around ~127.5fps, but that's within the margin of error for GCC. I doubt it's actually faster this way. The DSP core could really use a ton of work. It comes from a port of blargg's spc_dsp to my coding style, but he was extremely fond of using 32-bit signed integers everywhere. There's a lot of opportunity to remove red tape masking by resizing the variables to their actual state sizes. I really need to find where I put spc_dsp6.sfc from blargg. It's a great test to verify if I've made any mistakes in my implementation that would cause regressions. Don't suppose anyone has it? [2] so again, the idea is that higan/audio and higan/video are going to sit between the emulation cores and the user interfaces. The hope is to output raw encoding data from the emulation cores without having to worry about the video display format (generally 24-bit RGB) of the host display. And also to avoid having to repeat myself with eg three separate implementations of interframe blending, and so on. Furthermore, the idea is that the user interface can configure its side of the settings, and the emulation cores can configure their sides. Thus, neither has to worry about the other end. And now we can spin off new user interfaces much easier without having to mess with all of these things. Right now, I've implemented color emulation, interframe blending and SNES horizontal color bleed. I did not implement scanlines (and interlace effects for them) yet, but I probably will at some point. Further, for right now, the WonderSwan/Color screen rotation is busted and will only show games in the horizontal orientation. Obviously this must be fixed before the next official release, but I'll want to think about how to implement it. Also, the SNES light gun pointers are missing for now. Things are a bit messy right now as I've gone through several revisions of how to handle these things, so a good house cleaning is in order once everything is feature-complete again. I need to sit down and think through how and where I want to handle things like light gun cursors, LCD icons, and maybe even rasterized text messages. And obviously ... higan/audio is still just nall::DSP's headers. I need to revamp that whole interface. I want to make it quite powerful with a true audio mixer so I can handle things like SNES+SGB+MSU1+Voicer-Kun+SNES-CD (five separate audio streams at once.) The video system has the concept of "effects" for things like color bleed and interframe blending. I want to extend on this with useful other effects, such as NTSC simulation, maybe bringing back my mini-HQ2x filter, etc. I'd also like to restore the saturation/gamma/luma adjustment sliders ... I always liked allowing people to compensate for their displays without having to change settings system-wide. Lastly, I've always wanted to see some audio effects. Although I doubt we'll ever get my dream of CoreAudio-style profiles, I'd like to get some basic equalizer settings and echo/reverb effects in there.
2016-04-11 21:29:56 +00:00
//video.cpp
auto configureVideoPalette() -> void;
auto configureVideoEffects() -> void;
Update to v097r28 release. byuu says: Changelog: (all WSC unless otherwise noted) - fixed LINECMP=0 interrupt case (fixes FF4 world map during airship sequence) - improved CPU timing (fixes Magical Drop flickering and FF1 battle music) - added per-frame OAM caching (fixes sprite glitchiness in Magical Drop, Riviera, etc.) - added RTC emulation (fixes Dicing Knight and Judgement Silversword) - added save state support - added cheat code support (untested because I don't know of any cheat codes that exist for this system) - icarus: can now detect games with RTC chips - SFC: bugfix to SharpRTC emulation (Dai Kaijuu Monogatari II) - ( I was adding the extra leap year day to all 12 months instead of just February ... >_< ) Note that the RTC emulation is very incomplete. It's not really documented at all, and the two games I've tried that use it never even ask you to set the date/time (so they're probably just using it to count seconds.) I'm not even sure if I've implement the level-sensitive behavior correctly (actually, now that I think about it, I need to mask the clear bit in INT_ACK for the level-sensitive interrupts ...) A bit worried about the RTC alarm, because it seems like it'll fire continuously for a full minute. Or even if you turn it off after it fires, then that doesn't seem to be lowering the line until the next second ticks on the RTC, so that likely needs to happen when changing the alarm flag. Also not sure on this RTC's weekday byte. On the SharpRTC, it actually computes this for you. Because it's not at all an easy thing to calculate yourself in 65816 or V30MZ assembler. About 40 lines of code to do it in C. For now, I'm requiring the program to calculate the value itself. Also note that there's some gibberish tiles in Judgement Silversword, sadly. Not sure what's up there, but the game's still fully playable at least. Finally, no surprise: Beat-Mania doesn't run :P
2016-03-25 06:19:08 +00:00
//serialization.cpp
auto serializeInit() -> void;
auto serialize() -> serializer;
auto unserialize(serializer&) -> bool;
auto serializeAll(serializer&) -> void;
auto serialize(serializer&) -> void;
struct Information {
string manifest;
} information;
EEPROM eeprom;
Update to v097r24 release. byuu says: Changelog: - WS: fixed bug when IRQs triggered during a rep string instruction - WS: added sprite attribute caching (per-scanline); absolutely massive speed-up - WS: emulated limit of 32 sprites per scanline - WS: emulated the extended PPU register bit behavior based on the DISP_CTRL tile bit-depth setting - WS: added "Rotate" key binding; can be used to flip the WS display between horizontal and vertical in real-time The prefix emulation may not be 100% hardware-accurate, but the edge cases should be extreme enough to not come up in the WS library. No way to get the emulation 100% down without intensive hardware testing. trap15 pointed me at a workflow diagram for it, but that diagram is impossible without a magic internal stack frame that grows with every IRQ, and can thus grow infinitely large. The rotation thing isn't exactly the most friendly set-up, but oh well. I'll see about adding a default rotation setting to manifests, so that games like GunPey can start in the correct orientation. After that, if the LCD orientation icon turns out to be reliable, then I'll start using that. But if there are cases where it's not reliable, then I'll leave it to manual button presses. Speaking of icons, I'll need a set of icons to render on the screen. Going to put them to the top right on vertical orientation, and on the bottom left for horizontal orientation. Just outside of the video output, of course. Overall, WS is getting pretty far along, but still some major bugs in various games. I really need sound emulation, though. Nobody's going to use this at all without that.
2016-03-12 13:27:41 +00:00
struct Keypad {
bool y1, y2, y3, y4;
bool x1, x2, x3, x4;
bool b, a, start;
bool rotate;
} keypad;
private:
Emulator::Interface* interface = nullptr;
struct Registers {
//$0060 DISP_MODE
uint5 unknown;
uint1 format;
uint1 depth;
uint1 color;
} r;
Update to v097r12 release. byuu says: Nothing WS-related this time. First, I fixed expansion port device mapping. On first load, it was mapping the expansion port device too late, so it ended up not taking effect. I had to spin out the logic for that into Program::connectDevices(). This was proving to be quite annoying while testing eBoot (SNES-Hook simulation.) Second, I fixed the audio->set(Frequency, Latency) functions to take (uint) parameters from the configuration file, so the weird behavior around changing settings in the audio panel should hopefully be gone now. Third, I rewrote the interface->load,unload functions to call into the (Emulator)::System::load,unload functions. And I have those call out to Cartridge::load,unload. Before, this was inverted, and Cartridge::load() was invoking System::load(), which I felt was kind of backward. The Super Game Boy really didn't like this change, however. And it took me a few hours to power through it. Before, I had the Game Boy core dummying out all the interface->(load,save)Request calls, and having the SNES core make them for it. This is because the folder paths and IDs will be different between the two cores. I've redesigned things so that ICD2's Emulator::Interface overloads loadRequest and saveRequest, and translates the requests into new requests for the SuperFamicom core. This allows the Game Boy code to do its own loading for everything without a bunch of Super Game Boy special casing, and without any awkwardness around powering on with no cartridge inserted. This also lets the SNES side of things simply call into higher-level GameBoy::interface->load,save(id, stream) functions instead of stabbing at the raw underlying state inside of various Game Boy core emulation classes. So things are a lot better abstracted now.
2016-02-08 03:17:59 +00:00
bool _loaded = false;
Model _model = Model::WonderSwan;
Update to v097r28 release. byuu says: Changelog: (all WSC unless otherwise noted) - fixed LINECMP=0 interrupt case (fixes FF4 world map during airship sequence) - improved CPU timing (fixes Magical Drop flickering and FF1 battle music) - added per-frame OAM caching (fixes sprite glitchiness in Magical Drop, Riviera, etc.) - added RTC emulation (fixes Dicing Knight and Judgement Silversword) - added save state support - added cheat code support (untested because I don't know of any cheat codes that exist for this system) - icarus: can now detect games with RTC chips - SFC: bugfix to SharpRTC emulation (Dai Kaijuu Monogatari II) - ( I was adding the extra leap year day to all 12 months instead of just February ... >_< ) Note that the RTC emulation is very incomplete. It's not really documented at all, and the two games I've tried that use it never even ask you to set the date/time (so they're probably just using it to count seconds.) I'm not even sure if I've implement the level-sensitive behavior correctly (actually, now that I think about it, I need to mask the clear bit in INT_ACK for the level-sensitive interrupts ...) A bit worried about the RTC alarm, because it seems like it'll fire continuously for a full minute. Or even if you turn it off after it fires, then that doesn't seem to be lowering the line until the next second ticks on the RTC, so that likely needs to happen when changing the alarm flag. Also not sure on this RTC's weekday byte. On the SharpRTC, it actually computes this for you. Because it's not at all an easy thing to calculate yourself in 65816 or V30MZ assembler. About 40 lines of code to do it in C. For now, I'm requiring the program to calculate the value itself. Also note that there's some gibberish tiles in Judgement Silversword, sadly. Not sure what's up there, but the game's still fully playable at least. Finally, no surprise: Beat-Mania doesn't run :P
2016-03-25 06:19:08 +00:00
uint _serializeSize = 0;
};
extern System system;
Update to v104r06 release. byuu says: Changelog: - gba,ws: removed Thread::step() override¹ - processor/m68k: move.b (a7)+ and move.b (a7)- adjust a7 by two, not by one² - tomoko: created new initialize(Video,Audio,Input)Driver() functions³ - ruby/audio: split Audio::information into Audio::available(Devices,Frequencies,Latencies,Channels)³ - ws: added Model::(WonderSwan,WonderSwanColor,SwanCrystal)() functions for consistency with other cores ¹: this should hopefully fix GBA Pokemon Pinball. Thanks to SuperMikeMan for pointing out the underlying cause. ²: this fixes A Ressaha de Ikou, Mega Bomberman, and probably more games. ³: this is the big change: so there was a problem with WASAPI where you might change your device under the audio settings panel. And your new device may not support the frequency that your old device used. This would end up not updating the frequency, and the pitch would be distorted. The old Audio::information() couldn't tell you what frequencies, latencies, or channels were available for all devices simultaneously, so I had to split them up. The new initializeAudioDriver() function validates you have a correct driver, or it defaults to none. Then it validates a correct device name, or it defaults to the first entry in the list. Then it validates a correct frequency, or defaults to the first in the list. Then finally it validates a correct latency, or defaults to the first in the list. In this way ... we have a clear path now with no API changes required to select default devices, frequencies, latencies, channel counts: they need to be the first items in their respective lists. So, what we need to do now is go through and for every audio driver that enumerates devices, we need to make sure the default device gets added to the top of the list. I'm ... not really sure how to do this with most drivers, so this is definitely going to take some time. Also, when you change a device, initializeAudioDriver() is called again, so if it's a bad device, it will disable the audio driver instead of continuing to send samples at it and hoping that the driver blocked those API calls when it failed to initialize properly. Now then ... since it was a decently-sized API change, it's possible I've broken compilation of the Linux drivers, so please report any compilation errors so that I can fix them.
2017-08-26 01:15:49 +00:00
auto Model::WonderSwan() -> bool { return system.model() == System::Model::WonderSwan; }
auto Model::WonderSwanColor() -> bool { return system.model() == System::Model::WonderSwanColor; }
auto Model::SwanCrystal() -> bool { return system.model() == System::Model::SwanCrystal; }