bsnes/higan/target-higan/program/program.hpp

61 lines
2.1 KiB
C++
Raw Normal View History

struct Program : Emulator::Platform {
//program.cpp
Program(vector<string> arguments);
auto main() -> void;
auto quit() -> void;
Update to v106r46 release. byuu says: Changelog: - bsnes, higan: simplified make output; reordered rules - hiro: added Window::set(Minimum,Maximum)Size() [only implemented in GTK+ so far] - bsnes: only allow the window to be shrunk to the 1x multiplier size - bsnes: refactored Integral Scaling checkbox to {Center, Scale, Stretch} radio selection - nall: call fflush() after nall::print() to stdout or stderr [needed for msys2/bash] - bsnes, higan: program/interface.cpp renamed to program/platform.cpp - bsnes: trim ".shader/" from names in Settings→Shader menu - bsnes: Settings→Shader menu updated on video driver changes - bsnes: remove missing games from recent files list each time it is updated - bsnes: video multiplier menu generated dynamically based on largest monitor size at program startup - bsnes: added shrink window and center window function to video multiplier menu - bsnes: de-minimize presentation window when exiting fullscreen mode or changing video multiplier - bsnes: center the load game dialog against the presentation window (important for multi-monitor setups) - bsnes: screenshots are not immediate instead of delayed one frame - bsnes: added frame advance menu option and hotkey - bsnes: added enable cheats checkbox and hotkey; can be used to quickly enable/disable all active cheats Errata: - hiro/Windows: `SW_MINIMIZED`, `SW_MAXIMIZED `=> `SW_MINIMIZE`, `SW_MAXIMIZE` - hiro/Windows: add pMonitor::workspace() - hiro/Windows: add setMaximized(), setMinimized() in pWindow::construct() - bsnes: call setCentered() after setMaximized(false)
2018-07-08 04:58:27 +00:00
//platform.cpp
auto path(uint id) -> string override;
auto open(uint id, string name, vfs::file::mode mode, bool required) -> vfs::shared::file override;
auto load(uint id, string name, string type, vector<string> options = {}) -> Emulator::Platform::Load override;
auto videoRefresh(uint display, const uint32* data, uint pitch, uint width, uint height) -> void override;
Update to v098r14 release. byuu says: Changelog: - improved attenuation of biquad filter by computing butterworth Q coefficients correctly (instead of using the same constant) - adding 1e-25 to each input sample into the biquad filters to try and prevent denormalization - updated normalization from [0.0 to 1.0] to [-1.0 to +1.0]; volume/reverb happen in floating-point mode now - good amount of work to make the base Emulator::Audio support any number of output channels - so that we don't have to do separate work on left/right channels; and can instead share the code for each channel - Emulator::Interface::audioSample(int16 left, int16 right); changed to: - Emulator::Interface::audioSample(double* samples, uint channels); - samples are normalized [-1.0 to +1.0] - for now at least, channels will be the value given to Emulator::Audio::reset() - fixed GUI crash on startup when audio driver is set to None I'm probably going to be updating ruby to accept normalized doubles as well; but I'm not sure if I will try and support anything other 2-channel audio output. It'll depend on how easy it is to do so; perhaps it'll be a per-driver setting. The denormalization thing is fierce. If that happens, it drops the emulator framerate from 220fps to about 20fps for Game Boy emulation. And that happens basically whenever audio output is silent. I'm probably also going to make a nall/denormal.hpp file at some point with platform-specific functionality to set the CPU state to "denormals as zero" where applicable. I'll still add the 1e-25 offset (inaudible) as another fallback.
2016-06-01 11:23:22 +00:00
auto audioSample(const double* samples, uint channels) -> void override;
auto inputPoll(uint port, uint device, uint input) -> int16 override;
auto inputRumble(uint port, uint device, uint input, bool enable) -> void override;
Update to v099r14 release. byuu says: Changelog: - (u)int(max,ptr) abbreviations removed; use _t suffix now [didn't feel like they were contributing enough to be worth it] - cleaned up nall::integer,natural,real functionality - toInteger, toNatural, toReal for parsing strings to numbers - fromInteger, fromNatural, fromReal for creating strings from numbers - (string,Markup::Node,SQL-based-classes)::(integer,natural,real) left unchanged - template<typename T> numeral(T value, long padding, char padchar) -> string for print() formatting - deduces integer,natural,real based on T ... cast the value if you want to override - there still exists binary,octal,hex,pointer for explicit print() formatting - lstring -> string_vector [but using lstring = string_vector; is declared] - would be nice to remove the using lstring eventually ... but that'd probably require 10,000 lines of changes >_> - format -> string_format [no using here; format was too ambiguous] - using integer = Integer<sizeof(int)*8>; and using natural = Natural<sizeof(uint)*8>; declared - for consistency with boolean. These three are meant for creating zero-initialized values implicitly (various uses) - R65816::io() -> idle() and SPC700::io() -> idle() [more clear; frees up struct IO {} io; naming] - SFC CPU, PPU, SMP use struct IO {} io; over struct (Status,Registers) {} (status,registers); now - still some CPU::Status status values ... they didn't really fit into IO functionality ... will have to think about this more - SFC CPU, PPU, SMP now use step() exclusively instead of addClocks() calling into step() - SFC CPU joypad1_bits, joypad2_bits were unused; killed them - SFC PPU CGRAM moved into PPU::Screen; since nothing else uses it - SFC PPU OAM moved into PPU::Object; since nothing else uses it - the raw uint8[544] array is gone. OAM::read() constructs values from the OAM::Object[512] table now - this avoids having to determine how we want to sub-divide the two OAM memory sections - this also eliminates the OAM::synchronize() functionality - probably more I'm forgetting The FPS fluctuations are driving me insane. This WIP went from 128fps to 137fps. Settled on 133.5fps for the final build. But nothing I changed should have affected performance at all. This level of fluctuation makes it damn near impossible to know whether I'm speeding things up or slowing things down with changes.
2016-07-01 11:50:32 +00:00
auto dipSettings(Markup::Node node) -> uint override;
auto notify(string text) -> void override;
//game.cpp
auto load() -> void;
auto load(Emulator::Interface& interface) -> void;
auto unload() -> void;
//state.cpp
Update to v103r08 release. byuu says: Changelog: - emulator: improved aspect correction accuracy by using floating-point calculations - emulator: added videoCrop() function, extended videoSize() to take cropping parameters¹ - tomoko: the overscan masking function will now actually resize the viewport² - gba/cpu: fixed two-cycle delay on triggering DMAs; not running DMAs when the CPU is stopped - md/vdp: center video when overscan is disabled - pce/vce: resize video output from 1140x240 to 1120x240 - tomoko: resize window scaling from 326x240 to 320x240 - tomoko: changed save slot naming and status bar messages to indicate quick states vs managed states - tomoko: added increment/decrement quick state hotkeys - tomoko: save/load quick state hotkeys now save to slots 1-5 instead of always to 0 - tomoko: increased overscan range from 0-16 to 0-24 (in case you want to mask the Master System to 240x192) ¹: the idea here was to decouple raw pixels from overscan masking. Overscan was actually horrifically broken before. The Famicom outputs at 256x240, the Super Famicom at 512x480, and the Mega Drive at 1280x480. Before, a horizontal overscan mask of 8 would not reduce the Super Famicom or Mega Drive by nearly as much as the Famicom. WIth the new videoCrop() function, the internals of pixel size distortions can be handled by each individual core. ²: furthermore, by taking optional cropping information in videoSize(), games can scale even larger into the viewport window. So for example, before the Super Famicom could only scale to 1536x1440. But by cropping the vertical resolution by 6 (228p effectively, still more than NTSC can even show), I can now scale to 1792x1596. And wiht aspect correction, that becomes a perfect 8:7 ratio of 2048x1596, giving me perfectly crisp pixels without linear interpolation being required. Errata: for some reason, when I save a new managed state with the SFC core, the default description is being set to a string of what looks to be hex numbers. I found the cause ... I'll fix this in the next release. Note: I'd also like to hide the "find codes..." button if cheats.bml isn't present, as well as update the SMP TEST register comment from smp/timing.cpp
2017-07-05 05:44:15 +00:00
auto stateName(uint slot, bool managed = false) -> string;
auto loadState(uint slot, bool managed = false) -> bool;
auto saveState(uint slot, bool managed = false) -> bool;
//utility.cpp
Update to v104r06 release. byuu says: Changelog: - gba,ws: removed Thread::step() override¹ - processor/m68k: move.b (a7)+ and move.b (a7)- adjust a7 by two, not by one² - tomoko: created new initialize(Video,Audio,Input)Driver() functions³ - ruby/audio: split Audio::information into Audio::available(Devices,Frequencies,Latencies,Channels)³ - ws: added Model::(WonderSwan,WonderSwanColor,SwanCrystal)() functions for consistency with other cores ¹: this should hopefully fix GBA Pokemon Pinball. Thanks to SuperMikeMan for pointing out the underlying cause. ²: this fixes A Ressaha de Ikou, Mega Bomberman, and probably more games. ³: this is the big change: so there was a problem with WASAPI where you might change your device under the audio settings panel. And your new device may not support the frequency that your old device used. This would end up not updating the frequency, and the pitch would be distorted. The old Audio::information() couldn't tell you what frequencies, latencies, or channels were available for all devices simultaneously, so I had to split them up. The new initializeAudioDriver() function validates you have a correct driver, or it defaults to none. Then it validates a correct device name, or it defaults to the first entry in the list. Then it validates a correct frequency, or defaults to the first in the list. Then finally it validates a correct latency, or defaults to the first in the list. In this way ... we have a clear path now with no API changes required to select default devices, frequencies, latencies, channel counts: they need to be the first items in their respective lists. So, what we need to do now is go through and for every audio driver that enumerates devices, we need to make sure the default device gets added to the top of the list. I'm ... not really sure how to do this with most drivers, so this is definitely going to take some time. Also, when you change a device, initializeAudioDriver() is called again, so if it's a bad device, it will disable the audio driver instead of continuing to send samples at it and hoping that the driver blocked those API calls when it failed to initialize properly. Now then ... since it was a decently-sized API change, it's possible I've broken compilation of the Linux drivers, so please report any compilation errors so that I can fix them.
2017-08-26 01:15:49 +00:00
auto initializeVideoDriver() -> void;
auto initializeAudioDriver() -> void;
auto initializeInputDriver() -> void;
Update to v105r1 release. byuu says: Changelog: - higan: readded support for soft-reset to Famicom, Super Famicom, Mega Drive cores (work in progress) - handhelds lack soft reset obviously - the PC Engine also lacks a physical reset button - the Master System's reset button acts like a gamepad button, so can't show up in the menu - Mega Drive: power cycle wasn't initializing CPU (M68K) or APU (Z80) RAM - Super Famicom: fix SPC700 opcode 0x3b regression; fixes Majuu Ou [Jonas Quinn] - Super Famicom: fix SharpRTC save regression; fixes Dai Kaijuu Monogatari II's real-time clock [Talarubi] - Super Famicom: fix EpsonRTC save regression; fixes Tengai Makyou Zero's real-time clock [Talarubi] - Super Famicom: removed `*::init()` functions, as they were never used - Super Famicom: removed all but two `*::load()` functions, as they were not used - higan: added option to auto-save backup RAM every five seconds (enabled by default) - this is in case the emulator crashes, or there's a power outage; turn it off under advanced settings if you want - libco: updated license from public domain to ISC, for consistency with nall, ruby, hiro - nall: Linux compiler defaults to g++; override with g++-version if g++ is <= 4.8 - FreeBSD compiler default is going to remain g++49 until my dev box OS ships with g++ >= 4.9 Errata: I have weird RAM initialization constants, thanks to hex_usr and onethirdxcubed for both finding this: http://wiki.nesdev.com/w/index.php?title=CPU_power_up_state&diff=11711&oldid=11184 I'll remove this in the next WIP.
2017-11-06 22:05:54 +00:00
auto softReset() -> void;
auto powerCycle() -> void;
auto togglePause() -> void;
Update to v102r22 release. byuu says: Changelog: - higan: Emulator::Interface::videoSize() renamed to videoResolution() - higan: Emulator::Interface::rtcsync() renamed to rtcSynchronize() - higan: added video display rotation support to Video - GBA: substantially improved audio mixing - fixed bug with FIFO 50%/100% volume setting - now properly using SOUNDBIAS amplitude to control output frequencies - reduced quantization noise - corrected relative volumes between PSG and FIFO channels - both PSG and FIFO values cached based on amplitude; resulting in cleaner PCM samples - treating PSG volume=3 as 200% volume instead of 0% volume now (unverified: to match mGBA) - GBA: properly initialize ALL CPU state; including the vital prefetch.wait=1 (fixes Classic NES series games) - GBA: added video rotation with automatic key translation support - PCE: reduced output resolution scalar from 285x242 to 285x240 - the extra two scanlines won't be visible on most TVs; and they make all other cores look worse - this is because all other cores output at 240p or less; so they were all receiving black bars in windowed mode - tomoko: added "Rotate Display" hotkey setting - tomoko: changed hotkey multi-key logic to OR instead of AND - left support for flipping it back inside the core; for those so inclined; by uncommenting one line in input.hpp - tomoko: when choosing Settings→Configuration, it will automatically select the currently loaded system - for instance, if you're playing a Game Gear game, it'll take you to the Game Gear input settings - if no games are loaded, it will take you to the hotkeys panel instead - WS(C): merged "Hardware-Vertical", "Hardware-Horizontal" controls into combined "Hardware" - WS(C): converted rotation support from being inside the core to using Emulator::Video - this lets WS(C) video content scale larger now that it's not bounded by a 224x224 square box - WS(C): added automatic key rotation support - WS(C): removed emulator "Rotate" key (use the general hotkey instead; I recommend F8 for this) - nall: added serializer support for nall::Boolean (boolean) types - although I will probably prefer the usage of uint1 in most cases
2017-06-08 14:05:48 +00:00
auto rotateDisplay() -> void;
auto showMessage(const string& text) -> void;
auto updateStatusText() -> void;
Update to v098r06 release. byuu says: Changelog: - emulation cores now refresh video from host thread instead of cothreads (fix AMD crash) - SFC: fixed another bug with leap year months in SharpRTC emulation - SFC: cleaned up camelCase on function names for armdsp,epsonrtc,hitachidsp,mcc,nss,sharprtc classes - GB: added MBC1M emulation (requires manually setting mapper=MBC1M in manifest.bml for now, sorry) - audio: implemented Emulator::Audio mixer and effects processor - audio: implemented Emulator::Stream interface - it is now possible to have more than two audio streams: eg SNES + SGB + MSU1 + Voicer-Kun (eventually) - audio: added reverb delay + reverb level settings; exposed balance configuration in UI - video: reworked palette generation to re-enable saturation, gamma, luminance adjustments - higan/emulator.cpp is gone since there was nothing left in it I know you guys are going to say the color adjust/balance/reverb stuff is pointless. And indeed it mostly is. But I like the idea of allowing some fun special effects and configurability that isn't system-wide. Note: there seems to be some kind of added audio lag in the SGB emulation now, and I don't really understand why. The code should be effectively identical to what I had before. The only main thing is that I'm sampling things to 48000hz instead of 32040hz before mixing. There's no point where I'm intentionally introducing added latency though. I'm kind of stumped, so if anyone wouldn't mind taking a look at it, it'd be much appreciated :/ I don't have an MSU1 test ROM, but the latency issue may affect MSU1 as well, and that would be very bad.
2016-04-22 13:35:51 +00:00
auto updateVideoPalette() -> void;
auto updateVideoShader() -> void;
Update to v098r06 release. byuu says: Changelog: - emulation cores now refresh video from host thread instead of cothreads (fix AMD crash) - SFC: fixed another bug with leap year months in SharpRTC emulation - SFC: cleaned up camelCase on function names for armdsp,epsonrtc,hitachidsp,mcc,nss,sharprtc classes - GB: added MBC1M emulation (requires manually setting mapper=MBC1M in manifest.bml for now, sorry) - audio: implemented Emulator::Audio mixer and effects processor - audio: implemented Emulator::Stream interface - it is now possible to have more than two audio streams: eg SNES + SGB + MSU1 + Voicer-Kun (eventually) - audio: added reverb delay + reverb level settings; exposed balance configuration in UI - video: reworked palette generation to re-enable saturation, gamma, luminance adjustments - higan/emulator.cpp is gone since there was nothing left in it I know you guys are going to say the color adjust/balance/reverb stuff is pointless. And indeed it mostly is. But I like the idea of allowing some fun special effects and configurability that isn't system-wide. Note: there seems to be some kind of added audio lag in the SGB emulation now, and I don't really understand why. The code should be effectively identical to what I had before. The only main thing is that I'm sampling things to 48000hz instead of 32040hz before mixing. There's no point where I'm intentionally introducing added latency though. I'm kind of stumped, so if anyone wouldn't mind taking a look at it, it'd be much appreciated :/ I don't have an MSU1 test ROM, but the latency issue may affect MSU1 as well, and that would be very bad.
2016-04-22 13:35:51 +00:00
auto updateAudioDriver() -> void;
auto updateAudioEffects() -> void;
Update to v103r13 release. byuu says: Changelog: - gb/interface: fix Game Boy Color extension to be "gbc" and not "gb" [hex\_usr] - ms/interface: move Master System hardware controls below controller ports - sfc/ppu: improve latching behavior of BGnHOFS registers (not hardware verified) [AWJ] - tomoko/input: rework port/device mapping to support non-sequential ports and devices¹ - todo: should add move() to inputDevice.mappings.append and inputPort.devices.append - note: there's a weird GCC 4.9 bug with brace initialization of InputEmulator; have to assign each field separately - tomoko: all windows sans the main presentation window can be dismissed with the escape key - icarus: the single file selection dialog ("Load ROM Image...") can be dismissed with the escape key - tomoko: do not pause emulation when FocusLoss/Pause is set during exclusive fullscreen mode - hiro/(windows,gtk,qt): implemented Window::setDismissable() function (missing from cocoa port, sorry) - nall/string: fixed printing of largest possible negative numbers (eg `INT_MIN`) [Sintendo] - only took eight months! :D ¹: When I tried to move the Master System hardware port below the controller ports, I ran into a world of pain. The input settings list expects every item in the `InputEmulator<InputPort<InputDevice<InputMapping>>>>` arrays to be populated with valid results. But these would be sparsely populated based on the port and device IDs from inside higan. And that is done so that the Interface::inputPoll can have O(1) lookup of ports and devices. This worked because all the port and device IDs were sequential (they left no gaps in the maps upon creating the lists.) Unfortunately by changing the expectation of port ID to how it appears in the list, inputs would not poll correctly. By leaving them alone and just moving Hardware to the third position, the Game Gear would be missing port IDs of 0 and 1 (the controller ports of the Master System). Even by trying to make separate MasterSystemHardware and GameGearHardware ports, things still fractured when the devices were no longer contigious. I got pretty sick of this and just decided to give up on O(1) port/device lookup, and moved to O(n) lookup. It only knocked the framerate down by maybe one frame per second, enough to be in the margin of error. Inputs aren't polled *that* often for loops that usually terminate after 1-2 cycles to be too detrimental to performance. So the new input system now allows non-sequential port and device IDs. Remember that I killed input IDs a while back. There's never any reason for those to need IDs ... it was easier to just order the inputs in the order you want to see them in the user interface. So the input lookup is still O(1). Only now, everything's safer and I return a maybe<InputMapping&>, and won't crash out the program trying to use a mapping that isn't found for some reason. Errata: the escape key isn't working on the browser/message dialogs on Windows, because of course nothing can ever just be easy and work for me. If anyone else wouldn't mind looking into that, I'd greatly appreciate it. Having the `WM_KEYDOWN` test inside the main `Application_sharedProc`, it seems to not respond to the escape key on modal dialogs. If I put the `WM_KEYDOWN` test in the main window proc, then it doesn't seem to get called for `VK_ESCAPE` at all, and doesn't get called period for modal windows. So I'm at a loss and it's past 4AM here >_>
2017-07-12 08:24:27 +00:00
auto focused() -> bool;
Update to v101r06 release. byuu says: I reworked the video sizing code. Ended up wasting five fucking hours fighting GTK. When you call `gtk_widget_set_size_request`, it doesn't actually happen then. This is kind of a big deal because when I then go to draw onto the viewport, the actual viewport child window is still the old size, so the image gets distorted. It recovers in a frame or so with emulation, but if we were to put a still image on there, it would stay distorted. The first thought is, `while(gtk_events_pending()) gtk_main_iteration_do(false);` right after the `set_size_request`. But nope, it tells you there's no events pending. So then you think, go deeper, use `XPending()` instead. Same thing, GTK hasn't actually issued the command to Xlib yet. So then you think, if the widget is realized, just call a blocking `gtk_main_iteration`. One call does nothing, two calls results in a deadlock on the second one ... do it before program startup, and the main window will never appear. Great. Oh, and it's not just the viewport. It's also the widget container area of the windows, as well as the window itself, as well as the fullscreen mode toggle effect. They all do this. For the latter three, I couldn't find anything that worked, so I just added 20ms loops of constantly calling `gtk_main_iteration_do(false)` after each one of those things. The downside here is toggling the status bar takes 40ms, so you'll see it and it'll feel a tiny bit sluggish. But I can't have a 20ms wait on each widget resize, that would be catastrophic to performance on windows with lots of widgets. I tried hooking configure-event and size-allocate, but they were very unreliable. So instead I ended up with a loop that waits up to a maximm of 20ms that inspects the `widget->allocation.(width,height)` values directly and waits for them to be what we asked for with `set_size_request`. There was some extreme ugliness in GTK with calling `gtk_main_iteration_do` recursively (`hiro::Widget::setGeometry` is called recursively), so I had to lock it to only happen on the top level widgets (the child ones should get resized while waiting on the top-level ones, so it should be fine in practice), and also only run it on realized widgets. Even still, I'm getting ~3 timeouts when opening the settings dialog in higan, but no other windows. But, this is the best I can do for now. And the reason for all of this pain? Yeah, updated the video code. So the Emulator::Interface now has this:    struct VideoSize { uint width, height; };  //or requiem for a tuple    auto videoSize() -> VideoSize;    auto videoSize(uint width, uint height, bool arc) -> VideoSize; The first function, for now, is just returning the literal surface size. I may remove this ... one thing I want to allow for is cores that send different texture sizes based on interlace/hires/overscan/etc settings. The second function is more interesting. Instead of having the UI trying to figure out sizing, I figure the emulation cores can do a better job and we can customize it per-core now. So it gets the window's width and height, and whether the user asked for aspect correction, and then computes the best width/height ratio possible. For now they're all just doing multiples of a 1x scale to the UI 2x,3x,4x modes. We still need a third function, which will probably be what I repurpose videoSize() for: to return the 'effective' size for pixel shaders, to then feed into ruby, to then feed into quark, to then feed into our shaders. Since shaders use normalized coordinates for pixel fetching, this should work out just fine. The real texture size will be exposed to quark shaders as well, of course. Now for the main window ... it's just hard-coded to be 640x480, 960x720, 1280x960 for now. It works nicely for some cores on some modes, not so much for others. Work in progress I guess. I also took the opportunity to draw the about dialog box logo on the main window. Got a bit fancy and used the old spherical gradient and impose functionality of nall/image on it. Very minor highlight, nothing garish. Just something nicer than a solid black window. If you guys want to mess around with sizes, placements, and gradient styles/colors/shapes ... feel free. If you come up with something nicer, do share. That's what led to all the GTK hell ... the logo wasn't drawing right as you resized the window. But now it is, though I am not at all happy with the hacking I had to do. I also had to improve the video update code as a result of this: - when you unload a game, it blacks out the screen - if you are not quitting the emulator, it'll draw the logo; if you are, it won't - when you load a game, it black out the logo These options prevent any unsightliness from resizing the viewport with image data on it already I need to redraw the logo when toggling fullscreen with no game loaded as well for Windows, it seems.
2016-08-13 13:57:48 +00:00
bool hasQuit = false;
bool pause = false;
vector<Emulator::Interface*> emulators;
vector<string> gameQueue; //for command-line and drag-and-drop loading
vector<string> gamePaths; //for keeping track of loaded folder locations
Update to v105r1 release. byuu says: Changelog: - higan: readded support for soft-reset to Famicom, Super Famicom, Mega Drive cores (work in progress) - handhelds lack soft reset obviously - the PC Engine also lacks a physical reset button - the Master System's reset button acts like a gamepad button, so can't show up in the menu - Mega Drive: power cycle wasn't initializing CPU (M68K) or APU (Z80) RAM - Super Famicom: fix SPC700 opcode 0x3b regression; fixes Majuu Ou [Jonas Quinn] - Super Famicom: fix SharpRTC save regression; fixes Dai Kaijuu Monogatari II's real-time clock [Talarubi] - Super Famicom: fix EpsonRTC save regression; fixes Tengai Makyou Zero's real-time clock [Talarubi] - Super Famicom: removed `*::init()` functions, as they were never used - Super Famicom: removed all but two `*::load()` functions, as they were not used - higan: added option to auto-save backup RAM every five seconds (enabled by default) - this is in case the emulator crashes, or there's a power outage; turn it off under advanced settings if you want - libco: updated license from public domain to ISC, for consistency with nall, ruby, hiro - nall: Linux compiler defaults to g++; override with g++-version if g++ is <= 4.8 - FreeBSD compiler default is going to remain g++49 until my dev box OS ships with g++ >= 4.9 Errata: I have weird RAM initialization constants, thanks to hex_usr and onethirdxcubed for both finding this: http://wiki.nesdev.com/w/index.php?title=CPU_power_up_state&diff=11711&oldid=11184 I'll remove this in the next WIP.
2017-11-06 22:05:54 +00:00
time_t autoSaveTime = 0; //for automatically saving RAM periodically
string statusText;
string statusMessage;
time_t statusTime = 0;
};
extern unique_pointer<Program> program;