It's possible (but rare) for a WIA or RVZ file to support
this for some partitions but not all, and for the game and
the blob code to disagree on how large a partition is.
In particular, I wanted to do change this in
AudioInterface::Init so that dumped GC audio doesn't need
to have a file split (changing from 32000 Hz to 32029 Hz)
when the emulated software initializes the AI registers.
I've also made the same change to DI's DTK code.
The heuristic was not allocating enough space for Metroid: Other M,
at least when using the default settings. (This didn't break the
file, it just caused some headers to be placed at the end of the
file instead of at the start and wasted a few hundred kilobytes.)
The new hash check catches essentially all desync problems
that VolumeVerifier can catch, so from the user's perspective,
such problems will result in Dolphin refusing to start the
game on netplay rather than actually getting a desync.
This moves the scan thread logic and variables into a separate
ScanThread class. By turning ScanThread instances into members of the
most derived class, this ensures that the scan thread is always
properly stopped when the most derived class is destructed and fixes
a race condition that could cause the scan thread to call virtual
member functions from a derived class whose members have already
been destructed.
A drawback of this approach is that the scan thread must be the last
member variable, so this commit also adds static assertions to ensure
that the assumption stays valid.
The "FindLibLZMA.cmake" module in CMake versions prior to 3.14 do not
set an alias like how Externals/liblzma/CMakeLists.txt does, so builds
performed using one of those older CMake versions will fail if the
system LZMA library is detected. To fix this, we need to link against
"lzma" instead of "LibLZMA::LibLZMA".
Fixes: b59ef81a7e ("WIA: Implement bzip2, LZMA, and LZMA2 decompression")
When I first made VolumeVerifier, I figured that the distinction
between an unsigned ticket and an unsigned TMD was a technical
detail that users would have no reason to care about. However,
while this might be true for discs, it isn't equally true for
WADs, due to the widespread practice of fakesigning tickets to
set the console ID to 0. This practice does not require
fakesigning the TMD (though apparently people do it anyway,
at least sometimes...), and the presence of a correctly signed
TMD is a useful indicator that the contents have not been
tampered with, even if the ticket isn't correctly signed.
FCVT doesn't necessarily round to zero, so the result
might be inaccurate if we use it. To ensure correct
rounding, we use FCVTS from double FPR to 32-bit GPR.
Unfortunately, FCVTS can't do double FPR to single FPR.
This is now unused. Seems like it was an improper fix
(there would be a race if saving the screenshot took longer
than 2 seconds) back when it was used too.
The intent here is to generate a more compact instruction if a 32-bit
immediate can be zero-extended to the desired 64-bit immediate.
Nowadays the emitter is smart enough to do this for us, so this logic is
redundant.
There's no need to load the 64-bit immediate into a temporary register.
x64 will sign-extend 32-bit immediates to 64 bits, giving us the exact
value we need in this case.
48 C7 C0 00 00 FF FF mov rax,0FFFFFFFFFFFF0000h
48 21 C2 and rdx,rax
48 81 E2 00 00 FF FF and rdx,0FFFFFFFFFFFF0000h
- LEA is a bit silly when the source and the destination are the same. A
simple ADD or SHL will do in those cases.
66 8D 04 45 00 00 00 00 lea ax,[rax*2]
66 03 C0 add ax,ax
48 8D 04 00 lea rax,[rax+rax]
48 03 C0 add rax,rax
66 8D 14 D5 00 00 00 00 lea dx,[rdx*8]
66 C1 E2 03 shl dx,3
- When scaling by 2, consider summing the register with itself instead.
The former always needs a 32-bit displacement, so the sum is more
compact.
66 8D 14 45 00 00 00 00 lea dx,[rax*2]
66 8D 14 00 lea dx,[rax+rax]
Other than the controller settings and JIT debug settings,
these are the only settings which were defined in Java code
but not defined in the new config system in C++. (There are
still a lot of settings that are defined in the new config
system but not yet saveable in the new config system, though.)
Instead of comparing the game ID, revision, disc number and name,
we can compare a hash of important parts of the disc including
all the aforementioned data but also additional data such as the
FST. The primary reason why I'm making this change is to let us
catch more desyncs before they happen, but this should also fix
https://bugs.dolphin-emu.org/issues/12115. As a bonus, the UI can
now distinguish the case where a client doesn't have the game at
all from the case where a client has the wrong version of the game.
Turns out, Gamecube games actually do check DILENGTH, and if DILENGTH is at 0, they'll think the transfer completed successfully even if DEINT is used, since after all, surely that means everything was sent. That caused all sorts of issues, from audio looping when a disc is removed since it's re-using the same buffer to just flat-out crashing instead of showing the disc removed screen.
In particular:
- Trying to play audio in a non-ready state returns the state-specific error, not an audio buf error
- Audio status cannot be requested in non-ready states
- The audio buffer cannot be configured in states other than ReadyNoReadsMade
- Using the stop motor command while the motor is already stopped doesn't change states
Additionally, the internal state IDs are used (which distinguish ReadyNoReadsMade and Ready), instead of the state IDs exposed in request error. This makes some of the weird behavior a bit more obvious.
State and error behavior of the seek command was not implemented in this commit.
It is my opinion that nobody should use NKit disc images without
being aware of the drawbacks of them. Since it seems like almost
nobody who is using NKit disc images knows what NKit is (hmm, now
how could that have happened...?), I am adding a warning to Dolphin
so that you can't run NKit disc images without finding out about the
drawbacks. In case someone really does want to use NKit disc images,
the warning has a "Don't show this again" option. Unfortunately, I
can't retroactively add the warning where it's most needed:
in Dolphin 5.0, which does not support Wii NKit disc images.
Pretty much the same optimization we did for AVX, although slightly more
constrained because we're stuck with the two-operand instruction where
destination and source have to match.
We could also specialize the case where registers b, c, and d are all
distinct, but I decided against it since I couldn't find any game that
does this.
Before:
66 0F 57 C0 xorpd xmm0,xmm0
66 41 0F C2 C1 06 cmpnlepd xmm0,xmm9
41 0F 28 CE movaps xmm1,xmm14
66 41 0F 38 15 CC blendvpd xmm1,xmm12,xmm0
44 0F 28 F1 movaps xmm14,xmm1
After:
66 0F 57 C0 xorpd xmm0,xmm0
66 41 0F C2 C1 06 cmpnlepd xmm0,xmm9
66 45 0F 38 15 F4 blendvpd xmm14,xmm12,xmm0
AVX has a four-operand VBLENDVPD instruction, which allows for the first
input and the destination to be different. By taking advantage of this,
we no longer need to copy one of the inputs around and we can just
reference it directly, provided it's already in a register (I have yet
to see this not be the case).
Before:
66 0F 57 C0 xorpd xmm0,xmm0
F2 41 0F C2 C6 06 cmpnlesd xmm0,xmm14
41 0F 28 CE movaps xmm1,xmm14
66 41 0F 38 15 CA blendvpd xmm1,xmm10,xmm0
F2 44 0F 10 F1 movsd xmm14,xmm1
After:
66 0F 57 C0 xorpd xmm0,xmm0
F2 41 0F C2 C6 06 cmpnlesd xmm0,xmm14
C4 C3 09 4B CA 00 vblendvpd xmm1,xmm14,xmm10,xmm0
F2 44 0F 10 F1 movsd xmm14,xmm1
Fixes a critical regression where 95945a0 made us unable to
start emulation on Android 10 and newer. Android is restricting
direct access to /dev/ashmem starting with the new SDK version,
but we can use the new (and simpler) ASharedMemory API instead.
We have to keep using the /dev/ashmem approach on old versions
of Android, though.
On Linux, if shared zlib is present, zlib.h is always available and -lz
links to zlib, even if you don't run find_package(ZLIB).
For some reason I have zlib installed on Windows (possibly from vcpkg),
so find_package(ZLIB) succeeds and ZLIB_FOUND is true.
When Dolphin uses shared zlib on Windows, the problem is that zlib.h
is not in the default include path, and the CMake target is called
ZLIB::ZLIB and there's neither a target nor a library called z.
However, both find_package(ZLIB) and add_subdirectory(Externals/zlib)
create a target called ZLIB::ZLIB, so I'll switch to that instead.
Hopefully this change doesn't break anyone's build.
This modifies GCMemcard::TitlePresent() to match my findings of how the GC BIOS and various games behave when you alter the fields in the directory entry.
It looks like for a save to be recognized by a game, the following have to be true:
- Game code and maker code must exactly match what the game expects.
- Filename is only checked up to the first null byte. All bytes afterwards can be whatever.
The BIOS itself does a full compare of the filename when checking for whether it should allow copying a file from one card to another, but behaves oddly in some cases when there's non-null bytes after the first null. See the big comment in `HasSameIdentity()` for details.
This could cause read errors if chunks were laid out a certain
way in the file and the whole chunk wasn't being read at once.
Should fix https://bugs.dolphin-emu.org/issues/12184.
I believe the value returned by value() resets when we call
setValue() with the maximum (due to auto-reset). I have been
unable to test this because I can't reproduce the issue, which is
described at https://bugs.dolphin-emu.org/issues/12158#note-9.
The functions with "UTF" in the name use "modified UTF-8" rather
than the standard UTF-8 which Dolphin uses, at least according
to Oracle's documentation, so it is incorrect for us to use them.
This change fixes the problem by converting between UTF-8 and
UTF-16 manually instead of letting JNI do it for us.
This function does *not* always convert from UTF-16. It converts
from UTF-16 on Windows and UTF-32 on other operating systems.
Also renaming UTF8ToUTF16 for consistency, even though it
technically doesn't have the same problem since it only was
implemented on Windows.
As a side effect of 9c5c3c0, Dolphin's frame counter was changed
to run at 60/50 Hz even if the game is running at a lower framerate
such as 30 fps. Since the TAS input turbo button functionality
toggled the state of a button every other frame as reported by
the frame counter, this change made the turbo button functionality
not work with 30/25 fps games.
I believe it would be hard to change the frame counter back to
how it used to work without undermining the point of 9c5c3c0,
and I'm not sure if doing so would be desireable or not anyway,
so what I'm doing instead is letting the user determine how long
turbo button presses should last. This lets users avoid the 30/25
fps game problem while also granting additional flexibility.
Perhaps there is some game where it is useful to mash at a speed
which is slower than frame perfect.
Also added a IsRunning function as it was impossible to know whether it had been started or not (I will use it in later PRs but it should be there anyway)
Currently, the touch controller overlay uses a square gate for
sticks. This commit changes that so that it instead uses the
stick gate configured in the INI, which ensures that the values
sent to the core are appropriately scaled regardless of what
is configured in the INI and makes the overlay look nicer
if the INI is set to a stick gate that matches the graphics.
While manually capturing constexpr variables used in lambda
expressions does work, it's really easy to forget doing so since
we don't have a Windows CMake builder and the workaround isn't
necessary anywhere else. Fortunately, MSVC has a flag that fixes
the constexpr capture behavior, so let's use that instead.
Fixes https://bugs.dolphin-emu.org/issues/12066.
I must've only tested the frame counter with an earlier version
of the PR that broke this, not the final version...
HostIsInstructionRAMAddress uses XCheckTLBFlag::OpcodeNoException,
so we should also use XCheckTLBFlag::OpcodeNoException when reading,
to ensure that we use the IBAT (as opposed to the DBAT) for both.
Doesn't support triggering interrupts when the thermal threshold is
exceeded, but allows polling for temperature information.
The THRM[123] registers are documented in most PPC datasheets, see e.g.
this PPC750CX one: http://datasheets.chipdb.org/IBM/PowerPC/750/750cx_um3-17-05.pdf
PURGE isn't especially useful, while requiring some annoying
special handling in the file format. If you want no compression,
use NONE. If you want fast compression, use Zstandard.
Gets rid of the need to seek to the end of the file
when opening a file.
The downside of this is that we waste a little space,
since we can't know in advance exactly how much
space the compressed parts of the headers will need.
This is useful for the way Dolphin scrubs Wii discs.
The encrypted data is what gets zeroed out, but this
zeroed out data then gets decrypted before being stored,
and the resulting data does not compress well.
However, each block of decrypted scrubbed data is
identical given the same encryption key, and there's
nothing stopping us from making multiple group entries
point to the same offset in the file, so we only have
to store one copy of this data per partition.
For reference, wit zeroes out the decrypted data,
but Dolphin's WIA writer can't do this because it currently
doesn't know which parts of the disc are scrubbed.
This is also useful for things such as storing Datel discs
full of 0x55 blocks (repesenting unreadable blocks)
without compression enabled.
Fixes https://bugs.dolphin-emu.org/issues/10654.
To quote the documenation file included with the program tgctogcm:
"TGC's are miniaturized .gcm images with a 32kB header.
The embedded gcm contains some bogus data, namely:
-FST Location (0x424 in gcm)
-DOL Location (0x420 in gcm)
-FST File offsets (all files are offset/spoofed by a certain amount)"
Dolphin has been handling the values at 0x420 and 0x424 by simply
overwriting them with a working value (just like tgctogcm does),
but it has used a different approach for the file offsets in the FST.
Instead of changing the offsets that are stored in the FST, Dolphin
changed where the files actually are placed on the virtual disc.
My hope was that this would make the loading times more accurate to
how they are when running a TGC file as part of a larger disc.
However, there are TGC files where we would need to move files
backwards on the disc in order to do this (this is what issue
10654 is about), so the approach we have been using is flawed.
This change makes Dolphin overwrite offsets in the FST instead, like
tgctogcm does. Other than making Dolphin handle the affected TGC files
correctly, this change also makes it so that unnecessary padding data
isn't written if you use Dolphin to convert a TGC file to an ISO file.
This feature is not actually implemented in Dolphin as of now, but I'm
planning to add it in the near future as part of a larger feature.