+4 dBu line inputs can handle a louder signal and have more headroom. They are found on high-end audio devices and increasingly more modern consumer devices. -10 dBV line inputs are the standard for consumer devices.

For many the question is what setting should be used on a device’s line output and that depends on what the destination line input is. The best case is to match, and many devices have a switch to go between +4 dBu and -10 dBV

If you must use hardware that doesn’t match then try to match the gain:

If you have a device with +4 output going into a -10 dBV line input
Turn the gain down on the output device or in software that controls the output until the input device isn’t giving any indication of clipping (and your ears don’t tell you it is clipping). Alternatively if your -10 dBV mixer/interface/etc has gain controls you can try lowering the gain. One solution could sound better depending on the specific devices you are using. The best solution might be a little attenuation on both sides The key problem to mitigate here is to prevent the input device from being overloaded and sounding bad.

If your device has a -10 dBV input(s)
The problem here is a raised noise floor due to the output being significantly below optimal the optimal line input level. You will receive a quiet signal. If the destination hardware has gain controls you can turn the gain up. This is the higher fidelity problem to have vs having a signal that is too hot.

The reason fancier gear has inputs that can handle a voltage of +4dBu is because when the gain is higher the noise floor is lower relatively. If a device has a switch then most likely same principle apples, the -10 dBV setting is usually an attinuation which brings the signal closer to the noise floor. Switches in some hardware might use gain to reach +4 dBu but that is both more expensive and more complicated circuitry (introducing another amp) and the input device may have better-sounding gain.

Conversely -10 dBV is more for consumer/budget electronics because they may not be able to handle a hotter signal

The Nitty Gritty

+4 dBu is 1.23 volts
0 dBu is .775 volts
-10 dBV is 0.3162 volts

+4 dBu is used as a reference because the 1.23 volts historically (from way back in the tube days) was enough voltage over the plate noise (about 20 dB for many tubes) that one still has good signal-to-noise ratio while still leaving enough room above to allow for peaks and generally normal audio operations.