¡Que siga la fiesta!
This is a classic example of lower resolution digital--the reverb trails and quieter decays on notes (aka the "lingering notes or delay") get lost in the dither due to distortion. I first realized how bad CD sound was when I listened to the first A&M CD release of Synchronicity (The Police) over headphones, way back in the mid 80s. When "Tea In The Sahara" faded out, the last half of the faded-out ending of the song faded away into a fuzzy distortion--there simply isn't enough resolution there to handle it. That is part of the reason I try to buy music in hi-res these days (and also why I had to spend a small fortune to buy a DAC that would make CD-resolution digital listenable to me)..... The digital tape recorder had a tendency to cut off all sounds very sharply, leaving little in the way of lingering notes or delay. This gave the recordings a very dry and cold sound ....
Studios these days use 24-bit/96kHz as a standard for a reason--not only the resolution for recording/playback, but also because any manipulation done in digital (even a level adjustment) causes rounding errors which leads to distortion when many digital operations are applied in a row. Digital equipment was crude back then. And engineers knew it. The advantages of low noise and stable pitch were offset by the sonic limitations, and they had to find ways to overcome it to make the end product listenable. I know of a few digital multitrack albums from the 80s that have always had a very harsh sound to them. Very typical of the era.