HDMI being digital is much less prone to degradation than an analogue signal would be, but there is still a limit to how long a cable can be before a repeater is needed. 50 foot seems to be a common figure bandied around.
VGA uses 3 wires for the Red, Green and Blue components of the picture (and others for synchronising: saying when to start a new screen or line). Each carries a voltage that swings between 0v and 0.7v (that's what makes it analogue) to define the intensity of Red, Green or Blue at any moment. That voltage can be affected by noise at any time, and long or crappy cables will have an effect on that voltage that directly affects the picture.
The encoding (and electrical characteristics) of a picture in a digital format like HDMI is a bit more complicated, but suffice to say that the colours are essentially sent as binary; numbers made up of lots of 1's and 0's. The difference between a 1 and 0 is enough that a bit of noise or attenuation might have no noticeable effect. (Not entirely accurate in this context, but for illustration ... if the receiver gets "0.1" and "0.9" it knows the sender really meant "0" and "1".). That's why cheaper cables are generally fine for HDMI, super special wires generally won't make the colours on the screen any better or the sound any clearer. Eventually though, with long enough cable the signals will eventually degrade to the point where the content does get lost.
The way it's noticed will be different. A bit like with old fashioned analogue T.V. using an aerial, noise could result in snow on screen as the picture is directly affected. As it gets worse eventually the whole picture might be lost as even the sync signals are smashed. With a modern satellite decoder (so images are digital and encoded), a bit of noise won't be seen at all. But if any noise gets worse, entire blocks will vanish from the screen as chunks of the encoded picture can't be deciphered. Eventually too the whole picture can be lost.
I don't know if Display Port is "the new top standard", but it's certainly very common in computing. Video cards I've bought recently have had a mix of DVI, HDMI and Display Port. The laptop I'm typing this post on is on a docking station that has two Display Ports (and these go through adapters to DVI inputs on my external monitors). The laptop itself has VGA and Display Port (business focus). In consumer stuff like T.V.'s as far as I know HDMI still rules; a consumer focused laptop would often have HDMI, not Display Port.
Edit: one more comment on cheap cables: where I've seen the difference is in the mechanical factors. i.e. very cheap cables where the case at one end breaks apart. So while I'm all for cheap HDMI cables, I'd steer clear of the $1 ones ...