What is quality? It isn't some magical thing that exists. Define what you mean by that. And from a technical standpoint, no, the 8MP photo and the 2MP photo won't look the same, but the native 2MP photo will be more accurate. When you compress the 8MP down to 2MP, decisions on how to fit that data into 2MP (i.e., what data gets lost) are made by the compression algorithm without any notion of quality, etc. That's why there are always compression artifacts, and that's the difference between compression algorithms and settings. For example, JPEG compression has certain settings you can make related to "quality", but those settings essentially say, "how much data are you willing to lose". If you're going down to exactly 2MB, you have already determined the "quality". Photoshop, for example, allows you to set high, medium, and low quality on JPEG compression. That quality determines nothing more than how aggressive the algorithm is in removing data. Low quality compression will remove more data than high quality, and the file sizes will be different. You don't have high quality compression and low quality compression both compressing to the same size, because the definition of quality in image compression means how much data will be removed in the compression.
So, by saying you're going from 8MB to 2MP, in image compression, you've essentially stated the quality you want. There is no other meaning of "quality" in this sense outside of that; how much data you're comfortable having removed. Same with 1080p down to 900p. Sure, a 1080p image compressed to 900p won't look exactly the same as a native 900p image. But it won't be better looking, just different, because instead of rendering out directly to 900p, you've rendered to 1080p, then compressed down to 900p, and the compression will remove data according to a compression algorithm, which may actually end up looking worse.