my i7-7700 is dropped by Microsoft about 4.5 years after I built what I expected would be a top-of-the-line system that would last me 10 years with a few upgrades here and there. And so, I have two options - i) stick to Windows 10 for another 4 years (which is the ultimate insult to any enthusiast) or ii) buy something new I don't need (and at a time when Intel's lineup is weak). So basically, Microsoft has obsoleted what was supposed to be a nice high-end system half-way through its expected life expectancy. Something they have never done before, as I'm sure many people still running Windows 10 on Core 2 Duo/Quads would attest to.
Oh, and just to add insult to injury, a N4050 celeron from a year or two newer with 4 gigs of RAM and 64 gigs of eMMC is supported, but Microsoft tells me my i7-7700 with 64 gigs of RAM, NVME SSD, a shiny new GTX 3070, etc is e-waste.
A Core 2 Duo system from 2006 may have started with XP, could run Vista (2007), 7 (2009), 8/8.1 (if you don't mind it being turned into a tablet), 10 (2015), etc. So yes, you're looking at being able to run a 9-year-old OS with another 10 years of security/etc updates. A C2D from 2006 can be perfectly supported and operational until 2025, i.e. 19 years, and possibly longer if it wasn't for this Windows 11 cut-off. Would it have performed great? Who knows - I suspect with 8 gigs of RAM and an SSD it would probably have been fine for a lot of purposes.
When I built my i7-7700 in early 2017, it was replacing a C2Q 8300 I bought in 2010 running on a motherboard I bought in 2008 that was starting to get very slow in part because it was limited to 8 gigs of RAM. So I was replacing a 9-year old motherboard that was capable of running the latest OS, so why wouldn't I expect my i7-7700 to have a 10 year lifecycle? I even spent extra money to get a motherboard with Thunderbolt 3 because I thought that might be good future proofing. The idea that 4.5 years later, my high-end system would be stuck on a particular version of Windows (with a final drop dead date of 2025, i.e. 8 years from when it was built) while a $300 laptop that's a year newer is able to run the latest version would have been unimaginable.
When is the last time Microsoft put a hard cutoff on hardware? Frankly, I don't think they ever have, or if they have it was laughably low such that no reasonable person would want to try something below the cut-off. I think Vista media couldn't boot on P3 machines, but those could only do 512 (i815) or 768 (440BX) of RAM anyways. I don't know if I have ever tried Win7/10 on pre-C2D/C2Q hardware - frankly unless you need a spaceheater there's no reason to run Pentium 4s/Ds for anything and hasn't been for over a decade.
And as I said in my earlier post, as an enthusiast, I consider "not able to run the latest OS" to be the equivalent of e-waste. Every Windows machine I have, including the C2Ds/C2Qs unplugged in my closet and a trio of nice 10-year-old quad-core Sandy Bridge laptops, is capable of running Windows 10 21H1 and doing it perfectly nicely. And at 4.5 years, when they consider an N4050 Celeron supported, I view this as an insult. Especially when their OS runs just fine on a VM on a 4590 host with BIOS/MBR and no TPM and is fully-supported there.
Oh, and let me add one additional detail - the only supported system I have is a Dell Inspiron 3780 I should never have bought. That thing crashes every few days running Windows 10. It is a complete piece of junk. All my other aging systems are rock, rock solid under Windows 10 21H1. And Microsoft tells me "stability" and "reliability" is why only newer systems are supported?!?
This whole thing seems to be driven by money - Lenovo, Dell, and HP see Apple getting away with a 7 year hardware lifecycle and they're annoyed that their machines, especially things other than laptops with built-in batteries, have a much, much, much longer lifecycle nowadays. So here comes Microsoft having decided that every pre-2018 system will be total e-waste in 2025 (unless you want to run Linux).