Can Windows make use of 16+ core processors?
From Microsoft - Windows 10 supports a maximum of two physical CPUs, but the number of logical processors or cores varies based on the processor architecture. A maximum of 32 cores is supported in 32-bit versions of Windows 8, whereas up to 256 cores are supported in the 64-bit versions.
Can it use them? Absolutely. Will the average person take advantage of this much CPU power? Not likely.
These types of processors seem to be aimed at the gaming market
Not at all. While games do require some CPU horsepower, GPU power is often king in gaming. I doubt any game would come close to using this much CPU (unless there was a bug) anytime in the near future.
These types of CPU are more for data analytics and number crunching, not for a home consumer. One could make the argument that "more" is better, there are diminishing returns in standard use environments.
There definitely are some applications that can take advantage of such a high number of cores. Video editing, 3D modeling, etc. However, that level of CPU power is not used by the average user.
I feel we have personal computers with masses of power that can't really be used and are hamstrung by the operating system?
Outside of specific high-performance workstations or top end gaming rigs "personal" computers with that sort of core count are still relatively rare in my experience. But as to the main point of your question - the processor isn't hamstrung by the operating system because Windows 10 will happily use all 32 cores if you give them something to do.
Historically there weren't that many use cases where you are wanting to be running 32 simultaneously CPU-bound processes for any significant length of time, as machines with large numbers of cores are gradually becoming more widely available developers are changing to take advantage of them by increasing the parallelism of their software iD Software for example specifically coded Quake Champions to take advantage of that sort of processor.
Outside of gaming the ability to run multiple VMs, each with multiple dedicated cores can be very handy.
Let's answer it part by part.
My question is, can Windows 10 even make use of this power?
Technically - yes, based on Windows specifications it is ready to use those cores.
We know that it can support up to 32 cores but would it actually use them?
Yes, Windows theoretically can use them. No, practically there will be no significant performance boost for typical usage.
Are there programs out there that could?
Yes. From mining programs (that use CPU power for "making money") to multi-threaded compression programs, virtual machines and so on. Performance gain will be visible when running multiple single-threaded and/or multi-threaded CPU demanding applications.
These types of processors seem to be aimed at the gaming market
Those are ads, targeted mostly on gamers who don't know how to waste their money. (Explanations are given below.)
So would games be able to use all of that power?
Theoretically - yes. Practically - not now. Right now there is no AAA game that can make use of 16 CPU cores and bring any significant performance improvement. What you may gain from the most demanding game may be +1-2 FPS, compared to a typical 8 core CPU.
Game Streaming:
The cores may be useful for game streaming software, though it's dependent on the software - one can use CPU, the other GPU, the third - use both.
Yet low-lag game streaming hardware can be bought to provide constant high streaming FPS without significant harm to game FPS. The cost of such hardware (USD$200) is cheaper than the cost of "extra" CPU cores (+USD$1000).
I feel we have personal computers with masses of power that can't really be used and are hamstrung by the operating system?
Yes. Currently there is a lot of CPU and GPU power in computers and mobiles that are unused. That mostly includes users who use 4+ CPU core computers and smartphones mostly for browsing, leaving the cores almost idle most of the time.
Suggestion for gaming PCs:
Buying such CPUs for gaming is not recommended. Currently there is no game to use this power, so it's waste of money.
After a few years - when games will be able to use it - this "old-gold" CPU will miss new technologies (such as the next DDR* RAM) which you may want for having a next-generation top-gaming PC. So again, it will become a waste of money spent in the past.
Gaming computers should be upgraded approximately each three years, to always be able to run the top games. If you buy a part (CPU, GPU, or RAM) that will "last longer" (e.g. five years), then it's much better to save that extra money and upgrade the computer after three years, in order to avoid performance bottlenecks caused by old hardware.