50W LED with 10W power equal to 10W LED at full power?
All else being equal, yes. Of course, all things are never equal.
First, the lower-power unit will probably be physically smaller than the 50 watt, and this will include the light-emitting area. If you look directly at the LED while it operates, there is a good chance that the lower-power unit will appear brighter, since it is emitting light from a smaller area. However, the appearance is irrelevant. What counts is the total light emitted, not the power density per unit area.
Second, (again assuming a physically smaller LED) heat sinking is likely to be more effective on the higher-power LED. At the same power it is likely that the low power LED will run hotter, and this will reduce its efficiency somewhat.
Finally, the power rating for an LED refers to the total input power, not the visible output. So the two LEDs may not use the same process, and may have different intrinsic efficiencies.
It'll be brighter by a fair bit, because you are running it under its rating. LEDs are more efficient when you do. Here's a typical efficiency curve.
Voltage is changing very slightly with current, so you have to break out the sharp pencil if you want exact numbers. But the voltage doesn't change enough to matter. In this example, suppose we run it at 400ma and get 1.65x reference light output. That's an efficiency of 4.125 units per amp. But run it at 80ma instead (80% less), and get 0.45x reference light output. That's an efficiency of 5.625 units per amp. That's 36% better amp-efficiency. There'll be a small voltage difference, so watt efficiency won't be quite as good, but it'll certainly be 25% better.
Since you're holding power constant, you can expect in the neighborhood of 25% more usable brightness from the larger emitter.
This doesn't scale infinitely; running 1% of capacity (1000W LED at 10W) won't give you even more efficiency. There are sane limits. But there's also a "sweet spot" - and running at 20% is pretty close to the sweet spot.