Depends on a lot of things. Are You running it at 100%? Are they wired in parallel or series? Is there a fan cooling down the LEDs? Your average temps?
What amperage / voltage each strip runs at?
All these factors will have an effect on the lifespan of the LEDs
More voltage = more heat that those strips will have to deal with.
It's safe to say after a few years the light will be less brighter than what it used to be. But the only way of figuring that out is with a lux meter or an expensive par meter.
Basically It's comparable to computer processors, the increase in voltage won't kill it but the added heat that comes with increased voltage will degrade and possibly fry a CPU, however if that processor remains cool then it doesn't matter how much voltage you put through it, an example would be using liquid nitrogen to cool the chip.
So there's no clear answer as to how degraded your light is. Obviously you didn't cool your LEDs with liquid nitrogen. So your light has probably experienced normal wear and tear as the rest of us.
My DIY build is 4 or so years old and has degraded quite a lot brightness wise,
The first 3-4 strips are reading 30% less LUX then the strips after, due to thermal degradation as they take more current due to wiring in parallel, and these strips are operating at half the maximum capacity.
So your light, even with proper heatsinks, burning bright at a higher capacity would have definitely lost brightness over the years.
As I said, there's variables which will effect how a lights brightness will degrade but nobody will be able to tell how much