Flashlights: Watts versus Lumens?

Status
Not open for further replies.

Geronimo45

Member
Joined
Aug 28, 2006
Messages
3,345
Location
Phoenix, Arizona
I picked up a new mini-maglite (that, as a matter of interest, is no longer mini. Extra length, and an extra battery included) with an LED bulb. Not sure if old models worked this way, but the 'candle' option provides an incredible amount of light, which makes it a pretty darn great utility light, which is its main duty.
The box advertised a 3-watt LED, with no mention of lumens. Is there a corellation twixt lumens and watts? Maglite's webpage seems to make no mention of lumens either.
 
Dag, you know, I never noticed that.

Google turns this up.

By definition, at the peak sensitivity of the eye (green 555nm) 1 Watt equals 680 lumens.

It would make the most sense to talk about lumens with a flashligh, because what you really care about is the perceived "brightness", not the true optical power.
 
The 1W:680lm thing is the maximum theoretical efficiency if all light output has a wavelength of 555nm. Because LEDs are not 100% efficient, and we're talking about a mixture of different wavelengths, one watt put into a white LED flashlight isn't going to put out anywhere near 680lm. I've seen 3W LED flashlights claim 45lm to 60lm, so I think it'll be somewhere in that ballpark.
 
Lumens and watts both measure power--the rate of energy coming out of the light--but lumens (as Mannix indicates) are calculated according to visible energy, whereas watts don't discriminate according to frequency. You can have a 40,000 watt radio transmitter, for example, that doesn't produce a single lumen.

Some brands of flashlight use lux instead of lumens. Lux is lumens per square meter--so if you focus your 1-lumen lamp into progressively narrower beams, the amount of lux produced goes up, even though the total power doesn't change.
 
Last edited:
The wattage rating is the amount of electrical power put INTO the LED while the lumens rating measures the amount of light power that comes OUT as a result.

I don't know for certain what type of LED the flashlight in question is using but somewhere in the neighborhood of 100 lumens is a pretty safe bet.

You can't take that number and assume that it's 30 lumens per watt in the general case though. There is no direct way to convert watts input to lumens output unless you know the efficiency of the light producing device. I made that guess based on what I remember about messing around with the current crop of LED based lights.
 
I wondered about that--input vs. output. Is it possible to put 1 watt "into" the light and get less than 1 watt of radiant flux "out" of the light? Where would the missing energy go?
 
Power at other wavelengths. Usually a decent amount of heat. Light energy that doesn't make it out of the LED because it's absorbed.

LEDs are a lot more efficient than incandescent bulbs because they produce much less heat than incandescents. Basically incandescents convert a lot more of their input electrical power to heat and a lot less to light than LEDs do.

Actually, I glossed over something else in my initial explanation. The wattage rating tells you the MAXIMUM amount of electrical power you can put into the LED without damaging it under the specified conditions of use. Without knowing more about the flashlight, it's not possible to say that's what's actually being fed into the LED. However, it's a pretty safe bet--no reason to pay more for the higher rated part if you're not going to make use of it.
 
It looks like 150 lumens per watt is the high end of LED efficiency. That's what the internet says, at least.

I'm sure JohnKSa is right about flashlight wattage. But if you are running 1 watt into the light, I'm pretty sure you're going to get 1 watt out of it--partly heat, partly other wavelengths, partly maybe mechanical work?
 
It looks like 150 lumens per watt is the high end of LED efficiency.
Right now, I'm not aware of any flashlights that will give you much above 30 lumens per watt.
But if you are running 1 watt into the light, I'm pretty sure you're going to get 1 watt out of it--partly heat, partly other wavelengths, partly maybe mechanical work?
Right, it all balances. The problem is that a lot of the power ends up doing stuff you don't care about or perhaps don't want. Usually heat is the biggest byproduct.
 
Where would the missing energy go?

Just like all electronics, some of it is lost to heat due to resistance and such. No energy is missing, per se, it just doesn't all get converted to the intended form. That's where the phrase efficiency comes into play.
 
Status
Not open for further replies.
Back
Top