With all the F.U.D. floating around the internet,and seeing alot of members trying their shiny new LED lamps and being disappointed/ thrilled/critical or similar, I thought that I might contribute a little information for everyone to peruse, and discuss, in the case that I'm wrong, or if you just feel like arguing.
I'll start with some basics about lighting. It's all freely available on the interweb, somewhere, if you have the patience.
Correlated Colour Temperatures (CCT's). When white isn't white ...
Image Gratuitously linked from Wikipedia.
Keep in mind that, like everything human, no-one can agree on anything and bluster through the universe doing their own darn thing, and so the Chinese/British/Americans all have their own naming schemes, and names can be mixed up - the only real measure is the numerical colour temperature.
Correlated Colour Temperature is measured in degrees kelvin, and is a combinations of every light frequency in the visible spectrum (red, orange, yellow, green, blue, indigo, violet â€¦ and everything in-between) .
Daylight, approx 6000degK - this colour temperature is comparable to an overcast midday and light at this CCT (and anything higher) blasts through your eyelids and resets your circadian clock, no matter where yours is currently ticking through it's cycle. When compared with lower colour temperatures, it's very blue, and will probably give you eye-strain if you spend too much time under it.
Cool White - nominally 5000degK - Also very blue, but not as much as Daylight. It'll keep you awake. Typically found in Docks, seedy carparks, and value conscious peoples houses.
Neutral White - nominally 4000degK - Typically used in offices, medical facilities, shopping malls, deluxe retail stores and other unpleasant places.
Warm White - nominally 3000degK - This is what most luxury retail stores & museums use. It's very comfortable, and won't usually put you to sleep, unless you are very tired. It's basically yellow when compared to all the above.
Very Warm White / Interna (Osram) - nominally 2700degK - This is the colour temperature of standard incandescent light globes, and most halogen light globes, and also fire. You'll find these at home (hopefully on the ceiling, rather than next to a suspicious looking child with a box of matches), or installed in many luxury retail/department stores inside their fitting rooms. It makes you look tanned to fool you into buying whatever it is you are trying on. It also puts you to sleep. In terms of colour, it looks very orange.
Anything in between the above colours is basically - non-standard. It's typified by poorly designed products and/or poor quality control.
So generally, what do you want?
Homes - 3000degK
Bedrooms - 2700degK
Reading lights - 3000/4000/5000degK, whatever you feel comfortable with.
>4000degK workplaces, shopping malls and other places where massive human suffering is the main aim.
Colour Rendering Index (CRI) ... when two things the same colour, aren't the same colour.
The colour rendering index is a measure of how well any light source emits light with a uniform spectrum (ie, every red/blue/green & colour in between are at the same level). It's really only applicable between 2700degK and 4000degK, as anything higher than this is heavier in the blue spectrum and lower light in the red/orange spectrum, resulting in lower CRI's.
Incandescent light globes, (most) halogens and natural burning fire emit light in a uniform level across the entire spectrum. They have CRI of 100.
Other lighting technologies (Metal Halide, low pressure sodium, high pressure sodium, mercury vapour, LED's, fluorescents & compact fluorescents) don't. Depending on the technology, it can range between 0 and the high 90's. Low pressure sodium is highly efficient, but with a CRI of practically 0, everything is .. brown.
Led's, metal halide and fluorescents range typically between 70 and 97. Current generation (not yet available in shops) have reached minimum CRI's of 90 by tweaking the phosphors to get rid of unwanted blues. You can still get cheap LED's with CRI <70 from ebay, if you feel like living dangerously.
A CRI > 95 are typically used for film production & photography to ensure uniform colour reproduction on film. Colour uniformity is near-perfect.
A CRI > 90 is often used for museums, art galleries and other fancy places. Colour uniformity is excellent.
A CRI > 80 is what most people use, and is suitable for most tasks. Colour informity is good enough, but most people will have difficulty differentiating close shades.
A CRI > 70 is the minimum you really want to spend any time under. Many colours start to look wishy-washy.
CRI < 70 is very low, and it's like looking through foggy glasses.
My personal choice is CRI >85. Anything less, and yuck.
Illumination levels ... how much light do you actually need?
Lux is the international standard for measurement of light intensity. If you are from a backwards country that uses shoe sizes based on an arbitrary persons foot length, you may use footcandles, which sounds as rediculous as it is, and much confusion can be found between the two. If you still use footcandles, go sit in the corner and wear the dunce hat. A foot candle is the average intensity that is cast by a candle, on a 1ft radius from a distance of 1 ft distance. It is equivalent to 10.64 lux.
How much light do you need, to see?
Well, that's a complex question, and it depends on how fine a detail you require, and what you will be doing, and how good your eyesight is. You also don't need to illuminate everywhere to the same illumination level. The best advice is to just put the light where you need it.
The vast majority of people can see well enough to walk around (basically in the dark) at levels as low 0.1 lux (or 1 foot candle, which is rediculous, because at this light level, you can't even see your feet, unless you are very short). At this level, the average person can make out walls & large obstacles. If you are in a factory that produces oil, banana peels, skates, or other slippery things, you are on your own. Determining colours is practically impossible at this illumination level.
Very Detailed tasks - >1600 lux - Jewellery makers, watch makers and other people with more patience than me.
Detailed tasks â€“ 800-1600 lux â€“ inspecting detailed objects, jewellery or watches, or working with tiny screws or detailed colour matching.
Retail Stores/Kitchen benches â€“ 400-800 lux â€“ course operations, but with clear comparison between different colours.
Offices â€“ 320-400 Lux â€“ Fine detail reading tasks. Colours are easily determined.
Storage areas - 160 -320 Lux â€“ Occasional reading tasks, or regular reading of large text. Colours are very discernible.
Walkways 40-160 Lux â€“ Very occasional reading or large text. Colours may be a problem.
Security lighting â€“ 2-40 lux â€“ You can clearly see shapes. Colours are difficult, except by close inspection.
Emergency Lighting 0.1-2 lux â€“ You can see large shapes (walls, boxes, other people) but can't make out any real details. If in doubt, have a conversation with the box/wall/person.
My House (it's what I like):
Kitchen benches (playing with knives & hot toffee) â€“ 800 lux
Kitchen general â€“ 320-400 lux
Dining Room â€“ 240 lux.
Lounge room â€“ 120 lux with lights on (40 lux with side lamp on when watching TV/movies).
Bed rooms â€“ 120 lux (40 lux with just bedside lamps on).
How to calculate how much light is needed?
In a professional scenario, you will end up using software packages to calculate illuminances taking into account surface reflectance, the large quantity of sources, different reflector types and efficiencies, glare, uniformity gradients and urgh. Just do it the simple way
It's basically very easy, unless you want to be professional.
Divide the area being illuminated, by the number of lumens of the light source.
Ie; if you have a room 3 metres by 3 metres (total area of 9 metres squared), and you put in a bare lamp with 1000 lumens output, you will get approximately 111.1 lumens on average.
If you want to be fancy, add 30% to compensate for 30% reflectance of the floor. That's about 140 lux. Perfect!
Keep in mind that in a home situation, generally, any single lamp shouldn't service a radius larger than the height that it's mounted at, in order to maintain consistent light. If you put one 10,000 lumen lamp in a room that is 100metres by 100metres, then you end up with a very bright centre, and very dark edges, unless the room has a ludicrously high ceiling.
Also be aware that despite all the science with light curvature and etc/etc â€¦ light basically goes straight. Any objects between the light source and you will cast a shadow â€“ you might need more lights with lower lumen output to achieve a uniform illumination level.
Lighting Technologies .. the hard stuff.
Arc Lighting Technologies â€“ Lightning in a glass tube.
Metal Halide, High pressure Sodium, low pressure sodium, mercury vapour, neon, krypton, xenon, argon, fluorescent tubes.
Basically gas in a tube with electrodes at opposite ends of the tube. A high voltage pulse is discharged between the elements at initial switch-on and is then sustained at a lower current & lower voltage. A ballast is required to initiate the pulse, and then limit the current in an operating state. If the power to these units is cut, they often need to cool down before the lamp can be re-operated.
The arc can exceed temperatures of 5000K celsius. The glass around the arc can commonly reach 500 degrees celsius. Lights using these lamps typically enclose the lamps behind a protective glass.
Typical Efficacies (Lumen Output divided by Power Consumed) are 85-140 lm/w.
Consideration of the ballast losses needs to be factored in to the efficacy.
Fluorescent & Compact Fluorescent Lamps .. the warm glow we all love â€¦. with the mercury poisoning we all want (just kidding) ... the vast majority of lamps are either mercury free, or contain enough mercury to make a newt mildly ill for a few nanoseconds. Mercury poisoning could possibly be induced by smashing thousands of fluorescent lamps in a sealed room & inhaling deeply.
Basically an arc lamp, but the gas used in fluorescents causes the arc to emit light in the ultraviolet spectrum. Much like the old Cathode Ray tubes used in older TVs, coloured phosphors are used to absorb the ultraviolet light and emit the visible light that we see. Lamp manufacturers, over the last 30years, have tweaked the combination of red/yellow/blue phosphors to get the nice whites we have now. More expensive versions will further add filters to remove any stray unwanted light, and increase the CRI, with a corresponding reduction in output.
Typically with CRI's >85, with CRI >95 for the fancy (expenive) versions
Typical Efficacies are 40-105 lm/w.
Light Emitting Diodes â€“ Ooooh shiny/shiny!
LED's are the newcomers, although they've been around for a long time used as indicators & toys, it was only recently (about 10years ago) that someone decided that they might be worth pursuing as a general lighting source. LED's are a little more complex than a bit of gas in a glass jar with electrodes.
LEDs are silicon diodes that when electrical current is forced through the junction, it hits a bunch of different radicals floating around in the silicon which excites them, and when they exhaust themselves, it causes light to bounce out at different wavelengths. The great thing about LED's is that you run a current through them, they emit light. When you stop, it stops. You run current through it again, it starts emitting light again. There is no cool-down period between on & off.
As LEDs age, they tend to lose efficacy (output) and this has forced development of standards regarding maintaining lumen output and hence operational life â€“ the L70 (the age at which 70% of light output is retained).
Typical L70's exceed 50,000hrs of operation.
This sounds wonderful, but the well known average life expectancy of electronics drivers/power supplies is 25,000 hrs. Given that LED's REQUIRE an electronic component, the expected life on any particular LED is therefore 25,000 hrs (approximately 5.7 years at 12hrs per day), unless the driver/power supply is replaceable.
Also, in the last 5 years, LED efficacy has increased from 25-32 lumens per watt upto >100 lumens per watt. This will become less of an issue as development approaches the theoretical efficacy limit of (approx) 250 lumens per watt.
Depending on the efficacy, LEDs generate enormous amounts of heat in an insanely small area (much like CPU's). This heat is retained inside the LED, unless removed. Very little heat, if any, is projected (down) with the light.
The Light output of LED's is directly proportional to the current that is flowing through the LED. Heat generation is directly proportional to the current flowing through the LED.
The light output of LED's is negatively affected by heat, and LEDs have a non-uniform impedance. Increasing the voltage across an LED can disproportionately increase the current flowing through the LED, making it a balancing act to ensure the correct current is flowing through an LED.
For this reason, high quality LED's are typically CURRENT driven, rather than Voltage driven.
Voltage based LED units are typically a bunch of LEDs joined in series (end to end) with a resistor to limit the current that can flow through the chain (similar to a ballast). When selecting a Constant Voltage power supply for LED's, care must be taken to ensure low line & load regulation, to prevent unwanted over-voltages (and massive increases in current) being applied to the LED's.
Originally LEDs were just blue, then they made red, then they made green and so-on and so forth.
Most current generation LED's used in lighting emit ultraviolet light, rather than the anaemic red/white/blue colour mixing LEDs that were used previously. Manufacturers drop a glob of phosphor on top of the ultraviolet LED (much like fluorescent) which then absorbs ultraviolet light & emits the happy warm white visible light that we want. Modern LEDs emit practically zero ultraviolet light, and significantly lower percentage of ultraviolet light compared to natural sunlight.
The CRI in LED's is improved by filtering out unwanted light, ie removing higher frequency blues/indigo/violet light, which is basically light loss.
Current Generation LED's (as at June 2013) have efficacies above 105 lumens per watt at 3000K with CRI >80, and 90 lumens per watt at CRI >90. Efficacies above 115 lumens per watt for 4000K, 130 lm/w for 5000K and higher are not uncommon.
The higher the colour temperature of an LED, the less light is filtered out, and hence the efficacy (efficiency) increases.
Another thing to note about LED's is that they are only now becoming mainstream, and the consistency of colour temperature between batches is becoming consistent, adhering to the 3 step MacAdams stepping requirement (3SDCM â€“ 3rd Tier of the MacAdams Ellipse for Standard Deviation of Colour Matching. MacAdams was an engineer/scientist in the 1940's that set out to determine the limits of colour variances that humans wouldn't notice under certain conditions).
Organic Light Emitting Diodesâ€“ Ooooh glowy / glowy, Au-Naturale, and Environmentally Friendly!
Organic LED's (OLED's) are refinements of traditional LED technologies, and you'll find them being used in phones, TV's & etc. Organic LED's have the advantage that they are very tiny, not complex, and can basically be â€œpainted onâ€ to a surface as a very thin layer. Apply a voltage or force a current through these, and they'll emit light.
What they aren't (currently) is efficient â€“ they're currently floundering around 25-45 lumens per watt, but test labs are continually developing OLED's and are aiming at upto 160 lumens per watt, but development is still trailing normal LED's. Anyone using OLED's for serious illumination purposes should be sent to the Environmentally Friendly hall of shame. They're great for features only (at the moment).
The Current & Near future of Lighting
LED's used in lighting basically come in several handy packages, being
SMD (surface mount devices) and COB (chip on block).
A third type exists, and they are those stupid bubble type signalling LED's that are useless for lighting because they have an L70 of about 1 hour. They were designed as indicators, and to be easily replaced when required. Avoid any lighting with these.
SMD's are handy, because any heat generated by them tends to be distributed across a large area, meaning that heat removal is relatively trivial. This means that the technical barrier to entry for SMD based products is very low, and the resultant problem is that any Tom, Dick or Harry, or (Wen, Lee, or Nuen as it turns out) with a garden shed and a few spare dollars is now producing products based on SMD LED's. You will typically find these SMD LEDs in low quality MR16 retrofit lamps, T8 retrofit lamps, and low quality downlights, and also LED strips/ribbons.
The problem is that each SMD LED only emits a very small amount of light, and so any light fitting based on the SMD's will have a problem with uniformity, because now the light is the sum of tens of SMD chips, rather than one. Luckily, light tends to add, so if you put a 3000K LED and a 4000K LED together, you'll basically get 3500K. If you put enough SMD's together, an eventually you'll have a product that adhere vaguely to a particular colour temperature.
COB are one, or a few, quality-controlled miniature LED's on a single circuit block. Quality control & uniformity are maintained, but they have the problem that all the heat is dissipated in a small surface area meaning hefty heatsinking and heat removal systems (just like a CPU). Only competent manufacturers have managed to create decent LED systems at high power levels >10-16W over the last few years.
Vast improvements in LED efficacy (hence huge reduction in losses as heat) have greatly reduced this problem, and the technical bar for entry into the COB market is now getting lower. Major manufacturers are introducing/have introduced COB chips that get upto (and now exceed) 80W.
COB are closest to the traditional single point lamp sources that we've been using for a long time, and most people can identify with.
Constant current drivers with efficiency of >85% are common. Constant voltage power supplies with efficiency of >90% are common. Examples of both with low quality and low power factor are very common & should be avoided.
Since LED's are the current buzzword, and smartphones are proliferous and both are buzzwords, some marketing genius's (aka idiotic morons unsuitable for anything productive) are now generating uses in the media & show industries.
Traditional systems were typically wired, and hence have their limitations, but are well regarded and understood, and are, quite frankly, common.
Since LED's have quick switching, and already contain a large electronic component, manufacturers are building additional electronics into these units for novelty uses like â€œwifi enabled LED'sâ€.
I won't tell everyone their business, but seriously? Proprietary interfaces and protocols â€“ really? The world has been down this path multiple times, and it sucked, every blooming time.
The humble light switch is more than adequate for the vast majority of people, and if you want smart control, use a building management system (CBUS anyone?) that is built into the building, and is permanent.
Smart lamps are an additional cost for advanced wifi interface/protocols & logic electronics built into a disposable product. It's bad for the environment. It's bad for your pocket. It's just bad.
Just say no to wifi enabled lamps.