Given that a specimen of Bose-Einstein Condensate can almost stop a beam of light. Can this enormous refractive index be used in constructing an ultra-compact webcam?
By: Ringo Bones
Ever since Danish-born physicist Dr. Lene Vestergaard Hau and her Harvard team of physicists managed to slow down a light beam from 300-million meters-per-second to 3-meters-per-second – akin to a little girl riding a bicycle – with a cigar-shaped specimen of Bose-Einstein Condensate back in 1999. One can only imagine the tremendous refractive index of Bose-Einstein Condensate if used in optical applications given its ability to slow down light by a factor of 100 million.
At present, most optical front-ends of CCD-type digital cameras that are de rigueur in mobile phone / cellular phone / webcam applications are made of low-cost – make that cheap – plastic lenses, never mind ophthalmology-grade glass that’s 35% lead to increase its refractive index or those expensive and esoteric gemstone-grade diamonds shaped into optical lenses. But what advantages does a “practical” optical-grade Bose-Einstein Condensate lens over the current state-of-the-art?
The higher the refractive index of the Bose-Einstein Condensate optics means the possibility of making your webcam or mobile phone or cellular phone camera more compact in comparison to ones currently available. This could mean microscopic surveillance cameras too small to be seen by unaided human eyes. Or how abut equipping your nano-sized robot-surgeons with a camera? The applications of such a breakthrough are seemingly endless. Bose-Einstein Condensate Digital Cameras could be the next best thing in the next CES show in Las Vegas in the not-too-distant future.
What about the problem of maintaining the near-absolute-zero temperature of the Bose-Einstein Condensate optical front-end? Given the current state of cost-effective Aerogel insulation technology, near-absolute-zero temperatures could be easily maintained by enclosing the Bose-Einstein Condensate optical lens in those housings used to contain semiconductor chips; thus shifting the problem of maintaining near-absolute-zero temperature of the Bose-Einstein Condensate only in the manufacturing and fabricating stage. Expensive at first, but electronic hobbyists could one day be soldering packaged Bose-Einstein Condensate lenses on their latest projects.
Saturday, March 3, 2012
Thursday, August 25, 2011
Stanford Prison Experiment:Bad Science?
Forty years ago, a supposedly routine experiment had gone awry over the course of the study. Are we still ignoring the lessons learned from the Stanford Prison Experiment?
By: Ringo Bones
In August 2011, the notorious Stanford Prison Experiment "silently" celebrated its 40th Anniversary. And yet it seems that the lessons supposedly learned fro the experiment seems to be ignored these days, not just by the scientific community, but also by the society-at-large as well. The Abu Ghraib Prison Scandal is a case-in-point.
Back around August 1971, the seemingly routine academic study to explore human behavior in a "penal setting" slowly started to go awry as the experiment went on. The subjects playing the role of the prisoners eventually adopted a submissive stance while the subjects playing the role of the prison guards eventually became more authoritarian and tyrannical. Shades of the Abu Ghraib Prison Scandal that eventually got the media's attention back in April 2004?
Even though 40 years had passed, the Stanford Prison Experiment had been oft cited as a prime example of "bad science" and has since established legal precedents on the controls when conducting such experiments in behavioral psychology. In this day and age, the Stanford Prison Experiment style study will never be allowed without the since established controls. And yet the Bush Administration seemingly has done something similar in prosecuting their "War on Terror" in the form of Extraordinary Renditions and Enhanced Interrogation.
By: Ringo Bones
In August 2011, the notorious Stanford Prison Experiment "silently" celebrated its 40th Anniversary. And yet it seems that the lessons supposedly learned fro the experiment seems to be ignored these days, not just by the scientific community, but also by the society-at-large as well. The Abu Ghraib Prison Scandal is a case-in-point.
Back around August 1971, the seemingly routine academic study to explore human behavior in a "penal setting" slowly started to go awry as the experiment went on. The subjects playing the role of the prisoners eventually adopted a submissive stance while the subjects playing the role of the prison guards eventually became more authoritarian and tyrannical. Shades of the Abu Ghraib Prison Scandal that eventually got the media's attention back in April 2004?
Even though 40 years had passed, the Stanford Prison Experiment had been oft cited as a prime example of "bad science" and has since established legal precedents on the controls when conducting such experiments in behavioral psychology. In this day and age, the Stanford Prison Experiment style study will never be allowed without the since established controls. And yet the Bush Administration seemingly has done something similar in prosecuting their "War on Terror" in the form of Extraordinary Renditions and Enhanced Interrogation.
Friday, August 5, 2011
Is Chemistry Still Relevant in the 21st Century?
As the world celebrate the UN-sponsored 2011 International Year of Chemistry, is the science of chemistry still of vital importance in the 21st Century?
By: Ringo Bones
Back in 1999, science writers raved about labeling the 21st Century as the "Century of Biology" due to the advances in biological sciences and the much touted progress of the human genome mapping project just started a few years before and the implied discoveries it might usher in a few years down the road. The said science writers even dubbed the 19th Century as the "Century of Chemistry", probably due to the discoveries made in that century ushering in much of the 20th Century's vital chemical industry - while the 20th Century was dubbed as the "Century of Physics" not only because the "Century of Physics" actually started in 1896 with various scientists around Europe discovering hitherto unknown facts about the natural phenomena of radioactive decay, but also nuclear weapons and the nuclear power generation industry became the primary geopolitical force for much of the post World War II history of the 20th Century. But is it still fair to question the extent of importance of the science of chemistry in the 21st Century?
As a chemistry buff, an overwhelming landmark discoveries of the science of chemistry - in my opinion - did happen in the 19th Century and most of them in Victorian Era Europe. German chemist Friedrich Wöhler "repudiating" the barrier between organic and inorganic chemistry after he synthesized urea - an organic compound then believed to can only be produced by biological processes - from two inorganic compounds - i.e. by mixing silver cyanate and ammonium chloride. Expecting to find ammonium cyanate - an inorganic salt - as the precipitate, Wöhler instead found out that he had produced urea, an organic compound then believed to be "impossible" to synthesize by inorganic means. Just one of the events that had made the 19th Century to be dubbed as the "Century of Chemistry".
Sadly in the 20th Century and the 21st Century too, lone chemists making landmark discoveries are a rare event since most chemists these days are tenured specialists by big industrial chemical companies like Dow and DuPont. Even those clever chemical remediation schemes funded by the American Environmental Protection Agency's Superfund program to clean up groundwater contaminated by toxic chemical wastes are more often than not a team effort by tenured chemists.
By: Ringo Bones
Back in 1999, science writers raved about labeling the 21st Century as the "Century of Biology" due to the advances in biological sciences and the much touted progress of the human genome mapping project just started a few years before and the implied discoveries it might usher in a few years down the road. The said science writers even dubbed the 19th Century as the "Century of Chemistry", probably due to the discoveries made in that century ushering in much of the 20th Century's vital chemical industry - while the 20th Century was dubbed as the "Century of Physics" not only because the "Century of Physics" actually started in 1896 with various scientists around Europe discovering hitherto unknown facts about the natural phenomena of radioactive decay, but also nuclear weapons and the nuclear power generation industry became the primary geopolitical force for much of the post World War II history of the 20th Century. But is it still fair to question the extent of importance of the science of chemistry in the 21st Century?
As a chemistry buff, an overwhelming landmark discoveries of the science of chemistry - in my opinion - did happen in the 19th Century and most of them in Victorian Era Europe. German chemist Friedrich Wöhler "repudiating" the barrier between organic and inorganic chemistry after he synthesized urea - an organic compound then believed to can only be produced by biological processes - from two inorganic compounds - i.e. by mixing silver cyanate and ammonium chloride. Expecting to find ammonium cyanate - an inorganic salt - as the precipitate, Wöhler instead found out that he had produced urea, an organic compound then believed to be "impossible" to synthesize by inorganic means. Just one of the events that had made the 19th Century to be dubbed as the "Century of Chemistry".
Sadly in the 20th Century and the 21st Century too, lone chemists making landmark discoveries are a rare event since most chemists these days are tenured specialists by big industrial chemical companies like Dow and DuPont. Even those clever chemical remediation schemes funded by the American Environmental Protection Agency's Superfund program to clean up groundwater contaminated by toxic chemical wastes are more often than not a team effort by tenured chemists.
Monday, November 22, 2010
God is not Necessary - Thus Spake Stephen Hawking
As the latest findings in his search for the “Theory of Everything” got press attention yet again, did Prof. Stephen Hawking got it right in concluding that God is not necessary in the creation of the Universe?
By: Ringo Bones
Back in September 2010, there had been a huge uproar over Prof. Stephen Hawking’s latest book titled “The Grand Design” where the eminent theoretical physics professor openly spoke out that God or any other contrived deity isn’t necessary or has nothing whatsoever to do with our Universe coming into being and its day to day operations. The remark may seem somewhat opinionated from the standpoint of “soft agnostics”, but even if Professor Hawking is right, many did question if he has the right to speak out such a conclusion.
From a theoretical physicist’s point of view, Hawking’s The Grand Design is a strong proof-based implication that “God” or any other dogmatically contrived deity is not needed on the creation of the Universe – i.e. the Big Bang. In this day and age, the strong consensus of the collection of evidence-based knowledge of the scientific community could easily trump theological dogma. Unfortunately – even in some affluent nations – religious dogma still reigns supreme and even reserves the right to deny the rights of ethnic minorities and women.
In a stark comparison to scientific consensus, even organized Christianity’s various denominations and sects can’t even agree on or even established an evidence-based consensus about the ontological empiricism defining “God”. Yet, the Archbishop of Canterbury who has the good fortune of establishing himself as the first and “loudest” critic of Professor Hawking’s latest opus – says that “physics” can neither prove nor disprove the existence of God. Even Brian May – guitarist of premier British rock band Queen – who also has several science degrees also voiced his criticism on Professor Hawking deciding to pit science with religion / spirituality.
Unfortunately, despite being now entrenched in NY Times’ Bestseller List, Hawking’s The Grand Design – like his previous works – will more likely than not be soon forgotten because an overwhelming majority who bought it only bought it as a conversation piece and for its novelty value, as opposed to their genuine fascination of the subject of cutting-edge theoretical physics. The science versus religion rift will probably never heal because religion still has the bigger war-chest that enables it to achieve its ends no matter how devious the means are.
By: Ringo Bones
Back in September 2010, there had been a huge uproar over Prof. Stephen Hawking’s latest book titled “The Grand Design” where the eminent theoretical physics professor openly spoke out that God or any other contrived deity isn’t necessary or has nothing whatsoever to do with our Universe coming into being and its day to day operations. The remark may seem somewhat opinionated from the standpoint of “soft agnostics”, but even if Professor Hawking is right, many did question if he has the right to speak out such a conclusion.
From a theoretical physicist’s point of view, Hawking’s The Grand Design is a strong proof-based implication that “God” or any other dogmatically contrived deity is not needed on the creation of the Universe – i.e. the Big Bang. In this day and age, the strong consensus of the collection of evidence-based knowledge of the scientific community could easily trump theological dogma. Unfortunately – even in some affluent nations – religious dogma still reigns supreme and even reserves the right to deny the rights of ethnic minorities and women.
In a stark comparison to scientific consensus, even organized Christianity’s various denominations and sects can’t even agree on or even established an evidence-based consensus about the ontological empiricism defining “God”. Yet, the Archbishop of Canterbury who has the good fortune of establishing himself as the first and “loudest” critic of Professor Hawking’s latest opus – says that “physics” can neither prove nor disprove the existence of God. Even Brian May – guitarist of premier British rock band Queen – who also has several science degrees also voiced his criticism on Professor Hawking deciding to pit science with religion / spirituality.
Unfortunately, despite being now entrenched in NY Times’ Bestseller List, Hawking’s The Grand Design – like his previous works – will more likely than not be soon forgotten because an overwhelming majority who bought it only bought it as a conversation piece and for its novelty value, as opposed to their genuine fascination of the subject of cutting-edge theoretical physics. The science versus religion rift will probably never heal because religion still has the bigger war-chest that enables it to achieve its ends no matter how devious the means are.
Monday, January 26, 2009
Incandescent Bulb Phase-Out: A Giant Leap for the Environment?
The phasing-out of incandescent bulbs – especially the 100-watt models – had already started in the EU. Will this measure really help our environment?
By: Vanessa Uy
Yep, it’s official. The phasing-out of tungsten-filament incandescent light bulbs – especially the 100-watt models – had already been declared mandatory in Europe by the start of 2009. And so does their manufacture to be replaced by those mercury-vapor compact fluorescent lamps, which with their screw-on sockets – can directly replace the older less-energy efficient incandescent bulbs. If this is the only a question of energy efficiency, then why are there still a somewhat large majority still skeptical in their use despite of the energy-saving properties of compact fluorescent lamps? First, let us compare the two somewhat radically different illumination technologies.
Tungsten-filament incandescent bulbs convert the 60Hz 220V alternating current electrical energy into light energy for illumination by heating the tungsten filament inside the incandescent bulb. The disadvantage of this technology is that only 10% of the incoming electrical energy are converted into light, while the other 90% is given-off as heat or thermal energy. What makes incandescent bulbs useful for use in poultry incubators can be somewhat of a waste of electrical bills when it comes to domestic illumination.
Compact fluorescent lamps or CFL ‘s – since their commercial manufacture and promotion in the late 1980’s – has been a “godsend” to those who want to lower their electrical utility bills in the illumination front. Like ordinary fluorescent lamps, CFL ’s convert the incoming electrical energy into light when the electricity converts the mercury vapor inside the tube into ultraviolet radiation in which the bulb’s coating of phosphorescent materials – usually zinc sulfide – converts the ultraviolet radiation into more or less visible light. Fluorescent –type lamps usually convert 79 to 85% of the incoming electrical energy into light which make them easily 7 to 8 times more efficient than ordinary incandescent bulbs in energy usage terms. The advantage of compact fluorescent lamps over ordinary fluorescent lamps is that because of their screw-on base, they can directly be used as a replacement for “inefficient” incandescent bulbs. If this is all about lowering our energy consumption and reducing our carbon footprint, then why are there still more people “seeing red” over the “green” potential of compact fluorescent lamps?
First, let’s start with everyone’s aesthetic tastes – which could be seen by most environmentalists as irrelevant when it comes to energy use – is the primary – make that the only reason – why some people really hate compact fluorescent lamps. CFL ‘ s are very notorious for their bad spectral output – i.e. the light that they give off is utterly unnatural, even when compared to fluorescent lamps of “previous generation”. Honestly, I can only gain wisdom comparable to that of the newly elected US President Barack Obama only when I’m working under Northerly-Lights akin to that frequently used by the Dutch painter Jan Vermeer. In which, sadly, even the latest generation of compact fluorescent lamps still can’t provide. Nuclear war fallout shelter use for the next 35 years in hiding for illumination they are not!
Then there’s that significant mercury content of compact fluorescent lamps. If these lamps happen to end up in countries where their manufacturer’s recycling and proper disposal department is absent. Then compact fluorescent lamps will be more trouble than they are worth when their expired brethren will be contaminating elemental mercury into the local biosphere despite of the carbon dioxide emissions that these types of lamps can happen to reduce. Then there’s the concern of somewhat high ultraviolet radiation output of these lamps, especially when you are using them as desktop lamps - which could cause most of us to be concerned when it comes to possible skin cancer effects.
Are compact fluorescent lamps – in spite of their energy efficiency – really more trouble than they are worth? In the short-term, the answer is a big fat yes. Their spectral output can be an eyesore to a significant number of people. Plus, they need to be disposed of properly when they die - in spite of their longer life-span that’s usually 5 times more than old-style tungsten-filament incandescent bulbs – because of their significantly high mercury content. Major manufacturing firms of compact fluorescent lamps should start looking into these problems as soon as possible. Maybe the best way to reduce our carbon footprint when it comes to lighting is to just turn off unnecessary lights, isn’t it?
By: Vanessa Uy
Yep, it’s official. The phasing-out of tungsten-filament incandescent light bulbs – especially the 100-watt models – had already been declared mandatory in Europe by the start of 2009. And so does their manufacture to be replaced by those mercury-vapor compact fluorescent lamps, which with their screw-on sockets – can directly replace the older less-energy efficient incandescent bulbs. If this is the only a question of energy efficiency, then why are there still a somewhat large majority still skeptical in their use despite of the energy-saving properties of compact fluorescent lamps? First, let us compare the two somewhat radically different illumination technologies.
Tungsten-filament incandescent bulbs convert the 60Hz 220V alternating current electrical energy into light energy for illumination by heating the tungsten filament inside the incandescent bulb. The disadvantage of this technology is that only 10% of the incoming electrical energy are converted into light, while the other 90% is given-off as heat or thermal energy. What makes incandescent bulbs useful for use in poultry incubators can be somewhat of a waste of electrical bills when it comes to domestic illumination.
Compact fluorescent lamps or CFL ‘s – since their commercial manufacture and promotion in the late 1980’s – has been a “godsend” to those who want to lower their electrical utility bills in the illumination front. Like ordinary fluorescent lamps, CFL ’s convert the incoming electrical energy into light when the electricity converts the mercury vapor inside the tube into ultraviolet radiation in which the bulb’s coating of phosphorescent materials – usually zinc sulfide – converts the ultraviolet radiation into more or less visible light. Fluorescent –type lamps usually convert 79 to 85% of the incoming electrical energy into light which make them easily 7 to 8 times more efficient than ordinary incandescent bulbs in energy usage terms. The advantage of compact fluorescent lamps over ordinary fluorescent lamps is that because of their screw-on base, they can directly be used as a replacement for “inefficient” incandescent bulbs. If this is all about lowering our energy consumption and reducing our carbon footprint, then why are there still more people “seeing red” over the “green” potential of compact fluorescent lamps?
First, let’s start with everyone’s aesthetic tastes – which could be seen by most environmentalists as irrelevant when it comes to energy use – is the primary – make that the only reason – why some people really hate compact fluorescent lamps. CFL ‘ s are very notorious for their bad spectral output – i.e. the light that they give off is utterly unnatural, even when compared to fluorescent lamps of “previous generation”. Honestly, I can only gain wisdom comparable to that of the newly elected US President Barack Obama only when I’m working under Northerly-Lights akin to that frequently used by the Dutch painter Jan Vermeer. In which, sadly, even the latest generation of compact fluorescent lamps still can’t provide. Nuclear war fallout shelter use for the next 35 years in hiding for illumination they are not!
Then there’s that significant mercury content of compact fluorescent lamps. If these lamps happen to end up in countries where their manufacturer’s recycling and proper disposal department is absent. Then compact fluorescent lamps will be more trouble than they are worth when their expired brethren will be contaminating elemental mercury into the local biosphere despite of the carbon dioxide emissions that these types of lamps can happen to reduce. Then there’s the concern of somewhat high ultraviolet radiation output of these lamps, especially when you are using them as desktop lamps - which could cause most of us to be concerned when it comes to possible skin cancer effects.
Are compact fluorescent lamps – in spite of their energy efficiency – really more trouble than they are worth? In the short-term, the answer is a big fat yes. Their spectral output can be an eyesore to a significant number of people. Plus, they need to be disposed of properly when they die - in spite of their longer life-span that’s usually 5 times more than old-style tungsten-filament incandescent bulbs – because of their significantly high mercury content. Major manufacturing firms of compact fluorescent lamps should start looking into these problems as soon as possible. Maybe the best way to reduce our carbon footprint when it comes to lighting is to just turn off unnecessary lights, isn’t it?
Sunday, November 30, 2008
Lie Detectors: How Truthful Are These Devices?
The modern lie detector / polygraph test has said to have been used in proving guilt or innocence in offences as varied as petty theft to witnessing an “alleged” alien abduction. But are these machines truthful?
By: Vanessa Uy
The high-profile use of lie detectors / polygraph test machines in the US justice system range from the proving the guilt or innocence of rogue CIA agents to the credibility of alien abduction witnesses and victims. Even though majority of us know that these devices are used to determine whether the subject being tested is telling the truth or not, but can the machine irrefutably determine the guilt or innocence of the “test subject”?
In reality, polygraph test devices – or as it is more famously known colloquially as lie detector machines – measures how the subject reacts to the set of questions being asked physiologically. Whether the subject is lying or not is usually determined by the person supervising the test basing on the resulting measurements. One of the few manufacturers of purpose-built polygraph devices is the Lafayette Instrument Company in Lafayette, Indiana. The manufacturing firm makes polygraph devices that costs around 12,000 US dollars each. A typical polygraph – usually classified as a 4-pin device - has several modules that measures galvanic skin response – or GSR, the breathing rate via a pneumosensor, and the heart rate and blood pressure.
Newer digital / PC-based polygraph devices now exist (and are even way cheaper), but these are not as accurate as a purpose-build polygraph device. Though PC-based 4-pin polygraph devices has a proviso to store / save data digitally. Even though these types of polygraph does very well in their intended roles like measuring GSR, heart rate, blood pressure, breathing rate, etc. They cannot yet irrefutably determine the guilt or innocence of the person under test. That's why, lie detectors / polygraph test data are usually inadmissible in criminal court proceedings where the polygraph data is used to determine the guilt or innocence of the accused.
The latest form of these “lie detector” devices is called Brain Fingerprinting, which is touted to be more accurate than the current polygraph test devices in use. Developed by Dr. Laurence Farwell – a Seattle-based neuroscientist, Brain Fingerprinting is a radically new type of “lie detector” that has proven to have a more than 90% certainty rate in determining whether the subject is telling the truth or not. The newfangled system locks on to the P300 murmur response of the brain when the test subject is asked a well-selected roster of pertinent questions about the crime. The test subject’s brain response / brain wave patterns is measured via a sensor cap. At present, brain fingerprinting test results is not yet admissible as evidence in majority of US courts.
Despite of the advances in lie detection technology over the years, the US justice system is still weary of accepting polygraph test / lie detection data as evidence because the test results are open to interpretation. And lie detection devices somewhat violate the plaintiff’s constitutional rights against self-incrimination when such devices are used in criminal trial proceedings. Plus, it’s been proven that polygraph test devices / lie detectors are not infallible. Former CIA double agents / rogue agents Aldrich Aames and Howard Woodward “aced” their polygraph tests during the 1980’s even though the other evidence presented in their trials proved their guilt. Howard Woodward even manage to escape into the Iron Curtain more than 20 years ago and his whereabouts today still remain unknown despite the Cold War ending for almost two decades.
By: Vanessa Uy
The high-profile use of lie detectors / polygraph test machines in the US justice system range from the proving the guilt or innocence of rogue CIA agents to the credibility of alien abduction witnesses and victims. Even though majority of us know that these devices are used to determine whether the subject being tested is telling the truth or not, but can the machine irrefutably determine the guilt or innocence of the “test subject”?
In reality, polygraph test devices – or as it is more famously known colloquially as lie detector machines – measures how the subject reacts to the set of questions being asked physiologically. Whether the subject is lying or not is usually determined by the person supervising the test basing on the resulting measurements. One of the few manufacturers of purpose-built polygraph devices is the Lafayette Instrument Company in Lafayette, Indiana. The manufacturing firm makes polygraph devices that costs around 12,000 US dollars each. A typical polygraph – usually classified as a 4-pin device - has several modules that measures galvanic skin response – or GSR, the breathing rate via a pneumosensor, and the heart rate and blood pressure.
Newer digital / PC-based polygraph devices now exist (and are even way cheaper), but these are not as accurate as a purpose-build polygraph device. Though PC-based 4-pin polygraph devices has a proviso to store / save data digitally. Even though these types of polygraph does very well in their intended roles like measuring GSR, heart rate, blood pressure, breathing rate, etc. They cannot yet irrefutably determine the guilt or innocence of the person under test. That's why, lie detectors / polygraph test data are usually inadmissible in criminal court proceedings where the polygraph data is used to determine the guilt or innocence of the accused.
The latest form of these “lie detector” devices is called Brain Fingerprinting, which is touted to be more accurate than the current polygraph test devices in use. Developed by Dr. Laurence Farwell – a Seattle-based neuroscientist, Brain Fingerprinting is a radically new type of “lie detector” that has proven to have a more than 90% certainty rate in determining whether the subject is telling the truth or not. The newfangled system locks on to the P300 murmur response of the brain when the test subject is asked a well-selected roster of pertinent questions about the crime. The test subject’s brain response / brain wave patterns is measured via a sensor cap. At present, brain fingerprinting test results is not yet admissible as evidence in majority of US courts.
Despite of the advances in lie detection technology over the years, the US justice system is still weary of accepting polygraph test / lie detection data as evidence because the test results are open to interpretation. And lie detection devices somewhat violate the plaintiff’s constitutional rights against self-incrimination when such devices are used in criminal trial proceedings. Plus, it’s been proven that polygraph test devices / lie detectors are not infallible. Former CIA double agents / rogue agents Aldrich Aames and Howard Woodward “aced” their polygraph tests during the 1980’s even though the other evidence presented in their trials proved their guilt. Howard Woodward even manage to escape into the Iron Curtain more than 20 years ago and his whereabouts today still remain unknown despite the Cold War ending for almost two decades.
Thursday, October 16, 2008
Blue LED Water Purification
They consume very little power for the amount of light that they give off, but are blue light-emitting diodes or LED’s produce enough ultraviolet or UV radiation to kill water-borne bacteria to make water safe to drink?
By: Ringo Bones
I was skeptical at first given my first-hand experience and working knowledge of light-emitting diodes. But a research scientist at the Berlin Institute of Technology had recently claimed that he had developed a set-up to purify water – i.e. killing water-borne bacteria via ultraviolet radiation – using just an array of blue light-emitting diodes. If this works, it would start a new revolution on how we obtain safe drinking water. Given that blue LED’s are hundreds of times – even more – efficient than the mercury-vapor lamps currently in use to produce ultraviolet rays to kill water-borne bacteria and other pathogens as a way of making water safe to drink.
The blue LED water purification concept was aired on October 6, 2008 in a DW-TV science program titled Tomorrow-Today. Michael Kneissl of the Berlin Institute of Technology has demonstrated his blue LED water purification prototype set-up with claims that the blue light-emitting diode array produces enough UV radiation to “zap” harmful water-borne bacteria. If this is true, then Michael Kneissl probably made himself a Nobel Prize worthy concept given that mercury-vapor UV lamps currently used in this type of water purification are very power hungry in comparison to the (claimed off the shelf) blue light-emitting diodes that he used.
Theoretically, light-emitting diodes can last thousands of years – up to 150,00 years - if used well below their current limit ratings. If the Berlin Institute of Technology’s blue LED-based water purification system used current levels very near the limit of those rated for the blue light-emitting diodes, they would still last years compared to UV generating mercury-vapor lamps. If the concept goes on line, it will probably be the water purification method with the lowest carbon footprint given the energy efficiency of light-emitting diodes.
By: Ringo Bones
I was skeptical at first given my first-hand experience and working knowledge of light-emitting diodes. But a research scientist at the Berlin Institute of Technology had recently claimed that he had developed a set-up to purify water – i.e. killing water-borne bacteria via ultraviolet radiation – using just an array of blue light-emitting diodes. If this works, it would start a new revolution on how we obtain safe drinking water. Given that blue LED’s are hundreds of times – even more – efficient than the mercury-vapor lamps currently in use to produce ultraviolet rays to kill water-borne bacteria and other pathogens as a way of making water safe to drink.
The blue LED water purification concept was aired on October 6, 2008 in a DW-TV science program titled Tomorrow-Today. Michael Kneissl of the Berlin Institute of Technology has demonstrated his blue LED water purification prototype set-up with claims that the blue light-emitting diode array produces enough UV radiation to “zap” harmful water-borne bacteria. If this is true, then Michael Kneissl probably made himself a Nobel Prize worthy concept given that mercury-vapor UV lamps currently used in this type of water purification are very power hungry in comparison to the (claimed off the shelf) blue light-emitting diodes that he used.
Theoretically, light-emitting diodes can last thousands of years – up to 150,00 years - if used well below their current limit ratings. If the Berlin Institute of Technology’s blue LED-based water purification system used current levels very near the limit of those rated for the blue light-emitting diodes, they would still last years compared to UV generating mercury-vapor lamps. If the concept goes on line, it will probably be the water purification method with the lowest carbon footprint given the energy efficiency of light-emitting diodes.
Subscribe to:
Posts (Atom)