09 December 2016

#FacesofPhotonics: Inspired

Among the #FacesofPhotonics: Student Leadership
Workshop participants at SPIE Optics + Photonics
Guest blogger: Emily Power is a Winter Quarter graduate in communications from Western Washington University, and most recently social media intern for SPIE, the international society for optics and photonics. She is blogging on responses to the SPIE #FacesofPhotonics campaign, to share the stories of SPIE students around the globe.

It is a commonly known fact: students are the future. Around the world, students with ideas, opinions, and innovative minds are preparing for their opportunities to conceptualize and create the next advances for the ever-changing world in which we live.

In the field of optics and photonics, students are making a difference even now, sharing their work and building their networks through conferences such as SPIE Photonics West, coming up next month in San Francisco.

The SPIE campaign #FacesofPhotonics was developed as a showcase across social media to connect students from SPIE Student Chapters around the world, highlighting similarities, celebrating differences, and fostering a space for conversation and community to thrive. Students were invited to share their perspectives and successes via SPIE’s social media channels.

The results were amazing, and we’ll be sharing some excerpts on these pages over the next few weeks.

This week, we feature students who described how they are inspired by their field.

Michael J. Williams
Michael J. Williams is a PhD student at Delaware State University, studying optics, and earned his master’s degree in material science from Fisk University and bachelor of science at Morehouse College.

In #FacesofPhotonics, he tells of a moment during SPIE Optics + Photonics 2016 when he was inspired by SPIE CEO Eugene Arthurs.

“At a town hall meeting held during the conference, there was a question asked by a professor of how optics and photonics awareness can be spread to third-world countries for their benefit,” Michael wrote. “Dr. Arthurs responded by saying that before we even think about going to other countries, we need to reach the inner-city black and Latino communities in our cities first.

“That encouraged me so much because quite honestly, I was the only born-and-raised black American at the event, and I come from the inner-city where people have written those kids off as being too unintelligent or saying they do not have the propensity to learn complex science.

Laura Tobin
“I thanked Dr. Arthurs personally for redirecting the need for optics awareness to poor and low-income communities who may have the desire and intrinsic skill to create colorful innovation for a different point of view. They just need consistent encouragement and the opportunity.”

Laura Tobin is a postgraduate student at University College Dublin. She pursues her interests in optics and renewable energy by studying electrical and electronic engineering.

Laura said she found inspiration at SPIE Optics + Photonics 2010, when “I attended my first outreach workshop, ‘Optics Magic’ by Judy Donnelly and Nancy Magnani. This workshop inspired and motivated me to start doing #scicomm and outreach. I honestly don’t think I would have achieved or gone for half the things that I have done if I hadn’t attended that conference.”

Matt Posner
Born and raised in France, Matt Posner is a postgraduate student studying optoelectronics at the University of Southampton. He is currently president of his university’s SPIE Student Chapter. Matt wrote about the inspiration he found at Optics + Photonics in 2016, centered on connections he made there: “I had really rich and inspiring discussions with the people that came to see our experiments, and made lots of contacts with people from all around the world whom passionate about photonics.”

For full stories and more inspiration, follow @SPIEphotonics on Instagram, and look for the #FacesofPhotonics tag.

19 September 2016

Peer Review Week celebrates the 'unsung heroes'

The second annual Peer Review Week spans 19-25 September 2016. This global event celebrates the vital role that peer review plays in achieving exceptional scientific quality.

Recognition for Review is the theme for this year’s event, which is dedicated to recognizing contributions made by those participating in peer review activities ranging from conference submissions to publication and grant reviews.

“Reviewers are the unsung heroes of scholarly journals," said Optical Engineering editor-in-chief Michael Eismann in his first editorial of 2016. "Generally operating in anonymity, they ensure that published articles meet the journal’s standards of originality, significance, scientific accuracy, and professional quality.”

In another editorial, "Four attributes of an excellent peer review", Eismann defined how to ensure that peer review results in quality publications.

The Peer Review Week event calendar includes a several webinars on various topics, online Q&A sessions, workshops, and presentations.

SPIE, the international society for optics and photonics, publishes 10 peer-reviewed journals (at right) in the SPIE Digital Library, the largest collection of literature in the field of optics and photonics.

This year, several of the journal editors-in-chief joined Eismann in offering thanks to their top reviewers -- read more at the links below about the dedicated individuals who work to ensure quality publications!

29 August 2016

Big dreams and nanomedicine: optical nanotransformers

Guest blogger: Elizabeth Bernhardt, a physics research assistant in nonlinear optics at Washington State University, is  blogging on presentations at SPIE Optics + Photonics in San Diego, California, 28 August through 1 September.

Dream big dreams, create amazing solutions:
Paras Prasad offered inspiration in a talk on
how nanomedicine can save lives
Treating diseases in the human body can be incredibly difficult and certain cancers may even be inoperable.

In the opening all-symposium plenary at SPIE Optics + Photonics 2016, Paras Prasad, Executive Director of the Institute for Lasers, Photonics, and Biophotonics at the University at Buffalo, New York, told how he aims to bring treatment directly to the source of the disease, using light.

Inspired early on by James Cameron's move Fantastic Voyage (1966), Dr. Prasad imagined sending something tiny into the human blood stream to specifically target disease. He turned science fiction into reality via nanomedicine.

Nanomedicine uses incredibly small devices, such as multilayered nanotransducers, to treat human diseases from inside the body. The first layer absorbs a particular wavelength of light. The next layer takes this absorbed energy and converts it to a higher or lower wavelength, which is then re-radiated.

The overarching idea is to take low-energy light, such as infrared, send it to a particular location in the body, then change the light to a different, more useful energy. IR light easily passes through the human body with very little damage. Nanotransducers absorb this light, turning it into useful, high-energy visible light, which is easily and readily absorbed by nearby cells. The cells are then destroyed, for an effective and potentially less dangerous way of treating cancer.

Dr. Prasad described another dream becoming reality, via the work of Nobel Laureate Maria Goeppert-Mayer, who developed the theory of two-photon absorption.

At the time, it was assumed experimental verification would never be possible. However, with development of the laser, two-photon absorption occurs every time one uses a green laser pointer.

Moreover, two-photon absorption can be used for dental bonding, killing bacteria, two-photon microscopy, and more. Indeed, Dr. Prasad showed materials applicable to night vision, security, and friend-foe identification. These materials appear to be different colors based on the light they absorb.

He challenged the audience to turn their own imaginings into reality as well. Perhaps the next project in optogenetics (using light to effect genes) will cure or help people with neurological disorders, or even enhance capabilities ... maybe one day neurophotonics will help Superman jump from the pages of a comic book into real-life super-human capabilities.

Note: On Wednesday 31 August, Dr. Prasad will receive the SPIE Gold Medal, the highest award of the Society, in recognition of his work.

25 August 2016

Eight to anticipate: photonics technologies coming our way

Optics and photonics technologies are at work improving our lives in many ways.

These technologies are what provide sustainable lighting and energy-generation systems. Nanoparticles are used to rapidly diagnose disease or derive 3D images of living, functioning cells. Optical resonators detect counterfeit or pirated goods. Airborne telescopes probe deep into the Universe while optical fibers send messages instantly across the globe.

Engineers and scientists from around the world meet every August at SPIE Optics + Photonics in San Diego to advance research in several broad areas of optics and photonics. A few of the 3,000+ researchers who will present reports next week have provided previews via articles they have authored recently for the SPIE Newsroom.

Multicolor rapid diagnostics for infectious disease,” Kimberly Hamad-Schifferli, Chunwan Yen, Helena de Puig, José Goméz-Marquéz, Irene Bosch and Lee Gehrke [ref. 9923-28, Tuesday 30 August, 9 a.m.]
Recent epidemic outbreaks have highlighted the need for a rapid point-of-care assay that can provide a diagnosis to enable treatment, proper quarantining, and disease surveillance. One promising diagnostic is the lateral flow test, i.e., the same type of assay used in pregnancy tests: a paper strip to which a biological fluid is added. These are attractive for diagnostics because they are inexpensive, easy to use, and do not require special reagents or experts to run them.

Custom complex 3D microtubule networks for experimentation and engineering,” Michael Vershinin, Jared Bergman and Florence Doval [ref. 9930-4, Sunday 28 August, 10 a.m.]
Cargo logistics — driven by cytoskeletal motors and proceeding along actin and microtubule filaments — are an essential subsystem of the overall machinery of eukaryotic cells. It is no stretch to say that virtually every process in a living cell depends, directly or indirectly, on proper routing of cargoes in a timely fashion. Much progress has been made in the last few decades in understanding the structure and properties of individual filaments and motors, but clean experimental modeling of how these components add up to a functional cytoskeleton still poses many challenges.

Anisotropic Fabry-Perot resonators for anticounterfeiting applications,” In-Ho Lee, Eui-Sang Yu, Se-Um Kim and Sin-Doo Lee [ref. 9940-2, Sunday 28 August, 8:55 a.m.].
The prevalence of counterfeited and pirated goods in modern society has increased the demand for anticounterfeiting technologies. Global trade of such items in 2015 was estimated to be worth $960 billion, and a danger to 2.5 million jobs. Much effort has been made in the development of smart security labels designed to hide information in normal conditions and reveal it in others.

Organic LEDs with low power consumption and long lifetimes,” Satoshi Seo [ref. 9941-18, Sunday 28 August, 4:40 p.m.]
An LED with an emissive organic thin film sandwiched between the anode and cathode is known as an organic-LED (OLED). The emission mechanism of an OLED is superficially similar to that of a standard LED, i.e., holes and electrons are injected from the anode and cathode, respectively, and these carriers recombine to form excited states (excitons) that lead to light emission. In recent years, smartphones and TVs with OLED displays have rapidly become widespread because OLEDs provide high contrast, a wide color gamut, light weight, thinness, and flexibility for the displays. OLEDs also have great potential for the creation of new lighting applications. The high power consumption and short lifetime of OLEDs, however, remain key issues.

Using femtosecond lasers to grow nonlinear optical crystals in glass,” Carl Liebig, Jonathan Goldstein, Sean McDaniel, Eric Glaze, Doug Krein and Gary Cook [ref. 9958-5, Sunday 28 August, 10:30 a.m.]
Non-centrosymmetric crystals whose optical response does not vary linearly with the strength of an electric field — known as nonlinear optical (NLO) crystals — are the fundamental building blocks for most electro-optic applications. The production of novel NLO crystals is very difficult because it entails bulk techniques that require long growth times and expensive equipment, and that often result in low-quality crystals. For more than three decades, lasers have been used to make modifications to glass refractive indices in the fabrication of high-efficiency waveguides. In recent work, the use of femtosecond laser sources facilitates the fabrication of multidimensional structures composed of many types of NLO crystals.

Synchrotron ‘pink beam’ tomography for the study of dynamic processes,” Mark Rivers [ref. 9967-33, Tuesday 30 August, 3:00 p.m.]
Computerized axial tomography scanning has revolutionized medical imaging, and through microtomography, its spatial resolution can be reduced from the millimeter scale to the micrometer scale. Microtomography has developed rapidly, driven by developments in x-ray sources, computers, and particularly in detectors. There are now microtomography systems available for laboratory use and microtomography has been applied to fields including biology, geology, soil science, and the study of meteorites. Monochromatic beams are generally unsuitable for dynamic studies. So-called pink beam microtomography is an alternative.

Making unique IR observations with an airborne 2.5m telescope,” Eric Becklin, Maureen Savage, Erick Young and Dana Backman [ref. 9973-17, Tuesday 30 August, 8:30 a.m.]
Large parts of the IR spectrum are inaccessible in observations made from ground-based telescopes because of absorption by water vapor in the atmosphere.For this reason, the Stratospheric Observatory for IR Astronomy (SOFIA) — a joint project between NASA and the German Aerospace Center (DLR) — was designed and has been operational since 2010. SOFIA has become a key facility for several astronomy investigations, e.g., for studying regions of star formation, observing objects obscured by interstellar dust, and making time-critical measurements of transient events.

Robust photon-pair source survives rocket explosion,” Zhongkan Tang, Rakhitha Chandrasekara, Yue Chuan Tan, Cliff Cheng, Kadir Durak and Alexander Ling [ref. 9980-8, Monday 29 August, 8:05 a.m.]
Quantum key distribution (QKD) is of much interest for quantum communications because of its high level of privacy (underpinned by quantum mechanics). In particular, entanglement-based QKD is a powerful technique in which quantum correlations between photons are leveraged. In this process, the entangled photons can be distributed with the use of optical fibers or ground-level free-space links. Current QKD networks, however, suffer from a distance limit because of fiber losses and the lack of quantum repeaters.

16 August 2016

Keeping nighttime lighting under control

Yosemite National Park offers stunning views of mountain vistas during the day and star-filled skies at night. This view often includes the Milky Way -- invisible to almost one third of Earth’s population due to light pollution.

Artificial lighting is restricted in Yosemite, but some areas in the park require lighting, such as parking lots and pathways between buildings. Light pollution can not only have a negative effect on visitors’ experiences, but can also change the natural rhythms of the park’s wildlife.

University of California, Merced (UC Merced) graduate student Melissa Ricketts has found a solution – by turning one of her professor’s inventions upside down. In an article from UC Merced’s University News, Ricketts describes what she calls “prescribed irradiance distribution.”

Ricketts is a member of UC Solar, a multicampus research institute headquartered at UC Merced headed by Roland Winston, the inventor of nonimaging optics. His compound parabolic concentrator (CPC) is a key piece of solar-collecting equipment in the emerging solar energy industry. Ricketts has developed a way to make Winston’s CPC emit light rather than gather it.

“It’s the reverse of the solar collector,” Ricketts said. “We can make a perfect square of LED light, or a circle, or whatever shape works best to illuminate only what needs to be illuminated.

Ricketts has been working with Steve Shackelton, a UC Merced staff member and former Yosemite chief ranger, on what they call “The Sand Pile Project.” Although most of their work is done in the lab, designs are occasionally tested in Yosemite on a large pile of sand that snowplow operators spread on the park roads when needed. The park needs to keep the sand pile well-lit so it can be accessed at any time, but lighting should have minimum effects on the surrounding areas.

UC Merced graduate student Melissa Ricketts sets up her LED
 lighting solution in the Sand Pile at Yosemite National Park
Credit: Courtesy of UC Merced

Yosemite is cautious about introducing new technology into the park, but they have been supportive of Ricketts’ research toward managing light by letting her use the area as a test where her work could eventually have global implications for wildlife and park visitors.

“We’re hoping to show the park we can eliminate the unnecessary light,” Ricketts said. She’s currently seeking funding to make the project viable for Yosemite and other parks

08 August 2016

Laser-induced removal of space debris

If you never thought something as small as a paint chip could have the potential to destroy the International Space Station, think again. Traveling at speeds upwards of 17,500 mph, the ISS could be torn apart by debris smaller than a marble in an instant. NASA is currently tracking more than 500,000 objects orbiting Earth including non-operational satellites and obsolete disengagements from past rocket missions. But the greatest risk to active satellites and space missions comes from the millions of pieces of debris that are nearly impossible to track.
7 mm chip on ISS window caused by a small
fragment of space debris no larger than
a few microns across

An article from 12 May 2016 in the Washington Post reported the International Space Station’s recent collision with “something as unassuming as a flake of paint or a metal fragment just a few thousandths of a millimeter across.”

The fragment left a 7-millimeter chip in a window of the European-built Cupola module. ESA astronaut Tim Peake was the first to snap a picture of the damage, then shared it with the world on his twitter account.

So how might we deal with all this hazardous space material? Lasers!

Authors of Laser-based removal of irregularly shaped space debris, Stefan Scharring, Jascha Wilken, and Hans-Albert Eckel of the German Aerospace Center discuss a new method in applying laser-induced damage principles to clean up space junk, where the use of high-energy laser pulses modify the orbit of debris causing it to burn up in the atmosphere.

The greatest improvement from previous studies in laser-based removal of debris is the ability to target irregularly-shaped objects – a characteristic shared by most space material.

To get a better picture of how much debris we’re working with, watch this short video simulating the increasing amount of space junk that has accumulated over the years in low Earth orbit (LEO).

Claude Phipps of Photonic Associates, LLC and his colleagues have been researching laser orbital debris removal (LODR) for over 15 years and have concluded that it is a very promising technique. Laser technology is improving at an astounding rate and is proving to be the most cost-efficient solution to space junk clean up.

20 July 2016

Grilling robot takes over backyard barbecue

Photonics has already made profound contributions to such areas as medicine, energy, and communications to make our everyday lives more efficient. (Hence the name of this blog.) People in all walks of life benefit from the incorporation of photonics technologies. We look forward to future advancements when the technology may help find a cure for cancer, monitor and prevent climate change, and pave the way to other advancements we can’t even visualize yet.

But here’s a photonics-based invention -- already demonstrated – that breaks ground in a new area: the backyard barbecue. Talk about hot fun in the summertime!

The BratWurst Bot made its appearance at the Stallwächter-Party of the Baden-Württemberg State Representation in Berlin. It’s made of off-the-shelf robotic components such as the lightweight Universal Robots arm UR-10, a standard parallel gripper (Schunk PG-70) and standard grill tongs. A tablet-based chef’s face interacted with party guests.

Two RGB cameras and a segmentation algorithm with background subtraction were used to localize each sausage on the grill. A special challenge was the changing color of the sausages. As one was completely cooked and served, the robot replaced it with a new one on the grill.

The BratWurst Bot was developed by the FZI Research Center for InformationTechnology at the Karlsruhe Institute of Technology; FZI also has a branch office in Berlin. A press release from FZI points out that hardware and software components of the grilling robot are also applicable to other processes like cooperative assembly or difficult manipulation tasks.

24 June 2016

Sky survey, AMA recommendations say it's time to reduce light pollution

A major focus of the International Year of Light and Light-Based Technologies was raising awareness of light pollution. With the rapid dissemination of LED lighting, one unfortunate side-effect is the proliferation of a higher color-temperature illumination. This has many documented negative effects on wildlife behavior and migration, as well as on human circadian rhythms. In addition, scientists are studying further problems in human health that may be indirectly related to different lighting, including higher incidence of some cancers.

Meanwhile, cities and towns across the globe enthusiastically switch to LED street lighting. The energy savings are significant, but in news reports of the plans and projects, there is usually no mention of the technical specifics (or “warmth”) of the light. Early bright white LED streetlights were mostly above 4000K, whereas warmer versions are now available, 3000K or below.

One problem with extremely bright light is that it impairs vision in darker areas, so any illusion of safety at night vanishes as visibility diminishes once you get into a shadow. For drivers as well as pedestrians, this can be dangerous.

Now the American Medical Association (AMA) has issued guidance encouraging communities to adopt LED lighting that minimizes blue-rich light. The AMA also recommended that “all LED lighting should be properly shielded to minimize glare and detrimental human health and environmental effects, and consideration should be given to utilize the ability of LED lighting to be dimmed for off-peak time periods,” according to a press release.

“The guidance adopted by grassroots physicians who comprise the AMA's policy-making body strengthens the AMA's policy stand against light pollution and encourages public awareness of the adverse health and environmental effects of pervasive nighttime lighting,” the release says.

We requested a copy of the original report that led to these recommendations from the AMA. Here’s an excerpt:

More recently engineered LED lighting is now available at 3000K or lower. At 3000K, the human eye still perceives the light as “white,” but it is slightly warmer in tone, and has about 21% of its emission in the blue-appearing part of the spectrum. This emission is still very blue for the nighttime environment, but is a significant improvement over the 4000K lighting because it reduces discomfort and disability glare. Because of different coatings, the energy efficiency of 3000K lighting is only 3% less than 4000K, but the light is more pleasing to humans and has less of an impact on wildlife.

“Disability glare” is defined by the Lighting Research Center at Rensselaer Polytechnic Institute as “the reduction in visibility caused by intense light sources in the field of view [because of] stray light being scattered within the eye.”

One city that put the brakes on the brighter LEDs is Davis, California. In 2014, as new lights were being installed, the Davis city council put the project on hold after multiple complaints from residents. Later, the city decided to spend an additional $325,000 to replace those too-bright streetlights in residential areas. However, Davis is the exception. Places that have not yet committed to the switch are encouraged by the International Dark-Sky Association (IDA), the Lighting Research Center, and others to ask the right questions and study the issues involved beyond the simple let’s-save-energy approach. (In fact, that justification is up for debate as well -– it seems that when something gets cheaper, people tend to use more of it.)

Last fall, SPIE Newsroom published an article exploring these issues and collecting the advice of lighting experts. Recommendations for municipalities considering a change are included. (See “LED light pollution: Can we save energy and save the night?” by Mark Crawford.)

Just this month, a world atlas of artificial sky luminance, described in Science Advances reported that 80% of North Americans and one third of all humans are unable to see the Milky Way because of light pollution. Calculated with data from professional researchers and citizen scientists, the atlas also takes advantage of the newly available, low-light imaging data from the VIIRS DNB sensor on the Suomi National Polar-orbiting Partnership (NPP) satellite. The authors conclude:

"Light pollution needs to be addressed immediately because, even though it can be instantly mitigated (by turning off lights), its consequences cannot (for example, loss of biodiversity and culture)."

The IDA says this is a "watershed moment." The sky atlas and the AMA recommendations offer "an unprecedented opportunity to implore cities to transition to LEDs in the most environmentally responsible way possible." It's a good chance to start a conversation with your elected officials.

07 June 2016

Photonics on the farm: robotics to help feed the world

Simon Blackmore talks about farming with robots
for precision agriculture in an
SPIE Newsroom video interview [6:58].
Ten to 15 years ago, farmers used to laugh when Simon Blackmore and his colleagues talked about deploying robotics for such chores as weeding, protecting crops from disease or pests, or selecting harvest-ready vegetables — all while helping to cut costs and limit chemical and other impacts on the soil.

Now, he said in an SPIE Newsroom video interview posted last week, they’re asking questions about how robotics and other photonics-enabled technologies can help save energy and money, minimize soil damage, and improve crop yield.

Blackmore, who is Head of Engineering at Harper Adams University in Shropshire, director of the UK National Centre for Precision Farming (NCPF), and project manager of FutureFarm, also shared his ideas in a new conference at SPIE Defense and Commercial Sensing in April on technologies with applications in precision agriculture.

Blackmore and his NCPF colleagues are working to overhaul current farming practices by intelligently targeting inputs and energy usage. Their lightweight robots are capable of planting seeds in fields even at full moisture capacity, replacing heavy tractors that compact and damage the soil.

Simon Blackmore
Robots have also been designed with micro-tillage capabilities, to target the soil at individual seed positions, and for selective harvesting of crops for quality assurance.

“Now one of my former PhD students has developed a laser weeding system that probably uses the minimum amount of energy to kill weeds, by using machine vision to recognize the species, biomass, leaf area, and position of the meristem, or growing point,” Blackmore said.

A miniature spray boom of only a few centimeters wide can then apply a microdot of herbicide directly onto the leaf of the weed, thus saving 99.9% by volume of spray. Or, a steerable 5W laser can heat the meristem until the cells rupture and the weed becomes dormant. These devices could be carried on a small robot no bigger than an office desk and work 24/7 without damaging the soil or crop.

Not surprisingly, data is a hot topic in the field of precision agriculture.

Several speakers at the April event — among them John Valasek, and Alex Thomasson of Texas A&M University (TAMU), chairs of the conference, and Elizabeth Bondi of the Rochester Institute of Technology (RIT) — spoke about best practices for collecting data, and Kern Ding of California State Polytechnic University discussed data processing techniques.

Valasek also described several sensors and different ways they may be flown. Factors such as weather, speed, altitude, and frame rate can dramatically change the quality of the data products from UAV imagery.

Bondi discussed the calibration of imagery from UAVs (unmanned aerial vehicles, such as drones) to maintain consistency over time and under different illumination conditions.

Other speakers — Haly Neely of TAMU, Carlos Zuniga of Washington State University, and Raymond Hunt of the U.S. Agricultural Research Service — focused on the use of UAVs for such applications as soil variability, irrigation efficiency, insect infestation, and nitrogen management for crops including cotton, grapes, and potatoes.

Plant phenotyping — the analysis of crop characteristics such as growth, height, disease resistance, nutrient levels, and yield — is vital to increase crop production. Taking these data with current methods can damage plants, and is time-consuming and expensive. UAVs, carrying the right sensors, have the potential to make phenotyping more efficient and less damaging.

Speakers Yu Jiang of the University of Georgia, Andrew French of the U.S. Arid-Land Agriculture Research Center, and Grant Anderson of RIT described ground-based systems to expedite phenotyping, and Joe Mari Maja of Clemson University, Yeyin Shi of TAMU, Maria Balota of Virginia Polytechnic Institute, and Lav Khot of Washington State University discussed UAV-based systems.

With images and measurements from such devices, for example, cotton height may be determined and cotton bolls counted, soil temperature can be mapped, and nutrient levels in wine grapes were assessed remotely.

Small- and mid-sized farms are expected to see the largest yield increase from these initiatives. The ultimate result of all this photonics-enabled precision agriculture is profound: healthier food, more productive farms and gardens, and more nutritious food for a growing world population.

Thanks to Elizabeth Bondi and Emily Berkson, both of RIT, for contributions to this post.

06 April 2016

Cataract surgery: misnomer?

On left, the patient’s left eye has no cataract and all structures are visible. On right, retinal image from fundus camera confirms the presence of a cataract. (From Choi, Hjelmstad, Taibl, and Sayegh, SPIE Proc. 85671Y, 2013)

On left, the patient’s left eye has no cataract and all structures are visible. On right, retinal image from fundus camera confirms the presence of a cataract. (From Choi, Hjelmstad, Taibl, and Sayegh, SPIE Proc. 85671Y, 2013)
Article by guest blogger Roger S. Reiss, SPIE Fellow and recipient of the 2000 SPIE President's Award. Reiss was the original Ad Hoc Chair of SPIE Optomechanical Working Group. He manages the LinkedIn Group “Photonic Engineering and Photonic Instruments.”

The human eye and its interface with the human brain fit the definition of an "instrument system."  The human eye by itself is also an instrument by definition.

After the invention of the microscope and the telescope, the human eye was the first and only detector for hundreds of years, only to be supplemented and in most cases supplanted by an electro-optical detector of various configurations.

The evolution of the eye has been and still is a mystery.  In National Geographic (February 2016) an excellent article titled "Seeing the Light" has a very good explanation of the eye's development

Having recently had cataract surgery, my interest in the eye was stimulated. First, I wondered why "cataract surgery" is called "cataract surgery."

In cataract surgery, no surgery is performed on the cataracts (cataract material). A very small incision is made in the lens pocket and the cataract material is flushed out by using the opening to introduce the flushing substance, and the flushing substance carries out the cataract material through that opening. The cataract material may require ultrasonic fracturing to reduce particle size.  A man- and machine-made lens is inserted into the opening.  The opening may or may not require suturing. This procedure should more accurately be known as "lens replacement surgery."

Why are a large number of measurements made on the eye before the eye surgery?

Without invasion of the eyeball, a great many measurements from outside it must be made to determine the required focal length of the replacement lens. (Some people do need corrective glasses to achieve the correct value.) When I asked about all these measurements (made by high-precision lasers) other important factors were brought up, including knowledge of the instruments by the operator, guesswork, and finally…some luck.  Luckily, without glasses distance vision is infinite after surgery, but reading glasses are a necessity. Today, there are many options available to cataract patients, including multifocal lenses, which may enable complete independence from glasses.

After having lens replacement surgery myself, two haunting questions remain unanswered in my mind.

A. Where did the optical-quality fluid (vitreous) in the original eye lens and the eyeball come from, and how did it know where it belonged? Optical-quality liquid or gel occurs in the human eye but nowhere else in the human body.

B. How did the Creator (or whoever or whatever, a religious question) -- without Physics 101 or Optics 101 or Warren Smith's book on basic optics -- determine the focal length of the eye lens (the distance from the eye lens to the retina; some people do need corrective glasses to achieve the correct value). The focal length of the human eye lens is a mathematical value based on measurements and calculations or both and could not have just evolved without some knowledge and information about basic optics.

I wish I could answer either of these two questions but I will have to wait for someone smarter than me. Until then let’s at least change the name of the operational procedure to reflect what is actually being performed, so that people will understand that their cataracts are not being operated on but that their eye lens is being replaced.

Meanwhile, beyond these everyday procedures for improving vision, exciting advances are emerging from labs around the world, enabled by photonics. These include smart contact lenses for monitoring and even treating disease. Artificial retinas under development at Stanford, USC, and elsewhere offer the promise of vision to the blind. The results might not be as clear as what we are used to (yet), but imaging technologies and/or nanomaterials that send visual signals to the brain are helping counter the effects of age-related macular degeneration and other vision problems. New devices and treatments may offer a bright future to those with previously intractable vision problems.

09 March 2016

Graphene: changing the world with 2D photonics

In existing technologies, 2D technologies can be introduced
into products such as silicon electronics, semiconductor
nanoparticles, plastics and more for added new
functionality; above; a flexible 2d prototype sensor.
Graphene, anticipated as the next "killer" app to hit optical sensing, is expected to offer an all-in-one solution to the challenges of future optoelectronic technologies, says Frank Koppens. A professor at the Institute of Photonic Sciences (ICFO) in Barcelona, Koppens leads the institute's Quantum Nano-Optoelectronics Group.

Koppens, along with Nathalie Vermeulen of B-PHOT (Brussels Photonics Team, Vrije Universiteit Brussel), will lead a daylong workshop in Brussels on 5 April on transitioning graphene-based photonics technology from research to commercialization.

In his article on Light and Graphene in the current issue of SPIE Professional magazine, Koppens describes the 2D material's tunable optical properties, broadband absorption (from UV to THz), high electrical mobility for ultrafast operation, and novel gate-tunable plasmonic properties.

Two-dimensional materials-based photodetectors are among the most mature and promising solutions, Koppens notes. Potential applications include expanded communications networking and data storage, increased computing speeds, enhanced disease control utilizing increasingly larger and more complex data sets, and more accurate fire, motion, chemical, and other sensor systems including the next generation of wearables.

Graphene is gapless, absorbing light in the ultraviolet, visible, short-wave infrared, near-infrared, mid-infrared, far-infrared, and terahertz spectral regimes. A few of many advantages include:
  • Ability to be monolithically integrated with silicon electronics
  • Extremely fast -- exceeding 250GHz -- as a material-based photodetector
  • Able to bend, stretch, and roll while maintaining useful properties
  • Low-cost production with potential to integrate on thin, transparent, flexible substrates
  • Potential to be competitive against alternate applications in health, safety, security and automotive systems.

Koppens notes that the €1 billion European Union Graphene Flagship program is aiming to work through academia and industry to bring graphene into society within the next 10 years.

For more, read the complete article in the SPIE Professional, and watch Koppens' SPIE Newsroom video interview [7:09] on manipulating light with graphene.

08 February 2016

UPDATE! Gravitational waves ... detected!

Prior to sealing up the chamber and pumping the
vacuum system down, a LIGO optics technician
inspects one of LIGO’s core optics (mirrors) by
illuminating its surface with light at a glancing angle.
It is critical to LIGO's operation that there is no
contamination on any of its optical surfaces.
Credit: Matt Heintze/Caltech/MIT/LIGO Lab
Update, 11 February: A hundred years after Einstein predicted them, gravitational waves from a cataclysmic event a billion years ago have been observed.

For the first time, scientists have observed gravitational waves, ripples in the fabric of spacetime arriving at Earth from a cataclysmic event in the distant universe. This confirms a major prediction of Albert Einstein's 1915 general theory of relativity and opens an unprecedented new window to the cosmos.

The discovery was announced on 11 February at a press conference in Washington, DC, hosted by the National Science Foundation, the primary funder of the Laser Interferometer Gravitational Wave Observatory (LIGO).

The gravitational waves were produced during the final fraction of a second of the merger of two black holes to produce a single, more massive spinning black hole. This collision of two black holes had been predicted but never observed.

The event took place on 14 September 2015 at 5:51 a.m. EDT (09:51 UTC) by both of the twin (LIGO) detectors, located in Livingston, Louisiana, and Hanford, Washington. The LIGO observatories are funded by the National Science Foundation (NSF), and were conceived, built and are operated by the California Institute of Technology (Caltech) and the Massachusetts Institute of Technology (MIT).

Earlier this week (on 8 February), we wrote:

Gravitational wave rumors pulsate through media

The control room of the LIGO Hanford detector site
near Hanford, Washington. Credit: Caltech/MIT/LIGO Lab
The cosmic rumor mill has been busy lately with tweets and lots of buzz about a potential announcement of observation of gravitational waves by the Laser Interferometer Gravitational Wave Observatory (LIGO). Predicted by Einstein 100 years ago, gravitational waves are ripples in space-time caused by collisions of massive objects like black holes and neutron star.

The LIGO interferometers in Louisiana and Washington State were just retooled, based on what researchers learned from their first few years of observations from 2002 to 2010. They are already several times more sensitive, and as the instruments are tuned to design sensitivity, they will be even better. According to some reports, their first observing run may have already found something. We’ll know on 11 February and we will update this post after that. Until then, see these reports:

The SPIE Newsroom visited LIGO Hanford Observatory last fall, just after the first observing run began for Advanced LIGO. In the following video, observatory Head Frederick Raab and LIGO Hanford Lead Scientist Mike Landry introduce us to the instrumentation and setup of LIGO Hanford, from the laser whose beam travels through the 4-km tubes of LIGO, to the stabilization needed for the interferometer’s mirrors:

25 January 2016

The photonics of Star Trek: 6 ways sci-fi imagined the future that is today

Sci-fi meets reality in this 1975 NASA photo: The Shuttle Enterprise rolls out of its Palmdale, California, manufacturing facilities with Star Trek television cast members on hand for the ceremony. From left to right are James Fletcher (NASA), DeForest Kelley (“Dr. ‘Bones’ McCoy”), George Takei (“Mr. Sulu”), James Doohan (“Chief Engineer Montgomery ‘Scotty’ Scott”), Nichelle Nichols (“Lt. Uhura”), Leonard Nimoy (“Mr. Spock”), Star Trek creator Gene Roddenberry, an unnamed NASA official, and Walter Koenig (“Ensign Pavel Chekov”).

Fifty years after Gene Roddenberry launched the Star Trek series on American television, many of the then-futuristic devices and ideas on the award-winning show have become commonplace on Earth.

Roddenberry’s creativity and extensive homework in consultation with scientists and engineers of his day infused the show with technology such as photodynamic therapy, laser weapons, and handheld sensors and communication devices. In the process, his sci-fi world colored our expectations, inspiring more than a few young people with a level of interest that led to STEM careers.

The short list that follows notes photonics-enabled ideas and props from the initial series (1966–69) that have become reality. See the January 2016 SPIE Professional magazine article for more.

1. The Replicator: today’s 3D printer

Star Trek’s replicator synthesized food, water, and other provisions on demand.

Today, the company 3D Systems sells consumers a popular 3D printer based on stereolithography, a solid-imaging technology for which company founder Chuck Hull received a patent in 1986. General Electric uses laser-powered 3D printers to create jet-engine fuel nozzles and other complex components

In space, a 3D printer from the company Made in Space was delivered to the International Space Station to test the effects of microgravity on 3D printing.

2. The Communicator: the first flip phone

Captain Kirk and other Enterprise crew members flipped open their personal communicators to speak to someone elsewhere on the starship or on a planet below.

Motorola engineer Martin Cooper, who invented the first mobile phone, told Time magazine that his invention was inspired by the Star Trek communicator.

The flip phone already has been succeeded by smartphones and tablets; photonics devices with displays, lenses, cameras, and more. Lasers are used to manufacture the processors, cases, and batteries and to mark a serial number on each device.

3. The Long-Range Scanner: today’s space-based sensors

Scanners on the Enterprise could detect atmospheric chemistry and presence of water on faraway planets, and even count life forms.

All of this is possible today via satellites or aircraft equipped with photonics sensors.

This year, the European Space Agency will launch a spacecraft equipped with sensors that optical engineers developed to search Mars for evidence of methane and other trace atmospheric gases that could be signatures of active biological or geological processes, using two infrared and one ultraviolet spectrometer.

4. The Tricorder: tomorrow’s Tricorder!

The Star Trek tricorder (a TRI-function reCORDER) was a black rectangular device with a shoulder strap with three functions: to scan a person or unfamiliar area, record technical data, and analyze that data.

For today’s Tricorder, contestants for the $10 million Qualcomm Tricorder XPRIZE are competing in developing a consumer-friendly device capable of diagnosing 15 medical conditions and capturing metrics for health. Consumer testing of finalist teams’ solutions is scheduled for this September, with the winner to be announced in early 2017.

5. Invisibility Cloak: object cloaking

Metamaterials have been demonstrated to effectively cloak objects by manipulating the paths of lightwaves through a novel optical material, demonstrating the basic physics used to make Romulan and Klingon spacecraft invisible in Star Trek.

Sir John Pendry of Imperial College is one of the real-life pioneers of invisibility cloaking with negative-refractive-index metamaterials, and many others report on their research at various SPIE conferences on metamaterials and plasmonics.

6. Healing with light: photodynamic therapy

Star Trek’s chief medical officer, “Bones,” used light for surgery, wound care, accelerated bone healing, and as a dermal regenerator to rebuild skin -- all of which will be discussed at SPIE BiOS during SPIE Photonics West in San Francisco next month.

Lasers and specific wavelengths of light are used today to treat cancer and help skin heal faster, and for for aesthetic treatments, dentistry, and eye surgery. Transcranial near-infrared laser therapy (NILT) has been used to reduce the severity of stroke. Complex skin cancers have been treated at University of Lund and elsewhere using light-activated (photodynamic) medicine.

What fueled your dreams?

Theoretical physicist Stephen Hawking once wrote that "Science fiction such as Star Trek is not only good fun but it also serves a serious purpose, that of expanding the human imagination.” With a nod to such inspiration, the SPIE Photonics West 2016 welcome reception will celebrate the Star Trek anniversary.

What other light-based technologies depicted by Star Trek or elsewhere in science fiction serve a real purpose today or inspired your STEM career?