Monday, December 24, 2012

Optics for Peace of Mind

Demonstration of phase gradient microscopy in thick-tissue with back-illumination
suitable for endoscopic integration. (a,c,e) amplitude images (b,d,f) phase gradient
images of mouse intestinal epithelium. From T. ford, J. Chu, and J. Mertz, Nature
Methods, 9, 1195 (2012). Jerome Mertz, Boston Univeristy, among other biomedical
 researchers, will be presenting latest breakthroughs in endoscopic imaging during
invited talks at CLEO 2013 Applications and Technology: Biomedical.
In the last two months, I gained a much larger appreciation for optical technology. Abdominal pain and pressure sent me to a number of doctors’ visits and a handful of endoscopic procedures: an upper-GI endoscopy, a colonoscopy, and a capsule endoscopy (the video camera in a pill). Before these, the most serious medical procedure  I had was a setting of a broken arm from a failed skateboarding trick when I was 11 years old. The stomach pain frightened me. It was deep inside where I couldn’t see it or get at it and it was making daily tasks and living difficult. I was so relieved to be prescribed the first endoscopy and then the followup procedures. It gave me an element of control. The thought repeatedly running through my head before and after these procedures was, “how fortunate I am to live in the time I am in.”  The upper-GI procedure took  less than 15 minutes, was painless, and I found out immediately after that my esophagus and stomach looked healthy. Tests from biopsies less than a week later confirmed this was true. I had similar experiences with the other endoscopies. I was given amazing information about by internal organs in fairly non-invasive short outpatient visits. The figure below shows one of the video frames of my stomach.

Stomach tissue from my own recent upper GI endoscopy using
a conventional commercial endoscope.
















Because my own work in ultrafast laser systems has applications in nonlinear endoscopic imaging, I have used the words “optical biopsy”  (the idea that tissue is cleverly analyzed with photons during the procedure instead of "barbarically" exised to be sent to a lab and analyzed later) and “non-invasive” in introductions to papers, talks, or in explanations to lab visitors how an ultrafast laser has relevance to the average person. In the promotion of ultrafast lasers for optical biopsy, I  have sometimes talked about how the time and effort it takes to run biopsied tissue through histology is long and arduous-it needs to be sliced thin and stained in order to be viewed with a conventional microscope, and then analyzed by an expert. The patient distressingly waits for a diagnosis and also pays a non-trivial sum of money for the professional time involved for analysis.

I couldn’t have imagined the importance of these motivations before my own endoscopic procedures. What was part of my ultrafast laser stump speech was suddenly very real and worthy. My own experiences were definitely non-invasive. What would have been my options when endoscopes were larger and bulkier? What would have been my options prior to widespread use of endoscopic diagnosis?  And though my waiting for histology was short, it was still difficult and definitely costly. What advantages will the next generations have as optical researchers and engineers push endoscopes to use more imaging modalities? Push them to smaller sizes and with more functionality? What peace of mind can we pass on?

No doubt many contributed talks to CLEO 2013 and postdeadline papers will address advances in endoscopic procedures, endoscopes, and catheter-based probes. Last year’s postdeadline session saw two papers on endoscopic imaging: one from a collaboration between John Hopkins Univeristy and Corning, Inc. led by Xingde Li for efficient, high-resolution nonlinear endomicroscopy and  the other from Chris Xu’s lab of Cornell University which piggy-backed wide-field one-photon imaging with high-resloution two-photon imaging in the same device for optical zoom capability.  There were also a number of contributed submissions regarding advances in endoscopy such as the work by Adela Ben-Yakar’s group of the University of Texas at Austin whose endoscope used the same ultrafast laser for two-photon imaging for targeting tissue and subsurface precision microsurgery  through athermal ablation. Last year’s CLEO also hosted an invited talk by Brett Bouma, pioneer of Optical Coherence Tomography (OCT), on translating OCT into GI endoscopy.

This year’s invited speakers in CLEOs Applications and Technology: Biomedical will also be addressing future directions on endoscopes and  endoscopic procedures. Invited speaker Jerome Mertz of Boston University will be discussing his work on phase contrast endomicroscopy which was just published in this week’s  Nature Methods. His technique cleverly uses two diametrically opposed off-axis sources to allow oblique back-illumination  in a reflection mode geometry. Traditionally phase contrast microscopy using oblique illumination requires transillumination and is therefore not suitable for in vivo imaging. Mertz's back-illumination technique allows his microscope to be miniaturized and integrated into an endoscope for which the source and detection optics must reside on the same side of the sample. Unlike traditional oblique illumination phase contrast, Mertz's technique can be used to image thick samples.

Invited speaker Laura Marcu of University of California Davis will be addressing the use of fluorescence lifetime imaging (FLIM) and time-resolved fluorescence spectroscopy (TRFS) in clinical diagnostics. FLIM and TRFS can reveal optical molecular contrast for diagnosis of atherosclerotic cardiovascular disease. However, current approaches using FLIM and TRFS in arterial studies have been primarily ex vivo and  in vivo studies have only interrogated exposed plaque.  In a recent  article in  Journal of Biomedical Optics Express, Marcu shows a catheter-based (“endoscopic”) scanning-TRFS system for  intravascular interrogation of the arterial wall.  Catheter systems are a crucial step for translating TRFS and FLIM into clinical diagnosis.

Finally, invited speaker Andrew M. Rollins of Case Western Reserve University will discuss OCT image guidance for radio frequency ablation (RFA) for treatment of ventricular tachycardia. RFA is a standard treatment for curing many cardiac arrhythmias. However, in cases of ventricular tachycardia (VT), the arrhythmia circuits are located deep in the myocardium or epicardium. Because no technology exists to image cardiac tissue directly during RFA treatment, it has limited success  for curing VT. OCT which has the capability of imaging 1-2 mm deep within tissue with micron resolution could be the silver bullet for successful VT RFA procedures. In a recent article in the Journal of Biomedical Optics, Rollins and collaborators from the North Ohio Heart Center, Cardiac Electrophysiology, image  freshly excised swine hearts using a microscope integrated bench-top scanner and a forward imaging catheter probe to show the high functionality of OCT for RFA image guidance.

This CLEO I will be looking at imaging research from a different perspective. Whether at the research level or market-ready, advances in endoscopes serve a larger altruistic purpose of potentially giving patients a higher quality life, some control of their health, and dignity. These aren't just empty words to promote optics research but are very real. What a wonderful profession we’re in.

Thursday, September 6, 2012

Ultrafast Optical Pulses and Hip Hop Play a Critical Role on Mars Rover

Artist Rendering of ChemCam Laser
Analysis on Mars Science Laboratory. From libs.lanl.gov/ChemCam.html
What do front man for the Black Eyed Peas, Will.i.am, and ultrafast optical pulses have in common? They are both playing crucial role on the newest Mars rover mission. On August 28, Will.i.am's song "Reach for the Stars" was the first musical composition to be transmitted to Earth from another planet, in this case from Curiosity, twelve days after its  Seven Minutes of Terror landing, complete with state-of-the-art supersonic parachute and sky-crane. I'm still a bit shocked at this science fictionesque feat of impressive engineering seeming to border on hubris. Really, a sky-crane? Really?

While Will.i.am's  interplanetary music transmission is playing a critical role in science and engineering outreach as part of google+ and Lockheed Martin sponsored initiative SYSTEM (Stimulating Youth for Science Technology Engineering and Math), ultrafast optics is playing a critical role for analyzing the geology of the martian surface. On August 19, ChemCam, an instrument that is a part of the Mars Science Laboratory on board Curiosity, ablated part of a rock with ultrafast optical laser pulses and performed chemical analysis on the emitted plasma to determine rock and soil composition, a first for exogeology. Though the technique, laser induced break-down spectroscopy (LIBS), is almost as old as the laser itself, it has never been performed on another planet. What makes LIBS so useful for Mars exploration is that as an active remote sensing technique, no physical contact needs to be made with the rock or soil under test, including cleaning the sample area.

The previous Mars rovers required a rock abrasion tool to remove dust and outer layers to analyze the more interesting unweathered interior of rock and soil samples. On Curiosity, initial pulses "clean" the area and subsequent pulses create the plasma of interest whose spectrum is to be analyzed. For this instrument standoff distances can be as far as 7 m. The LIBS instrument has been combined with a Remote Micro-Imager (RMI) to give contextual information around the approximate 0.5 mm LIBS interrogation points in a single instrument called ChemCam. The figure below shows the precision of the laser system as well as the resolution of the Micro-Imager at 3 m stand-off.  The choice to burn precision holes in the U.S. dollar and Euro (near Toulouse, France on the Euro map) is in homage to locations of the collaborating institutions  Los Alamos National Laboratory, Centre National d'Etudes Spatiales, and Centre National de la Recherche.

Demonstration of ChemCam's shooting accuracy and micro imager resoltion at 3 m
standoff after ablating holes in U.S. and European currency respectively. The inset
 (lower left) shows the difference image. Image from poster "Progress on Calibration
of the ChemCam LIBS Instrument on the Mars Science Laboratory Rover,"
by principle investigator R.C. Weins, 2010. 

















Besides the ultrafast laser system, ChemCam is a goldmine of optical engineering and instrumentation. There is honestly something for almost any kind of optical scientist on this instrument. Details can be found both on the ChemCam website and in a review of the instrument suite (an easy geeky read which I had trouble putting down). The laser and imaging optics reside in the mast of ChemCam (the seeming periscope-like eye of the rover) and the spectrometers and supporting equipment live in the body unit. The mast and body are connected by optical fiber.

Schematic of ChemCam. From "The ChemCam Instrument Suite on the Mars
Science Laboratory Rover: Body Unit and Combined System Tests," Space
Sci. Rev., DOI 10.1007/s11214-012-9902-4, (2012).





















As a laser scientist, when I read the initial news of ChemCam, I wanted to know as much as possible about the ultrafast laser on board. What kind of pulses do you need to in order to create a plasma from rock? What power? What width? What wavelength or wavelength range is required for LIBS analysis? Was it a fiber laser or solid state version?

The laser on ChemCam is a neodymium potassium gadolynium tungstate (Nd:KGW) at a center wavelength of 1067 nm made by Thayles Optronique. Pulses are 5 ns in duration with more than 10 mJ of energy in order to deliver 10 MW of power per square millimeter to the target. The repetition rate is very low, 1-10 Hz, in order to maximize pulse energy. It turns out that the wavelength is not very special and could be anything from visible to near IR- the field strength is what is most important for creating the plasma. The ChemCam team chose a laser in the 1.0 micron region due to the simplicity and practicality of obtaining the necessary energy density for LIBS. However, another advantage to choosing excitation light at 1.0 microns is that it is longer than the reddest wavelength of expected characteristic emission lines from the plasma. These lines range from 240-850 nm.

ChemCams first spectrum using data collected from a rock near the landing site dubbed,
 "Coronation". Image from Andy Shaner's ChemCam blog.











So far preliminary data from ChemCam's  LIBS instrument, show clean "textbook" spectra. The first spectrum (shown above) is consistent with basalt, a type of volcanic rock which is known to be present on Mars. The carbon peak in the spectrum comes from the carbon dioxide-rich martian atmosphere. Hydrogen disappeared after the first shot, indicating it was only on the surface, and the concentration of magnesium became less with subsequent lasr shots. ChemCam began ablating rocks on the martian surface August 19, and has since been taking more target practice. The latest picture from ChemCam on Augusts 25, (see below) shows a 5 x 1 raster scan to investigate chemical variability across the sample.

A before and after photo of a 5x1 raster scan during an August 25, chemical variability analysis.


The feats being accomplished by Curiosity are truly amazing. It seems the sky is the limit when it comes to what this rover can do...no wait, to quote "Reach for the Stars",

"Why do they say the sky is the limit when I've seen footprints on the moon?...let's reach for the stars"

I take that back, the stars are the limit. Thanks Will.i.am.

Monday, May 14, 2012

Transistor Moment

Flying back home from San Jose I couldn't help wonder with excitement if our field is on the verge of a "transistor moment." Maybe it was just my CLEO conference euphoria coupled with high-end caffeine from Cafe Frascati still in my system. However, I feel like something big is going to happen, particularly in the field of photonic circuits and nanophotonics.

 The explosion of work in this subarea is impressive and CLEO hosted a number of talks from the leaders and pioneers in this field. You can still watch a handful of these on the CLEO On Demand video  such as Yurii Valsov's plenary talk on fundamentals and applications of silicon nanophotonics, Larry Coldren's tutorial, CWK1.1, on single-chip transmitters and receivers, and Dave Welch's tutorial, JM4.I.1 on semiconductor photonic integrated circuits, just to name a few. Cutting edge science is interfacing with better fabrication processes- repeatability and low cost. At the poster session on Wednesday night it seemed every group was using some kind of micro ring resonator. Simple photonic circuits are becoming standard. Will our children have the nanophotonic equivalent of a Heathkit radio- something like "My first Fab." It seems a sure thing to me that my daughters will be using optical/electrical hybrid computers in their lifetime. And it seems even more certain to me that nanophotonics is the future of our field.

However, will something even bigger, more profound, and unexpected happen like when Walter Brattain dumped his amplifier experiment in a thermos of water in 1947 to successfully demonstrate electrical gain of what was to become the transistor? The same little amplifier that gave birth to a small startup company named Sony and then later to Texas Instrument, Intel and the entire business of integrated circuits and computation as we know it. The transistor was at first a "mere" amplifier. Later it became the foundation for all computer logic and a new era of technology. I wonder what is within our grasp that we don't realize.

Yurii Vlasov used imagery from the Wizard of Oz in his plenary talk of a road to follow to the Emerald City (our goals of nanophotonics and computation and the windy road we will take). However, I wonder what ruby slippers we are wearing right now. What  "transistor potential" is waiting to be unlocked. It's a good time to be in the field of photonics. We will be the leaders of the new information age and the technology that drives and supports it.

Thursday, May 10, 2012

Limber-up for the Postdeadline Session Tonight

From Postdeadline paper CTh5D.1 "Wavelength-Size Silicon Modulator." Scanning electron micrograph of the silicon integrated waveguide modulator.


Make sure to stretch your legs if you want to move from session to session in this frenzy of fantastic photonics research (say that five times fast). Tonight from 8:00-10:00 pm marks the crème de la crème of contributed papers to CLEO. I haven't quite made up my mind of which to attend, but found a number of them particularly exciting:

CTh5D.1, "Wavelength-size Silicon Modulator"  from V.J. Sorger et al.

 This is work out of the Zhang Lab from Berkely showing an optical modulator with 1-dB/micron extinction (a 20 micron long device gives 20 dB extinction). The modulator is based upon tuning the carrier concentration of an active nm-thin layer of Indium Tin Oxide sandwiched between a MOS structure. Just yesterday, Larry Coldren from UCSB was gently ribbing the silicon folk in his tutorial CW1k.1, "Single-chip Integrated Transmitters and Receivers" for the dearth of practical active components such as a modulator. Coldren sees InP based photonic circuits as the more robust platform for photonic integrated circuits. However, great work like this from the Zhang group will be pushing silicon to the forefront.

CTh5C.4, "In Vivo Three-Photon Microscopy of Subcoritical Structures wihtin an Intact Mouse Brain" from N. Horton et al.

 This work from the Xu Group from Cornell University uses a clever choice of longer-excitation wavelength coupled to the improved localization of three photon fluorescence in order to image deep through intact tissue. Even though longer wavelengths are more readily absorbed in tissue, they are significantly less scattered. The overall effect is higher throughput and deeper penetration. Combine that with a 1/z4 fall-off in three-photon fluorescence signal (tighter localization), and now you can make beautiful images of intact tissue. The Xu group shows 1.2 mm stack of brain tissue taken in 4 micron increments. The broader goal will be to eventually use this for optical biopsy in humans. I would prefer to have my tissue scanned with a laser rather than excised from my body with a knife by a surgeon, wouldn't you?  


CTh5C.1, "Demonstration of a Bright 50 Hz Repetition Rate Table-Top Soft X-Ray Laser Driven by a Diode-Pumped Laser" from B. Reagan et al.

This work from the Rocca Group of Colorado State University and the Research Center for Extreme Ultraviolet Science and Technology shows a significant improvement of table-top soft x-ray lasers. To see how quickly this group is improving these  systems, just look at a March 2012 Laser Focus World feature article highlighting their work- now outdated. The aim of table-top soft x-ray research is to bring systems that are typically found at a shared national lab facility to the many optics tables of university labs and industry. Applications for coherent soft x-rays include laser-induced materials processing at the nanoscale level as well as ultrafast characterization of nanoscale motion. Spectra Physics or Coherent may not be selling ultrfast soft x-ray lasers just yet, but this paper shows a  5-fold increase in repetition rate (important for higher average power applications) and a 20-fold increase in pulse energy from previous best efforts.

ATh5A.4 "Highly Efficient GaAs Solar Cells with Dual Layer of Quantum Dots and a Flexible PDMS Film"  from C. Lin et al.

In this paper a Taiwanese collobaration from the Institute of Photonic Systems, National Chiao Tung University, and the Research Center for Applied Sciences has shown a 22 % enhanced efficiency in a GaAs solar cell by spraying a coating of UV absorptive quantum dots onto a polydimethylsiloxane film at the top surface of the cell. This collaboration has found a clever way to not let so many UV photons from the sun go to waste.

Protecting Troops and Civilians with Light

From Joint IED Defeat Organization (JIEDDO) https://www.jieddo.dod.mil. Soldiers from the 713th Engineer Company, out of Valparaiso, Ind., conducted counter improvised explosive device training at Camp Atterbury Joint Maneuver Training Center Aug. 20. Photo by Staff Sgt. Matthew Scotten.
The panelists from Tuesday's 2:00 pm Market Focus, Defense: Laser Interrogation for Standoff Detection of Hazardous Materials, presented the audience with a difficult problem to which the U.S. Department of Defense is allocating many resources and substantial funding:

How can you accurately detect threats from chemical, biological, radiological, nuclear, or high-yield explosives (CBRNE) from a safe stand-off distance to protect or warn those in harms way?

Laser spectroscopy is the short answer, be it UV Raman, NIR Raman, Long Wave Absorption Spectroscopy, Laser-induced Breakdown Spectroscopy (LIBS), Photoacoustice Spectroscopy, Ultrafast Spectroscopy, just to name a few. However, what kind of spectroscopy you use to identify a threat is just the beginning to making a system that can function in rugged battlefield environments and accurately deliver the information you need in the time you need it.

Panelist Scott Robertson,  Research Senior Manager at Lockheed Martin, posed just how difficult this can be with some specific targets of the type of systems needed in the field. One project whose objective was to analyze threats by the vapors and residues from vehicles needed a stand-off detection distance of 400 m, an entire scan, detect and process time of 1.0 second, with a false alarm rate of only 1 in one million, and packaged in a volume of 1 cubic meter. Another specification target was to be able to scan an area of 2,700 square meters per second while searching a road 100 m wide, while traveling 60 mph.

There are other constraints as well. Tom Stark (no relation to Tony from the Iron Man series), from Landmark Technologies Joint IED Defeat Organization, reminded the audience that 99.9% of the people in an area you want to scan are not the threat. You can't and  don't want to blatently scan a crowd with a potentially dangerous high-power laser system. Another constraint therefore is laser safety, particularly eye safety. Add this to the checklist of specification targets and you start bumping up against fundamental limits for power needed to detect a spectroscopic signature of a threat, as well as selectivity and sensitivity for identification of molecules.

Augustus Fountain, Senior Research Scientist in Chemistry at Edgewood Chemical Biological Center, spoke to some of these issues. Fountain spoke about choosing the wavelength/spectroscopic for your method. In the UV you gain in sensitivity but loose in selectivity. The opposite is true as you move into the IR. Another problem to consider in system design is 1/r2 loss and atmospheric attenuation. What kind of time window do you have available for scanning? Is the analyte a mixture of compounds- harder to detect spectroscopically, or something simple?  Scott Roberston echoed many of these remarks. Do you want to identify the threat or do you just want to know if it is going to kill you? The specific use and system dictate different constraints on what you design. Robertson also argued most users want the latter- "just give me a green or red indicator  light," not a beautiful Raman spectrum that requires interpretation. More often you just want to know "threat or no threat" for fast decision making in an environment of potential threats.

Much of the panel discussion centered around the do's and don'ts of collaborating with companies for defense money and contracts or even directly submitting proposals to broad agency announcements from DoD. If you are a small business or researcher trying to connect with defense contractors or apply directly for money the advice was to follow the rules, connect with partners and collaborators early, ask lots of questions early, and once again follow the rules.

The panel did offer some specific areas where there is need for technology. Fountain spoke how the 785 nm laser has been inappropriately the workhorse for Raman. This wavelength region has many problems. He would like sources further into the IR or deeper into the UV, particularly solid state sources. Edwin Dottery, President of Alakai Defense Systems, pleaded for a UV laser source less than 250 nm. Specifically between 220-240 nm will be ideal for UV Raman.

The difficult obstacles to overcome for practical stand-off detection are worth the effort. The end-user is particularly important and worth the time, soldiers continually putting their lives in harms way as well as civilians who want to carry on a normal life and provide for their families without fear of attacks. Lasers just may make this possible.

Tuesday, May 8, 2012

Interest in 2.0 micron Light is Growing

Wavelength Modulation Spectrum using tunable 2.0 micron VCSEL; From JTh1L.6, A. Kahn et al,. "Open-Path Green House Gas Sensor for UAV Applications"
Today at CLEO I spent a large amount of time at the expo hunting down which companies were selling 2.0 micron wavelength products and why. In the technical program, there are a large number of contributed talks regarding 2.0 micron lasers, pulsed and continuous wave. On Monday I attended session CM1B, Ultrafast Mid-IR in which 5 out of 8 papers demonstrated ultrafast pulses about 2.0 microns. Today there was a session titled CTu2D, 1.5 to 5 micron Lasers which also had 5 talks out of 8 showing laser systems operating near 2.0 microns. There have been and will be a handful of talks not pinned down to these topic categories as well:

-CM2B.2, "A Broadband 1850-nm 40-Gb/s Receiver Based on Four-Wave Mixing in Silicon Waveguides"

-CTu3M.7, "All-fiber 10-GHz Picosecond Pulse Generation at 1.9 microns without Mode-locking"

-JTh1L.6, "Open-Path Green House Gas Sensor for UAV Applications"

-CF1K.1, "Single-Frequency kHz-Linewidth 2-μm GaSb-Based Semiconductor Disk Lasers With Multiple-Watt Output Power"   

-CF1N.4, "Double-wall carbon nanotube Q-switched and Mode-locked Two-micron Fiber Lasers" 

However, what we like to research and what we can actually bring to market are often two very different things. I am therefore excited that it is not just 2.0 micron papers that are cropping up at this years conference, but 2.0 micron products at the expo as well.

So why is anyone interested in light in the 2.0 micron region? My personal interest stems from a research talk I saw by analytical chemist, Mark Arnold, at University of Iowa. Arnold is trying to perform some hard analytical  chemistry on 2.0 micron light shone through the skin on the back of one's hand. He hopes that by looking at the absorption spectra, he can measure blood glucose levels without having to draw blood. This noninvasive testing would be a boon to diabetics who are not thrilled about pricking their fingers regularly. Wavelengths that are helpful for pinning down glucose, but that are not absorbed as readily by tissue are 2.13 microns, 2.27 microns, and 2.33 microns.

In short, there are some interesting molecules around 2.0 microns on which to perform spectroscopy. For environmental sensing, there is 1877 nm, a well defined water absorption line, and 2004 nm, a good line for carbon dioxide detection, and many more.

Many of the companies I spoke with selling 2.0 micron components and sources confirmed such spectroscopic applications of their customers:

-Oz Optics now sells passive fiber components at 2.0 microns as well as DFB sources.

-Sacher Lasertchnik and Nanoplus make DFB lasers extending through the 2.0 micron region depending on your molecule of interest.

-Advalue Photonics makes thulium-based fiber laser systems and sells passive 2.0 micron products.

-New Focus will be developing tunable laser diodes about 2.0 microns in the next few months.

-Nufern and CorActive are selling Tm- and Ho-doped fiber for 2.0 micron amplification and for fiber sources.

-IPG sells a number of lasers from 2.0-2.8 microns based Cr:ZnSe as well as 2.0 micron fiber lasers using thulium doped fibers.

There are other advantages to 2.0 micron light as well. 2.2 microns is where the two-photon absorption coefficient in silicon drops to nothing. If you are interested in confining light to a silicon waveguide, doing so at 1550 nm could be the worst choice since it coincides with the peak two-photon absorption. However, above 2.2 microns allows higher throughput as well as access to other nonlinear effects like parametric amplification. 

This is one of the pursuits of Thorlab startup, PicoLuz. Among other optical instrumentation, PicoLuz is developing 2.0 micron amplifiers which will eventually support its endeavors of a chip scale optical parametric amplification. 

-Thorlab's Quantum electronics division is also selling a laser that is tunable about the gain bandwidth of thulium 1800-2000 nm, an FTIR spectrometer that goes out to 2500 nm,  a handful of moderate speed long-wavelength detectors, passive fiber components for 2.0 microns, as well as pumps for Tm-amplifiers.

Another reason for generating 2.0 micron light is for opening up new spectral bandwidth for telecommunications or signal processing, whether on fiber or on silicon. To that end,

-Eospace is now offering 20 Gb/s speed LiNbO3 intensity and phase modulators at 2.0 microns.

-Electro-Optics Technology is pushing past 1 GHz speed for 2.0 micron detectors.

Malcom Minty, a project manager from New Focus told me that New Focus was actively persuing thulium based laser systems in the 90's. The thought was that bandwidth would all be used up in the C- and L-band during the telecom boom, requiring expansion. Thulium has wide efficient gain and was a natural choice. Nufocus dumped the project as the the telecom bubble burst. Now they will be rejuvenating it, but more likely to sell to customers interested in spectrscopic pursuits.

Minty conjectured that 2.0 microns is becoming an interesting color to customers, vendors and researchers because it is in a region (or getting close to a region) of spectrscopic and biomedical interest. However, because it is close enough to the L-band, much telecom technology can still be used. It just squeaks by with some efficiency for detection on InGaAs-based detectors where as detection methods above 3.0 microns start getting tricky.

I think Minty likely has this right. If so I will be interested to see what new research we can carryout with 2.0 micron light while leveraging what we already know from fiber systems and telecom.

Sunday, March 25, 2012

Highlights from the Program Chairs


(One of seventeen youtube shorts from the program chairs highlighting hot topics for CLEO 2012)

For a few years now CLEO conference organizers have been posting youtube shorts highlighting contributed talks, symposia, research trends, and any new or unique directions for the upcoming conference. This year there are seventeen videos from the program chairs, all worth watching. However, for those who prefer text over A/V, I thought it might be helpful to highlight the highlights here.

Conference Program Stats

-The 2012 program has been selected from a record number of submissions.

-In just its second year, CLEO's new Technology and Applications Conference saw a 50 % increase in submissions.

-350 papers, 15 % of all submissions, live in the subcommittee sections "Nano-optics and Plasmonics" or "Micro- and Nano-Photonic Devices"

-Subcommittee section: "Fiber Amplifiers, Lasers and Devices" was the single committee that received the most submissions

CLEO Applications and Technology: Government and National Science, Security and Standards Applications

In his youtube short, subcommittee Chair Ian Mckinnie of Lockheed Martin Coherent Technologies briefly discusses the two tracks of this subcommittee: 1) Ultrafast Laser Applications and 2) Instrumentation and Sensing.

Mckinnie talks about how the ultrafast program covers a broad range ultrafast laser applications spanning those performed at large facility-class systems to those on a bench top or operating table. These are exemplified by the tutorial talk, AW3J1, "Enabling Science at the Advanced Light Source X-ray Facility" that will be given by Roger Falcone of Lawrence Berkeley National Laboratory from 4:30-5:30 pm on May 9, and the invited talk AW3J4, "Applications of Ultrafast Lasers" by Mike Mielke of Raydiance Inc., also on May 9, but from 6:00-6:30 pm

The Advanced Light Source (ALS) is a large synchrotron source that produces laser light over an extremely broad spectrum including the hard-to-reach soft x-ray region. Falcone will be discussing the use of the coherent radiation at this user-facility for applications such as precise material processing and biomedical research.

On the other hand, Mielke will be discussing the use of compact fiber systems for micromachining and laser surgery. See blog post "Machining with Ultrafast Pulses" for some stunning videos and more information on these compact micromachining systems.

On the remote sensing side, Massayuki Fujita, from the Institute of for Laser Technology in Osaka, will be giving an invited talk on an application of remote sensing not typically found in the CLEO conference program- nondestructive inspection for heavy industrial processes. Fujita's talk, ATuG3 "Nondestructive Inspection for Heavy Construction" can be heard on Tuesday May 8, at 2:30 pm.

CLEO Applications and Technology: Biomedical

In his youtube short, subcommitee chair Yu Chen from University of Maryland mentions a number of specific talks you won't want to miss:

In the session "In vivo Imaging", there will be two talks on on image-guided spectroscopy. The first will be a tutorial talk by Brian Pogue of Dartmouth on integrating optical molecular spectroscopy techniques into standard medical imaging equipment, ATh4C1, "Image-Guided Spectroscopy of Cancer: Translating Optical Technology into Clinical Tools" on May 10, at 4:30 pm. The second will be an invited talk from Brian Benaron of Spectros Corporation, ATh4C4, "Molecular Spectroscopy and Imaging: A multibillion-dollar industry reshaping biotech and medicine" also on May 10, but 6:00 pm.

Chen also mentions the contributed talks from the In Vivo session by Saivash Yazdanfar of GE Global Research who will be speaking about fluorescence image-guided procedures in talk ATh4C2, "Fluorescence Image Guided Surgical Instruments and Contrast Agents for Intraoperative Visualization of Nerves" on May 10, 5:30 pm as well as contributed talk from Adam Straub of Cornell University who will be presenting work on increasing multi-photon image acquisition speed by a whopping two orders of magnitude, ATh4C3, "Multiphoton Multifoci Modulation Microscopy for High-Speed Fluorescence Lifetime Imaging"at 5:45 pm on May 10.

Chen goes on to highlight other talks in session "OCT and Microscopy" which will be held on Thursday May 10, from 2:00-4:00 pm, perhaps most notably talk JTh3J, "Recent advances in translating OCT into GI Endoscopy" by Brett Bouma of Massachusetts General Hospital, one of the pioneers of OCT. There will also be a host of cutting edge talks in session "Cellular Imaging and Therapy" 8:00-10:00 am on Thursday May 10. This session kicks off with an invited talk by Adam Wax from Duke University, ATh1M1, entitled "Coherence Imaging for Early Cancer Detection."

CLEO Applications and Technology: Industrial Applications

In his video short, subcommittee chair Eric Mottay of Amplitude Systemes discuses the two major trends of the Industrial Applications subcommittee: 1) micro- and nanofabrication techniques and 2) applications of graphene.

Talks in the latter category can be found in a joint session with CLEO: Science and Innovation subcommittee six in session "Graphene and Carbon Advanced Photonic Materials" which will be held form 11:00am-1:00 pm on May 8. This session will host talks presenting graphene-based devices such as detectors, modulators, and tunable resonators. Recall that Andre Geim and Konstantin Novoselov were awarded the 2010 Nobel Prize for showing the "exceptional" properties of graphene such as it being simultaneously the thinnest and strongest material, having better electrical conductivity than copper, better heat conduction than all other known materials, and having nearly 100 % transparency yet an extremely high density (so dense helium atoms cannot pass through). Be sure to see how this "magical" material is being translated into devices that may be on the market in the next three to five years.

On the other hand, the invited talks for this subcommitee all center around micro- and nano- fabrication processes. Arnold Gillner of the Fraunhofer Institute will discuss how ultrafast lasers can be used for surface processing at the micro- and nanoscale level for applications in light guiding, fabrication of low friction surfaces, or wear-resistant surfaces. His talk, ATu3L1, "Micromanufacturing and nano surface functionalisation with ultrashort pulsed lasers" is scheduled for May 8, at 4:30 pm. Additionally, Paul Webster from Queen's University will be discussing online monitoring during fabrication, particularly concerning the control of depth, in invited talk ATu3L5, "Inline Coherent Imaging: Measuring and Controlling Depth in Industrial Laser Processes," on May 8, at 5:45 pm and Rick Russo from Lawrence Berkeley National Laboratory will be speaking about real-time spectroscopy of a sample after it has been turned into a plasma through laser ablation in talk, AW1H3 "Laser Plasmas for Spectrochemistry" on May 9, at 11:00 am.

CLEO Applications and Technology: Energy and Environment

In his video short, subcommittee chair Christian Wetzel from Rensselaer Polytechnich Institute discusses two trends in paper submissions 1) environmental sensing, particularly atmospheric sensing using quantum cascade lasers (QCL) and 2) Breakthroughs in LED lighting, for which many contributed papers address ways of overcoming "droop" (the reduction in efficiency by when driving with high current).

These two topics will be also be discussed indirectly and directly in the special symposium "50th Anniversary of the Semiconductor Laser" in which one of the pioneers of the QCL, Jerome Faist, from the Institute of Quantum Electronics in Zurich, will be giving an invited talk "Quantum Cascade Lasers: Coming of Age" as well in the plenary talk, "Development of nonpolar and semipolar InGaN/GaN light-emitting diodes (LEDs) and Laser Diodes" by solid-state lighting giant Steven Denbaars of University of California, Santa Barbara.

Wetzel mentions two must-see invited talks in his short. One is talk JTh4J1 "Hydrogen Generation using Nitride Photoelectrode" by Kazuhiro Ohkawa of Tokoyo University of Science on May 10, at 4:30 pm. Ohkawa will show results of solar powered water-splitting on a nitride-based electrode for which the incident photon-to-electron conversion efficiency (IPCE) is upwards of 70%. The other is JTh1L3, "III-Nitride Optochemical Nanosensors" in which Jörg Teubert from Justus-Liebig-University in Giessen will discuss a nitride-based nanosensor for spectroscopic measurement and ph detection.

CLEO: Science and Innovation

In his youtube short, program co-chair René-Jean Essiambre of Bell Labs, Alcatel-Lucent discusses some of the trends of the various committees.

In subcommittee 11: Fiber Amplifiers, Lasers and Devices, Essiambre notes a trend in papers demonstrating lasers between 1.8-2.0 microns. This is a region where thulium and holmium give efficient and broad gain. Specifically, many submissions show increased wavelength tunability or higher-power operation. Since Essiambre mentions this track, I figured this would give me license to shamelessly promote my own contributed talk. I will be presenting a contributed paper in this category, CTu3M7, "All-fiber 10-GHz Picosecond-Pulse Generation at 1.9 μm without Mode-locking" which demonstrates an unconventional method for pulse generation in this spectral region.

What is so exciting about 2.0 micron light is that there are good gain media in this spectral region and it is just on the edge of the mid-IR for which spectral signatures for various interesting molecules have sharp unique absorption lines- the fingerprint region. Therefore, 2.0 micron sources may be good seed sources to frequency-shift to redder, more spectroscopically significant wavelengths. Two micron light also holds interest for silicon photonics since two-photon absorption, a hindrance for many processes involving tightly confined and/or pulsed light, drops off rapidly in silicon at 2.0 microns and beyond.

Essiambre also notes other trends in the various subcommittees in Science and Innovations. In subcommitee 12: Lightwave Communications and Optical Networks many submissions address new modulation formats and constellations, spatial multiplexing, and high spectral efficiency systems.

Subcommittee 13: Active Optical Sensing saw a focus on frequency combs, particularly making comb sources more accurate, with narrower line-widths, yet at the same time keeping them simple, portable, inexpensive and usable in harsh environments. This was also the trend for papers submitted to subcommittee 14: Optical Metrology. In addition, topics for this subcommittee address applications to astronomy, spectroscopy, and use for high precision standards, not to mention distribution of high-precision combs.

Check out the videos for more details and information. One of the marvelous things about CLEO is that it has so much breadth and hosts so many talks. However, this also makes it overwhelming and difficult to decide what to attend and to decipher new trends in research and applications. I recommend taking some time to hear what the chairs have to say so that they can make your work a little easier.

Monday, February 27, 2012

Semiconductor Laser's Golden Anniversary

(Above: First room temp. CW semiconductor nanolaser with subwavelngth cavity presented at CLEO 2011. From K. Ding et al, CTuG2, CLEO 2011.)

The year 2012 marks the impressive 50th anniversary of the invention of the prolific and ubiquitous semiconductor laser. Almost every household in the industrialized world owns at least one, be it in a DVD player (maybe two if it is a Blue-ray), CD player, optical mouse or depend on them indirectly for long-distance phone service, digital cable, or internet access. Besides making telecommunications a practical possibility, semiconductor lasers have paved the way for the development of silicon photonics and will be pivotal in the future of optical information storage and processing. Despite their primary use in mass consumer markets for communications, information processing, mutimedia, and teasing cats (you can even get semiconductor laser pointers with phase masks and lens attachments that project images mice or fish on the floor for your feline to chase), many subfields have profited from the low-cost and small-footprint of these robust laser sources. Take for example the handful of semiconductor sources offered commercially by Thorlabs for optical coherence tomography, or the inexpensive semiconductor laser diode sources used by the Ozcan group for field-portable, ultra-low footprint, holographic microscopes.

There are too many other technologies and subfields to name that have profited as well. All you need to do is think of the numerous optics applications that live at telecom wavelengths near 1300 nm or 1550 nm or DVD player wavelengths, 405 nm and 635 nm. Such lasers offer unbelievable device characteristics at such a low price that researchers and venture capitalists often build their technologies to fit these wavelengths instead of the other way around.

Amnon Yariv and Pochi Yeh write in their 2007 edition of the book Photonics that,

"The semiconductor laser invented in 1961 is the first laser to make the transition from a research topic and specialized applications to the mass consumer market...It is by economic standards and the degree of its applications, the most important of all lasers."

To celebrate the most important laser of lasers, CLEO will be hosting a special symposium with talks from pioneers of semiconductor laser technology. The list of speakers and subjects has been well-crafted to paint not only a historical picture but to address current research and trends on this ever-evolving technology.

From a fundamentals perspective Russel Dupuis from Georgia Tech will be talking about device materials. Nobel Laureate Herbert Kroemer of University of California Santa Barbara will discuss the double heterostructure which is still the basic framework for almost all semiconductor light sources and solar cells and which without there would be no continuous wave (CW) lasing in semiconductor devices at room temperature. To this end, Morton Panish, formerly of Bell Laboratories, will describe the development of the first room temperature semiconductor laser.



















(Above: Evolution of threshold current. From Nobel Laureate Z. Alferov, IEEE J. Sel. Top. Quant. Elec. 6, 832, 2000.)

Charles Henry, formerly of Bell Laboratories, will discuss the quantum well structure which was pivotal in reducing active layer thickness and therefore significantly reducing threshold current, see the figure above. Yasuhiko Arakawa from the University of Tokyo will discuss quantum dot lasers which reduced threshold densities even further and remains a developing area of semiconductor laser physics research.

On the more practical side, Jack Jewell, of Green VCSEL will discuss the vertical cavity surface emitting laser (VCSEL) which among other important device attributes may be the best laser for high-yield production. VCSELs are grown, processed, and tested in wafer-form allowing parallel fabrication and testing, minimizing labor and maximizing yield. They also take up less space on a wafer- about three times less than edge emitters of similar power and can be made in 2-D arrays. Jewell will likely discuss the benefits of lower power consumption of VCSELs for use in short-reach, high-speed networks. My understanding is that the "green" in "Green VCSEL" refers to environmental considerations not wavelength.

There will also be talks discussing the semiconductor laser's role in telecommunications, quantum cascade lasers, integrated and hybrid optical circuits, high-power devices, as well progress in nano laser structures with subwavelength volume (see the figure at the top).

Whether to learn the history, fundamental principles, pay homage to the pioneers, or to learn new trends, be sure to mark your calendar for the 50th Anniversary of the Semiconductor Laser symposium to celebrate "the most important of all lasers."

Thursday, January 26, 2012

Why a Temporal-Cloak is so Great: Uncovering the Hype


(Figure from R. Boyd and Z. Shi, Jan. 5, "News and Views" Nature, explaining temporal-cloaking)

At Frontiers in Optics 2011 just this last October, Moti Fridman from Alex Gaeta's group presented work on a the first experimental demonstration of temporal-cloaking using a time-lens system. The work was based upon a theoretical paper from Martin McCall et al in the February issue of the Journal of Optics, and at the beginning of this month, appeared in an in-depth treatment in the January 5, issue of Nature. Besides the usual barrage of bloggers latching onto science-fictionesque results of new research, time-cloaking was also written up in traditional news media such as the Christian Science Monitor.

Temporal-cloaking certainly sounds like something out of Star Trek, but what is it and why is it so great? What makes a temporal cloak truly exciting, and what a majority of the recent articles and posts fail to highlight, is that the temporal-cloak allows cloaking over an infinite section of space albeit for a finite duration of time.

Let's imagine Harry Potter and his invisibility cloak. If the invisibility cloak is a temporal-cloak, Harry can move as far as he wants to the left-and-right and up-and-down without being seen for duration of the cloaking window. Harry can also move a little bit forward and backward without being seen, but not much or else he will walk out of the cloaking time-window (which is 50 ps for the Gaeta group's work or about 1.0 cm in fiber). It is crucial that he is in the right place in the axial dimension (forward/backward) since the window occurs at a specific place in space, but he has total freedom in the transverse dimension for the duration of the cloak. Conceivably Harry could pull-off a bank robbery as long as the bank and the vault are inside that particular infinite pancake of cloaking window and within the duration of the window.

Contrast that to a spatial cloak which gives cloaking for an infinite amount of time, but only a finite section of space. If Harry has a spatial invisibility cloak, then he can stand in one spot for as long as he wants without being seen.

Finally, if Harry has a spatio-temporal cloak, conceivably he can maintain invisibility for any duration of time and throughout any volume of space.

The temporal-cloak shown by the Gaeta group is not a practical cloak. If you scrutinize the setup you'll find that the way that they detect a cloaked event is through lack of nonlinear mixing. A nonlinear signal tells them the event is detected, and no signal tells them that the event is cloaked. You could just turn the power down to get the same result. They also couple into and out of the cloaking window with fiber-couplers between the cloaking apparatus. You can't send both the signal and the event to be cloaked down the same fiber because if the "event" goes through the same time-lens system as the "signal" the event will appear superposed instead of cloaked. Basically they had to sneak it into the right spot at the right time along a different path of propagation.

However, the point of the work was not to show practical temporal cloaking for masking or encryption, but to show the very odd, very fundamental, and very cool phenomena of creating and tailoring gaps in time. So even if the temporal-cloak won't be used anytime in the near future for cracking safes, it does bring the optics community closer to a true spatio-temporal invisibility cloak. It might be time to start brushing up on the rules of Quidditch.