Thursday, June 13, 2013

Compound eye fabricated by Song et al. Photo from UIUC College of Engineering

Browsing the post deadline papers, whose sessions will run from 8:00-10:00 pm this evening, it seemed the Applications and Technology session exhibited a zoological theme.

Fly in the Ointment:

In postdeadline paper ATh5A.5, to begin at 8:48 pm, Song et al. will present work recently published in Nature on compound eye cameras that mimic the physiology of a fly's eye. Unlike human eyes or the eyes of other vertebrates, most animals use compound eyes that have many optical units (facets), each with their own lenses and set of photoreceptors. Though compound eyes lack the sensitivity and resolution of single-lens eyes which work by forming images on a detection plane, they can have infinite depth-of-field without the need to adjust the focal length of any of the lenses. Because of this, compound eyes are very adept at calculating/perceiving relative motion. A good set of compound eyes allows the fly high-precision navigation while in flight. Digital compound eyes therefore show great promise for  micro air vehicles (MAVs) to be used for reconnaissance, sensing, and diagnostics in tight spaces (say locating people in a collapsed building, or flying inside and around machinery and other cramped environments with extreme conditions: high radioactivity, temperatures, etc).

What makes Song et al.'s work, a multi-institutional collaboration lead by the Beckman Institute of University of Illinois Urbana-Champaign, so compelling is that they make use of recent advances in stretchable electronics and hemispherical detector arrays to create a compact, monolithic, scalable compound eye. Essentially, the collaboration fabricated a planar layer of elastomeric microlenses and a planar layer of flexible photodiodes and blocking diodes that are aligned and stretched into a hemispherical shape. Serpentine-shaped metal interconnections on the electronics aid in flexibility. The Beckman Institute collaboration achieved near infinite depth-of-field and 160 degree field-of-view.

Crocodile Smile:

Postdeadline paper ATh5A.3, to begin at 8:12 pm, from Yang et al. of Case Western Reserve University actually addresses clinical diagnostics of human tooth decay using Raman imaging, though they image an alligator tooth to help demonstrate proof-of-concept (note all alligators are crocodiles so the cute colloquialism above is technically correct, albeit it is reaching a bit!).

Current clinical practices for dental caries (decay) lack early-stage detection. Late-stage cavities often require multiple fillings or more costly measures such as crowns, bridges, or even entire replacement of the tooth over the tooth lifetime because of the insufficiencies of x-ray and visual observation to detect lesions.
If tooth lesions could be detected early, they could be remineralized at an early stage of decay, thereby preventing future costly, invasive procedures. 

Yang et al. use global Raman imaging that implements a 2D-CCD array and images at a single wavenumber over full-field of view rather than point inspection over a spectrum of wavenumbers. Their Raman images show a clear border between the dentin and enamel of an alligator tooth, showing high contrast in mineral signal intensity. They also show similar images for human teeth  indicating their technique shows good promise for early clinical detection of tooth decay.

Wednesday, June 12, 2013

Laser Fusion for Sustainable Energy

View from inside the target chamber at NIF showing the pencil-shaped target positioner. Image from LLNL
Yesterday began two days of laser-driven fusion talks punctuated by a visit to the nearby National Ignition Facilitiy (NIF) at Lawrence Livermore National Lab (LLNL) as part of CLEO Applications and Technology Special Symposium: The Path to Sustainable Energy: Laser Driven Inertial Fusion Energy.

The session began with The Physics of Laser Driven Inertial Confinement Fusion (ICF) and continued with the Technology of ICF Drive Lasers and Laser Facilities, and Optical and Nuclear Diagnostics. After the tour of NIF today, the symposium will pick up again on Thursday culminating in Future Perspective of ICF as Sustainable Energy Source.
A tutorial on ICF on the NIF website gives a cute recipe for creating the temperatures and pressures needed for fusion on earth that are only found elsewhere in our universe in stars,

Recipe for a Star:
- Take a hollow, spherical plastic capsule about two millimeters in diameter (about the size of a small pea)

- Fill it with 150 micrograms (less than one-millionth of a pound) of a mixture of deuterium and tritium

-Take a laser that for about 20 billionths of a second can generate 500 trillion Watts

-Wait ten billionths of a second
-Result: one miniature star
Figure of the hohlraum and a cross-sectional view (right) showing the fuel capsule. Figure from LLNL

Of course the devil is always in the details. Ignition, in which more energy is generated from the reaction than went into creating it, has yet to be achieved.  In 2009, NIF reached its laser energy goal and thought ignition would be achieved by fall of 2012.

John Lindl, of LLNL began today's session speaking about many of these devilish details, particularly on NIF. For example, besides having the necessary peak power, the 20 ns, 500 TW laser needs to have the proper pulse shape, which is a strangely-shaped series of four pulses of tailored delay and power in order to deliver four shocks to the target at the proper intervals. 

The target capsule, which may seem to be a trivial piece of the puzzle, has undergone an intense 20- year effort. Different shells of ablator materials, size, shape, density, concentricity, and surface smoothness are all key factors in a symmetric collapse (the attempt to get the correct "spherical rocket"). Lindl, spent a good portion of his talk showing diagnostics images of the collapse, and efforts to optimize the system to better ensure symmetric spherical collapse and confinement. 
Other factors include whether to use direct drive (hitting the capsule directly with the many laser beams) or indirect drive (hitting a cavity called a hohlraum with the beams to generate a symmetric barrage of x-rays to initiate collapse). NIF uses a hohlraum and 192 beams. Omega in Rochester, NY uses direct drive, which accelerates more fuel to burn, potentially for better energy production (when that day comes). Beam configurations, target placement and position, and much more come into play.
Of course simulation has been a key factor in design, result interpretation, and future direction. The immense effort for ICF at NIF, as well as other the facilities in the U.S. and around the world are extremely impressive and the problems are complicated, beautiful, and rich.
Laser inertial fusion energy (LIFE) is a worthy goal which could deliver a sustainable carbon-free energy source. There is no enrichment, no radioactive waste, and no worries of a meltdown; unlike fission chain reactions, when you turn "off" fusion, it is "off". NIF is an experimental facility made to understand the physics and technology necessary for LIFE and not scalable to a power plant. Scaling ignition towards operable power plants is another direction of physics, engineering, and optics research.
Schematic of how laser ignition fusion may interface with a power plant to deliver a sustainble source of electricity. Image from LLNL
Currently, targets are fixed and the laser delivers a few shots a day so that experiments can be changed out, realigned, and optics and components can cool down. In a power plant facility, the hope is to use a higher-repetition rate laser to deliver 20 shots a second. Targets would be injected at speeds of greater than 100 m/s to continually burn fuel, which would heat up a low-activiation coolant of lithium-bearing liquid metals or molten salts surrounding the target in order to convert water to steam with which to turn a turbine.

Lindl said that NIF is just 2 to 3 times away from achieving ignition, meaning the output energy from the fusion reaction is one-third to one-half of the input photon energy from the laser. Though nature has provided some delays from what was previously thought, ignition is realistically around the corner. Laser ignition fusion power plants may be close as 2030.

Sunday, April 21, 2013

New Trends in Fiber and Fiber Applications

Top: Microplasma ignition in an argon-filled kagome-latticed
hollow-core photonic crystal fiber. Bottom: scanning electron 
micrograph of fiber facet, from B. Dabord et al, CLEO 2013 
talk, CTu3K.6, "Microconfinement of microwave plasma in 
photonic structures." Microplasmas show promise for 
applications requiring small confinement of short-wavelength 
visible or UV light such as photolithography or compact 
UV laser emission sources.
Microwave plasmas, optical vortices, gravitational wave detection, and mode-division multiplexing for high-capacity telecom systems are just some of the topics in CLEO Science and Innovations  11: Fiber, Fiber Amplifiers, Lasers and Devices. I recently had an opportunity to speak with subcommittee chair, Siddharth Ramachandran from Boston University, U.S.A. to discuss this year’s program on fundamental fiber technology and devices. Though at a surface glance we may think fiber and fiber applications to be very conventional or already “all-figured out”, Ramachandran noted the fact that this subcommittee continues to receive so many submissions year-after-year (in fact the second largest in the entire conference for 2013) indicates that this is still an extremely active area of fundamental and applied research.

Ramachandran said that contributed and invited talks for the subcommittee could be divided into to two main categories:  1) Novel Fiber, and 2) Fiber Applications.  The latter represents  breakthroughs in engineering, instrumentation, and devices from fiber technology introduced five to fifteen years ago. It is the product of well-tended ideas, hard work, and ingenuity coming into fruition. The former, on the other hand, will likely be the seeds for cutting-edge instruments and systems five to fifteen CLEOs from now. In terms of novel fiber work, Ramachandran discussed two trends 1) Kagome-lattice structures, and 2) Mode-division multiplexing for high-capacity communications.

“We are still developing all sorts of novel fibers. What a fiber is, in terms of being a high-index region that guides light surrounded by a low-index region, is not a settled issue. There are actually a lot of innovations going on.”

Ramachandran spoke of how a decade back, the excitement in fiber research centered around photonic band gap fiber (PBG) which guides light in air (or a structure of silica/air-cores), but still provides many of the properties of standard single-mode fiber, particularly confinement and guidance over many kilometers of length. “That was very exciting, and then what happened afterwards is people found out these band-gap effects are nice for guiding light but they tend to have very small spectral regions where they can guide light, so it is not as universal as our old fibers.”

Kagome-lattice fibers, named for the trihexagonal pattern of air-holes resembling the weave-pattern of a Japanese Kagome basket, may provide one solution to having the versatility of air-guided fibers, while allowing large-bandwidth propagation.

“What Kagome lattice fibers essentially do is solve this spectrum-limiting problem we had with photonic band-gap fibers. You can get huge bandwidth out of these, albeit with slightly higher (theoretical) losses. And so they have been very interesting for doing nonlinear optics of gasses filled in these fibers, to do all sorts of dispersive applications where you need crazy high-bandwidth, and for instance to create plasmas. And then there are people who are trying to make ignition torches with fibers which one would never have thought of doing maybe even five years ago,” said Ramachandran.

Left: Spiral interference pattern of twelve distinct orbital angular momentum
states (vortex modes)after propagating through 2 m of the air-core fiber shown
on the right. Right: photo of the facet of the core shown on top and index profile
on the bottom. From P. Gregg et al, CLEO 2013 talk CTu2K.2, "Stable Transmission
of 12 OAM States in Air-Core fiber." The potential for simultaneous propagation
of so many modes shows promise for mode-division multiplexing for high capacity
telecom systems.

The other category for submissions on novel fiber development on this subcommittee has centered on mode-division multiplexing for high-capacity telecom systems. Ramachandran discussed,

“The simplest way to scale information capacity might be to not just use a single mode in a fiber, but to start using multiple modes. And that brings with it a lot of complexities of how different modes interact with each other and what impact dispersion has? What does the area of the fiber do, etcetera, etcetera? Which cycles back to being a fiber design and fiber fabrication problem. So there is a lot of innovation going on there. Even figuring out what modes one wants to send. Are they the standard modes that we have seen in textbooks? Or are they these more exotic orbital angular momentum or vortex modes?”

Top: Areal view of the Laser Intereferometer Gravitational-Wave
Observatory (LIGO) at the Hanford Observatory site showing
one of the 4 km arms. Photo from image library. 
Bottom: One of the possible 3rd generation fiber-amplified laser
sources for gravitational wave detection designed by Quest Centre
for Quantum Engineering and Space-Time Research and Laser 
Zentrum Hannover e.V. Photo from Thomas Damm, Quest. Peter 
Wessels from Laser Zentrum Hannover e.V. will be describing 
many of the stringent requirements of laser sources used for 
gravitational wave detection such as high average power 
(~100 W to kW), single-frequency emission, ultra-low amplitude 
and phase noise, and diffraction-limited beam quality in
CLEO 2013, invited talk, CW3M.5, "Single Frequency Laser 
Sources for Gravitational Wave detection." 

In addition to contributed submissions in these areas, four of the invited talks concern novel fibers and their propagation effects. On the other hand, the remaining invited talks, tutorial, and contributed submissions focus on fiber applications. The tutorial, by Michael Marhic of Swansea University, U.K. entitled “Fiber Optical Parametric Amplifiers in Optical communications,” will be given on Thursday June 13, from 2:00-3:00 pm. The invited talks in fiber applications, which are indicative of the contributed submissions,  comprise topics as diverse as fiber parametric devices, microwave plasmas, gravitational wave detection, mid-IR sensing, and ultrafast laser combs.

Ramachandran notes, “And the interesting thing about that space is the fiber itself that people are using is perhaps something that was developed anywhere between five years ago to maybe even fifteen years ago. We are now beginning to see all the promise that we initially thought that fibers could deliver and actually seeing applications across different disciplines of science and technology.”

Monday, December 24, 2012

Optics for Peace of Mind

Demonstration of phase gradient microscopy in thick-tissue with back-illumination
suitable for endoscopic integration. (a,c,e) amplitude images (b,d,f) phase gradient
images of mouse intestinal epithelium. From T. ford, J. Chu, and J. Mertz, Nature
Methods, 9, 1195 (2012). Jerome Mertz, Boston Univeristy, among other biomedical
 researchers, will be presenting latest breakthroughs in endoscopic imaging during
invited talks at CLEO 2013 Applications and Technology: Biomedical.
In the last two months, I gained a much larger appreciation for optical technology. Abdominal pain and pressure sent me to a number of doctors’ visits and a handful of endoscopic procedures: an upper-GI endoscopy, a colonoscopy, and a capsule endoscopy (the video camera in a pill). Before these, the most serious medical procedure  I had was a setting of a broken arm from a failed skateboarding trick when I was 11 years old. The stomach pain frightened me. It was deep inside where I couldn’t see it or get at it and it was making daily tasks and living difficult. I was so relieved to be prescribed the first endoscopy and then the followup procedures. It gave me an element of control. The thought repeatedly running through my head before and after these procedures was, “how fortunate I am to live in the time I am in.”  The upper-GI procedure took  less than 15 minutes, was painless, and I found out immediately after that my esophagus and stomach looked healthy. Tests from biopsies less than a week later confirmed this was true. I had similar experiences with the other endoscopies. I was given amazing information about by internal organs in fairly non-invasive short outpatient visits. The figure below shows one of the video frames of my stomach.

Stomach tissue from my own recent upper GI endoscopy using
a conventional commercial endoscope.

Because my own work in ultrafast laser systems has applications in nonlinear endoscopic imaging, I have used the words “optical biopsy”  (the idea that tissue is cleverly analyzed with photons during the procedure instead of "barbarically" exised to be sent to a lab and analyzed later) and “non-invasive” in introductions to papers, talks, or in explanations to lab visitors how an ultrafast laser has relevance to the average person. In the promotion of ultrafast lasers for optical biopsy, I  have sometimes talked about how the time and effort it takes to run biopsied tissue through histology is long and arduous-it needs to be sliced thin and stained in order to be viewed with a conventional microscope, and then analyzed by an expert. The patient distressingly waits for a diagnosis and also pays a non-trivial sum of money for the professional time involved for analysis.

I couldn’t have imagined the importance of these motivations before my own endoscopic procedures. What was part of my ultrafast laser stump speech was suddenly very real and worthy. My own experiences were definitely non-invasive. What would have been my options when endoscopes were larger and bulkier? What would have been my options prior to widespread use of endoscopic diagnosis?  And though my waiting for histology was short, it was still difficult and definitely costly. What advantages will the next generations have as optical researchers and engineers push endoscopes to use more imaging modalities? Push them to smaller sizes and with more functionality? What peace of mind can we pass on?

No doubt many contributed talks to CLEO 2013 and postdeadline papers will address advances in endoscopic procedures, endoscopes, and catheter-based probes. Last year’s postdeadline session saw two papers on endoscopic imaging: one from a collaboration between John Hopkins Univeristy and Corning, Inc. led by Xingde Li for efficient, high-resolution nonlinear endomicroscopy and  the other from Chris Xu’s lab of Cornell University which piggy-backed wide-field one-photon imaging with high-resloution two-photon imaging in the same device for optical zoom capability.  There were also a number of contributed submissions regarding advances in endoscopy such as the work by Adela Ben-Yakar’s group of the University of Texas at Austin whose endoscope used the same ultrafast laser for two-photon imaging for targeting tissue and subsurface precision microsurgery  through athermal ablation. Last year’s CLEO also hosted an invited talk by Brett Bouma, pioneer of Optical Coherence Tomography (OCT), on translating OCT into GI endoscopy.

This year’s invited speakers in CLEOs Applications and Technology: Biomedical will also be addressing future directions on endoscopes and  endoscopic procedures. Invited speaker Jerome Mertz of Boston University will be discussing his work on phase contrast endomicroscopy which was just published in this week’s  Nature Methods. His technique cleverly uses two diametrically opposed off-axis sources to allow oblique back-illumination  in a reflection mode geometry. Traditionally phase contrast microscopy using oblique illumination requires transillumination and is therefore not suitable for in vivo imaging. Mertz's back-illumination technique allows his microscope to be miniaturized and integrated into an endoscope for which the source and detection optics must reside on the same side of the sample. Unlike traditional oblique illumination phase contrast, Mertz's technique can be used to image thick samples.

Invited speaker Laura Marcu of University of California Davis will be addressing the use of fluorescence lifetime imaging (FLIM) and time-resolved fluorescence spectroscopy (TRFS) in clinical diagnostics. FLIM and TRFS can reveal optical molecular contrast for diagnosis of atherosclerotic cardiovascular disease. However, current approaches using FLIM and TRFS in arterial studies have been primarily ex vivo and  in vivo studies have only interrogated exposed plaque.  In a recent  article in  Journal of Biomedical Optics Express, Marcu shows a catheter-based (“endoscopic”) scanning-TRFS system for  intravascular interrogation of the arterial wall.  Catheter systems are a crucial step for translating TRFS and FLIM into clinical diagnosis.

Finally, invited speaker Andrew M. Rollins of Case Western Reserve University will discuss OCT image guidance for radio frequency ablation (RFA) for treatment of ventricular tachycardia. RFA is a standard treatment for curing many cardiac arrhythmias. However, in cases of ventricular tachycardia (VT), the arrhythmia circuits are located deep in the myocardium or epicardium. Because no technology exists to image cardiac tissue directly during RFA treatment, it has limited success  for curing VT. OCT which has the capability of imaging 1-2 mm deep within tissue with micron resolution could be the silver bullet for successful VT RFA procedures. In a recent article in the Journal of Biomedical Optics, Rollins and collaborators from the North Ohio Heart Center, Cardiac Electrophysiology, image  freshly excised swine hearts using a microscope integrated bench-top scanner and a forward imaging catheter probe to show the high functionality of OCT for RFA image guidance.

This CLEO I will be looking at imaging research from a different perspective. Whether at the research level or market-ready, advances in endoscopes serve a larger altruistic purpose of potentially giving patients a higher quality life, some control of their health, and dignity. These aren't just empty words to promote optics research but are very real. What a wonderful profession we’re in.

Thursday, September 6, 2012

Ultrafast Optical Pulses and Hip Hop Play a Critical Role on Mars Rover

Artist Rendering of ChemCam Laser
Analysis on Mars Science Laboratory. From
What do front man for the Black Eyed Peas,, and ultrafast optical pulses have in common? They are both playing crucial role on the newest Mars rover mission. On August 28,'s song "Reach for the Stars" was the first musical composition to be transmitted to Earth from another planet, in this case from Curiosity, twelve days after its  Seven Minutes of Terror landing, complete with state-of-the-art supersonic parachute and sky-crane. I'm still a bit shocked at this science fictionesque feat of impressive engineering seeming to border on hubris. Really, a sky-crane? Really?

While's  interplanetary music transmission is playing a critical role in science and engineering outreach as part of google+ and Lockheed Martin sponsored initiative SYSTEM (Stimulating Youth for Science Technology Engineering and Math), ultrafast optics is playing a critical role for analyzing the geology of the martian surface. On August 19, ChemCam, an instrument that is a part of the Mars Science Laboratory on board Curiosity, ablated part of a rock with ultrafast optical laser pulses and performed chemical analysis on the emitted plasma to determine rock and soil composition, a first for exogeology. Though the technique, laser induced break-down spectroscopy (LIBS), is almost as old as the laser itself, it has never been performed on another planet. What makes LIBS so useful for Mars exploration is that as an active remote sensing technique, no physical contact needs to be made with the rock or soil under test, including cleaning the sample area.

The previous Mars rovers required a rock abrasion tool to remove dust and outer layers to analyze the more interesting unweathered interior of rock and soil samples. On Curiosity, initial pulses "clean" the area and subsequent pulses create the plasma of interest whose spectrum is to be analyzed. For this instrument standoff distances can be as far as 7 m. The LIBS instrument has been combined with a Remote Micro-Imager (RMI) to give contextual information around the approximate 0.5 mm LIBS interrogation points in a single instrument called ChemCam. The figure below shows the precision of the laser system as well as the resolution of the Micro-Imager at 3 m stand-off.  The choice to burn precision holes in the U.S. dollar and Euro (near Toulouse, France on the Euro map) is in homage to locations of the collaborating institutions  Los Alamos National Laboratory, Centre National d'Etudes Spatiales, and Centre National de la Recherche.

Demonstration of ChemCam's shooting accuracy and micro imager resoltion at 3 m
standoff after ablating holes in U.S. and European currency respectively. The inset
 (lower left) shows the difference image. Image from poster "Progress on Calibration
of the ChemCam LIBS Instrument on the Mars Science Laboratory Rover,"
by principle investigator R.C. Weins, 2010. 

Besides the ultrafast laser system, ChemCam is a goldmine of optical engineering and instrumentation. There is honestly something for almost any kind of optical scientist on this instrument. Details can be found both on the ChemCam website and in a review of the instrument suite (an easy geeky read which I had trouble putting down). The laser and imaging optics reside in the mast of ChemCam (the seeming periscope-like eye of the rover) and the spectrometers and supporting equipment live in the body unit. The mast and body are connected by optical fiber.

Schematic of ChemCam. From "The ChemCam Instrument Suite on the Mars
Science Laboratory Rover: Body Unit and Combined System Tests," Space
Sci. Rev., DOI 10.1007/s11214-012-9902-4, (2012).

As a laser scientist, when I read the initial news of ChemCam, I wanted to know as much as possible about the ultrafast laser on board. What kind of pulses do you need to in order to create a plasma from rock? What power? What width? What wavelength or wavelength range is required for LIBS analysis? Was it a fiber laser or solid state version?

The laser on ChemCam is a neodymium potassium gadolynium tungstate (Nd:KGW) at a center wavelength of 1067 nm made by Thayles Optronique. Pulses are 5 ns in duration with more than 10 mJ of energy in order to deliver 10 MW of power per square millimeter to the target. The repetition rate is very low, 1-10 Hz, in order to maximize pulse energy. It turns out that the wavelength is not very special and could be anything from visible to near IR- the field strength is what is most important for creating the plasma. The ChemCam team chose a laser in the 1.0 micron region due to the simplicity and practicality of obtaining the necessary energy density for LIBS. However, another advantage to choosing excitation light at 1.0 microns is that it is longer than the reddest wavelength of expected characteristic emission lines from the plasma. These lines range from 240-850 nm.

ChemCams first spectrum using data collected from a rock near the landing site dubbed,
 "Coronation". Image from Andy Shaner's ChemCam blog.

So far preliminary data from ChemCam's  LIBS instrument, show clean "textbook" spectra. The first spectrum (shown above) is consistent with basalt, a type of volcanic rock which is known to be present on Mars. The carbon peak in the spectrum comes from the carbon dioxide-rich martian atmosphere. Hydrogen disappeared after the first shot, indicating it was only on the surface, and the concentration of magnesium became less with subsequent lasr shots. ChemCam began ablating rocks on the martian surface August 19, and has since been taking more target practice. The latest picture from ChemCam on Augusts 25, (see below) shows a 5 x 1 raster scan to investigate chemical variability across the sample.

A before and after photo of a 5x1 raster scan during an August 25, chemical variability analysis.

The feats being accomplished by Curiosity are truly amazing. It seems the sky is the limit when it comes to what this rover can wait, to quote "Reach for the Stars",

"Why do they say the sky is the limit when I've seen footprints on the moon?...let's reach for the stars"

I take that back, the stars are the limit. Thanks

Monday, May 14, 2012

Transistor Moment

Flying back home from San Jose I couldn't help wonder with excitement if our field is on the verge of a "transistor moment." Maybe it was just my CLEO conference euphoria coupled with high-end caffeine from Cafe Frascati still in my system. However, I feel like something big is going to happen, particularly in the field of photonic circuits and nanophotonics.

 The explosion of work in this subarea is impressive and CLEO hosted a number of talks from the leaders and pioneers in this field. You can still watch a handful of these on the CLEO On Demand video  such as Yurii Valsov's plenary talk on fundamentals and applications of silicon nanophotonics, Larry Coldren's tutorial, CWK1.1, on single-chip transmitters and receivers, and Dave Welch's tutorial, JM4.I.1 on semiconductor photonic integrated circuits, just to name a few. Cutting edge science is interfacing with better fabrication processes- repeatability and low cost. At the poster session on Wednesday night it seemed every group was using some kind of micro ring resonator. Simple photonic circuits are becoming standard. Will our children have the nanophotonic equivalent of a Heathkit radio- something like "My first Fab." It seems a sure thing to me that my daughters will be using optical/electrical hybrid computers in their lifetime. And it seems even more certain to me that nanophotonics is the future of our field.

However, will something even bigger, more profound, and unexpected happen like when Walter Brattain dumped his amplifier experiment in a thermos of water in 1947 to successfully demonstrate electrical gain of what was to become the transistor? The same little amplifier that gave birth to a small startup company named Sony and then later to Texas Instrument, Intel and the entire business of integrated circuits and computation as we know it. The transistor was at first a "mere" amplifier. Later it became the foundation for all computer logic and a new era of technology. I wonder what is within our grasp that we don't realize.

Yurii Vlasov used imagery from the Wizard of Oz in his plenary talk of a road to follow to the Emerald City (our goals of nanophotonics and computation and the windy road we will take). However, I wonder what ruby slippers we are wearing right now. What  "transistor potential" is waiting to be unlocked. It's a good time to be in the field of photonics. We will be the leaders of the new information age and the technology that drives and supports it.

Thursday, May 10, 2012

Limber-up for the Postdeadline Session Tonight

From Postdeadline paper CTh5D.1 "Wavelength-Size Silicon Modulator." Scanning electron micrograph of the silicon integrated waveguide modulator.

Make sure to stretch your legs if you want to move from session to session in this frenzy of fantastic photonics research (say that five times fast). Tonight from 8:00-10:00 pm marks the crème de la crème of contributed papers to CLEO. I haven't quite made up my mind of which to attend, but found a number of them particularly exciting:

CTh5D.1, "Wavelength-size Silicon Modulator"  from V.J. Sorger et al.

 This is work out of the Zhang Lab from Berkely showing an optical modulator with 1-dB/micron extinction (a 20 micron long device gives 20 dB extinction). The modulator is based upon tuning the carrier concentration of an active nm-thin layer of Indium Tin Oxide sandwiched between a MOS structure. Just yesterday, Larry Coldren from UCSB was gently ribbing the silicon folk in his tutorial CW1k.1, "Single-chip Integrated Transmitters and Receivers" for the dearth of practical active components such as a modulator. Coldren sees InP based photonic circuits as the more robust platform for photonic integrated circuits. However, great work like this from the Zhang group will be pushing silicon to the forefront.

CTh5C.4, "In Vivo Three-Photon Microscopy of Subcoritical Structures wihtin an Intact Mouse Brain" from N. Horton et al.

 This work from the Xu Group from Cornell University uses a clever choice of longer-excitation wavelength coupled to the improved localization of three photon fluorescence in order to image deep through intact tissue. Even though longer wavelengths are more readily absorbed in tissue, they are significantly less scattered. The overall effect is higher throughput and deeper penetration. Combine that with a 1/z4 fall-off in three-photon fluorescence signal (tighter localization), and now you can make beautiful images of intact tissue. The Xu group shows 1.2 mm stack of brain tissue taken in 4 micron increments. The broader goal will be to eventually use this for optical biopsy in humans. I would prefer to have my tissue scanned with a laser rather than excised from my body with a knife by a surgeon, wouldn't you?  

CTh5C.1, "Demonstration of a Bright 50 Hz Repetition Rate Table-Top Soft X-Ray Laser Driven by a Diode-Pumped Laser" from B. Reagan et al.

This work from the Rocca Group of Colorado State University and the Research Center for Extreme Ultraviolet Science and Technology shows a significant improvement of table-top soft x-ray lasers. To see how quickly this group is improving these  systems, just look at a March 2012 Laser Focus World feature article highlighting their work- now outdated. The aim of table-top soft x-ray research is to bring systems that are typically found at a shared national lab facility to the many optics tables of university labs and industry. Applications for coherent soft x-rays include laser-induced materials processing at the nanoscale level as well as ultrafast characterization of nanoscale motion. Spectra Physics or Coherent may not be selling ultrfast soft x-ray lasers just yet, but this paper shows a  5-fold increase in repetition rate (important for higher average power applications) and a 20-fold increase in pulse energy from previous best efforts.

ATh5A.4 "Highly Efficient GaAs Solar Cells with Dual Layer of Quantum Dots and a Flexible PDMS Film"  from C. Lin et al.

In this paper a Taiwanese collobaration from the Institute of Photonic Systems, National Chiao Tung University, and the Research Center for Applied Sciences has shown a 22 % enhanced efficiency in a GaAs solar cell by spraying a coating of UV absorptive quantum dots onto a polydimethylsiloxane film at the top surface of the cell. This collaboration has found a clever way to not let so many UV photons from the sun go to waste.