If we can remotely probe the structure and nature of an object, we can make an image
of it and use that image to develop human comprehension.
Hubble Telescope Spots an Intergalactic String of Pearls
Astronomy and Space Science
No jeweler on the planet can beat the string of pearls spotted by the Hubble Space Telescope: The merger of two elliptical galaxies has created a "necklace" of infant stars stretching for 100,000 light-years.
NASA / ESA / RIT / HUBBLE HERITAGE
Jul. 10, 2014
Everything about this picture is big: The stellar string would stretch from one end of our Milky Way galaxy to the other, between two galaxies that are both three times wider than our own. The galaxies are contained in a cluster known as SDSS J1531+3414, a formation that's so massive its gravitation field bends the images of background galaxies into bluish arcs.
When the images were acquired, astronomers assumed the chain of stars was merely an illusion created by the galaxy cluster's gravitational lens. But follow-up observations using the Nordic Optical Telescope in the Canary Islands ruled out that hypothesis.
"We were surprised to find this stunning morphology, which must be very short-lived," Grant Tremblay of the European Southern Observatory said in a news release from the Space Telescope Science Institute. "We've long known that the 'beads on a string' phenomenon is seen in the arms of spiral galaxies and in tidal bridges between interacting galaxies. However, this particular supercluster arrangement has never been seen before in giant merging elliptical galaxies."
Tremblay compared the phenomenon to "two monsters playing tug-of-war with a necklace."
When Steve Sasson made the first digital camera in 1975, he never dreamed it would end up on display at the George Eastman House.
Nov. 23, 2014
Bennett J. Loudon
"I just never thought about the significance of it, to be honest with you, until the press started asking me lots of question in the early 2000s," Sasson said.
Sasson took part in a panel discussion Saturday at the George Eastman House about the effect digital photographic technology has had.
Despite the enormous impact his invention has had on modern life, Sasson remains startled by the celebrity it has bestowed.
"The camera represents to me a personal memory and a critical decision point in my career. From a broader perspective, I never expected people to be this interested in the prototype. I never really thought about it being historically that interesting. It motivated me, but I never thought about it in historical terms," said Sasson, who lives in Hilton.
Over the years, the mailbox-sized camera has been shown with Sasson in news interviews and documentaries, but it was never before put on public display. The public can see it until Jan. 4 at George Eastman House as part of the exhibit titled, "Innovation in the Imaging Capital."
Roger Easton, a professor of imaging science at Rochester Institute of Technology, who visited the George Eastman House Saturday to see the camera in person and hear Sasson talk, called the invention "a revolution."
"In my line of work, I use them all the time. I take digital images of historical documents, manuscripts," said Easton, who took a picture of the Sasson camera with his own, modern Kodak digital camera.
"Right now I'm working on a 1490 map of the world. That technology is what allows us to pull writings out that people haven't been able to read for hundreds of years," Easton said.
Sasson travels extensively, giving talks about the camera, innovation and inventing.
"I'm embarrassed to say that, but yeah, people do come up to me and want their picture taken with me all the time, which is fine," he said.
A few years ago, Sasson started getting letters from people all over the country asking for his picture and autograph.
"I collect them in a book. My book is getting pretty full. They're kind of moving stories," he said.
The camera was the result of a research project assigned to Sasson in 1975 when he worked at Eastman Kodak Co.
He retired from Kodak in 2012 and now has his own consulting firm.
"Nobody really told me to build a camera. They asked me to look at the imaging properties of this new type of imager called the charge-coupled device," Sasson said.
"I was just sort of the right person at the right place at the right time," he said, explaining that he was fresh out of Rensselaer Polytechnic Institute with a master's degree in electrical engineering degree.
"I liked to build things. As a kid I built all kinds of radios and transmitters at my house in Brooklyn. I used to scavenge parts from old TV sets people used to leave on the sidewalk," he said.
He didn't get much direction for what he described as a low-key project conducted without much funding or resources.
"I thought, if I'm going to measure the imaging properties of a device, it would be nice if I could capture images at will. That sort of sounds like a camera. And then I thought, wouldn't it be really cool to build an all-electronic camera, no moving parts at all," he said.
The pictures taken by the first camera weren't very good, but it demonstrated a concept. Back then, he guessed it would be 15 to 20 years before it would be perfected. He was off by about 10 years, and by that time the Kodak patent had expired.
CIS Undergraduate Research Sparks Prestigious Professorship in Astronomy
Astronomy and Space Science
Dec. 14, 2011
A poster book of huge, colorful photographs of the giant planets taken by the Voyager 2 spacecraft captivated Sally Dodson-Robinson as a child in Los Angeles. “I thought it was really cool. All these pictures of planets and moons,” she says. “I always liked astronomy when I was a child, but I didn’t know how you would go about having a career in astronomy.”
She toyed with photography and always enjoyed science, but everything fell into place as a junior at the Center for Imaging Science. That year and the following year, Dodson-Robinson carried out a research project on binary stars with then-CIS Professor Elliot Horch. This close contact with a working astronomer motivated her to pursue astronomy as a career.
Dodson-Robinson, now 31, received a Bachelor of Science degree from CIS in May 2002, graduating summa cum laude as the College of Science Student Delegate. She also accepted the College of Science Outstanding Scholar Award. She is in the midst of her third year as assistant professor of astrophysics at the University of Texas, Austin.
The access to such stimulating research projects at CIS, Dodson-Robinson says, directly led to her current, highly coveted, position. With Horch, Dodson-Robinson observed stars using the fine guidance sensors on the Hubble Space Telescope with the goal of finding binary, or double, stars. “The project was ongoing and when I started, we were in the very first phases of it. I think that first batch included about 12 stars. I enjoyed discovering things. I wanted to keep discovering and building knowledge.”
With the eventual goal of attending graduate school in astrophysics, Dodson-Robinson took a year off to teach English in Japan. In 2003, she started doctoral work at University of California, Santa Cruz. She received a National Science Foundation Graduate Research Fellowship between 2003 and 2006, which funded the bulk of her graduate career.
Almost immediately, as a result of a successful master’s level class project, she began working with Professor Greg Laughlin, who studies planetary astrophysics. While it included an analysis of observational data, Dodson-Robinson’s doctoral thesis took a more theoretical direction than her undergraduate work. She investigated the chemistry of planet formation and, specifically, how the composition of gas and dust determines planets’ growth. The boundary between observations and theory did not faze her. “If I get interested in a question, I will use any method I can to answer the question. I’m not particular about the method,” she says.
As Dodson-Robinson put the finishing touches on her dissertation in 2008, the University of Texas offered her a faculty position. At the time, she had already accepted a Spitzer Space Telescope post-doctoral position at the NASA Exoplanet Science Insitute at the California Institute of Technology in Pasadena and so she deferred her UT offer until 2009-10.
At UT, Dodson-Robinson focuses on planet formation and planet archeology, but she has extended her research to include planet-forming accretion disks around stars. Analytical theory and numerical simulations of the dynamical and chemical environment of planet growth allows her to uncover the formation histories of exoplanets and Solar System objects. Using spectroscopy, she also chemically analyzes stars and their orbiting dust, reading the fossil record of planet growth. In addition, analyses of infrared observations enable her to see the composition of dust grains that make up planets.
Dodson-Robinson’s collaborations are mainly rooted in her doctoral and post-doctoral work. She works extensively with two JPL researchers, Karen Willacy and Neil Turner, as well as with her “grand advisor” from UCSC, Professor Emeritus Peter Bodenheimer. Harkening back to her initial astronomical inspiration as a child, this year Dodson-Robinson obtained a five-year National Science Foundation CAREER Award to study the formation of giant planets.
While Dodson-Robinson, a Southern California native, does not miss the cold weather of Rochester, she looks fondly on her years at CIS and appreciates that all the students are encouraged to get involved in research. “I think that was one of the most valuable things for me, since I have a career in research now,” she says.
CIS Alumnus Takes on Instrumental Role in Next Landsat Satellite
Matthew Montanaro—a Chester F. Carlson Center for Imaging Science doctoral alumnus—has played a key role in the development of a state-of-the-art infrared imager aboard the newest Landsat mission scheduled for launch in February.
Jan. 24, 2013
Collaborating with a team of scientists and engineers at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, Montanaro is applying his knowledge of remote sensing theory and imaging methods to the Landsat Data Continuity Mission. Landsat 8 is the most advanced of eight remote sensing satellite missions designed to collect images of the Earth from space for land use research.
After completing his doctorate in May 2009, Montanaro joined Goddard in September 2009. His work focuses on a specialized camera called the Thermal Infrared Sensor (TIRS), which is designed to detect changes in temperature on the earth’s surface. The TIRS camera takes images in infrared wavelengths, which are beyond the visible range. Each image arrives in the form of raw numbers, representing intensities across the image, and Montanaro calibrates those numbers to produce an image of the Earth’s temperature variations. “My job is to figure out algorithms and connect the data to meaningful temperatures that scientists can use in their research,” Montanaro says. “When you are done, you have a temperature map of the earth’s surface.”
Practically speaking, this new sensor will help farmers, scientists, and enforcement agencies track water management in the Midwest, for example, in times of draught. “If you fly over the Midwest and look out the window, you see these huge circles that are irrigation fields,” Montanaro explains. “If you think of heat, you’re spraying a lot of water [on the field] and when that water evaporates, it cools the surface. Our instrument is designed to detect that change in temperature. You would be able to see temperature differences between these fields.”
The TIRS team has worked on an accelerated schedule to be ready for the February 2013 launch date of Landsat 8, which is equipped with an optical camera as well as TIRS. NASA launched the first satellite in the Landsat series in 1972; a new mission is put in orbit every five to 10 years, and each version becomes more sophisticated in its instrumentation.
Engineers built the new TIRS instrument at Goddard and shipped it to Orbital Sciences Corporation in Gilbert, Arizona last February where engineers installed both TIRS and the mission’s visible-light camera aboard the satellite. During the late summer and fall, Montanaro made numerous trips to Gilbert to assist in observatory level testing. In these tests, TIRS took images simulating the satellite in orbit, and Montanaro checked the data to ensure their reliability. The satellite has been shipped to the launch site at Vandenberg Air Force Base in California, where it will be mounted on top of a United Launch Alliance Atlas V rocket, according to the NASA website. After launch, the team will spend 90 days taking test images and ensuring the reliability of the data and then the U.S. Geological Survey takes over the operation of collecting the data.
Montanaro says his training in the Carlson Center for Imaging Science at RIT, where he studied thermal remote sensing theory and imaging methods with Professor Carl Salvaggio, prepared him well for his job duties at NASA.
“Matt's research leading to his doctoral degree was to look at the interactions of optical radiation in a complex, cavernous target, and predict what a remote sensing system would observe,” Salvaggio says. “Matt developed an in-depth understanding of the way thermal infrared energy interacts with targets, the atmosphere, and the sensor, which prepared him in a very solid way for his current work at NASA producing the next generation of infrared sensing systems.”
While at RIT Montanaro focused exclusively on the science side of remote sensing, whereas TIRS has also required him to comprehend the engineering end of camera design. “You see [the work] from a different perspective here. You see everything that has to go into building something like this and then launching it,” Montanaro says. “You get to see the inner workings of stuff that the public doesn’t normally get to see. And, hopefully, in February, it will actually launch and then something we worked on will be orbiting the earth.”
Can we replicate any object by sending a file to a machine that will in turn re-create the original object in 3D? Dr. Alvaro Rojas wants to know.
Feb. 3, 2014
Rojas is the only CIS graduate student who has earned a PhD in printing and electrophotography. Electrophotography is the technology behind laser printers and copiers and Rojas has been studying the application of this technology for the manufacturing of 3D objects.
According to Dr. Rojas, such technology could potentially have applications not only in manufacturing, but also in medicine. Tissues, bone, and even organs could possibly be customized for patients who might, for example, need a kidney transplant. "This seems to be an application area that is being pushed forward," says Rojas.
Since the onset of rapid 3D printers there has been an explosion both in access to the machines and in the range of applications. Some printers are more portable, but provide a lower resolution; others are more accurate—and they represent a viable method for producing results far beyond just a picture on a computer screen.
There are instructions online for 3D printers that can be built by anyone, and these kits "print" using various media such as plastics, food substances, or liquids that solidify after printing. "We are trying to explore a different technology; we are working on using electrophotography to print with powders," says Rojas. "Other techniques use powders but need some sort of glue. Laser printers use powders which don't need to be suspended in liquid and so we are working to use particles that fuse together, which means they don't require a binding agent."
The exploration of this technology means Dr. Rojas and his team might one day be able to print objects using materials such as ceramics or metals. "So far we are in the beginning stages," says Rojas. Because 3D printers work by layering material from the bottom up, microscopic surface defects —such as holes or bumps— in hundreds of layers can build up, causing major surface defects in the printed object. Without solving this issue, says Rojas, the technology cannot go much farther. So his research has revolved around avoiding such defects. "Microscopic bumps on hundreds of layers can really show up, so we are investigating ways of sensing the surface layer by layer as an object is being built."
Are some materials more susceptible to defects than others? Rojas explains that so far he has only tried printing with toner that consists of tiny five-micron particles. "We are using toner because that is what was available to us. We have seen other people using thirty-micron particles, but all materials are susceptible. Larger particles might not bring more defects, but they might be visible earlier in the printing process."
Alvaro Rojas came to RIT in 2006 from Colombia on a scholarship to earn his master's degree in Industrial Engineering. He earned a second master's degree, in Systems Engineering, at the University of Illinois at Urbana-Champaign; he then returned to RIT for his PhD. He successfully defended his dissertation in the fall of 2013 and earned his doctorate. He has now traveled home to Cali, Colombia, where is on the faculty at Universidad Autonoma de Occidente. Dr. Rojas also plans to continue collaborating with his advisor, Marcos Esterman.
"I am looking forward to it and I love to teach, but I have mixed feelings because I love RIT and this will be a big change."
RIT Imaging Science Doctoral Students Win National Awards
Canham, Pahlevan win use of novel imaging instrument
Apr. 6, 2011
Access to a specialized imaging device that measures reflectance was awarded to two doctoral students at Rochester Institute of Technology in support of their thesis research.
Kelly Canham and Nima Pahlevan, students in the Digital Imaging and Remote Sensing Laboratory in the Chester F. Carlson Center for Imaging Science, won temporary use of spectralradiometers. These instruments measure the amount of light reflected from a material at each wavelength along the electromagnetic spectrum. The awards were made through the Alexander Goetz Instrument Program, co-sponsored by Analytical Space Devices Inc. and the Institute of Electrical and Electronics Engineers Geoscience and Remote Sensing Society. A total of seven 2011 award winners were named.
Kelly Canham and Nima Pahlevan
Canham, a resident of Palmyra, Mo., shares her award with David Messinger, director of the Digital Imaging and Remote Sensing Laboratory, and William Middleton, associate professor of sociology and anthropology. They are developing image-processing tools that will aid Middleton’s archeological research pertaining to the Zapotec civilization in Oaxaca, Mexico.
In December, Canham will use the spectralradiometer, a Field Spec Pro, in Oaxaca to measure the amount of light reflected from soils and vegetation common to the area. The library of spectral signatures—not images—she builds will help the archeological team decide where to dig. Distinct spectral signatures or “fingerprints” will help Canham distinguish between different vegetation and minerals in the soil in Oaxaca.
The team will compare the spectra to images processed in an earlier stage of the project using data collected by NASA’s Earth Observing 1 satellite and its Hyperion hyperspectral sensor. Hyperspectral imaging combines bands of spectral information from the electromagnetic wavelength into three-dimensional data cubes.
“The overall result of this research is to predict archeologically interesting locations using the hyperspectral imagery,” Canham says. “This will help Dr. Middleton and other archaeologists focus their time and efforts in their research. They will not need to rely only on time- and resource-consuming ground surveys to determine a site. Instead, they may simply look at a map created from this research to determine where they would like to focus a more extensive dig-site.”
Pahlevan, a resident of Tehran, Iran, and John Schott, the Fredrick and Anna B. Weidman Professor in the Center for Imaging Science, also won temporary access to a spectralradiometer through the Alexander Goetz Instrument Program. Pahlevan and Schott will use the hand-held device in July to analyze optical properties of coastal waters.
“We will investigate the water quality of the southern shores of Lake Ontario at the mouth of the Genesee and the Niagara rivers,” Pahlevan says.
Their research will also examine the potential of a new generation of the Earth-observing satellite sensor, Landsat, scheduled for launch in December 2012.
“This effort introduces a different approach, based on satellite remote sensing, to provide environmentalists and decision makers with better insights on the state of the ecosystem in coastal waters on a regular basis,” Pahlevan says.
“The neat part of this project is establishing a link between satellite imagery and modeling efforts to improve our ability to monitor water quality in the receiving waters near the river discharge.”
In addition to the award from the Goetz program, Pahlevan was recognized for having the best presentation in the engineering/modeling session at the 21st annual Great Lakes Research Consortium student-faculty conference in March in Syracuse. He presented “The Potential of Landsat/LDCM Coupled with a Hydrodynamic Model for Quantitative Mapping of Water Constituents in Inland Waters.”
Clinicians across the world now have access to an image de-noising “toolbox” that will allow them to improve the diagnosis and treatment of their patients with schizophrenia or bipolar disorder. This pre-processing algorithm enables scientists to compare, with greater specificity, brain scans of schizophrenic patients and healthy control subjects.
May. 17, 2011
Siddharth Khullar, a second year doctoral student at RIT’s Chester F. Carlson Center for Imaging Science, is pioneering new techniques that calibrate functional magnetic resonance imaging (fMRI) in more precise detail, allowing scientists to discern brain functionality of schizophrenic patients better than ever before.
“Neurodiseases don’t show up in a brain scan as readily as, for example, a brain tumor,” says Khullar, who studies with Center Director Stefi Baum. “If someone has a symptom of schizophrenia, or a similar disease such as bipolar disorder, they can take a cognitive test inside the fMRI scanner and, with this de-noising method, clinicians are able to compare and analyze the resulting images more accurately with the same scan of a healthy person.”
Even as Khullar works with Baum at RIT, he lives in sunny Albuquerque, New Mexico. He works as graduate research associate in the Medical Image Analysis Lab led by Vince Calhoun at the Mind Research Network for Neurodiagnostic Discovery. Khullar received a Master’s Degree in electrical engineering at RIT in 2009, and then started the Imaging Science PhD program and interned at the Mind Research Lab that summer. After a year of graduate classes at the Center, Khullar continued to work on his research at Mind Research Network and is currently funded by a federal grant (National Institutes of Health, PI Calhoun).
During his first year at the Mind Research Network, Khullar has already published a journal article and presented his findings at three major medical imaging conferences. And, he says, another journal article is in the works.
“I am thrilled about our association with the Mind Research Network,” says Baum, who is also an astrophysicist. “These types of partnerships emphasize the interdisciplinary and collaborative nature of our work at the Center. The expertise that Khullar is developing as an imaging science student will help medical professionals’ understanding and diagnoses of schizophrenia and other mental disorders.”
An fMRI scan captures the level of blood flow in the brain over time, similar to capturing a movie of the brain, yet ordinarily the image sequence produced is extremely difficult to quantify. Khullar’s method identifies and quantifies regions of activity in the fMRI brain images in a way that allows clinicians to differentiate characteristics of healthy and schizophrenic patients.
“We have shown, through our published work, that our algorithm is better in terms of preserving vital information about neural activation patterns within the brain,” Khullar says.
Next steps? In his most recent research, Khullar is working on a new pre-processing methodology that allows clinicians to compare the patient’s brain activity at rest and while the patient is conducting a simple activity such as pressing a button in response to a noise. Khullar’s new technique uses the observations of patient’s brain resting state activity to build an atlas representing the patient’s brain function that can be used to align the data obtained when the patient performs a task. This functional alignment enables improved fusion of data when studying a group of individuals who suffer from the same neurological ailment such as schizophrenia or even autism. The aim is then to use that fused data to better understand what is happening in the brains of individuals with specific neurological conditions. “My inherent goal is to make a broader impact on mankind,” Khullar says.
Figure: This demonstrates a difference image from healthy controls (HC) and schizophrenia patients (SZ), showing regions in the temporal lobe that are relatively hyper active in healthy controls (red) and schizophrenia patients (blue). There is diminished activity in schizophrenia patients, probably a result of this neurodegenerative disease. These images were obtained using Khullar’s image denoising technique in addition to other segmentation algorithms.
Imaging Science Grad Student Responds to Disaster Needs
A disaster, such as the devastating magnitude 9.0 earthquake and resulting tsunami that struck Japan this spring, can happen at any time and assessing damage proves difficult even when using aerial photography. Given the need to respond quickly when natural or man-made disasters occur, Chester F. Carlson Center for Imaging Science (CIS) graduate student Richard Labiak is developing a simple and quick tool that will supply emergency crews with rapid damage assessment within hours after a catastrophic event.
Jul. 19, 2011
Just days after Haiti experienced a magnitude 7.0 earthquake on January 12, 2010, scientists from CIS’s Digital Imaging and Remote Sensing Laboratory flew over the severely damaged Port-au-Prince region in a small plane equipped with a light detection and ranging (LiDAR) sensor. These types of airborne laser scanners are able to collect high resolution, three-dimensional scans rapidly over large areas. The instrument, which sends out light pulses that hit a target and bounce back at up to 150,000 times a second, is often used for surveying and mapping of elevations. To assess damage without this technology, emergency crews rely on high-resolution photos taken from airplanes. This is a painstaking process that does not always accurately account for building heights and is limited by poor illumination or clouds and smoke.
When Labiak saw the LiDAR images, he realized he’d found an enormous, real-world opportunity to create an “interesting and relevant” master’s thesis using the DIRS’ dataset.
Labiak envisions that while flying above a disaster scene, scientists would collect the LiDAR data and then his processing tool would use the data to produce a damage assessment map for disaster management workers. “The ultimate goal is that an emergency will happen, we acquire airborne data, and then within a few hours, we are able to extract the relevant information, map it, and get it to people on the ground to show what is damaged and what is not,” says Labiak, who works with CIS Professor Jan van Aardt.
(click images to enlarge in new window)
This year Labiak discovered the less romantic side of scientific research as he spent hours analyzing data that combined high resolution Wildfire Airborne Sensor Program imagery collected simultaneously with LiDAR data. The grueling process of attempting to discern buildings from vegetation in one small section around Haiti’s National Palace, however, paid off. Labiak has come up with an important and useful tool that can provide a building map as well as an initial damage assessment. He presented his findings at the International Society for Optics and Photonics: Defense Security and Sensing Conference in April.
So far, the tool still requires a person to manipulate the data. “The tool is designed to be used in an operational setting, and hopefully will be helpful to disaster managers,” he says. “Eventually, the idea is to get it done pretty quickly and with as little human involvement as possible.”
Doctoral Student Takes Her Archeological Imaging Research On the Road
Oct. 7, 2011
Just back from a conference in Rio de Janeiro where she presented her research and received an international award, CIS doctoral student Kelly Canham is gearing up to pack her toothbrush, phrase book, and a spectroradiometer for a trip to Oaxaca, Mexico in December. There Canham will collaborate with William Middleton, associate professor of sociology and anthropology, to take ground-based spectral measurements to fill in some of the gaps left after analyzing the satellite data of the Nochixtlan Valley.
“The project in Nochixtlan will be for ground truthing various landscape taxa that Kelly has identified, and taking on-the-ground spectral measurements to better identify and interpret the satellite data,” Middleton says.
Canham found herself on her way to the Latin American GeoSpatial Forum in Brazil this August after winning Phase 2 of the Digital Globe 8-Band Research Challenge—one of five winners out of roughly 300 applicants worldwide. Canham proposed to apply an algorithm that she developed for examining hyperspectral data to Digital Globe’s multispectral WordView-2 satellite data. The Digital Globe WorldView-2 satellite data includes only eight spectral imaging bands compared to the 100s typical of hyperspectral data. The results Canham presented at the conference surpassed what was expected for a multispectral dataset.
Canham and David Messinger, Canham’s advisor and director of the Digital Imaging and Remote Sensing Laboratory, are developing image-processing tools to analyze hyperspectral satellite images so that Middleton can better understand the area of Oaxaca the Zapotec civilization once populated. Previously, archeologists had taken advantage of remote sensing imagery to identify individual sites by eye or to use a few spectral bands to find archaeological markers in the vegetation and terrain. The hyperspectral satellite data, obtained by the Hyperion sensor aboard NASA’s Earth Observing 1 satellite, includes many more spectral bands that form a data cube and allow researchers to map the area in more detail.
Canham’s algorithm allows her to automatically analyze and identify the spectral signature of the materials located within each pixel in hyperspectral, and now multispectral, satellite images. The algorithm allows Canham to hypothesize, for example, that a given pixel might contain a small fraction of a material with a prominent spectral signature, which might be indicative of limestone rock. A larger fraction of that pixel might have a weak spectral signature, which could indicate an asphalt road. However, Canham and Middleton cannot verify this hypothesis until they actually survey the area by land to determine if the spectral signatures obtained through the analysis indicate the presence of limestone and asphalt.
In order to take the measurements necessary to create a spectral library of the area, Canham will use the FieldSpec Pro spectroradiometer. She won the use of this instrument from the Alexander Goetz Equipment Program last spring. A CIS microgrant will fund Canham’s travel expenses to Mexico. ( Seehttp://www.rit.edu/news/story.php?id=48248)
The eventual goal is to create a viable land-use map to add to Middleton’s research on the Zapotec civilization. “Overall, what we can do with hyperspectral data will never replace archaeologists, but we can help them look for "candidate" sites of interest,” Canham says. “Our goal is just to make their life a little easier and possibly, eventually, a bit cheaper than the old ground walking surveys.”
CIS Student Enhances Digital Reconstruction of Tumors: Presents Findings at San Diego Medical Imaging Conference
Biomedical researchers now have access to a more elegant method to digitally reconstruct microscopictissueslices, or histological sections, of tumor specimens into three-dimensional models thanks to the work of Shaohui Sun.
Mar. 20, 2012
Sun—a Center for Imaging Science graduate student—presented his findings in February at the SPIE Medical Imaging Conference on Image Processing in San Diego.
CIS Professor Nathan Cahill discovered the problem in a conversation with Nzola de Magalhaes, a RIT Biomedical Engineering professor who studies tumor vascularization. Magalhaes wanted to find a way to stack successive histological sections of tumors in chicken embryos to eliminate the usual distortion and registration problems associated with digital reconstruction of these images.
Cahill, who is also a faculty member in the School of Mathematical Sciences faculty, turned to his first-year graduate research assistant Sun to generate a mathematical algorithm to help solve the problem. “Shaohui spent a quarter learning about the limitations of prior techniques, developing the theory behind our new algorithm, implementing the new algorithm, and validating it on Nzola's data,” Cahill says.
Before he figured out the new algorithm, Sun says that Magalhaes used a much more laborious process of aligning the slices by hand. Previous techniques—extremely time consuming and difficult—produced only a course volumetric reconstruction of the tumors. Sun also needed to tackle the “aperture problem,” which is one of the main obstacles biomedical researchers face when attempting to digitally reconstruct three-dimensional specimens. When stacking 10s or 100s of these slices, the final result becomes twisted and distorted when examined next to the original specimen.
In the past, researchers have only looked at two successive images in the registration process. “I compared five to 10 images and then figured out the similarities between the slices. Once you have the matching features, you know what is in common and you can model a mathematical formula to solve the problem,” Sun says. Sun’s algorithm allowed him to compensate for rotational, scale, shear, and minute geometrical variations between the slices.
Cahill encouraged Sun to see this practical problem through to the end. “When Shaohui was able to establish that the new algorithm seemed to work well on Nzola's data (and performed better than more basic approaches), I knew that he had a good topic to submit to an international conference,” Cahill says.
Cahill and Sun wrote an abstract and submitted it to the SPIE Medical Imaging Conference over the 2011 summer. It was accepted for an oral presentation and, with Magalhaes, they wrote the full paper to submit to the conference proceedings. Sun, who is first author on the paper, presented his research at the RIT Graduate Research Symposium and at a seminar hosted by RIT's Center for Applied and Computational Mathematics. “By the time he gave the conference presentation, he had practiced enough so that he was able to do a great job,” Cahill says.
The work has useful applications, most directly, in research on tumor vascularization, but also in any application where histology is used such as cell growth and development, cancer research, and identification of pathologies, Cahill says.
Sun, 27, from Qingdao, China, says he appreciated the opportunity to attend the San Diego conference. “It was a great opportunity for me to gain experience from academic communities such as SPIE for my professional development, and it will also enhance RIT's reputation in the field of medical imaging.”
Sun is currently working with CIS Professor Carl Salvaggio on a remote sensing study involving virtual three-dimensional building reconstruction of the RIT campus and downtown Rochester.