NASA

Nancy Grace Roman Space Telescope

GODDARD SPACE FLIGHT CENTER

Roman Space Telescope

Science Teams

banner credit: NASA/GSFC

Research and Support Participation Opportunities

Abstracts of Selected Proposals

This document provides the abstracts of proposals selected for funding within the Roman program. Principal Investigator (PI) name and institution, lead Co-I name and institution, proposal title, and category are also included. 91 proposals were received in response to this opportunity. On July 5, 2023, 30 proposals were selected for funding.






Wide-Field Science

Exploiting Deep Learning to Improve Roman Photometric Redshifts
Galaxy Intrinsic Alignments for Cosmology with the Roman Space Telescope
Detecting Microhertz Gravitational Waves with Roman
Precursor Strong Lensing Science with Roman Towards Precision Cosmology
Roman Reference Fields and SNe Ia Calibration
Rubin Increases the Power of Roman
Machine Learn the Roman Universe
Kinematic Lensing with the Roman Space Telescope
DeepDISC-Roman
ROSALIA: Roman Sky Analyst for Low surface brightness Imaging & Astronomy
A Statistical Framework for Optimizing Roman Spectroscopic Training Sets
Asteroseismology Using The Galactic Bulge Time Domain Survey
Laying the Foundation for a Comprehensive View of Transiting Exoplanets with GBTDS
SPQR: Spectroscopic Probes of Quantitative Reionization
A new theoretical framework for globular cluster science with the Roman WFI
Enhancing the Roman Cosmology Program with Strongly Lensed Supernovae
Spots, Faculae, and Ages: The Promise of Rotation with Roman and Deep Learning
Roman Infrared Nearby Galaxies Survey




Exploiting Deep Learning to Improve Roman Photometric Redshifts

Wide-Field Science – Regular

Brett Andrews / University of Pittsburgh, PI

Photometric redshifts (photo-zs) will be a critical ingredient for studies of both galaxy evolution and cosmology with Roman. The Roman Science Operations Center will estimate photo-zs, but these will only use integrated photometry in Roman bands, disregarding the key color information in Rubin bands and morphological information that recent deep learning methods have unlocked. At low redshift (z < 0.2), algorithms that exploit resolved imaging of galaxies via deep learning methods have delivered much better photo-z performance than those that rely on integrated photometry alone. However, at higher redshifts the potential constraining power of deep, multi-band resolved imaging has been inaccessible due to the comparatively small sizes of galaxies in comparison to ground-based seeing.

Roman will bypass this limitation by providing deep and well-resolved images of galaxies – even for those at z>1 – in multiple bands from the red-optical to the near-IR. We propose to apply bleeding-edge deep learning methods to existing Roman-like data sets with resolved imaging from Hubble to test how much improvement the incorporation of resolved imaging will yield for deep samples. This should be particularly effective for removing the degeneracies between low and high redshift that plague many photo-z algorithms.

Specifically, we propose to train a state-of-the-art type of deep neural network – a masked autoencoder – to distill color and morphology information from Roman-like images from CANDELS and 3D-DASH HST imaging into low-dimensional encodings. We will then test our ability to predict the high-precision multi-band photo-zs from COSMOS2020 and from spectroscopic redshifts using regression from the low-dimensional encodings and integrated Roman + Rubin-like broad-band photometry. We will compare the results to photo-zs estimated with methods that only use integrated Roman + Rubin-like broad-band photometry.

The strikingly distinct morphologies of low-z and high-z galaxies, as well as the radically different resolved morphologies observed in bands above versus below the 4000Å break, should be instrumental in breaking the photo-z degeneracies that beset photometry-only methods. Based upon the performance of deep learning-based photo-z methods with resolved SDSS imaging in past work, we expect that the more modern techniques we plan to provide should produce the most accurate Roman photo-zs across a broad range of magnitude-redshift-color space. We propose to develop and test the algorithms needed to realize that potential with existing HST optical and NIR imaging. Should our method prove valuable, additional time and effort (beyond the scope of this proposal) will be required to implement it at scale on Roman data (in coordination with the Roman Science Operations Center) early in the mission’s lifetime for maximum benefit, highlighting the urgency of our development effort.

We will share the resulting code, produce thorough documentation (including lessons learned), and release the new CANDELS photo-z catalogs with the community. The resulting photometric redshift catalogs could significantly enhance the legacy value of the CANDELS data sets as a free byproduct of our work to develop new methodologies for Roman.

Galaxy Intrinsic Alignments for Cosmology with the Roman Space Telescope

Wide-Field Science – Regular

Jonathan Blazek / Northeastern University, PI

The Nancy Grace Roman Space Telescope brings exciting new capabilities for cosmology. The High-Latitude Wide-Area Survey on the Wide-Field Instrument will provide the positions and shapes of hundreds of millions of galaxies, and over ten million galaxy spectra, from which analysis of weak gravitational lensing and galaxy clustering will form a main pillar of cosmology studies.

Despite its tremendous statistical power, the ultimate success of Roman will depend on our understanding of a range of astrophysical, observational, and instrumental effects. With current cosmology projects, we have already reached the point where cosmological constraints are determined by these systematic uncertainties. While observing from space reduces several challenges induced by the atmosphere, the impact of astrophysical effects is unavoidable. of particular interest to this proposal are galaxy “intrinsic alignments” (IA), correlations in the true shapes of galaxies as influenced by their local environment and formation. Galaxy IA induces observed correlations between galaxy shapes that mimic the weak lensing signal we are trying to measure. IA is one of the most significant astrophysical systematics for Roman and other future cosmology projects – without properly accounting for it, analyses can be significantly biased. Conversely, if sufficiently understood, IA can provide a new probe of cosmology and astrophysics.

The constraining power of Roman makes it critical that IA and other astrophysical effects are understood to a high degree of accuracy. At the same time, Roman’s depth means that we are probing galaxies that are fainter and more distant than those reached in current surveys, and we are thus more reliant on other approaches to predict their behavior. This proposal aims to address this challenge, using a combination of observational data, simulations, and analytic modeling to achieve three key objectives. First, we will combine current measurements to predict the plausible range of IA scenarios and the impact on Roman cosmology. These results will allow us to determine optimal modeling strategies. Second, we will provide simulation tools and proof-of-concept simulated Roman galaxy catalogs with realistic IA, to be used for pipeline validation and as an input for community-wide simulation efforts. Third, we will determine how to utilize IA as a new physical probe with Roman data.

The methods employed in this proposed research include analyzing existing observational data, processing numerical simulations, and analytic modeling. Forecasts will be done as realistic analysis of synthetic Roman data vectors. The simulation component will utilize “gravity-only” simulations, avoiding several challenges that arise when using hydrodynamics. We will refine and employ a method that is able to populate simulated dark matter structure with realistic galaxies, including alignments. These simulated galaxies will improve our understanding of galaxy IA and their impact on Roman analyses. We will also use them to develop a simulation-based model of IA and to improve and test analytic modeling in the context of Roman. Finally, these simulation and modeling tools will be used to develop new analyses for Roman that will utilize observed galaxy shapes to probe both astrophysics (e.g. galaxy formation) and fundamental physics (e.g. inflation and dark matter interactions).

This proposal is highly responsive to the criteria of this WFS opportunity and to NASA science goals more broadly. It will enable core Roman cosmology analyses through better understanding and treatment of IA, including guidance to the relevant Project Infrastructure Team(s). It will expand the potential scientific impact of WFI data, including opportunities to learn about both astrophysics and fundamental physics of the Universe. It will provide simulation products and software that enable projects and collaboration across the Roman science community.

Detecting Microhertz Gravitational Waves with the Nancy Grace Roman Space Telescope

Wide-Field Science – Regular

Tzu-Ching Chang / Jet Propulsion Laboratory, PI

Gravitational waves (GWs) are a new avenue of observing our Universe. So far, we have seen them in the ~10-100 Hz range, and there are hints that we might soon detect them in the nanohertz regime. Multiple efforts are underway to access GWs across the frequency spectrum; however, parts of the frequency space are currently not covered by any planned or future observatories. Our recent work has shown that photometric surveys can bridge the microhertz gap in the spectrum between LISA and Pulsar Timing Arrays (PTAs) through relative astrometric measurements. Similar to PTA measurements, these astrometric measurements rely on the correlated spacetime distortions produced by gravitational waves at Earth, which induce coherent, apparent stellar position changes on the sky. To detect the microhertz GWs with an imaging survey, a combination of high relative astrometric precision, a large number of observed stars, and a high cadence of exposures are needed. The Roman Galactic Bulge Time Domain Survey (RGBTDS) would have all of these components. Our program seeks to simulate relevant data and explore survey designs for the Roman mission to better estimate the sensitivity of Roman to gravitational waves from supermassive black holes in the microhertz regime.

Using analytic estimates, we calculated that the RGBTDS is sensitive to GWs with frequencies ranging from 7.7 × 10-8 Hz to 5.6 × 10-4 Hz, which opens up a unique GW observing window for supermassive black hole binaries and their waveform evolution. While the detection threshold assuming the currently expected performance proves too high for detecting individual GWs given the expected supermassive black hole binary population distribution, we showed that Roman would still be sensitive to the stochastic GWB with an estimated signal-to-noise (SNR) ~ 1 and set interesting limits to SMBH population and evolution. If the mean astrometric deflection, which is normally lost due to the use of guiding stars as reference for spacecraft pointing solution, could be recovered, a factor of 100 in sensitivity improvement could be expected. We will investigate this exciting prospect, which would allow confident detection of SMBH with a chirp mass Mc >107 M☉ out to 50 Mpc, and detection of stochastic GWB with an estimated SNR ~70.

In this proposal, working with mission and RGBTDS experts, we propose to study and simulate several key aspects of the RGBTDS, including the recovery of the mean astrometric deflection, as well as studying how modifications to the RGBTDS design could improve the sensitivity to GWs. Relying on these estimates, we will also develop an optimal statistic tailored to Roman to detect spatially and temporally coherent modulations and stochastic gravitational wave background modulations (e.g., from the superposition of many SMBHBs) in the RGBTDS data. Our program will guide the future development of the full simulation and analysis pipeline that will be required to make Roman a space based GW detector.

Preparing for a leap: Precursor Strong Lensing Science with Roman Towards Precision Cosmology

Wide-Field Science – Regular

Tansu Daylan / Washington University, PI

The Roman Telescope will offer a unique opportunity to study strong lenses across cosmic time, enabling thorough investigations of the substructure and microphysics of dark matter. For the first time, the discovery images will have the required depth and resolution to perform high-precision measurements. What the Hubble Space Telescope could do on a limited selected sample, Roman will be able to achieve over the entirety of planned surveys. There is currently no realistic set of simulations of strong lenses with Roman, which can undermine the progress towards achieving maximal cosmology from Roman. Therefore, a set of realistic strong lens simulations must be performed in time in order to train and test the strong lensing pipelines that will process the Roman data.

We propose to generate a comprehensive suite of simulated Roman images of realistic strong lenses with substructure, organize a data challenge for the research community to test their detection and characterization pipelines on these data, and develop robust retrieval pipelines to determine the selection functions on the population-level properties of dark matter substructure indicative of its microphysics.

We will produce a detailed and realistic lens population over cosmic time and then generate synthetic Roman data that can be used for training, validation, or survey planning. We will include in our simulations both galaxy-galaxy-type lenses that are optimal for inferring substructure and quasar lenses that can enable other science cases such as time-delay tomography. Using our multiband simulations, we will enable precise determinations of the expected yields of static galaxy-galaxy strong-lenses, strongly-lensed quasars, and supernovae, as well as their population properties. Toward this goal, we will develop a simulation pipeline that accurately incorporates up-to-date detector characteristics of the Wide Field Instrument (WFI) on Roman and a population model for strong lenses informed by the Hubble Space Telescope and DES. Thus, we will help prepare Roman for the operational phase of the mission and enhance the science return of the WFI. Our work will be a Regular Wide Field Science (WFS) effort over two years, from September 2023 to September 2025.

The primary merit of our proposed effort is to facilitate the timely simulation and survey planning to enable optimal detection and characterization of the strong lenses following the launch of Roman. In particular, the early timing of our precursor effort is crucial to deliver a strong-lens simulation pipeline for Roman on time so that subsequent efforts from 2025 to launch can use the simulation pipeline for yield simulations and data reduction pipelines. Accordingly, our proposed effort is especially suited as a precursor effort ~3-4 years before launch.

Our work is especially relevant as a regular WFS effort in this cycle since our simulations will open the path towards designing strong lens surveys with Roman and optimizing other community surveys for ancillary strong lensing science. Therefore, it is a uniquely suited and timely precursor science effort to be optimally completed ~2 years before the launch of Roman.

Establishing SI-traceable Standard Reference Fields and Low Redshift Type Ia Supernova Calibration for Roman Wide Field Science

Wide-Field Science – Regular

Susana Deustua / National Institute of Standards & Technology, Co-PI

Establishing Infrared Flux Standards for Roman Wide Field Instrument Science

The Nancy Grace Roman Space Telescope (Roman) is NASA’s next large flagship mission scheduled for launch by 2027. Roman’s Wide Field Instrument (WFI) will have a large field of view (0.28 sq deg), providing Hubble-like sensitivity and resolution in the infrared enabling transformational investigations in cosmology, exoplanet science and general astrophysics. The core community surveys include a High Latitude Wide Area survey, a High Latitude Time Domain survey, and a Galactic Bulge Time Domain survey.

In order to meet Roman’s dark energy goals, a requirement on the design of the imaging component of the High Latitude Wide Area Survey is to enable accurate measurements of the color to less than 0.5% as well as precision photometry over 11 mag in brightness of 0.3% of thousands of Type Ia Supernovae, and accurate photometric redshifts of millions of galaxies. These observations will be used to measure the accelerated expansion of the universe and to better constrain the nature of dark energy. Further, synergies with ground-based programs like the LSST with the Rubin Observatory and with space-based observatories like Euclid, will also require high fidelity for accurate cross-mission calibration.

At present the principal limitation to achieving these requirements is the paucity of standard stars that are both well characterized and have NIST-traceable spectral energy distribution in the critical brightness range between about V = 14- 20 mag. We propose to use a combination of ground-based and space-based assets to conduct a multi-year campaign to establish several hundred flux standards with an accuracy of a few millimags that will help Roman meet its demanding photometric requirements that enable transformational science.

Observatory Microlensing and Binary Self Lensing with Roman and with Rubin

Wide-Field Science – Regular

Rosanne Di Stefano / Harvard-Smithsonian Center for Astrophysics, PI

One of the primary goals of the Nancy Grace Roman Space Telescope mission is to establish the population properties of planetary systems in the Galactic Bulge. Its Galactic Bulge Time Domain Survey will be keenly sensitive to microlensing events. With its 15-minute cadence, it is particularly well suited to discover short events that could be free-floating planets, and also the short-term subtle deviations in a stellar-lensing light curve that are caused by planets. Predictions have been made that Roman will discover ~1400 planets via microlensing.

The Rubin Observatory’s Large Survey of Space and Time (LSST) is presently slated to cover the Roman field. It will start observations roughly a year earlier, observe the field in between Roman’s 72-day observation intervals, and continue for several years after Roman concludes. Our team, experts in microlensing and members of the Rubin microlensing group, proposes work that will use the Roman/Rubin Synergy to enable Roman to achieve its bold microlensing goals.

For long events, whether caused by black hole lenses or slow-moving stars, Rubin can play a crucial role in making sure we identify lensing events that do not start and complete within Roman’s observing window. We have been deeply engaged in developing event detection within LSST data, and will be working to identify the roughly 1200 Rubin-detectable events that are expected each year in the 2 square degree Roman field. Many Rubin-detected events that can be detected by Roman, even near baseline, will still be active when Roman starts observations. This applies not only to black-hole lenses. In fact, the events that can be detected by both observatories will be dominated by events with durations of weeks or months.

To increase the numbers of lensing events Roman will identify, and to increase the efficiency for the extraction of correct system parameters, we have devised a two-part plan.

First, we will conduct a sequence of light curve simulations in which we use, sequentially, a variety of Roman and Rubin sampling strategies in the Roman field. Our analyses of the simulated data will help us to identify optimal observing strategies, and should shape the approach both observatories take in the Roman Bulge field.

Second, we will create a Roman Input Microlensing Catalog for the Galactic Bulge. The entries in the catalog will be the event coordinates and the full LSST-multi-band light curves for each lensing event candidate. We will also create a complementary catalog of microlensing imposters–those events that could be easily confused with microlensing events, but which we have found to have other natures. With these catalogs in hand (or rather, downloaded from IPAC), Roman will start observations knowing the locations and prior histories of, potentially, hundreds of events. This will increase Roman’s early science return.

The science return will also be significantly increased by another aspect of the Roman/Rubin Synergy: the complementarity of their spectral coverage. Rubin’s mulitband (up to 6) coverage will provide spectral information about the lensed source and other objects along the line of sight.

From a physical perspective, the importance of this work is not simply an increase in the number of events Roman discovers. Our work will play crucial roles in allowing Roman to correctly characterize the events and, eventually, the full population responsible for planetary microlensing in the Bulge. If our work conducted under the aegis of this program is successful, as we expect it to be, we will take advantage of future funding opportunities to transform the catalog into a dynamic resource that is continuously updated with the combination of Roman and Rubin microlensing data.

Our team is uniquely qualified to conduct this research, which should be started as soon as possible if Roman is to derive the full scientific advantage of the Roman/Rubin microlensing synergy.

Machine Learn the Roman Universe

Wide-Field Science – Regular

Shirley Ho, New York University, PI

The Roman Space Telescope (Roman) with its large-scale structure (LSS) survey will provide us with data of unprecedented information content to elucidate fundamental questions about our Universe, such as its origins, content, and its future. Progress in any one of these directions could constitute a groundbreaking discovery in physical cosmology. However, nonlinear gravitational evolution makes extracting the pertinent information with traditional methods challenging, as none of the current methods deployed by LSS surveys are able to extract the full information content of the Universe.

To address this challenge, we propose to develop three Machine Learning (ML) based methods to learn the information in the data and determine the cosmological parameters and initial conditions of the Universe. The proposed methods have the potential to optimally (information theoretically) extract information from the Roman LSS data. The first method is based on a Bayesian statistical inference framework, where one first reconstructs the initial conditions and uses that information to learn the data likelihood. The second method is based on unsupervised learning, where we learn the data likelihood as a function of cosmological parameters via a Normalizing Flow. The third method is based on diffusion models, which generate posterior samples of the initial conditions and properties of the Universe from non-linear large-scale structure using score-based generative models.

We will pay special attention to robustness of all of the methods against systematic errors and astrophysical effects, leveraging astrophysical nuisance parameters that can be marginalized over, and utilizing scale separation information. An important contribution of this proposal is the generation of mock survey datasets via deep-learning accelerated simulations of the galaxy surveys. They will also serve as a testbed for our ML methods. As an example use case, we will apply these tools to the problem of extracting information about the initial conditions of the universe via primordial non-Gaussianity from space-based galaxy survey data.

A second major goal of the proposal is to develop a community framework within which different ML methods can be tested and compared. We will create deep-learning accelerated simulated datasets with survey realism that can be used for bench-marking and for blind analyses of different methods using realistic computational simulations. We will promote open access ML tools by releasing both the software and simulated datasets into the public domain, and by providing community support for these products. We will encourage community engagement through data challenges.

Results of this study will provide new ML methods that promise to considerably improve the information content of existing methods of LSS analysis, which could unlock an expanded potential not only for Roman, but also space-based LSS missions to illuminate the fundamental physics of the Universe.

Kinematic Lensing with the Roman Space Telescope

Wide-Field Science – Large

Elisabeth Krause / University of Arizona, PI

Weak gravitational lensing (WL) is one of the core probes of the Nancy Grace Roman Space Telescope to study multiple high-profile NASA science goals such as the origin and composition of the Universe and the processes of structure formation and galaxy evolution.

WL however is a very challenging measurement. Most importantly, the fact that the intrinsic shape of the lensed galaxy is unknown results in large statistical uncertainties of WL shear measurements. This so-called shape noise dilutes the desired measurement of the shear effect and as a consequence, traditional WL requires a large ensemble of galaxies to boost signal-to-noise of the shear signal. Unsurprisingly, this implies the inclusion of a substantial sample of faint galaxies that are affected by systematic uncertainties in shape and redshift measurement algorithms; perfectly controlling these systematics is impossible and can limit the constraining power of Roman cosmology.

Kinematic Lensing (KL) combines imaging and spectroscopic data into a new type of lensing inference. This reduces the shape noise uncertainty haunting WL by more than an order of magnitude. Further, the KL estimator automatically bypasses one of the most severe astrophysical systematics of weak lensing, so-called intrinsic galaxy alignments. The need for spectroscopic information, however, implies that the size of our KL galaxy sample will be smaller compared to that of standard WL. Nevertheless, the smaller KL galaxy sample will be significantly more robust to the two main observational systematics that are haunting standard WL galaxies. Firstly, the spectroscopic information renders redshift uncertainties obsolete; secondly, we can select large, bright galaxies, for which shape measurement uncertainties are well-controlled. Our team has run initial forecasts using the Roman spectroscopic sample as a KL galaxy sample, and we find that the KL constraining power on dark energy equation of state parameters is increased significantly over that of standard WL.

In this proposal we plan to develop a full KL inference pipeline that can ingest imaging and spectroscopic data from Roman and produce a corresponding KL shape catalog. We will also build the software infrastructure for the cosmological interpretation of the extracted KL signal, and we will use this infrastructure to create precision forecasts of KL science performance as a function of Roman survey strategy and systematics control.

DeepDISC-Roman: Detection, Instance Segmentation, and Classification for Roman with Deep Learning

Wide-Field Science – Regular

Xin Liu / University of Illinois – Urbana-Champaign, PI

The Nancy Grace Roman Space Telescope will deliver deep high-quality images in the near infrared for precision cosmology and beyond. As both the sensitivity and depth increase, larger numbers of blended (overlapping) sources will occur. If left unaccounted for, blending would result in biased measurements of sources that are assumed isolated, contaminating key inferences such as photometry, photometric redshift, galaxy morphology, and weak gravitational lensing. In the Roman era, efficient deblending techniques are a necessity and thus have been recognized a high priority. However, an efficient and robust deblending method to meet the demand of next-generation deep-wide surveys is still lacking.

Leveraging the rapidly-developing field of computer vision, the open-source DeepDISC-Roman will provide a new versatile deep learning framework for the Roman research community. It makes it easy to efficiently process Roman images and accurately identify blended galaxies with the lowest latency to maximize science returns. The approach is interdisciplinary and fundamentally different from traditional methods, combining state-of-the-art survey data with the latest deep learning tools. A unique feature of DeepDISC-Roman is the robust quantification of the uncertainty of the prediction, which can be then propagated into the final error budget for precision cosmology. Because of limitations with the previous deep learning applications, a new framework is under development, leveraging Detectron2 – Facebook AI Research’s next-generation open-source platform for object detection and segmentation. This program will support transforming DeepDISC-Roman from a proof-of-concept pilot study to a fully featured, developed and science-ready platform for astronomical object detection, instance segmentation, classification, and beyond. Leveraging existing software infrastructure and production-ready packages, DeepDISC-Roman will be trained and validated using a hybrid of real data and more realistic simulations by combining traditional image simulations with deep generative models. DeepDISC-Roman can be applied to many other higher-level downstream science applications such as photometric redshift estimation and galaxy morphology inferences. It will combine Roman with Rubin and Euclid to leverage the wider optical-to-near-infrared coverage to improve the reliability of photometric redshifts and to facilitate the deblending of Rubin ground-based images. The program has strong implications for a wide range of subjects, from efficiently detecting transients and solar system objects to the nature of dark matter and dark energy.

DeepDISC-Roman will deliver several key scientific, software, and data products to maximize Roman science. It will produce a versatile deep-learning framework which can be integrated into the analysis software provided by the Roman Science Centers. It addresses the Roman research and support participation program through “Development of Roman analysis software beyond that provided by the Science Centers” and “Development of algorithms for joint processing with data from other space- or ground-based observatories such as deblending algorithms, photometric redshift training and calibration, or forced photometry”. DeepDISC-Roman will complement and augment activities of the Roman Science Centers but does not overlap/duplicate them. The proposed work should be performed now, rather than closer to launch or post-lauch, because deblending is fundamental to the Roman data analysis infrastructure and critical for enabling many downstream applications. As part of the proposed work, the program will train undergraduate students through the Students Pushing Innovation at the National Center for Supercomputing Applications. Finally, the program will develop a diverse and inclusive scientific workforce and clearly defines roles and responsibilities for all team members toward pursuing those goals.

ROSALIA: Roman Sky Analyst for Low surface brightness Imaging & Astronomy

Wide-Field Science – Large

Pamela Marcum / NASA – Ames Research Center, PI

Deep imaging is the next frontier for many studies in galaxy evolution and cosmology, providing unprecedented views of ultra-low surface brightness realms of the universe that are thousands of times dimmer than the sky background, including stellar galactic halo structure, intracluster light, and the traces of galaxy assembly (tidal tails, stellar streams, shells, faint satellites). Detector sensitivity is a double-edged sword: while enhancements facilitate detection, they also result in a dramatic rise of systematic biases such as flat-fielding residuals, scattered light of bright objects in the field of view, and loss of extended sources due to sky over-subtraction. The objective of this proposal is to develop, test, and implement a complete suite of low surface brightness processing tools for the analysis of space imaging observations that will minimize the undesirable gradients in images caused by these sources. The proposed application of these tools is the Wide Field Instrument (WFI) of the Nancy Grace Roman Space Telescope. These methods will improve the photometric calibration and maximize the quality of the data product, enabling a new range of science objectives beyond the original mission. The proposed pipeline will be developed and tested with a set of end-to-end simulations of Roman/WFI mosaics based on cosmological simulations that include the complexity of high-z objects and local universe objects. We will implement these tools as additional modules to support observations of the astronomical community with the Nancy Grace Roman Space Telescope.

A Statistical Framework for Optimizing Roman Spectroscopic Training Sets

Wide-Field Science – Regular

Jeffrey Newman / University of Pittsburgh, PI

The vast majority of galaxies in Roman imaging will lack spectroscopic redshifts, so photometric redshifts (photo-zs) will be crucial for both cosmology and galaxy evolution studies. We propose to build on machine learning methods that can interpolate between the limited sampling of the color-redshift relation from deep but small-area spectroscopic surveys and clean incorrect redshifts out of spectroscopic training sets for Roman photo-zs, enabling the construction of nearly-ideal spectroscopic training data from sparse and imperfect samples. The methods we will explore should enable improvements in both the performance of photo-z algorithms at predicting redshifts for individual objects as well as in the calibration of the outputs of those algorithms, which otherwise may be a dominant systematic uncertainty in Roman cosmology studies.

First, we will apply a powerful non-linear dimension reduction technique, UMAP (Uniform Manifold Approximation and Projection), to compress galaxy SEDs into a low-dimensional continuous space, using existing data from fields with existing multiwavelength and spectroscopic data. In contrast to the Self-Organizing Maps (SOMs) often used to map observed galaxy SEDs onto a 2-D rectangular and discrete grid for photo-z applications, UMAP provides a continuous, topologically flexible, and robust low-D representation of optical-IR color space, which can be trained using large photometric galaxy samples. We expect that observed SEDs should intrinsically occupy a roughly 3-D manifold, since apparent colors are determined primarily by redshift, specific star formation rate, and a degenerate combination of dust/metallicity. Supervised variants of this algorithm trained using high-quality redshifts may help to make the structure of the low-dimensional color-redshift manifold more informative.

We will then train a robust Gaussian process regression algorithm, which can interpolate optimally and identifies and ignores outliers, to map from location in the low-dimensional UMAP space to redshift. Current spectroscopic and many-band photo-z samples have incorrect-redshift rates that are large enough to compromise the calibration of redshift distributions for cosmology; however, such incorrect redshifts should be easily identifiable in the lower-dimensional UMAP space, as they will be out of line with other redshifts in the same region of the color manifold. If the robustness to outliers is great enough, the numerous but less-certain low-resolution spectroscopic redshifts and many-band photo-zs could be incorporated into Roman photo-z training and characterization.

Our procedures will also address the problem that objects with spectroscopic redshifts provide only a sparse and inconsistent sampling of the relationship between the colors of galaxies and their redshifts due shot noise/limited sample size, selection effects in spectroscopic data sets, and sample/cosmic variance. For instance, galaxy populations that only inhabit the densest regions of the Universe may be missing entirely from training sets built from small fields at some redshifts but will be present at others; no amount of re-weighting can make up for their absence. With a mapping from UMAP coordinates to redshift in hand, however, we can construct augmented spectroscopic samples of arbitrary size that perfectly match the distribution of photometric samples in UMAP space (and hence color space) and that include objects that fill in gaps in the existing sparse spectroscopic samples, with minimal impact from sample/cosmic variance. Such training sets would be the ideal inputs for machine-learning-based photo-z algorithms and would be invaluable for determining accurate redshift distributions for any photometrically-defined samples. Our results could have considerable impact on the strategies and requirements for Roman spectroscopic training sets, so completing it in the next two years is critical.

Asteroseismology Using The Galactic Bulge Time Domain Survey

Wide-Field Science – Regular

Marc Pinsonneault / Ohio State University, PI

Overview: Evolved red giant stars are detected as high-amplitude non-radial oscillators in time-domain space missions. Precise, regularly sampled, long-duration photometric observations can be used to characterize their oscillation frequency pattern. Masses, radii, and ages have been measured for tens of thousands of stars observed by Kepler, K2 and TESS by combining this asteroseismic data with stellar metallicity and Teff. We propose to quantify the unique asteroseismic capability of the Nancy Grace Roman Space Telescope by: 1) generating a detailed model of asteroseismic detectability with Roman light curves; 2) simulating the expected population of bulge red giants that would be detectable; 3) curating a target list with existing ancillary spectrophotometric data; and 4) exploring the potential of Roman astrometry and photometry to greatly expand the sample size, precision, and accuracy. Roman asteroseismology will constrain the properties of an important Galactic population and will be important for interpreting the results of the Roman Galactic Bulge Time Domain Survey (GBTDS).

Background: Roman is especially well-suited for asteroseismology of core He-burning, or red clump (RC), stars in the Galactic bulge because of its high spatial resolution, IR passband, aperture, and observing cadence. Asteroseismology requires the ability to detect oscillations and key stellar properties (Teff and metallicity) to infer mass and age. If L and Teff are known, R can be inferred, reducing the information needed for masses. Our project therefore begins with detectability and proceeds to catalog stars with Teff and metallicity (of order 110,000). We then explore the prospect of using Roman astrometry and photometry to provide R and Teff for numerous fainter stars (of order 420,000).

Methods: Prior work has established the feasibility of asteroseismology in Roman. Saturation will be important for RC stars in the Galactic bulge. We will develop an improved asteroseismic detection model, including the effect of saturation on variability and detection probabilities as a function of magnitude and intrinsic luminosity. We will then generate a mock RC catalogue to predict detections once Roman is launched. It will also serve as a reference for testing how different color and magnitude cuts, and different survey footprints, affect the predicted yields.

Based on the mock catalog, we will then assemble an asteroseismic target list (ATL) based on real data in the GBTDS footprint. The target list will consist of RC stars with Gaia and 2MASS photometry whose solar-like oscillations we determine to be detectable, and we will use it to motivate spectroscopic survey follow-up. We will also provide target lists with metallicities and temperatures from low-resolution optical Gaia BP/RP spectra and high-resolution infrared APOGEE spectra. We will use this data to iteratively calibrate the mock catalog parameters and infer the selection function for the ATL.

Roman astrometry is precise but needs to be tied to the Gaia system to place it on an absolute scale. We will combine an improved astrometric and photometric model with data from our mock catalog to infer the density of Gaia calibrators and make predictions for uncertainties in Roman RC parallaxes and radii, crucial for extending asteroseismology to fainter targets.

Future impact: The catalogs and codes will be resources for the Roman centers and the microlensing PIT. The detectability and astrometry models that we develop are applicable to other science cases, especially those involving saturated stars. Our projections for what Roman will achieve for asteroseismology will motivate broader science cases for Roman, such as insights into the ages and chemical properties of the bulge stellar population from which stellar exoplanet hosts are sampled. Population-level discoveries will likewise revolutionize our understanding of the primordial bulge from a Galactic archaeology standpoint.

Laying the Foundation for a Comprehensive View of Transiting Exoplanets with the Galactic Bulge Survey

Wide-Field Science – Large

Elisa Quintana / NASA – Goddard Space Flight Center, PI

The primary science driver of the Roman Galactic Bulge Time-Domain Survey (GBTDS) is the detection and demographics of cold exoplanets via microlensing (Astro2010, Penny et al. 2019). However, additional science can be extracted from this survey (Gaudi et al., 2019), including the potential to detect an unprecedented ~100,000 transiting exoplanets (Bennett & Rhie, 2002; Montet et al., 2017) and the transformative science this would enable.

We propose to investigate and build the infrastructure necessary to develop a transiting exoplanet science case with Roman that will benefit the astronomical community. Specifically, we will develop a GBTDS transiting exoplanet science case by:

1. Developing accurate and precise pixel-level simulations of the GBTDS,
2. Building a robust transit search and vetting infrastructure based on proven techniques developed for Kepler, K2, and TESS,
3. Identifying and providing recommendations for GBTDS design trades that maximize the transiting planet science return, and
4. Performing simulations of transiting exoplanet atmospheres to develop Roman’s potential for transformative atmosphere population studies.

This work will produce publicly-available pipelines to simulate GBTDS pixel-level data, generate light curves, detect transiting exoplanets and identify false positives, as well as simulation software for producing synthetic transit populations. We will work with the Roman project and other selected teams to leverage and augment their simulation tools. The proposed work must begin now in order to provide timely recommendations to the Roman project on survey design trades and prepare the community to maximize the science return of Roman’s first few years.

SPQR: Spectroscopic Probes of Quantitative Reionization

Wide-Field Science – Large

James Rhoads / NASA – Goddard Space Flight Center, PI

Reionization of intergalactic hydrogen was the first time that stars and galaxies had a global impact on the universe around them, and the last phase transition for ordinary matter in the universe. The Nancy Grace Roman Space Telescope will enable critically needed wide field surveys for galaxies in the epoch of reionization (EoR). In particular, Roman’s wide field spectroscopic capabilities will allow direct observation of the Lyman alpha line from EoR sources, providing direct, local tests for neutral intergalactic gas, over scales that are large enough to study this inherently inhomogeneous process and that are unachievable with other facilities.

This proposal builds upon the work of the previous Cosmic Dawn Science Investigation Team to prepare for WFI investigations of Cosmic Dawn through deep imaging and spectroscopy. For this we will:

(1) Simulate cosmic dawn galaxies, by combining large semi-analytic simulations with high-resolution hydrodynamic simulations and radiative transfer of Lyman-alpha at z > 7. These simulations will be informed and constrained by deep observations from JWST.

(2) Develop high fidelity scene simulations, custom built for Roman Grism and Prism, starting with observed deep and wide field imaging from JWST.

(3) Optimize selection methods for high redshift line emitters, quasars, and Lyman Break Galaxies, incorporating spectroscopic information that will increase the robustness and efficiency compared to photometric searches.

(4) Explore quantitative metrics to measure reionization using robust measures such as clustering of Lyman alpha sources, topology of Lyman alpha emitter distribution, and cross-correlation with 21cm measures.

Applying wide field slitless spectroscopy to studies of cosmic dawn requires optimized extraction of spectral lines 10 times fainter than those sought in the High Latitude Wide Area Survey. Such observations will have more significant crowding than shallower surveys. This requires new strategies for both observations and source detection algorithms, and data simulations with unprecedented detail that go beyond the tools and methods currently planned.

We will develop the required tools to simulate Roman slitless data. We will produce high fidelity simulations that include (a) the defocused higher diffractive orders in grism data, whose wavelength-dependent pattern cannot be simulated by existing packages; (b)the highly nonlinear prism dispersion; (c) position-dependent trace, dispersion solution, and passband edges, and (d) wavelength and position dependent point spread functions.

Using these tools, we will examine how depth, area, number of distinct roll angles, and other observational parameters affect the ability to recover input structures. We will also develop figures of merit that will help quantitatively evaluate future tradeoffs such as depth vs area. We will define strategies to study the ionizing photon budget as a function of redshift, and to study the dependence of epoch-of-reionization galaxy properties on their local environments.

The results will ultimately benefit all applications of Roman slitless spectroscopy, reaching a wide user community. We will provide tools to inject simulated sources with user-provided spectra and spatial profiles into realistic background scenes, in order to test detection efficiency and measurement fidelity. This may be applicable to diverse general astrophysics survey programs, and also to galaxy redshift survey cosmology results from the high latitude wide area survey. We expect to work with the science centers to share results and algorithms, and also to offer data challenges and training for the general community. By developing these tools now, we will help the community to be ready for scientific applications of Roman spectroscopy, from galaxies to kilonovae, on day one of science operations.

A new theoretical framework for globular cluster science with the Roman Wide-Field Imager

Wide-Field Science – Regular

Robyn Sanderson / University of Pennsylvania, PI

The Roman Space Telescope will be an excellent tool for finding thin stellar streams, i.e. those formed from globular clusters, in external galaxies for the first time. To date, we only know of globular cluster tidal streams within our own Milky Way. In combination with the population of still intact globular clusters, these tidal streams have provided unique insights into the early formation and accretion history of our Galaxy, as well as the shape of its dark matter potential and constraints on the clumpiness of dark matter. Despite their demonstrated power in our own Galaxy, however, there are currently no theoretical predictions of the expected number of GC streams in nearby galaxies, or how many Roman will be able to find, severely limiting our ability to interpret Roman observations. Studying the formation of GC streams in cosmologically-evolving galaxies is computationally challenging, and previous work has been forced to oversimplify models either for the formation and evolution of GCs or for the galactic tidal field.

In this proposal we will, for the first time, include all components of globular cluster formation and evolution in a cosmologically evolving galaxy. Our project will build upon an existing model for the formation of globular clusters based on zoomed cosmological-hydrodynamical simulations, and explore variations on these initial conditions. We will then build a stream formation model that will inject test particles of the growing streams into the same cosmological simulations used to determine the cluster population, optimized to reproduce the changing potential of the full halo and galaxy environment at all cosmic times. The result will be the first predictions for the population of thin streams and globular clusters around galaxies in a self-consistent and fully cosmological context.

These results will let us answer crucial questions related to globular cluster and galaxy formation science for the first time. We will predict the mass and metallicity distributions of surviving and destroyed globular clusters and the morphology of thin stellar streams, and explore their dependence on the violence of the host galaxy’s formation, the cluster formation model, and the presence or absence of dark matter substructure and satellite galaxies. Most importantly, however, we will use tools already developed by our team to produce full Roman synthetic observations of thin stellar streams, their morphology, and their stellar populations, in their cosmologically-motivated host halos, connecting the detectable set of clusters and streams to the origin and evolution of the full underlying population of globular clusters.

Enhancing the Roman Cosmology Program with Strongly Lensed Supernovae

Wide-Field Science – Large

Louis-Gregory Strolger / Space Telescope Science Institute, PI

One of the primary mission objectives for the Nancy Grace Roman Space Telescope is to investigate the nature of dark energy with a variety of methods. Observations of Type Ia supernovae (SNe Ia) will be one of the principal anchors of the Roman cosmology program, through traditional luminosity distance measurements from the High Latitude Time Domain Survey (HLTDS). The Wide Field Instrument (WFI) can provide another valuable cosmological probe, without altering the mission strategy: time delay cosmography with gravitationally lensed SNe. These rare events manifest when the light from a stellar explosion propagating along different paths is focused by a lensing potential (a galaxy or galaxy cluster), forming multiple images of the SN on the sky. Depending on the relative geometrical and gravitational potential differences of each path, the SN images appear delayed by hours to months (for galaxy-scale lenses) or years (for cluster-scale lenses). These time delays can be used to measure a combination of angular diameter distances that constrain the Hubble constant (H0) and other cosmological parameters, including the dark energy equation of state (e.g., w), in a single step. Constraints on cosmological parameters from lensed SNe are highly complementary and fully independent to those of non-lensed SNe Ia, providing a valuable check on systematics. This proposal will lay the groundwork for including this new probe in the Roman cosmology toolkit, which can provide a <1% H0 measurement, by producing simulations and tools needed for efficient detection and modeling of each new lensing system.

Previous work predicts the planned Roman HLTDS will discover over 40 strongly lensed SNe, but that assumes a robust detection pipeline with high efficiency. We have two avenues for identifying strongly lensed SNe in public Roman data. First, we can simply search all publicly available images for lensed SNe using convolutional neural networks (CNNs) designed to identify the combination of lensed arcs and multiple transient sources. Second, we can maintain a catalog of all known galaxy and cluster-scale lenses (currently >10,000), and continuously search for transients in those locations. This method requires that we continuously update the catalog of known lenses that have been discovered throughout the Roman mission. CNN lens-finding algorithms will therefore be repeatedly run on all publicly available data, and newly identified lenses will be added to the catalog for future observations at the same location. Both CNN architectures will require a large (~50,000 lenses) training set to be reliable at the start of the mission.

The early success of Roman Lensed SN cosmology therefore depends upon a suite of accurate and detailed pixel- and catalog-level simulations of Roman lensed SN observations. The simulations are critical for developing and testing detection/analysis pipelines, developing plans and proposals for external follow-up resources, and optimizing existing tools for the unique capabilities of Roman. This proposal will produce the necessary simulations, which are an extension of (but not included in) those planned for the HLTDS. Using the simulations, we will create a data challenge to identify the optimal static lens finding, lensed SN detection, and time delay measurement algorithms to be used with Roman data products. Adapting existing tools and incorporating the best performing methods from the data challenge, we will produce robust, well-documented, and public detection and analysis pipelines. Strongly lensed SNe have the potential to drastically improve the Roman cosmology program with a probe fully independent of SNe Ia. Roman and the HLTDS is truly the first opportunity to create a gold standard sample of lensed SNe, and this program produces all the components needed to ensure we can effectively leverage the enormous potential for cosmology.

Spots, Faculae, and Ages: The Promise of Rotation with Roman and Deep Learning

Wide-Field Science – Regular

Jamie Tayar / University of Florida – Gainesville, PI

We propose to improve our existing open-source tools to determine the efficacy of Roman’s proposed time domain survey’s strategies in measuring stellar rotation periods, inferring gyrochronological ages, and distinguishing between magnetic structures on stellar surfaces.

The Roman mission is set to perform infrared time domain surveys, particularly in the Galactic bulge, but also at high latitudes. The Kepler mission showed us that large time domain datasets allow the measurement of rotation periods of stars across the HR diagram through the modulation of stellar brightness by magnetic spots. Those rotation periods can be used to infer precise (<10% error) ages using gyrochronology, even into the M-dwarf regime where most of Roman’s exoplanet hosts will be. However, work with the TESS mission has suggested that systematics and complex observing strategies like those proposed for Roman can make the extraction of periods with conventional techniques extremely challenging.

We have developed a deep learning technique that can estimate rotation periods in these challenging conditions, as well as a suite of simulations to determine how much additional information about star spot configurations (e.g., relative spot temperatures, sizes, evolution) can be extracted from the data. This technique has been used successfully to obtain periods from TESS and is flexible and adaptable to other missions. We will therefore 1) construct a suite of simulated spot-modulated light curves following Roman’s observing strategy in multiple photometric bands, 2) adapt our deep learning framework to predict what ranges of rotation periods, gyrochronological ages, spot amplitudes, and spot characteristics Roman will detect, and 3) evaluate the tradeoffs of the Roman survey strategy for measuring stellar rotation.

We will predict yields for stellar rotation and spot properties observable with Roman and suggest a survey optimization to extract the most stellar astrophysics science from Roman. Our modeling, simulation, and machine learning tools will provide public Roman analysis software beyond what the Science Centers will provide. Finally, the proposed work will support NASA objectives in studying the evolution of stars and enhance Roman’s exoplanetary science impact by placing its discovered planets into the context of galactic evolution.

Roman Infrared Nearby Galaxies Survey

Wide-Field Science – Large

Benjamin Williams / University of Washington – Seattle, PI

We propose a large Wide Field Science program to develop software for the community to simulate and analyze Roman surveys of nearby galaxies. Roman’s wide field, superb sensitivity to faint stars and compact sources, and near-infrared bandpasses will permit panoramic mapping of hundreds of nearby galaxies out to their virial radii, simultaneously giving insight into the star formation histories of the central galaxy, its satellites, and its streams; charting its assembly from the cosmic web; and mapping its dark matter halo. With the help of powerful planning tools to optimize its potential, Roman will improve our current sample sizes of these resolved star maps a hundredfold, down to surface brightness limits comparable to those currently reached only in the Local Group and >4 magnitudes fainter than achievable from the ground. The tools we develop will build mock Roman imaging data from numerical simulations to test model predictions related to all these long-standing science topics, making it simple and reliable to optimize observing strategies to answer key science questions. Many of the tools necessary for planning Roman observations in this level of detail are the same as those necessary to reduce and analyze those observations once they are made. Our suite will thus enable the community to efficiently plan and analyze Roman observations of resolved stellar populations, in both crowded and low surface brightness regions, from the very start of the mission.

Roman can resolve individual stars to map galaxy structure down to an equivalent surface brightness of 35 mag/sqarcsec, for any galaxy within 10 Mpc of the Sun. Such maps will reveal tidal streams in stellar halos and constrain the mass function of dwarf satellites around hundreds of hosts, providing stringent tests of galaxy formation and dark matter models on galactic and even sub-galactic scales, where the Lambda-CDM model has the most tension with observations. Roman imaging will also transform studies of star formation in galaxy disks, by tracking stellar mass growth as a function of time and position within a galaxy. Roman’s precision photometry will constrain critical stellar evolution models of the near-infrared bright, rapidly evolving stars that contribute significantly to the integrated light of galaxies in the near-infrared. Thus, with Roman we can derive the detailed evolution of individual galaxies, reconstruct the complete history of star formation in the nearby universe, and put crucial constraints on the theoretical models used to interpret near-infrared extragalactic observations.

To revolutionize the study of nearby galaxies, the community must have the necessary tools in place to take full advantage of Roman’s capabilities. The tools we propose to develop will make it straightforward and efficient to produce and analyze simulated (and real) Roman imaging of nearby galaxies and their halos to maximize the scientific yield in the limited observing time available, ensuring the most effective use of the mission and maximizing the value of the final data archive. Consequently, these tools will allow both optimization of observing strategies (e.g., filters, coverage, depth) based on numerical simulation predictions and efficient analysis of real Roman observations.

These tools will be built, tested, and released by a Wide Field Science team that has decades of experience using nearby galaxies to inform fundamental topics in astrophysics and working with numerical simulations to test observables related to near-field cosmology, dark matter, and galaxy formation and evolution. Our team members have led the charge in observational and theoretical studies of resolved stellar populations and stellar halos. With our combined background, we are poised to help the community to take full advantage of the opportunities for discovery that Roman will offer to scientists studying the galaxies in our backyard.


Visit the Roman Space Telescope Partner Websites

IPAC/Caltech
Space Telescope Science Institute (STScI)


NASA logo

Contact

NASA Official: Julie McEnery
Media Inquiries: Claire Andreoli
Website Curator: Jennifer Brill



Privacy Policy   |   Accessibility

Stay Connected