Posts Tagged lsst

Recent Postings from lsst

Cosmic Visions Dark Energy: Science

Cosmic surveys provide crucial information about high energy physics including strong evidence for dark energy, dark matter, and inflation. Ongoing and upcoming surveys will start to identify the underlying physics of these new phenomena, including tight constraints on the equation of state of dark energy, the viability of modified gravity, the existence of extra light species, the masses of the neutrinos, and the potential of the field that drove inflation. Even after the Stage IV experiments, DESI and LSST, complete their surveys, there will still be much information left in the sky. This additional information will enable us to understand the physics underlying the dark universe at an even deeper level and, in case Stage IV surveys find hints for physics beyond the current Standard Model of Cosmology, to revolutionize our current view of the universe. There are many ideas for how best to supplement and aid DESI and LSST in order to access some of this remaining information and how surveys beyond Stage IV can fully exploit this regime. These ideas flow to potential projects that could start construction in the 2020's.

Modeling the Performance of the LSST in Surveying the Near-Earth Object Population

We have performed a detailed survey simulation of the LSST performance with regards to near-Earth objects (NEOs) using the project's current baseline cadence. The survey shows that if the project is able to reliably generate linked sets of positions and times (a so-called "tracklet") using two detections of a given object per night and can link these tracklets into a track with a minimum of 3 tracklets covering more than a ~12 day length-of-arc, they would be able to discover 62% of the potentially hazardous asteroids (PHAs) larger than 140 m in its projected 10 year survey lifetime. This completeness would be reduced to 58% if the project is unable to implement a pipeline using the two detection cadence and has to adopt the four detection cadence more commonly used by existing NEO surveys. When including the estimated performance from the current operating surveys, assuming these would continue running until the start of LSST and perhaps beyond, the completeness fraction for PHAs larger than 140m would be 73% for the baseline cadence and 71% for the four detection cadence. This result is a lower than the estimate of Ivezic et al. (2007,2014); however it is comparable to that of Jones et al. (2016) who show completeness ~70$%. We also show that the traditional method of using absolute magnitude H < 22 mag as a proxy for the population with diameters larger than 140m results in completeness values that are too high by ~5%. Our simulation makes use of the most recent models of the physical and orbital properties of the NEO populations, as well as simulated cadences and telescope performance estimates provided by LSST. We further show that while neither LSST nor a space-based IR platform like NEOCam individually can complete the survey for 140m diameter NEOs, the combination of these systems can achieve that goal after a decade of observation.

The LSST Data Management System

The Large Synoptic Survey Telescope (LSST) is a large-aperture, wide-field, ground-based survey system that will image the sky in six optical bands from 320 to 1050 nm, uniformly covering approximately $18,000$deg$^2$ of the sky over 800 times. The LSST is currently under construction on Cerro Pach\'on in Chile, and expected to enter operations in 2022. Once operational, the LSST will explore a wide range of astrophysical questions, from discovering "killer" asteroids to examining the nature of Dark Energy. The LSST will generate on average 15 TB of data per night, and will require a comprehensive Data Management system to reduce the raw data to scientifically useful catalogs and images with minimum human intervention. These reductions will result in a real-time alert stream, and eleven data releases over the 10-year duration of LSST operations. To enable this processing, the LSST project is developing a new, general-purpose, high-performance, scalable, well documented, open source data processing software stack for O/IR surveys. Prototypes of this stack are already capable of processing data from existing cameras (e.g., SDSS, DECam, MegaCam), and form the basis of the Hyper-Suprime Cam (HSC) Survey data reduction pipeline.

The LSST Data Processing Software Stack: Summer 2015 Release

The Large Synoptic Survey Telescope (LSST) is an 8-m optical ground-based telescope being constructed on Cerro Pachon in Chile. LSST will survey half the sky every few nights in six optical bands. The data will be transferred to NCSA and within 60 seconds they will be reduced using difference imaging techniques and detected transients will be announced to the community in the VOEvent format. Annual data releases will be made from all the data during the 10-year mission, with unprecedented depth of coadds and time resolution of catalogs for such a large region of sky. In this paper we present the current status of the data processing software, and describe how to obtain it.

Asteroid Discovery and Characterization with the Large Synoptic Survey Telescope (LSST)

The Large Synoptic Survey Telescope (LSST) will be a ground-based, optical, all-sky, rapid cadence survey project with tremendous potential for discovering and characterizing asteroids. With LSST's large 6.5m diameter primary mirror, a wide 9.6 square degree field of view 3.2 Gigapixel camera, and rapid observational cadence, LSST will discover more than 5 million asteroids over its ten year survey lifetime. With a single visit limiting magnitude of 24.5 in r-band, LSST will be able to detect asteroids in the Main Belt down to sub-kilometer sizes. The current strawman for the LSST survey strategy is to obtain two visits (each visit being a pair of back-to-back 15s exposures) per field, separated by about 30 minutes, covering the entire visible sky every 3-4 days throughout the observing season, for ten years. The catalogs generated by LSST will increase the known number of small bodies in the Solar System by a factor of 10-100 times, among all populations. The median number of observations for Main Belt asteroids will be on the order of 200-300, with Near Earth Objects receiving a median of 90 observations. These observations will be spread among ugrizy bandpasses, providing photometric colors and allowing sparse lightcurve inversion to determine rotation periods, spin axes, and shape information. These catalogs will be created using automated detection software, the LSST Moving Object Processing System (MOPS), that will take advantage of the carefully characterized LSST optical system, cosmetically clean camera, and recent improvements in difference imaging. Tests with the prototype MOPS software indicate that linking detections (and thus discovery) will be possible at LSST depths with our working model for the survey strategy, but evaluation of MOPS and improvements in the survey strategy will continue. All data products and software created by LSST will be publicly available.

Computational Intelligence Challenges and Applications on Large-Scale Astronomical Time Series Databases

Time-domain astronomy (TDA) is facing a paradigm shift caused by the exponential growth of the sample size, data complexity and data generation rates of new astronomical sky surveys. For example, the Large Synoptic Survey Telescope (LSST), which will begin operations in northern Chile in 2022, will generate a nearly 150 Petabyte imaging dataset of the southern hemisphere sky. The LSST will stream data at rates of 2 Terabytes per hour, effectively capturing an unprecedented movie of the sky. The LSST is expected not only to improve our understanding of time-varying astrophysical objects, but also to reveal a plethora of yet unknown faint and fast-varying phenomena. To cope with a change of paradigm to data-driven astronomy, the fields of astroinformatics and astrostatistics have been created recently. The new data-oriented paradigms for astronomy combine statistics, data mining, knowledge discovery, machine learning and computational intelligence, in order to provide the automated and robust methods needed for the rapid detection and classification of known astrophysical objects as well as the unsupervised characterization of novel phenomena. In this article we present an overview of machine learning and computational intelligence applications to TDA. Future big data challenges and new lines of research in TDA, focusing on the LSST, are identified and discussed from the viewpoint of computational intelligence/machine learning. Interdisciplinary collaboration will be required to cope with the challenges posed by the deluge of astronomical data coming from the LSST.

Metrics for Optimization of Large Synoptic Survey Telescope Observations of Stellar Variables and Transients [Replacement]

The Large Synoptic Survey Telescope (LSST) will be the largest time-domain photometric survey ever. In order to maximize the LSST science yield for a broad array of transient stellar phenomena, it is necessary to optimize the survey cadence, coverage, and depth via quantitative metrics that are specifically designed to characterize the time-domain behavior of various types of stellar transients. In this paper we present three such metrics built on the LSST Metric Analysis Framework (MAF) model (Jones et al. 2014). Two of the metrics quantify the ability of LSST to detect non-periodic and/or non-recurring transient events, and the ability of LSST to reliably measure periodic signals of various timescales. The third metric provides a way to quantify the range of stellar parameters in the stellar populations that LSST will probe. We provide example uses of these metrics and discuss some implications based on these metrics for optimization of the LSST survey for observations of stellar variables and transients.

Metrics for Optimization of Large Synoptic Survey Telescope Observations of Stellar Variables and Transients

The Large Synoptic Survey Telescope (LSST) will be the largest time-domain photometric survey ever. In order to maximize the LSST science yield for a broad array of transient stellar phenomena, it is necessary to optimize the survey cadence, coverage, and depth via quantitative metrics that are specifically designed to characterize the time-domain behavior of various types of stellar transients. In this paper we present three such metrics built on the LSST Metric Analysis Framework (MAF) model (Jones et al. 2014). Two of the metrics quantify the ability of LSST to detect non-periodic and/or non-recurring transient events, and the ability of LSST to reliably measure periodic signals of various timescales. The third metric provides a way to quantify the range of stellar parameters in the stellar populations that LSST will probe. We provide example uses of these metrics and discuss some implications based on these metrics for optimization of the LSST survey for observations of stellar variables and transients.

Mapping charge transport effects in thick CCDs with a dithered array of 40,000 stars

We characterize the astrometric distortion at the edges of thick, fully-depleted CCDs in the lab using a bench-top simulation of LSST observing. By illuminating an array of forty thousand pinholes (30mu m diameter) at the object plane of a f/1.2 optical reimager, thousands of PSFs can be imaged over a 4Kx4K pixel CCD. Each high purity silicon pixel, 10mu m square by 100mu m deep, can then be individually characterized through a series of sub-pixel dithers in the X/Y plane. The unique character [response, position, shape] of each pixel as a function of flux, wavelength, back side bias, etc. can be investigated. We measure the magnitude and onset of astrometric error at the edges of the detector as a test of the experimental setup, using a LSST prototype CCD. We show that this astrometric error at the edge is sourced from non-uniformities in the electric field lines that define pixel boundaries. This edge distortion must be corrected in order to optimize the science output of weak gravitational lensing and large scale structure measurements for the LSST.

The impact of intrinsic alignment on current and future cosmic shear surveys

Intrinsic alignment (IA) of source galaxies is one of the major astrophysical systematics for ongoing and future weak lensing surveys. This paper presents the first forecasts of the impact of IA on cosmic shear measurements for current and future surveys (DES, Euclid, LSST, WFIRST) using simulated likelihood analyses and realistic covariances that include higher-order moments of the density field in the computation. We consider a range of possible IA scenarios and test mitigation schemes, which parameterize IA by the fraction of red galaxies, normalization, luminosity and redshift dependence of the IA signal (for a subset we consider joint IA and photo-z uncertainties). Compared to previous studies we find smaller biases in time-dependent dark energy models if IA is ignored in the analysis; the amplitude and significance of these biases vary as a function of survey properties (depth, statistical uncertainties), luminosity function, and IA scenario: Due to its small statistical errors and relatively shallow observing strategy Euclid is significantly impacted by IA. LSST and WFIRST benefit from their increased survey depth, while the larger statistical errors for DES decrease IA's relative impact on cosmological parameters. The proposed IA mitigation scheme removes parameter biases due to IA for DES, LSST, and WFIRST even if the shape of the IA power spectrum is only poorly known; successful IA mitigation for Euclid requires more prior information. We explore several alternative IA mitigation strategies for Euclid; in the absence of alignment of blue galaxies we recommend the exclusion of red (IA contaminated) galaxies in cosmic shear analyses. We find that even a reduction of 20% in the number density of galaxies only leads to a 4-10% loss in cosmological constraining power.

Curvature Wavefront Sensing for the Large Synoptic Survey Telescope

The Large Synoptic Survey Telescope (LSST) will use an active optics system (AOS) to maintain alignment and surface figure on its three large mirrors. Corrective actions fed to the LSST AOS are determined from information derived from 4 curvature wavefront sensors located at the corners of the focal plane. Each wavefront sensor is a split detector such that the halves are 1mm on either side of focus. In this paper we describe the extensions to published curvature wavefront sensing algorithms needed to address challenges presented by the LSST, namely the large central obscuration, the fast f/1.23 beam, off-axis pupil distortions, and vignetting at the sensor locations. We also describe corrections needed for the split sensors and the effects from the angular separation of different stars providing the intra- and extra-focal images. Lastly, we present simulations that demonstrate convergence, linearity, and negligible noise when compared to atmospheric effects when the algorithm extensions are applied to the LSST optical system. The algorithm extensions reported here are generic and can easily be adapted to other wide-field optical systems including similar telescopes with large central obscuration and off-axis curvature sensing.

Chromatic CCD effects on weak lensing measurements for LSST [Replacement]

Wavelength-dependent point spread functions (PSFs) violate an implicit assumption in current galaxy shape measurement algorithms that deconvolve the PSF measured from stars (which have stellar spectral energy distributions (SEDs)) from images of galaxies (which have galactic SEDs). Since the absorption length of silicon depends on wavelength, CCDs are a potential source of PSF chromaticity. Here we develop two toy models to estimate the sensitivity of the cosmic shear survey from the Large Synoptic Survey Telescope to chromatic effects in CCDs. We then compare these toy models to simulated estimates of PSF chromaticity derived from the LSST photon simulator PhoSim. We find that even though sensor contributions to PSF chromaticity are subdominant to atmospheric contributions, they can still significantly bias cosmic shear results if left uncorrected, particularly in the redder filter bands and for objects that are off-axis in the field of view.

Chromatic CCD effects on weak lensing measurements for LSST [Replacement]

Wavelength-dependent point spread functions (PSFs) violate an implicit assumption in current galaxy shape measurement algorithms that deconvolve the PSF measured from stars (which have stellar spectral energy distributions (SEDs)) from images of galaxies (which have galactic SEDs). Since the absorption length of silicon depends on wavelength, CCDs are a potential source of PSF chromaticity. Here we develop two toy models to estimate the sensitivity of the cosmic shear survey from the Large Synoptic Survey Telescope to chromatic effects in CCDs. We then compare these toy models to simulated estimates of PSF chromaticity derived from the LSST photon simulator PhoSim. We find that even though sensor contributions to PSF chromaticity are subdominant to atmospheric contributions, they can still significantly bias cosmic shear results if left uncorrected, particularly in the redder filter bands and for objects that are off-axis in the field of view.

I. Apples to apples $A^2$: realistic galaxy simulated catalogs and photometric redshift predictions for next-generation surveys [Replacement]

We present new mock catalogues for two of the largest stage-IV next-generation surveys in the optical and infrared: LSST and Euclid, based on an N-body simulation+semi-analytical cone with a posterior modification with \texttt{PhotReal}. This technique modifies the original photometry by using an empirical library of spectral templates to make it more realistic. The reliability of the catalogues is confirmed by comparing the obtained color-magnitude relation, the luminosity and mass function and the angular correlation function with those of real data. Consistent comparisons between the expected photometric redshifts for different surveys are also provided. Very deep near infrared surveys such as Euclid will provide very good performance ($\Delta z/(1+z) \sim 0.025-0.053$) down to $H\sim24$ AB mag and up to $z\sim3$ depending on the optical observations available from the ground whereas extremely deep optical surveys such as LSST will obtain an overall lower photometric redshift resolution ($\Delta z/(1+z) \sim 0.045$) down to $i\sim27.5$ AB mag, being considerably improved ($\Delta z/(1+z) \sim 0.035$) if we restrict the sample down to i$\sim$24 AB mag. Those numbers can be substantially upgraded by selecting a subsample of galaxies with the best quality photometric redshifts. We finally discuss the impact that these surveys will have for the community in terms of photometric redshift legacy. This is the first of a series of papers where we set a framework for comparability between mock catalogues and observations with a particular focus on cluster surveys. The Euclid and LSST mocks are made publicly available in the following link: http://photmocks.obspm.fr/.

Transiting Planets with LSST II. Period Detection of Planets Orbiting 1 Solar Mass Hosts

The Large Synoptic Survey Telescope (LSST) will photometrically monitor ~1 billion stars for ten years. The resulting light curves can be used to detect transiting exoplanets. In particular, as demonstrated by Lund et al. (2015), LSST will probe stellar populations currently undersampled in most exoplanet transit surveys, including out to extragalactic distances. In this paper we test the efficiency of the box-fitting least-squares (BLS) algorithm for accurately recovering the periods of transiting exoplanets using simulated LSST data. We model planets with a range of radii orbiting a solar-mass star at a distance of 7 kpc, with orbital periods ranging from 0.5 to 20 d. We find that typical LSST observations will be able to reliably detect Hot Jupiters with periods shorter than ~3 d. At the same time, we find that the LSST deep drilling cadence is extremely powerful: the BLS algorithm successfully recovers at least 30% of sub-Saturn-size exoplanets with orbital periods as long as 20 d.

Transiting Planets with LSST II. Period Detection of Planets Orbiting 1 Solar Mass Hosts [Replacement]

The Large Synoptic Survey Telescope (LSST) will photometrically monitor approximately 1 billion stars for ten years. The resulting light curves can be used to detect transiting exoplanets. In particular, as demonstrated by Lund et al. (2015), LSST will probe stellar populations currently undersampled in most exoplanet transit surveys, including out to extragalactic distances. In this paper we test the efficiency of the box-fitting least-squares (BLS) algorithm for accurately recovering the periods of transiting exoplanets using simulated LSST data. We model planets with a range of radii orbiting a solar-mass star at a distance of 7 kpc, with orbital periods ranging from 0.5 to 20 days. We find that standard-cadence LSST observations will be able to reliably recover the periods of Hot Jupiters with periods shorter than approximately 3 days, however it will remain a challenge to confidently distinguish these transiting planets from false positives. At the same time, we find that the LSST deep drilling cadence is extremely powerful: the BLS algorithm successfully recovers at least 30% of sub-Saturn-size exoplanets with orbital periods as long as 20 days, and a simple BLS power criterion robustly distinguishes approximately 98% of these from photometric (i.e. statistical) false positives.

The Gaia-LSST Synergy

We discuss the synergy of Gaia and the Large Synoptic Survey Telescope (LSST) in the context of Milky Way studies. LSST can be thought of as Gaia's deep complement because the two surveys will deliver trigonometric parallax, proper-motion, and photometric measurements with similar uncertainties at Gaia's faint end at $r=20$, and LSST will extend these measurements to a limit about five magnitudes fainter. We also point out that users of Gaia data will have developed data analysis skills required to benefit from LSST data, and provide detailed information about how international participants can join LSST.

Periodograms for Multiband Astronomical Time Series

This paper introduces the multiband periodogram, a general extension of the well-known Lomb-Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb-Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. We use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature, and find that this method will be able to efficiently determine the correct period in the majority of LSST's bright RR Lyrae stars with as little as six months of LSST data. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.

The Whole is Greater than the Sum of the Parts: Optimizing the Joint Science Return from LSST, Euclid and WFIRST

The focus of this report is on the opportunities enabled by the combination of LSST, Euclid and WFIRST, the optical surveys that will be an essential part of the next decade's astronomy. The sum of these surveys has the potential to be significantly greater than the contributions of the individual parts. As is detailed in this report, the combination of these surveys should give us multi-wavelength high-resolution images of galaxies and broadband data covering much of the stellar energy spectrum. These stellar and galactic data have the potential of yielding new insights into topics ranging from the formation history of the Milky Way to the mass of the neutrino. However, enabling the astronomy community to fully exploit this multi-instrument data set is a challenging technical task: for much of the science, we will need to combine the photometry across multiple wavelengths with varying spectral and spatial resolution. We identify some of the key science enabled by the combined surveys and the key technical challenges in achieving the synergies.

The Whole is Greater than the Sum of the Parts: Optimizing the Joint Science Return from LSST, Euclid and WFIRST [Replacement]

The focus of this report is on the opportunities enabled by the combination of LSST, Euclid and WFIRST, the optical surveys that will be an essential part of the next decade's astronomy. The sum of these surveys has the potential to be significantly greater than the contributions of the individual parts. As is detailed in this report, the combination of these surveys should give us multi-wavelength high-resolution images of galaxies and broadband data covering much of the stellar energy spectrum. These stellar and galactic data have the potential of yielding new insights into topics ranging from the formation history of the Milky Way to the mass of the neutrino. However, enabling the astronomy community to fully exploit this multi-instrument data set is a challenging technical task: for much of the science, we will need to combine the photometry across multiple wavelengths with varying spectral and spatial resolution. We identify some of the key science enabled by the combined surveys and the key technical challenges in achieving the synergies.

Improving the LSST dithering pattern and cadence for dark energy studies [Replacement]

The Large Synoptic Survey Telescope (LSST) will explore the entire southern sky over 10 years starting in 2022 with unprecedented depth and time sampling in six filters, $ugrizy$. Artificial power on the scale of the 3.5 deg LSST field-of-view will contaminate measurements of baryonic acoustic oscillations (BAO), which fall at the same angular scale at redshift $z \sim 1$. Using the HEALPix framework, we demonstrate the impact of an "un-dithered" survey, in which $17\%$ of each LSST field-of-view is overlapped by neighboring observations, generating a honeycomb pattern of strongly varying survey depth and significant artificial power on BAO angular scales. We find that adopting large dithers (i.e., telescope pointing offsets) of amplitude close to the LSST field-of-view radius reduces artificial structure in the galaxy distribution by a factor of $\sim$10. We propose an observing strategy utilizing large dithers within the main survey and minimal dithers for the LSST Deep Drilling Fields. We show that applying various magnitude cutoffs can further increase survey uniformity. We find that a magnitude cut of $r < 27.3$ removes significant spurious power from the angular power spectrum with a minimal reduction in the total number of observed galaxies over the ten-year LSST run. We also determine the effectiveness of the observing strategy for Type Ia SNe and predict that the main survey will contribute $\sim$100,000 Type Ia SNe. We propose a concentrated survey where LSST observes one-third of its main survey area each year, increasing the number of main survey Type Ia SNe by a factor of $\sim$1.5, while still enabling the successful pursuit of other science drivers.

Improving the LSST dithering pattern and cadence for dark energy studies

The Large Synoptic Survey Telescope (LSST) will explore the entire southern sky over 10 years starting in 2022 with unprecedented depth and time sampling in six filters, $ugrizy$. Artificial power on the scale of the 3.5 deg LSST field-of-view will contaminate measurements of baryonic acoustic oscillations (BAO), which fall at the same angular scale at redshift $z \sim 1$. Using the HEALPix framework, we demonstrate the impact of an "un-dithered" survey, in which $17\%$ of each LSST field-of-view is overlapped by neighboring observations, generating a honeycomb pattern of strongly varying survey depth and significant artificial power on BAO angular scales. We find that adopting large dithers (i.e., telescope pointing offsets) of amplitude close to the LSST field-of-view radius reduces artificial structure in the galaxy distribution by a factor of $\sim$10. We propose an observing strategy utilizing large dithers within the main survey and minimal dithers for the LSST Deep Drilling Fields. We show that applying various magnitude cutoffs can further increase survey uniformity. We find that a magnitude cut of $r < 27.3$ removes significant spurious power from the angular power spectrum with a minimal reduction in the total number of observed galaxies over the ten-year LSST run. We also determine the effectiveness of the observing strategy for Type Ia SNe and predict that the main survey will contribute $\sim$100,000 Type Ia SNe. We propose a concentrated survey where LSST observes one-third of its main survey area each year, increasing the number of main survey Type Ia SNe by a factor of $\sim$1.5, while still enabling the successful pursuit of other science drivers.

Crosstalk in multi-output CCDs for LSST [Replacement]

LSST's compact, low power focal plane will be subject to electronic crosstalk with some unique signatures due to its readout geometry. This note describes the crosstalk mechanisms, ongoing characterization of prototypes, and implications for the observing cadence.

Synergy between the Large Synoptic Survey Telescope and the Square Kilometre Array

We provide an overview of the science benefits of combining information from the Square Kilometre Array (SKA) and the Large Synoptic Survey Telescope (LSST). We first summarise the capabilities and timeline of the LSST and overview its science goals. We then discuss the science questions in common between the two projects, and how they can be best addressed by combining the data from both telescopes. We describe how weak gravitational lensing and galaxy clustering studies with LSST and SKA can provide improved constraints on the causes of the cosmological acceleration. We summarise the benefits to galaxy evolution studies of combining deep optical multi-band imaging with radio observations. Finally, we discuss the excellent match between one of the most unique features of the LSST, its temporal cadence in the optical waveband, and the time resolution of the SKA.

Probing Neutrino Hierarchy and Chirality via Wakes

The relic neutrinos are expected to acquire a bulk relative velocity with respect to the dark matter at low redshifts, and downstream of dark matter halos neutrino wakes are expected to develop. We propose a method of measuring the neutrino mass based on this mechanism. The neutrino wake will cause a dipole distortion of the galaxy-galaxy lensing pattern. This effect could be detected by combining upcoming lensing surveys, e.g. the LSST and Euclid surveys with a low redshift galaxy survey or a 21cm intensity mapping survey which can map the neutrino flow field. The data obtained with LSST and Euclid should enable us to make positive detection if the three neutrino masses are Quasi-Degenerate, and a future high precision 21cm lensing survey would allow the normal hierarchy and inverted hierarchy cases to be distinguished, and even the right handed Dirac neutrinos may be detectable.

Probing Neutrino Hierarchy and Chirality via Wakes [Cross-Listing]

The relic neutrinos are expected to acquire a bulk relative velocity with respect to the dark matter at low redshifts, and downstream of dark matter halos neutrino wakes are expected to develop. We propose a method of measuring the neutrino mass based on this mechanism. The neutrino wake will cause a dipole distortion of the galaxy-galaxy lensing pattern. This effect could be detected by combining upcoming lensing surveys, e.g. the LSST and Euclid surveys with a low redshift galaxy survey or a 21cm intensity mapping survey which can map the neutrino flow field. The data obtained with LSST and Euclid should enable us to make positive detection if the three neutrino masses are Quasi-Degenerate, and a future high precision 21cm lensing survey would allow the normal hierarchy and inverted hierarchy cases to be distinguished, and even the right handed Dirac neutrinos may be detectable.

LSST optical beam simulator

We describe a camera beam simulator for the LSST which is capable of illuminating a 60mm field at f/1.2 with realistic astronomical scenes, enabling studies of CCD astrometric and photometric performance. The goal is to fully simulate LSST observing, in order to characterize charge transport and other features in the thick fully depleted CCDs and to probe low level systematics under realistic conditions. The automated system simulates the centrally obscured LSST beam and sky scenes, including the spectral shape of the night sky. The doubly telecentric design uses a nearly unit magnification design consisting of a spherical mirror, three BK7 lenses, and one beam-splitter window. To achieve the relatively large field the beam-splitter window is used twice. The motivation for this LSST beam test facility was driven by the need to fully characterize a new generation of thick fully-depleted CCDs, and assess their suitability for the broad range of science which is planned for LSST. Due to the fast beam illumination and the thick silicon design [each pixel is 10 microns wide and over 100 microns deep] at long wavelengths there can be effects of photon transport and charge transport in the high purity silicon. The focal surface covers a field more than sufficient for a 40x40 mm LSST CCD. Delivered optical quality meets design goals, with 50% energy within a 5 micron circle. The tests of CCD performance are briefly described.

Data Mining for Gravitationally Lensed Quasars

Gravitationally lensed (GL) quasars are brighter than their unlensed counterparts and produce images with distinctive morphological signatures. Past searches and target selection algorithms, in particular the Sloan Quasar Lens Search (SQLS), have relied on basic morphological criteria, which were applied to samples of bright, spectroscopically confirmed quasars. The SQLS techniques are not sufficient for searching into new surveys (e.g. DES, PS1, LSST), because spectroscopic information is not readily available and the large data volume requires higher purity in target/candidate selection. We carry out a systematic exploration of machine learning techniques and demonstrate that a two step strategy can be highly effective. In the first step we use catalog-level information ($griz$+WISE magnitudes, second moments) to preselect targets, using artificial neural networks. The accepted targets are then inspected with pixel-by-pixel pattern recognition algorithms (Gradient-Boosted Trees), to form a final set of candidates. The results from this procedure can be used to further refine the simpler SQLS algorithms, with a twofold (or threefold) gain in purity and the same (or $80\%$) completeness at target-selection stage, or a purity of $70\%$ and a completeness of $60\%$ after the candidate-selection step. Simpler photometric searches in $griz$+WISE based on colour cuts would provide samples with $7\%$ purity or less. Our technique is extremely fast, as a list of candidates can be obtained from a stage III experiment (e.g. DES catalog/database) in {a few} CPU hours. The techniques are easily extendable to Stage IV experiments like LSST with the addition of time domain information.

Spectroscopic Needs for Calibration of LSST Photometric Redshifts

This white paper summarizes the conclusions of the Snowmass White Paper "Spectroscopic Needs for Imaging Dark Energy Experiments" (arXiv:1309.5384) which are relevant to the calibration of LSST photometric redshifts; i.e., the accurate characterization of biases and uncertainties in photo-z's. Any significant miscalibration will lead to systematic errors in photo-z's, impacting nearly all extragalactic science with LSST. As existing deep redshift samples have failed to yield highly-secure redshifts for a systematic 20%-60% of their targets, it is a strong possibility that future deep spectroscopic samples will not solve the calibration problem on their own. The best options in this scenario are provided by cross-correlation methods that utilize clustering with objects from spectroscopic surveys (which need not be fully representative) to trace the redshift distribution of the full sample. For spectroscopy, the eBOSS survey would enable a basic calibration of LSST photometric redshifts, while the expected LSST-DESI overlap would be more than sufficient for an accurate calibration at z>0.2. A DESI survey of nearby galaxies conducted in bright time would enable accurate calibration down to z~0. The expanded areal coverage provided by the transfer of the DESI instrument (or duplication of it at the Blanco Telescope) would enable the best possible calibration from cross-correlations, in addition to other science gains.

Spectroscopic Needs for Training of LSST Photometric Redshifts

This white paper summarizes those conclusions of the Snowmass White Paper "Spectroscopic Needs for Imaging Dark Energy Experiments" (arXiv:1309.5384) which are relevant to the training of LSST photometric redshifts; i.e., the use of spectroscopic redshifts to improve algorithms and reduce photo-z errors. The larger and more complete the available training set is, the smaller the RMS error in photo-z estimates should be, increasing LSST's constraining power. Among the better US-based options for this work are the proposed MANIFEST fiber feed for the Giant Magellan Telescope or (with lower survey speed) the WFOS spectrograph on the Thirty Meter Telescope (TMT). Due to its larger field of view and higher multiplexing, the PFS spectrograph on Subaru would be able to obtain a baseline training sample faster than TMT; comparable performance could be achieved with a highly-multiplexed spectrograph on Gemini with at least a 20 arcmin diameter field of view.

Maximizing LSST's Scientific Return: Ensuring Participation from Smaller Institutions

The remarkable scientific return and legacy of LSST, in the era that it will define, will not only be realized in the breakthrough science that will be achieved with catalog data. This Big Data survey will shape the way the entire astronomical community advances -- or fails to embrace -- new ways of approaching astronomical research and data. In this white paper, we address the NRC template questions 4,5,6,8 and 9, with a focus on the unique challenges for smaller, and often under-resourced, institutions, including institutions dedicated to underserved minority populations, in the efficient and effective use of LSST data products to maximize LSST's scientific return.

The Variable Sky of Deep Synoptic Surveys [Replacement]

The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria -- a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky -- Galactic stars, QSOs, AGNs and asteroids. It is found that LSST will be capable of discovering ~10^5 high latitude |b| > 20 deg) variable stars per night at the beginning of the survey. (The corresponding number for |b| < 20 deg is orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100/night within 2 years. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of variable galactic nuclei (AGNs) and Quasi Stellar Objects (QSOs) are each predicted to begin at ~3000 per night, and decrease by 50X over 4 years. SNe are expected at ~1100/night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at > 10^5 per night, and if orbital determination has a 50% success rate per epoch, will drop below 1000/night within 2 years.

The Variable Sky of Deep Synoptic Surveys

The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria -- a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky -- Galactic stars, QSOs, AGNs and asteroids. It is found that LSST will be capable of discovering ~10^4 high latitude |b| > 20 deg) variable stars per night at the beginning of the survey. (The corresponding number for |b| < 20 deg is 2 orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100/night within less than 1 year. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of variable galactic nuclei (AGNs) and Quasi Stellar Objects (QSOs) are each predicted to begin at ~3000 per night, and decrease by 50X over 4 years. SNe are expected at ~1100/night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at > 10^5 per night, and if orbital determination has a 50% success rate per epoch, will drop below 1000/night within 2 years.

The Variable Sky of Deep Synoptic Surveys [Replacement]

The discovery of variable and transient sources is an essential product of synoptic surveys. The alert stream will require filtering for personalized criteria -- a process managed by a functionality commonly described as a Broker. In order to understand quantitatively the magnitude of the alert generation and Broker tasks, we have undertaken an analysis of the most numerous types of variable targets in the sky -- Galactic stars, QSOs, AGNs and asteroids. It is found that LSST will be capable of discovering ~10^5 high latitude |b| > 20 deg) variable stars per night at the beginning of the survey. (The corresponding number for |b| < 20 deg is orders of magnitude larger, but subject to caveats concerning extinction and crowding.) However, the number of new discoveries may well drop below 100/night within 1 year. The same analysis applied to GAIA clarifies the complementarity of the GAIA and LSST surveys. Discovery of variable galactic nuclei (AGNs) and Quasi Stellar Objects (QSOs) are each predicted to begin at ~3000 per night, and decrease by 50X over 4 years. SNe are expected at ~1100/night, and after several survey years will dominate the new variable discovery rate. LSST asteroid discoveries will start at > 10^5 per night, and if orbital determination has a 50% success rate per epoch, will drop below 1000/night within 2 years.

Strong Lens Time Delay Challenge: II. Results of TDC1

We present the results of the first strong lens time delay challenge. The motivation, experimental design, and entry level challenge are described in a companion paper. This paper presents the main challenge, TDC1, which consisted in analyzing thousands of simulated light curves blindly. The observational properties of the light curves cover the range in quality obtained for current targeted efforts (e.g. COSMOGRAIL) and expected from future synoptic surveys (e.g. LSST), and include "evilness" in the form of simulated systematic errors. 7 teams participated in TDC1, submitting results from 78 different method variants. After a describing each method, we compute and analyze basic statistics measuring accuracy (or bias) $A$, goodness of fit $\chi^2$, precision $P$, and success rate $f$. For some methods we identify outliers as an important issue. Other methods show that outliers can be controlled via visual inspection or conservative quality control. Several methods are competitive, i.e. give $|A|<0.03$, $P<0.03$, and $\chi^2<1.5$, with some of the methods already reaching sub-percent accuracy. The fraction of light curves yielding a time delay measurement is typically in the range $f = $20--40%. It depends strongly on the quality of the data: COSMOGRAIL-quality cadence and light curve lengths yield significantly higher $f$ than does sparser sampling. We estimate that LSST should provide around 400 robust time-delay measurements, each with $P<0.03$ and $|A|<0.01$, comparable to current lens modeling uncertainties. In terms of observing strategies, we find that $A$ and $f$ depend mostly on season length, while P depends mostly on cadence and campaign duration.

Strong Lens Time Delay Challenge: II. Results of TDC1 [Replacement]

We present the results of the first strong lens time delay challenge. The motivation, experimental design, and entry level challenge are described in a companion paper. This paper presents the main challenge, TDC1, which consisted of analyzing thousands of simulated light curves blindly. The observational properties of the light curves cover the range in quality obtained for current targeted efforts (e.g.,~COSMOGRAIL) and expected from future synoptic surveys (e.g.,~LSST), and include simulated systematic errors. \nteamsA\ teams participated in TDC1, submitting results from \nmethods\ different method variants. After a describing each method, we compute and analyze basic statistics measuring accuracy (or bias) $A$, goodness of fit $\chi^2$, precision $P$, and success rate $f$. For some methods we identify outliers as an important issue. Other methods show that outliers can be controlled via visual inspection or conservative quality control. Several methods are competitive, i.e., give $|A|<0.03$, $P<0.03$, and $\chi^2<1.5$, with some of the methods already reaching sub-percent accuracy. The fraction of light curves yielding a time delay measurement is typically in the range $f = $20--40\%. It depends strongly on the quality of the data: COSMOGRAIL-quality cadence and light curve lengths yield significantly higher $f$ than does sparser sampling. Taking the results of TDC1 at face value, we estimate that LSST should provide around 400 robust time-delay measurements, each with $P<0.03$ and $|A|<0.01$, comparable to current lens modeling uncertainties. In terms of observing strategies, we find that $A$ and $f$ depend mostly on season length, while P depends mostly on cadence and campaign duration.

Improving LSST Photometric Calibration with Gaia Data

We consider the possibility that the Gaia mission can supply data which will improve the photometric calibration of LSST. After outlining the LSST calibra- tion process and the information that will be available from Gaia, we explore two options for using Gaia data. The first is to use Gaia G-band photometry of selected stars, in conjunction with knowledge of the stellar parameters Teff, log g, and AV, and in some cases Z, to create photometric standards in the LSST u, g, r, i, z, and y bands. The accuracies of the resulting standard magnitudes are found to be insufficient to satisfy LSST requirements when generated from main sequence (MS) stars, but generally adequate from DA white dwarfs (WD). The second option is combine the LSST bandpasses into a synthetic Gaia G band, which is a close approximation to the real Gaia G band. This allows synthetic Gaia G photometry to be directly compared with actual Gaia G photometry at a level of accuracy which is useful for both verifying and improving LSST photometric calibration.

Transiting Planets with LSST I: Potential for LSST Exoplanet Detection [Replacement]

The Large Synoptic Survey Telescope (LSST) has been designed in order to satisfy several different scientific objectives that can be addressed by a ten-year synoptic sky survey. However, LSST will also provide a large amount of data that can then be exploited for additional science beyond its primary goals. We demonstrate the potential of using LSST data to search for transiting exoplanets, and in particular to find planets orbiting host stars that are members of stellar populations that have been less thoroughly probed by current exoplanet surveys. We find that existing algorithms can detect in simulated LSST light curves the transits of Hot Jupiters around solar-type stars, Hot Neptunes around K dwarfs, and planets orbiting stars in the Large Magellanic Cloud. We also show that LSST would have the sensitivity to potentially detect Super-Earths orbiting red dwarfs, including those in habitable zone orbits, if they are present in some fields that LSST will observe. From these results, we make the case that LSST has the ability to provide a valuable contribution to exoplanet science.

Transiting Planets with LSST I: Potential for LSST Exoplanet Detection

The Large Synoptic Survey Telescope (LSST) has been designed in order to satisfy several different scientific objectives that can be addressed by a ten-year synoptic sky survey. However, LSST will also provide a large amount of data that can then be exploited for additional science beyond its primary goals. We demonstrate the potential of using LSST data to search for transiting exoplanets, and in particular to find planets orbiting host stars that are members of stellar populations that have been less thoroughly probed by current exoplanet surveys. We find that existing algorithms can detect in simulated LSST light curves the transits of Hot Jupiters around solar-type stars, Hot Neptunes around K dwarfs, and planets orbiting stars in the Large Magellanic Cloud. We also show that LSST would have the sensitivity to potentially detect Super-Earths orbiting red dwarfs, including those in habitable zone orbits, if they are present in some fields that LSST will observe. From these results, we make the case that LSST has the ability to provide a valuable contribution to exoplanet science.

Too Many, Too Few, or Just Right? The Predicted Number and Distribution of Milky Way Dwarf Galaxies

We predict the spatial distribution and number of Milky Way dwarf galaxies to be discovered in the DES and LSST surveys, by completeness correcting the observed SDSS dwarf population. We apply most massive in the past, earliest forming, and earliest infall toy models to a set of dark matter-only simulated Milky Way/M31 halo pairs from Exploring the Local Volume In Simulations (ELVIS). The observed spatial distribution of Milky Way dwarfs in the LSST-era will discriminate between the earliest infall and other simplified models for how dwarf galaxies populate dark matter subhalos. Inclusive of all toy models and simulations, at 90% confidence we predict a total of 37-114 L $\gtrsim 10^3$L$_{\odot}$ dwarfs and 131-782 L $\lesssim 10^3$L$_{\odot}$ dwarfs within 300 kpc. These numbers of L $\gtrsim 10^3$L$_{\odot}$ dwarfs are dramatically lower than previous predictions, owing primarily to our use of updated detection limits and the decreasing number of SDSS dwarfs discovered per sky area. For an effective $r_{\rm limit}$ of 25.8 mag, we predict: 3-13 L $\gtrsim 10^3$L$_{\odot}$ and 9-99 L $\lesssim 10^3$L$_{\odot}$ dwarfs for DES, and 18-53 L $\gtrsim 10^3$L$_{\odot}$ and 53-307 L $\lesssim 10^3$L$_{\odot}$ dwarfs for LSST. These enormous predicted ranges ensure a coming decade of near-field excitement with these next generation surveys.

On the Performance of Quasar Reverberation Mapping in the Era of Time-Domain Photometric Surveys

We quantitatively assess, by means of comprehensive numerical simulations, the ability of broad-band photometric surveys to recover the broad emission line region (BLR) size in quasars under various observing conditions and for a wide range of object properties. Focusing on the general characteristics of the Large Synoptic Survey Telescope (LSST), we find that the slope of the size-luminosity relation for the BLR in quasars can be determined with unprecedented accuracy, of order a few percent, over a broad luminosity range and out to $z\sim 3$. In particular, major emission lines for which the BLR size can be reliably measured with LSST include H$\alpha$, MgII $\lambda 2799$, CIII] $\lambda 1909$, CIV $\lambda 1549$, and Ly$\alpha$, amounting to a total of $\gtrsim 10^5$ time-delay measurements for all transitions. Combined with an estimate for the emission line velocity dispersion, upcoming photometric surveys will facilitate the estimation of black hole masses in AGN over a broad range of luminosities and redshifts, allow for refined calibrations of BLR size-luminosity-redshift relations in different transitions, as well as lead to more reliable cross-calibration with other black hole mass estimation techniques.

Radio Astronomy in LSST Era

A community meeting on the topic of "Radio Astronomy in the LSST Era" was hosted by the National Radio Astronomy Observatory in Charlottesville, VA (2013 May 6--8). The focus of the workshop was on time domain radio astronomy and sky surveys. For the time domain, the extent to which radio and visible wavelength observations are required to understand several classes of transients was stressed, but there are also classes of radio transients for which no visible wavelength counterpart is yet known, providing an opportunity for discovery. From the LSST perspective, the LSST is expected to generate as many as 1 million alerts nightly, which will require even more selective specification and identification of the classes and characteristics of transients that can warrant follow up, at radio or any wavelength. The LSST will also conduct a deep survey of the sky, producing a catalog expected to contain over 38 billion objects in it. Deep radio wavelength sky surveys will also be conducted on a comparable time scale, and radio and visible wavelength observations are part of the multi-wavelength approach needed to classify and understand these objects. Radio wavelengths are valuable because they are unaffected by dust obscuration and, for galaxies, contain contributions both from star formation and from active galactic nuclei. The workshop touched on several other topics, on which there was consensus including the placement of other LSST "Deep Drilling Fields," inter-operability of software tools, and the challenge of filtering and exploiting the LSST data stream. There were also topics for which there was insufficient time for full discussion or for which no consensus was reached, which included the procedures for following up on LSST observations and the nature for future support of researchers desiring to use LSST data products.

Optical selection of quasars: SDSS and LSST

Over the last decade, quasar sample sizes have increased from several thousand to several hundred thousand, thanks mostly to SDSS imaging and spectroscopic surveys. LSST, the next-generation optical imaging survey, will provide hundreds of detections per object for a sample of more than ten million quasars with redshifts of up to about seven. We briefly review optical quasar selection techniques, with emphasis on methods based on colors, variability properties and astrometric behavior.

Cosmic shear without shape noise

We describe a new method for reducing the shape noise in weak lensing measurements by an order of magnitude. Our method relies on spectroscopic measurements of disk galaxy rotation and makes use of the Tully-Fisher (TF) relation in order to control for the intrinsic orientations of galaxy disks. For this new proposed experiment, the shape noise ceases to be an important source of statistical error. Using CosmoLike, a new cosmological analysis software package, we simulate likelihood analyses for two spectroscopic weak lensing survey concepts (roughly similar in scale to Dark Energy Survey Task Force Stage III and Stage IV missions) and compare their constraining power to a cosmic shear survey from the Large Synoptic Survey Telescope (LSST). Our forecasts in seven-dimensional cosmological parameter space include statistical uncertainties resulting from shape noise, cosmic variance, halo sample variance, and higher-order moments of the density field. We marginalize over systematic uncertainties arising from photometric redshift errors and shear calibration biases considering both optimistic and conservative assumptions about LSST systematic errors. We find that even the TF-Stage III is highly competitive with the optimistic LSST scenario, while evading the most important sources of theoretical and observational systematic error inherent in traditional weak lensing techniques. Furthermore, the TF technique enables a narrow-bin cosmic shear tomography approach to tightly constrain time-dependent signatures in the dark energy phenomenon.

Strong Lens Time Delay Challenge: I. Experimental Design

The time delays between point-like images in gravitational lens systems can be used to measure cosmological parameters as well as probe the dark matter (sub-)structure within the lens galaxy. The number of lenses with measured time delays is growing rapidly as a result of some dedicated efforts; the upcoming Large Synoptic Survey Telescope (LSST) will monitor ~1000 lens systems consisting of a foreground elliptical galaxy producing multiple images of a background quasar. In an effort to assess the present capabilities of the community to accurately measure the time delays in strong gravitational lens systems, and to provide input to dedicated monitoring campaigns and future LSST cosmology feasibility studies, we invite the community to take part in a "Time Delay Challenge" (TDC). The challenge is organized as a set of "ladders", each containing a group of simulated datasets to be analyzed blindly by participating independent analysis teams. Each rung on a ladder consists of a set of realistic mock observed lensed quasar light curves, with the rungs' datasets increasing in complexity and realism to incorporate a variety of anticipated physical and experimental effects. The initial challenge described here has two ladders, TDC0 and TDC1. TDC0 has a small number of datasets, and is designed to be used as a practice set by the participating teams as they set up their analysis pipelines. The (non-mandatory) deadline for completion of TDC0 will be the TDC1 launch date, December 1, 2013. TDC1 will consist of some 1000 light curves, a sample designed to provide the statistical power to make meaningful statements about the sub-percent accuracy that will be required to provide competitive Dark Energy constraints in the LSST era.

Growth of Cosmic Structure: Probing Dark Energy Beyond Expansion

The quantity and quality of cosmic structure observations have greatly accelerated in recent years. Further leaps forward will be facilitated by imminent projects, which will enable us to map the evolution of dark and baryonic matter density fluctuations over cosmic history. The way that these fluctuations vary over space and time is sensitive to the nature of dark matter and dark energy. Dark energy and gravity both affect how rapidly structure grows; the greater the acceleration, the more suppressed the growth of structure, while the greater the gravity, the more enhanced the growth. While distance measurements also constrain dark energy, the comparison of growth and distance data tests whether General Relativity describes the laws of physics accurately on large scales. Modified gravity models are able to reproduce the distance measurements but at the cost of altering the growth of structure (these signatures are described in more detail in the accompanying paper on Novel Probes of Gravity and Dark Energy). Upcoming surveys will exploit these differences to determine whether the acceleration of the Universe is due to dark energy or to modified gravity. To realize this potential, both wide field imaging and spectroscopic redshift surveys play crucial roles. Projects including DES, eBOSS, DESI, PFS, LSST, Euclid, and WFIRST are in line to map more than a 1000 cubic-billion-light-year volume of the Universe. These will map the cosmic structure growth rate to 1% in the redshift range 0<z<2, over the last 3/4 of the age of the Universe.

Growth of Cosmic Structure: Probing Dark Energy Beyond Expansion [Replacement]

The quantity and quality of cosmic structure observations have greatly accelerated in recent years. Further leaps forward will be facilitated by imminent projects, which will enable us to map the evolution of dark and baryonic matter density fluctuations over cosmic history. The way that these fluctuations vary over space and time is sensitive to the nature of dark matter and dark energy. Dark energy and gravity both affect how rapidly structure grows; the greater the acceleration, the more suppressed the growth of structure, while the greater the gravity, the more enhanced the growth. While distance measurements also constrain dark energy, the comparison of growth and distance data tests whether General Relativity describes the laws of physics accurately on large scales. Modified gravity models are able to reproduce the distance measurements but at the cost of altering the growth of structure (these signatures are described in more detail in the accompanying paper on Novel Probes of Gravity and Dark Energy). Upcoming surveys will exploit these differences to determine whether the acceleration of the Universe is due to dark energy or to modified gravity. To realize this potential, both wide field imaging and spectroscopic redshift surveys play crucial roles. Projects including DES, eBOSS, DESI, PFS, LSST, Euclid, and WFIRST are in line to map more than a 1000 cubic-billion-light-year volume of the Universe. These will map the cosmic structure growth rate to 1% in the redshift range 0<z<2, over the last 3/4 of the age of the Universe.

Prospects for Detecting Gamma Rays from Annihilating Dark Matter in Dwarf Galaxies in the Era of DES and LSST [Replacement]

Among the most stringent constraints on the dark matter annihilation cross section are those derived from observations of dwarf galaxies by the Fermi Gamma-Ray Space Telescope. As current (e.g., Dark Energy Survey, DES) and future (Large Scale Synoptic Telescope, LSST) optical imaging surveys discover more of the Milky Way's ultra-faint satellite galaxies, they may increase Fermi's sensitivity to dark matter annihilations. In this study, we use a semi-analytic model of the Milky Way's satellite population to predict the characteristics of the dwarfs likely to be discovered by DES and LSST, and project how these discoveries will impact Fermi's sensitivity to dark matter. While we find that modest improvements are likely, the dwarf galaxies discovered by DES and LSST are unlikely to increase Fermi's sensitivity by more than a factor of ~2. However, this outlook may be conservative, given that our model underpredicts the number of ultra-faint galaxies with large potential annihilation signals actually discovered in the Sloan Digital Sky Survey. Our simulation-based approach focusing on the Milky Way satellite population demographics complements existing empirically-based estimates.

Prospects for Detecting Gamma Rays from Annihilating Dark Matter in Dwarf Galaxies in the Era of DES and LSST [Replacement]

Among the most stringent constraints on the dark matter annihilation cross section are those derived from observations of dwarf galaxies by the Fermi Gamma-Ray Space Telescope. As current (e.g., Dark Energy Survey, DES) and future (Large Synoptic Survey Telescope, LSST) optical imaging surveys discover more of the Milky Way's ultra-faint satellite galaxies, they may increase Fermi's sensitivity to dark matter annihilations. In this study, we use a semi-analytic model of the Milky Way's satellite population to predict the characteristics of the dwarfs likely to be discovered by DES and LSST, and project how these discoveries will impact Fermi's sensitivity to dark matter. While we find that modest improvements are likely, the dwarf galaxies discovered by DES and LSST are unlikely to increase Fermi's sensitivity by more than a factor of ~2-4. However, this outlook may be conservative, given that our model underpredicts the number of ultra-faint galaxies with large potential annihilation signals actually discovered in the Sloan Digital Sky Survey. Our simulation-based approach focusing on the Milky Way satellite population demographics complements existing empirically-based estimates.

Prospects for Detecting Gamma Rays from Annihilating Dark Matter in Dwarf Galaxies in the Era of DES and LSST [Replacement]

Among the most stringent constraints on the dark matter annihilation cross section are those derived from observations of dwarf galaxies by the Fermi Gamma-Ray Space Telescope. As current (e.g., Dark Energy Survey, DES) and future (Large Synoptic Survey Telescope, LSST) optical imaging surveys discover more of the Milky Way's ultra-faint satellite galaxies, they may increase Fermi's sensitivity to dark matter annihilations. In this study, we use a semi-analytic model of the Milky Way's satellite population to predict the characteristics of the dwarfs likely to be discovered by DES and LSST, and project how these discoveries will impact Fermi's sensitivity to dark matter. While we find that modest improvements are likely, the dwarf galaxies discovered by DES and LSST are unlikely to increase Fermi's sensitivity by more than a factor of ~2-4. However, this outlook may be conservative, given that our model underpredicts the number of ultra-faint galaxies with large potential annihilation signals actually discovered in the Sloan Digital Sky Survey. Our simulation-based approach focusing on the Milky Way satellite population demographics complements existing empirically-based estimates.

 

You need to log in to vote

The blog owner requires users to be logged in to be able to vote for this post.

Alternatively, if you do not have an account yet you can create one here.

Powered by Vote It Up

^ Return to the top of page ^