# Posts Tagged software package

## Recent Postings from software package

### $\mathtt{ComEst}$: a Completeness Estimator of Source Extraction on Astronomical Imaging

The completeness of source detection is critical for analyzing the photometric and spatial properties of the population of interest observed by astronomical imaging. We present a software package $\mathtt{ComEst}$, which calculates the completeness of source detection on charge-coupled device (CCD) images of astronomical observations, especially for the optical and near-infrared (NIR) imaging of galaxies and point sources. The completeness estimator $\mathtt{ComEst}$ is designed for the source finder $\mathtt{SExtractor}$ used on the CCD images saved in the Flexible Image Transport System (FITS) format. Specifically, $\mathtt{ComEst}$ estimates the completeness of the source detection by deriving the detection rate of synthetic point sources and galaxies simulated on the observed CCD images. In order to capture any observational artifacts or noise properties while deriving the completeness, $\mathtt{ComEst}$ directly carries out the detection of simulated sources on the observed images. Given an observed CCD image saved in FITS format, $\mathtt{ComEst}$ derives the completeness of the source detection from end to end as a function of source flux (or magnitude) and CCD position. In addition, $\mathtt{ComEst}$ can also estimate the purity of the source detection by comparing the catalog of the detected sources to the input catalogs of the simulated sources. We run ComEst on the images from Blanco Cosmology Survey (BCS) and compare the derived completeness as a function of magnitude to the limiting magnitudes derived by using the Signal-to-Noise ratio (SNR) and number count histogram of the detected sources. $\mathtt{ComEst}$ is released as a Python package with an easy-to-use syntax and is publicly available at https://github.com/inonchiu/ComEst

### APFELgrid: a high performance tool for parton density determinations

We present a new software package designed to reduce the computational burden of hadron collider measurements in Parton Distribution Function (PDF) fits. The APFELgrid package converts interpolated weight tables provided by APPLgrid files into a more efficient format for PDF fitting by the combination with PDF and $\alpha_s$ evolution factors provided by APFEL. This combination significantly reduces the number of operations required to perform the calculation of hadronic observables in PDF fits and simplifies the structure of the calculation into a readily optimised scalar product. We demonstrate that our technique can lead to a substantial speed improvement when compared to existing methods without any reduction in numerical accuracy.

### APFELgrid: a high performance tool for parton density determinations [Cross-Listing]

We present a new software package designed to reduce the computational burden of hadron collider measurements in Parton Distribution Function (PDF) fits. The APFELgrid package converts interpolated weight tables provided by APPLgrid files into a more efficient format for PDF fitting by the combination with PDF and $\alpha_s$ evolution factors provided by APFEL. This combination significantly reduces the number of operations required to perform the calculation of hadronic observables in PDF fits and simplifies the structure of the calculation into a readily optimised scalar product. We demonstrate that our technique can lead to a substantial speed improvement when compared to existing methods without any reduction in numerical accuracy.

### TRIPPy: Trailed Image Photometry in Python

Photometry of moving sources typically suffers from reduced signal-to-noise (SNR) or flux measurements biased to incorrect low values through the use of circular apertures. To address this issue we present the software package, TRIPPy: TRailed Image Photometry in Python. TRIPPy introduces the pill aperture, which is the natural extension of the circular aperture appropriate for linearly trailed sources. The pill shape is a rectangle with two semicircular end-caps, and is described by three parameters, the trail length and angle, and the radius. The TRIPPy software package also includes a new technique to generate accurate model point-spread functions (PSF) and trailed point-spread functions (TSF) from stationary background sources in sidereally tracked images. The TSF is merely the convolution of the model PSF, which consists of a moffat profile, and super sampled lookup table. From the TSF, accurate pill aperture corrections can be estimated as a function of pill radius with a accuracy of 10 millimags for highly trailed sources. Analogous to the use of small circular apertures and associated aperture corrections, small radius pill apertures can be used to preserve signal-to-noise of low flux sources, with appropriate aperture correction applied to provide an accurate, unbiased flux measurement at all SNR.

### BEANS - a software package for distributed Big Data analysis

BEANS software is a web based, easy to install and maintain, new tool to store and analyse data in a distributed way for a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in the so-called Big Data. Creation of BEANS software is an answer to the growing needs of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field or open source software.

### AstroImageJ: Image Processing and Photometric Extraction for Ultra-Precise Astronomical Light Curves

ImageJ is a graphical user interface (GUI) driven, public domain, Java-based, software package for general image processing traditionally used mainly in life sciences fields. The image processing capabilities of ImageJ are useful and extendable to other scientific fields. Here we present AstroImageJ (AIJ), which provides an astronomy specific image display environment and tools for astronomy specific image calibration and data reduction. Although AIJ maintains the general purpose image processing capabilities of ImageJ, AIJ is streamlined for time-series differential photometry, light curve detrending and fitting, and light curve plotting, especially for applications requiring ultra-precise light curves (e.g., exoplanet transits). AIJ reads and writes standard FITS files, as well as other common image formats, provides FITS header viewing and editing, and is World Coordinate System (WCS) aware, including an automated interface to the astrometry.net web portal for plate solving images. Although AIJ provides research grade image calibration and analysis tools, its GUI driven approach and cross-platform compatibility enable new users, even at the level of undergraduate student, high school student, or amateur astronomer, to quickly start processing, modeling, and plotting astronomical image data with one tightly integrated software package.

### MIDAS: Software for the detection and analysis of lunar impact flashes

Since 2009 we are running a project to identify flashes produced by the impact of meteoroids on the surface of the Moon. For this purpose we are employing small telescopes and high-sensitivity CCD video cameras. To automatically identify these events a software package called MIDAS was developed and tested. This package can also perform the photometric analysis of these flashes and estimate the value of the luminous efficiency. Besides, we have implemented in MIDAS a new method to establish which is the likely source of the meteoroids (known meteoroid stream or sporadic background). The main features of this computer program are analyzed here, and some examples of lunar impact events are presented.

### The NIFTY way of Bayesian signal inference

We introduce NIFTY, "Numerical Information Field Theory", a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D3PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.

### LanHEP - a package for automatic generation of Feynman rules from the Lagrangian. Updated version 3.2 [Cross-Listing]

We present a new version 3.2 of the LanHEP software package. New features include UFO output, color sextet particles and new substutution techniques which allow to define new routines.

### FlexibleSUSY - a meta spectrum generator for supersymmetric models

FlexibleSUSY is a software package that takes as input descriptions of (non-)minimal supersymmetric models written in Wolfram/Mathematica and generates a set of spectrum generator libraries and executables, with the aid of SARAH. The design goals are precision, reliability, modularity, speed, and readability of the code. The boundary conditions are independent C++ objects that are plugged into the boundary value problem solver together with the model objects. This clean separation makes it easy to adapt the generated code for individual projects. The current status of the interface and implementation is sketched.

### FlexibleSUSY - a meta spectrum generator for supersymmetric models [Replacement]

FlexibleSUSY is a software package that takes as input descriptions of (non-)minimal supersymmetric models written in Wolfram/Mathematica and generates a set of spectrum generator libraries and executables, with the aid of SARAH. The design goals are precision, reliability, modularity, speed, and readability of the code. The boundary conditions are independent C++ objects that are plugged into the boundary value problem solver together with the model objects. This clean separation makes it easy to adapt the generated code for individual projects. The current status of the interface and implementation is sketched.

### Framework for Model Independent Analyses of Multiple Extra Quark Scenarios [Replacement]

In this paper we present an analysis strategy and a dedicated tool to determine the exclusion confidence level for any scenario involving multiple heavy extra quarks with generic decay channels, as predicted in several extensions of the Standard Model. We have created, validated and used a software package, called XQCAT (eXtra Quark Combined Analysis Tool), which is based on publicly available experimental data from direct searches for top partners and from Supersymmetry inspired searches. By means of this code, we recast the limits from CMS on new heavy extra quarks considering a complete set of decay channels. The resulting exclusion confidence levels are presented for some simple scenarios with multiple states and general coupling assumptions. Highlighting the importance of combining multiple topology searches to obtain accurate re-interpretations of the existing searches, we discuss the reach of the SUSY analyses so as to set bounds on new quark resonances. In particular, we report on the re-interpretation of the existing limits on benchmark scenarios with one and multiple pair-produced top partners having non-exclusive couplings to the third Standard Model generation of quarks.

### Model Independent Framework for Analysis of Scenarios with Multiple Heavy Extra Quarks

In this paper we present an analysis strategy and a dedicated tool to determine the exclusion confidence level for any scenario involving multiple heavy extra quarks with generic decay channels, as predicted in several extensions of the Standard Model. We have created, validated and used a software package, called XQCAT (eXtra Quark Combined Analysis Tool), which is based on publicly available experimental data from direct searches for top partners and from Supersymmetry inspired searches. The code will soon be publicly available and will be upgraded to include data from new searches. By means of this code, we recast the limits from CMS on new heavy extra quarks considering a complete set of decay channels. The resulting exclusion confidence levels are presented for some simple scenarios with multiple states and general coupling assumptions. Highlighting the importance of combining multiple topology searches to obtain accurate re-interpretations of the existing searches, we discuss the reach of the SUSY analyses so as to set bounds on new quark resonances. In particular, we report on the re-interpretation of the existing limits on benchmark scenarios with one and multiple pair-produced top partners having non-exclusive couplings to the third Standard Model generation of quarks.

### Molecfit: A Package for Telluric Absorption Correction

Correcting for the sky signature usually requires supplementary calibration data which are very expensive in terms of telescope time. In addition, the scheduling flexibility is restricted as these data have to be taken usually directly before/after the science observations due to the high variability of the telluric absorption which depends on the state and the chemical composition of the atmosphere at the time of observations. Therefore, a tool for sky correction, which does not require this supplementary calibration data, saves a significant amount of valuable telescope time and increases its efficiency. We developed a software package aimed at performing telluric feature corrections on the basis of synthetic absorption spectra.

### Symbolic computation of the Birkhoff normal form in the problem of stability of the triangular libration points

The problem of stability of the triangular libration points in the planar circular restricted three-body problem is considered. A software package, intended for normalization of autonomous Hamiltonian systems by means of computer algebra, is designed so that normalization problems of high analytical complexity could be solved. It is used to obtain the Birkhoff normal form of the Hamiltonian in the given problem. The normalization is carried out up to the 6th order of expansion of the Hamiltonian in the coordinates and momenta. Analytical expressions for the coefficients of the normal form of the 6th order are derived. Though intermediary expressions occupy gigabytes of the computer memory, the obtained coefficients of the normal form are compact enough for presentation in typographic format. The analogue of the Deprit formula for the stability criterion is derived in the 6th order of normalization. The obtained floating-point numerical values for the normal form coefficients and the stability criterion confirm the results by Markeev (1969) and Coppola and Rand (1989), while the obtained analytical and exact numeric expressions confirm the results by Meyer and Schmidt (1986) and Schmidt (1989). The given computational problem is solved without constructing a specialized algebraic processor, i.e., the designed computer algebra package has a broad field of applicability.

### PkANN - II. A non-linear matter power spectrum interpolator developed using artificial neural networks

In this paper we introduce PkANN, a freely available software package for interpolating the non-linear matter power spectrum, constructed using Artificial Neural Networks (ANNs). Previously, using Halofit to calculate matter power spectrum, we demonstrated that ANNs can make extremely quick and accurate predictions of the power spectrum. Now, using a suite of 6380 N-body simulations spanning 580 cosmologies, we train ANNs to predict the power spectrum over the cosmological parameter space spanning $3\sigma$ confidence level (CL) around the concordance cosmology. When presented with a set of cosmological parameters ($\Omega_{\rm m} h^2, \Omega_{\rm b} h^2, n_s, w, \sigma_8, \sum m_\nu$ and redshift $z$), the trained ANN interpolates the power spectrum for $z\leq2$ at sub-per cent accuracy for modes up to $k\leq0.9\,h\textrm{Mpc}^{-1}$. PkANN is faster than computationally expensive N-body simulations, yet provides a worst-case error $<1$ per cent fit to the non-linear matter power spectrum deduced through N-body simulations. The overall precision of PkANN is set by the accuracy of our N-body simulations, at 5 per cent level for cosmological models with $\sum m_\nu<0.5$ eV for all redshifts $z\leq2$. For models with $\sum m_\nu>0.5$ eV, predictions are expected to be at 5 (10) per cent level for redshifts $z>1$ ($z\leq1$). The PkANN interpolator may be freely downloaded from http://zuserver2.star.ucl.ac.uk/~fba/PkANN.

### PkANN - II. A non-linear matter power spectrum interpolator developed using artificial neural networks [Replacement]

In this paper we introduce PkANN, a freely available software package for interpolating the non-linear matter power spectrum, constructed using Artificial Neural Networks (ANNs). Previously, using Halofit to calculate matter power spectrum, we demonstrated that ANNs can make extremely quick and accurate predictions of the power spectrum. Now, using a suite of 6380 N-body simulations spanning 580 cosmologies, we train ANNs to predict the power spectrum over the cosmological parameter space spanning $3\sigma$ confidence level (CL) around the concordance cosmology. When presented with a set of cosmological parameters ($\Omega_{\rm m} h^2, \Omega_{\rm b} h^2, n_s, w, \sigma_8, \sum m_\nu$ and redshift $z$), the trained ANN interpolates the power spectrum for $z\leq2$ at sub-per cent accuracy for modes up to $k\leq0.9\,h\textrm{Mpc}^{-1}$. PkANN is faster than computationally expensive N-body simulations, yet provides a worst-case error $<1$ per cent fit to the non-linear matter power spectrum deduced through N-body simulations. The overall precision of PkANN is set by the accuracy of our N-body simulations, at 5 per cent level for cosmological models with $\sum m_\nu<0.5$ eV for all redshifts $z\leq2$. For models with $\sum m_\nu>0.5$ eV, predictions are expected to be at 5 (10) per cent level for redshifts $z>1$ ($z\leq1$). The PkANN interpolator may be freely downloaded from http://zuserver2.star.ucl.ac.uk/~fba/PkANN

### PsrPopPy: An open-source package for pulsar population simulations

We have produced a new software package for the simulation of pulsar populations, \textsc{PsrPopPy}, based on the \textsc{Psrpop} package. The codebase has been re-written in Python (save for some external libraries, which remain in their native Fortran), utilising the object-oriented features of the language, and improving the modularity of the code. Pre-written scripts are provided for running the simulations in standard' modes of operation, but the code is flexible enough to support the writing of personalised scripts. The modular structure also makes the addition of experimental features (such as new models for period or luminosity distributions) more straightforward than with the previous code. We also discuss potential additions to the modelling capabilities of the software. Finally, we demonstrate some potential applications of the code; first, using results of surveys at different observing frequencies, we find pulsar spectral indices are best fit by a normal distribution with mean $-1.4$ and standard deviation $1.0$. Second, we model pulsar spin evolution to calculate the best-fit for a relationship between a pulsar's luminosity and spin parameters. We used the code to replicate the analysis of Faucher-Gigu\ere & Kaspi, and have subsequently optimized their power-law dependence of radio luminosity, $L$, with period, $P$, and period derivative, $\dot{P}$. We find that the underlying population is best described by $L \propto P^{-1.39 \pm 0.09} \dot{P}^{0.48 \pm 0.04}$ and is very similar to that found for $\gamma$-ray pulsars by Perera et al. Using this relationship, we generate a model population and examine the age-luminosity relation for the entire pulsar population, which may be measurable after future large-scale surveys with the Square Kilometer Array.

### PsrPopPy: An open-source package for pulsar population simulations [Replacement]

We have produced a new software package for the simulation of pulsar populations, \textsc{PsrPopPy}, based on the \textsc{Psrpop} package. The codebase has been re-written in Python (save for some external libraries, which remain in their native Fortran), utilising the object-oriented features of the language, and improving the modularity of the code. Pre-written scripts are provided for running the simulations in standard' modes of operation, but the code is flexible enough to support the writing of personalised scripts. The modular structure also makes the addition of experimental features (such as new models for period or luminosity distributions) more straightforward than with the previous code. We also discuss potential additions to the modelling capabilities of the software. Finally, we demonstrate some potential applications of the code; first, using results of surveys at different observing frequencies, we find pulsar spectral indices are best fit by a normal distribution with mean $-1.4$ and standard deviation $1.0$. Second, we model pulsar spin evolution to calculate the best-fit for a relationship between a pulsar's luminosity and spin parameters. We used the code to replicate the analysis of Faucher-Gigu\ere & Kaspi, and have subsequently optimized their power-law dependence of radio luminosity, $L$, with period, $P$, and period derivative, $\dot{P}$. We find that the underlying population is best described by $L \propto P^{-1.39 \pm 0.09} \dot{P}^{0.48 \pm 0.04}$ and is very similar to that found for $\gamma$-ray pulsars by Perera et al. Using this relationship, we generate a model population and examine the age-luminosity relation for the entire pulsar population, which may be measurable after future large-scale surveys with the Square Kilometer Array.

### Automated Searches for Variable Stars

With recent developments in imaging and computer technology the amount of available astronomical data has increased dramatically. Although most of these data sets are not dedicated to the study of variable stars much of it can, with the application of proper software tools, be recycled for the discovery of new variable stars. Fits Viewer and Data Retrieval System is a new software package that takes advantage of modern computer advances to search astronomical data for new variable stars. More than 200 new variable stars have been found in a data set taken with the Calvin College Rehoboth Robotic telescope using FVDRS. One particularly interesting example is a very fast subdwarf B with a 95 minute orbital period, the fastest currently known of the HW Vir type.

### The Software Package for Astronomical Reductions with KMOS: SPARK

KMOS is a multi-object near-infrared integral field spectrometer with 24 deployable cryogenic pick-off arms. Inevitably, data processing is a complex task that requires careful calibration and quality control. In this paper we describe all the steps involved in producing science-quality data products from the raw observations. In particular, we focus on the following issues: (i) the calibration scheme which produces maps of the spatial and spectral locations of all illuminated pixels on the detectors; (ii) our concept of minimising the number of interpolations, to the limiting case of a single reconstruction that simultaneously uses raw data from multiple exposures; (iii) a comparison of the various interpolation methods implemented, and an assessment of the performance of true 3D interpolation schemes; (iv) the way in which instrumental flexure is measured and compensated. We finish by presenting some examples of data processed using the pipeline.

### Precision measures of the primordial abundance of deuterium [Replacement]

We report the discovery of deuterium absorption in the very metal-poor ([Fe/H] = -2.88) damped Lyman-alpha system at z_abs = 3.06726 toward the QSO SDSS J1358+6522. On the basis of 13 resolved D I absorption lines and the damping wings of the H I Lyman alpha transition, we have obtained a new, precise measure of the primordial abundance of deuterium. Furthermore, to bolster the present statistics of precision D/H measures, we have reanalyzed all of the known deuterium absorption-line systems that satisfy a set of strict criteria. We have adopted a blind analysis strategy (to remove human bias), and developed a software package that is specifically designed for precision D/H abundance measurements. For this reanalyzed sample of systems, we obtain a weighted mean of (D/H)_p = (2.53 +/- 0.04) x 10^-5, corresponding to a Universal baryon density100 Omega_b h^2 = 2.202 +/- 0.046 for the standard model of Big Bang Nucleosynthesis. By combining our measure of (D/H)_p with observations of the cosmic microwave background, we derive the effective number of light fermion species, N_eff = 3.28 +/- 0.28. We therefore rule out the existence of an additional (sterile) neutrino (i.e. N_eff = 4.046) at 99.3 percent confidence (2.7 sigma), provided that N_eff and the baryon-to-photon ratio (eta_10) did not change between BBN and recombination. We also place a strong bound on the neutrino degeneracy parameter, xi_D = +0.05 +/- 0.13 based only on the CMB+(D/H)_p observations. Combining xi_D with the current best literature measure of Y_p, we find |xi| <= +0.062. In future, improved measurements of several key reaction rates, in particular d(p,gamma)3He, and further measures of (D/H)_p with a precision comparable to those considered here, should allow even more stringent limits to be placed on new physics beyond the standard model.

### Visualizing Astronomical Data with Blender

Astronomical data take on a multitude of forms -- catalogs, data cubes, images, and simulations. The availability of software for rendering high-quality three-dimensional graphics lends itself to the paradigm of exploring the incredible parameter space afforded by the astronomical sciences. The software program Blender gives astronomers a useful tool for displaying data in a manner used by three-dimensional (3D) graphics specialists and animators. The interface to this popular software package is introduced with attention to features of interest in astronomy. An overview of the steps for generating models, textures, animations, camera work, and renders is outlined. An introduction is presented on the methodology for producing animations and graphics with a variety of astronomical data. Examples from sub-fields of astronomy with different kinds of data are shown with resources provided to members of the astronomical community. An example video showcasing the outlined principles and features is provided along with scripts and files for sample visualizations.

### User Guide for the Discrete Dipole Approximation Code DDSCAT 7.3 [Cross-Listing]

DDSCAT 7.3 is an open-source Fortran-90 software package applying the discrete dipole approximation to calculate scattering and absorption of electromagnetic waves by targets with arbitrary geometries and complex refractive index. The targets may be isolated entities (e.g., dust particles), but may also be 1-d or 2-d periodic arrays of "target unit cells", allowing calculation of absorption, scattering, and electric fields around arrays of nanostructures. The theory of the DDA and its implementation in DDSCAT is presented in Draine (1988) and Draine & Flatau (1994), and its extension to periodic structures in Draine & Flatau (2008), and efficient near-field calculations in Flatau & Draine (2012). DDSCAT 7.3 includes support for MPI, OpenMP, and the Intel Math Kernel Library (MKL). DDSCAT supports calculations for a variety of target geometries. Target materials may be both inhomogeneous and anisotropic. It is straightforward for the user to "import" arbitrary target geometries into the code. DDSCAT automatically calculates total cross sections for absorption and scattering and selected elements of the Mueller scattering intensity matrix for user-specified scattering directions. DDSCAT 7.3 can efficiently calculate E and B throughout a user-specified volume containing the target. This User Guide explains how to use DDSCAT 7.3 to carry out electromagnetic scattering calculations, including use of DDPOSTPROCESS, a Fortran-90 code to perform calculations with E and B at user-selected locations near the target. A number of changes have been made since the last release, DDSCAT 7.2 .

### PEACE: Pulsar Evaluation Algorithm for Candidate Extraction -- A software package for post-analysis processing of pulsar survey candidates

Modern radio pulsar surveys produce a large volume of prospective candidates, the majority of which are polluted by human-created radio frequency interference or other forms of noise. Typically, large numbers of candidates need to be visually inspected in order to determine if they are real pulsars. This process can be labor intensive. In this paper, we introduce an algorithm called PEACE (Pulsar Evaluation Algorithm for Candidate Extraction) which improves the efficiency of identifying pulsar signals. The algorithm ranks the candidates based on a score function. Unlike popular machine-learning based algorithms, no prior training data sets are required. This algorithm has been applied to data from several large-scale radio pulsar surveys. Using the human-based ranking results generated by students in the Arecibo Remote Command enter programme, the statistical performance of PEACE was evaluated. It was found that PEACE ranked 68% of the student-identified pulsars within the top 0.17% of sorted candidates, 95% within the top 0.34%, and 100% within the top 3.7%. This clearly demonstrates that PEACE significantly increases the pulsar identification rate by a factor of about 50 to 1000. To date, PEACE has been directly responsible for the discovery of 47 new pulsars, 5 of which are millisecond pulsars that may be useful for pulsar timing based gravitational-wave detection projects.

### Forthcoming mutual events of planets and astrometric radio sources

Radio astronomy observations of close approaches of the Solar system planets to compact radio sources as well as radio source occultations by planets may be of large interest for planetary sciences, dynamical astronomy, and testing gravity theories. In this paper, we present extended lists of occultations of astrometric radio sources observed in the framework of various astrometric and geodetic VLBI programs by planets, and close approaches of planets to radio sources expected in the nearest years. Computations are made making use of the EPOS software package.

### A software package for stellar and solar inverse-Compton emission: Stellarics

We present our software to compute gamma-ray emission from inverse-Compton scattering by cosmic-ray leptons in the heliosphere, as well as in the photospheres of stars. It includes a formulation of modulation in the heliosphere, but it can be used for any user-defined modulation model. Profiles and spectra are output to FITS files in a variety of forms for convenient use. Also included are general-purpose inverse-Compton routines with other features like energy loss rates and emissivity for any user-defined target photon and lepton spectra. The software is publicly available and it is under continuing development.

### The Pan-STARRS Moving Object Processing System

We describe the Pan-STARRS Moving Object Processing System (MOPS), a modern software package that produces automatic asteroid discoveries and identifications from catalogs of transient detections from next-generation astronomical survey telescopes. MOPS achieves > 99.5% efficiency in producing orbits from a synthetic but realistic population of asteroids whose measurements were simulated for a Pan-STARRS4-class telescope. Additionally, using a non-physical grid population, we demonstrate that MOPS can detect populations of currently unknown objects such as interstellar asteroids. MOPS has been adapted successfully to the prototype Pan-STARRS1 telescope despite differences in expected false detection rates, fill-factor loss and relatively sparse observing cadence compared to a hypothetical Pan-STARRS4 telescope and survey. MOPS remains >99.5% efficient at detecting objects on a single night but drops to 80% efficiency at producing orbits for objects detected on multiple nights. This loss is primarily due to configurable MOPS processing limits that are not yet tuned for the Pan-STARRS1 mission. The core MOPS software package is the product of more than 15 person-years of software development and incorporates countless additional years of effort in third-party software to perform lower-level functions such as spatial searching or orbit determination. We describe the high-level design of MOPS and essential subcomponents, the suitability of MOPS for other survey programs, and suggest a road map for future MOPS development.

### GOLIA: an INTEGRAL archive at INAF-IASF Milano

We present the archive of the INTEGRAL data developed and maintained at INAF-IASF Milano. The archive comprises all the public data currently available (revolutions 0026-1079, i.e., December 2002 - August 2011). INTEGRAL data are downloaded from the ISDC Data Centre for Astrophysics, Geneva, on a regular basis as they become public and a customized analysis using the OSA 9.0 software package is routinely performed on the IBIS/ISGRI data. The scientific products include individual pointing images and the associated detected source lists in the 17-30, 30-50, 17-50 and 50-100 keV energy bands, as well as light-curves binned over 100 s in the 17-30 keV band for sources of interest. Dedicated scripts to handle such vast datasets and results have been developed. We make the analysis tools to build such an archive publicly available. The whole database (raw data and products) enables an easy access to the hard X-ray long-term behavior of a large sample of sources.

### NIFTY - Numerical Information Field Theory - a versatile Python library for signal inference

NIFTY, "Numerical Information Field Theory", is a software package designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. Its object-oriented framework is written in Python, although it accesses libraries written in Cython, C++, and C for efficiency. NIFTY offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on fields into classes. Thereby, the correct normalization of operations on fields is taken care of automatically without concerning the user. This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory. Thus, NIFTY permits its user to rapidly prototype algorithms in 1D, and then apply the developed code in higher-dimensional settings of real world problems. The set of spaces on which NIFTY operates comprises point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those. The functionality and diversity of the package is demonstrated by a Wiener filter code example that successfully runs without modification regardless of the space on which the inference problem is defined.

### NIFTY - Numerical Information Field Theory - a versatile Python library for signal inference [Replacement]

NIFTY, "Numerical Information Field Theory", is a software package designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. Its object-oriented framework is written in Python, although it accesses libraries written in Cython, C++, and C for efficiency. NIFTY offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on fields into classes. Thereby, the correct normalization of operations on fields is taken care of automatically without concerning the user. This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory. Thus, NIFTY permits its user to rapidly prototype algorithms in 1D, and then apply the developed code in higher-dimensional settings of real world problems. The set of spaces on which NIFTY operates comprises point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those. The functionality and diversity of the package is demonstrated by a Wiener filter code example that successfully runs without modification regardless of the space on which the inference problem is defined.

### Inside catalogs: a comparison of source extraction software

The scope of this paper is to compare the catalog extraction performances obtained using the new combination of SExtractor with PSFEx, against the more traditional and diffuse application of DAOPHOT with ALLSTAR; therefore, the paper may provide a guide for the selection of the most suitable catalog extraction software. Both software packages were tested on two kinds of simulated images having, respectively, a uniform spatial distribution of sources and an overdensity in the center. In both cases, SExtractor is able to generate a deeper catalog than DAOPHOT. Moreover, the use of neural networks for object classification plus the novel SPREAD\_MODEL parameter push down to the limiting magnitude the possibility of star/galaxy separation. DAOPHOT and ALLSTAR provide an optimal solution for point-source photometry in stellar fields and very accurate and reliable PSF photometry, with robust star-galaxy separation. However, they are not useful for galaxy characterization, and do not generate catalogs that are very complete for faint sources. On the other hand, SExtractor, along with the new capability to derive PSF photometry, turns to be competitive and returns accurate photometry also for galaxies. We can assess that the new version of SExtractor, used in conjunction with PSFEx, represents a very powerful software package for source extraction with performances comparable to those of DAOPHOT. Finally, by comparing the results obtained in the case of a uniform and of an overdense spatial distribution of stars, we notice, for both software packages, a decline for the latter case in the quality of the results produced in terms of magnitudes and centroids.

### SEDfit: Software for Spectral Energy Distribution Fitting of Photometric Data

This paper describes SEDfit, the earliest --- but continually upgraded --- software package for spectral energy distribution fitting (SED fitting) of high-redshift photometric data, and the only one to properly treat non-detections. The principles of maximum-likelihood SED fitting are described, including formulae used for fitting both detected and un-detected (upper limits) photometric data. The internal mechanics of the SEDfit package are presented and several illustrative examples of its use are given. The paper concludes with a discussion of several issues and caveats applicable to SED-fitting in general.

### ASPECT: A spectra clustering tool for exploration of large spectral surveys

We present the novel, semi-automated clustering tool ASPECT for analysing voluminous archives of spectra. The heart of the program is a neural network in form of Kohonen's self-organizing map. The resulting map is designed as an icon map suitable for the inspection by eye. The visual analysis is supported by the option to blend in individual object properties such as redshift, apparent magnitude, or signal-to-noise ratio. In addition, the package provides several tools for the selection of special spectral types, e.g. local difference maps which reflect the deviations of all spectra from one given input spectrum (real or artificial). ASPECT is able to produce a two-dimensional topological map of a huge number of spectra. The software package enables the user to browse and navigate through a huge data pool and helps him to gain an insight into underlying relationships between the spectra and other physical properties and to get the big picture of the entire data set. We demonstrate the capability of ASPECT by clustering the entire data pool of 0.6 million spectra from the Data Release 4 of the Sloan Digital Sky Survey (SDSS). To illustrate the results regarding quality and completeness we track objects from existing catalogues of quasars and carbon stars, respectively, and connect the SDSS spectra with morphological information from the GalaxyZoo project.

### Solving inverse problems with the unfolding program TRUEE: Examples in astroparticle physics

The unfolding program TRUEE is a software package for the numerical solution of inverse problems. The algorithm was fi?rst applied in the FORTRAN77 program RUN. RUN is an event-based unfolding algorithm which makes use of the Tikhonov regularization. It has been tested and compared to di?fferent unfolding applications and stood out with notably stable results and reliable error estimation. TRUEE is a conversion of RUN to C++, which works within the powerful ROOT framework. The program has been extended for more user-friendliness and delivers unfolding results which are identical to RUN. Beside the simplicity of the installation of the software and the generation of graphics, there are new functions, which facilitate the choice of unfolding parameters and observables for the user. In this paper, we introduce the new unfolding program and present its performance by applying it to two exemplary data sets from astroparticle physics, taken with the MAGIC telescopes and the IceCube neutrino detector, respectively.

### New Techniques for High-Contrast Imaging with ADI: the ACORNS-ADI SEEDS Data Reduction Pipeline [Replacement]

We describe Algorithms for Calibration, Optimized Registration, and Nulling the Star in Angular Differential Imaging (ACORNS-ADI), a new, parallelized software package to reduce high-contrast imaging data, and its application to data from the SEEDS survey. We implement several new algorithms, including a method to register saturated images, a trimmed mean for combining an image sequence that reduces noise by up to ~20%, and a robust and computationally fast method to compute the sensitivity of a high-contrast observation everywhere on the field-of-view without introducing artificial sources. We also include a description of image processing steps to remove electronic artifacts specific to Hawaii2-RG detectors like the one used for SEEDS, and a detailed analysis of the Locally Optimized Combination of Images (LOCI) algorithm commonly used to reduce high-contrast imaging data. ACORNS-ADI is written in python. It is efficient and open-source, and includes several optional features which may improve performance on data from other instruments. ACORNS-ADI requires minimal modification to reduce data from instruments other than HiCIAO. It is freely available for download at www.github.com/t-brandt/acorns-adi under a BSD license.

### New Techniques for High-Contrast Imaging with ADI: the ACORNS-ADI SEEDS Data Reduction Pipeline

We describe Algorithms for Calibration, Optimized Registration, and Nulling the Star in Angular Differential Imaging (ACORNS-ADI), a new, parallelized software package to reduce high-contrast imaging data, and its application to data from the SEEDS survey. We implement several new algorithms, including a method to centroid saturated images, a trimmed mean for combining an image sequence that reduces noise by up to ~20%, and a robust and computationally fast method to compute the sensitivity of a high-contrast observation everywhere on the field-of-view without introducing artificial sources. We also include a description of image processing steps to remove electronic artifacts specific to Hawaii2-RG detectors like the one used for SEEDS, and a detailed analysis of the Locally Optimized Combination of Images (LOCI) algorithm commonly used to reduce high-contrast imaging data. ACORNS-ADI is efficient and open-source, and includes several optional features which may improve performance on data from other instruments. ACORNS-ADI is freely available for download at www.github.com/t-brandt/acorns-adi under a BSD license.

### Using radiative transfer models to study the atmospheric water vapor content and to eliminate telluric lines from high-resolution optical spectra

The Radiative Transfer Model (RTM) and the retrieval algorithm, incorporated in the SCIATRAN 2.2 software package developed at the Institute of Remote Sensing/Institute of Enviromental Physics of Bremen University (Germany), allows to simulate, among other things, radiance/irradiance spectra in the 2400-24 000 {\AA} range. In this work we present applications of RTM to two case studies. In the first case the RTM was used to simulate direct solar irradiance spectra, with different water vapor amounts, for the study of the water vapor content in the atmosphere above Sierra Nevada Observatory. Simulated spectra were compared with those measured with a spectrometer operating in the 8000-10 000 {\AA} range. In the second case the RTM was used to generate telluric model spectra to subtract the atmospheric contribution and correct high-resolution stellar spectra from atmospheric water vapor and oxygen lines. The results of both studies are discussed.

### The Long Wavelength Array Software Library

The Long Wavelength Array Software Library (LSL) is a Python module that provides a collection of utilities to analyze and export data collected at the first station of the Long Wavelength Array, LWA1. Due to the nature of the data format and large-N ($\gtrsim$100 inputs) challenges faced by the LWA, currently available software packages are not suited to process the data. Using tools provided by LSL, observers can read in the raw LWA1 data, synthesize a filter bank, and apply incoherent de-dispersion to the data. The extensible nature of LSL also makes it an ideal tool for building data analysis pipelines and applying the methods to other low frequency arrays.

### The 2012 Interferometric Imaging Beauty Contest

We present the results of the fifth Interferometric Imaging Beauty Contest. The contest consists in blind imaging of test data sets derived from model sources and distributed in the OIFITS format. Two scenarios of imaging with CHARA/MIRC-6T were offered for reconstruction: imaging a T Tauri disc and imaging a spotted red supergiant. There were eight different teams competing this time: Monnier with the software package MACIM; Hofmann, Schertl and Weigelt with IRS; Thi\'ebaut and Soulez with MiRA ; Young with BSMEM; Mary and Vannier with MIROIRS; Millour and Vannier with independent BSMEM and MiRA entries; Rengaswamy with an original method; and Elias with the radio-astronomy package CASA. The contest model images, the data delivered to the contestants and the rules are described as well as the results of the image reconstruction obtained by each method. These results are discussed as well as the strengths and limitations of each algorithm.

### The 2012 Interferometric Imaging Beauty Contest [Replacement]

We present the results of the fifth Interferometric Imaging Beauty Contest. The contest consists in blind imaging of test data sets derived from model sources and distributed in the OIFITS format. Two scenarios of imaging with CHARA/MIRC-6T were offered for reconstruction: imaging a T Tauri disc and imaging a spotted red supergiant. There were eight different teams competing this time: Monnier with the software package MACIM; Hofmann, Schertl and Weigelt with IRS; Thi\'ebaut and Soulez with MiRA ; Young with BSMEM; Mary and Vannier with MIROIRS; Millour and Vannier with independent BSMEM and MiRA entries; Rengaswamy with an original method; and Elias with the radio-astronomy package CASA. The contest model images, the data delivered to the contestants and the rules are described as well as the results of the image reconstruction obtained by each method. These results are discussed as well as the strengths and limitations of each algorithm.

### A Systematic Review of Strong Gravitational Lens Modeling Software

Despite expanding research activity in gravitational lens modeling, there is no particular software which is considered a standard. Much of the gravitational lens modeling software is written by individual investigators for their own use. Some gravitational lens modeling software is freely available for download but is widely variable with regard to ease of use and quality of documentation. This review of 12 software packages was undertaken to provide a single source of information. Gravitational lens models are classified as parametric models or non-parametric models, and can be further divided into research and educational software. Software used in research includes GravLens / LensModel, Lenstool, Lensperfect, Glafic, PixeLens, SimpLens, Lensview, and GRALE. In this review, GravLensHD, G-Lens, Gravitational Lensing and Lens are categorized as educational programs that are useful for demonstrating various aspects of lensing. Each of the 12 software packages is reviewed with regard to software features (installation, documentation, files provided, etc.) and lensing features (type of model, input data, output data, etc.) as well as a brief review of studies where they have been used. Recent studies have demonstrated the utility of strong gravitational lensing data for mass mapping, and suggest increased use of these techniques in the future. Coupled with the advent of greatly improved imaging, new approaches to modeling of strong gravitational lens systems are needed. This is the first systematic review of strong gravitational lens modeling software, providing investigators with a starting point for future software development to further advance gravitational lens modeling research.

### Using the pulsar timing software package, TEMPO2

This paper contains details on the algorithms implemented in the TEMPO2 pulsar timing software package and describes how the software is used. Information is given on how to download and install the software, use the various interfaces, simulate realistic data sets and develop the software. The use of TEMPO2 in predictive mode is also described.

### Chameleon: A Reconstruction package for a KM3NeT detector

In this note we describe the Chameleon software we developed for the event reconstruction of KM3 detector. This software package's developement started as a standalone application before the endorcement from the KM3NeT consortium of the SeaTray software framework, but it was adapted to it on the course. Chapter 1 outlines the techniques we developed for the pattern recognition and the track fitting. In Chapter 2, we demonstrate the performance of the Chameleon Reconstruction.

### Missing completely of CMB quadrupole in WMAP data

In cosmic microwave background (CMB) experiments, foreground-cleaned temperature maps are still disturbed by the dipole contamination caused by uncertainties of the dipole direction and microwave radiometer sidelobe. To obtain reliable CMB maps, the dipole contamination has to be carefully removed from observed data. We built and improve a software package for map-making with dipole-contamination removing, and power spectrum estimation form the Wilkinson Microwave Anisotropy Probe (WMAP) data. With the software we obtain a negative result of CMB quadrupole detection with WMAP data, the amplitude of CMB quadrupole is -3.2\pm3.5 {\mu}K^2 from the seven-year WMAP (WMAP7) data, which is evidently incompatible with \sim1000 {\mu}K^2 expected from the standard cosmological model LCDM. The completely missing of CMB quadrupole poses a serious challenge to the standard model and sets a strong constraint on possible models of cosmology. Due to the importance of this result for understanding the origin and early evolution of our universe, the software codes we used are opened for public checking.

### RR Lyrae type stars, ST Boo and RR Leo: 2007 Observations and the preliminary results of the frequency analaysis

We present BVR light curves of pulsating stars, ST Boo and RR Leo, obtained between March and September 2007 at the Ankara University Observatory (AUG) and the T\"{U}B{\.I}TAK National Observatory (TUG). Although these observational data are insufficient to obtain the reliable results for a frequency analysis of ST Boo and RR Leo stars, in this study, we tried to investigate the pulsation phenomena of these two stars, as an overview, using the Period04 software package. As preliminary results, we present the possible frequencies for ST Boo and RR Leo.

### Reconstruction efficiency and discovery potential of a Mediterranean neutrino telescope: A simulation study using the Hellenic Open University Simulation & Reconstruction (HOURS) package

We report on the evaluation of the performance of a Mediterranean very large volume neutrino telescope. We present results of our studies concerning the capability of the telescope in detecting/discovering galactic (steady point sources) and extragalactic, transient (Gamma Ray Bursts) high energy neutrino sources as well as measuring ultra high energy diffuse neutrino fluxes. The neutrino effective area and angular resolution are presented as a function of the neutrino energy, and the background event rate (atmospheric neutrinos and muons) is estimated. The discovery potential of the neutrino telescope is evaluated and the experimental time required for a significant discovery of potential neutrino emitters (known from their gamma ray emission, assumedly produced by hadronic interactions) is estimated. For the simulation we use the HOU Reconstruction & Simulation (HOURS) software package.

### Reconstruction efficiency and discovery potential of a Mediterranean neutrino telescope: A simulation study using the Hellenic Open University Reconstruction & Simulation (HOURS) package [Replacement]

We report on the evaluation of the performance of a Mediterranean very large volume neutrino telescope. We present results of our studies concerning the capability of the telescope in detecting/discovering galactic (steady point sources) and extragalactic, transient (Gamma Ray Bursts) high energy neutrino sources as well as measuring ultra high energy diffuse neutrino fluxes. The neutrino effective area and angular resolution are presented as a function of the neutrino energy, and the background event rate (atmospheric neutrinos and muons) is estimated. The discovery potential of the neutrino telescope is evaluated and the experimental time required for a significant discovery of potential neutrino emitters (known from their gamma ray emission, assumedly produced by hadronic interactions) is estimated. For the simulation we use the HOU Reconstruction & Simulation (HOURS) software package.

### Integrated spectra extraction based on signal-to-noise optimization using Integral Field Spectroscopy

We propose and explore the potential of a method to extract high signal-to-noise (S/N) integrated spectra related to physical and/or morphological regions on a 2-dimensional field using Integral Field Spectroscopy (IFS) observations by employing an optimization procedure based on either continuum (stellar) or line (nebular) emission features. The optimization method is applied to a set of IFS VLT-VIMOS observations of (U)LIRG galaxies, describing the advantages of the optimization by comparing the results with a fixed-aperture, single spectrum case, and by implementing some statistical tests. We demonstrate that the S/N of the IFS optimized integrated spectra is significantly enhanced when compared with the single aperture unprocessed case. We provide an iterative user-friendly and versatile IDL algorithm that allows the user to spatially integrate spectra following more standard procedures. This is made available to the community as part of the PINGSoft IFS software package.

### Faraday synthesis: The synergy of aperture and rotation measure synthesis

We introduce a new technique for imaging the polarized radio sky using interferometric data. The new approach, which we call Faraday synthesis, combines aperture and rotation measure synthesis imaging and deconvolution into a single algorithm. This has several inherent advantages over the traditional two-step technique, including improved sky plane resolution, fidelity, and dynamic range. In addition, the direct visibility- to Faraday-space imaging approach is a more sound foundation on which to build more sophisticated deconvolution or inference algorithms. For testing purposes, we have implemented a basic Faraday synthesis imaging software package including a three-dimensional CLEAN deconvolution algorithm. We compare the results of this new technique to those of the traditional approach using mock data. We find many artifacts in the images made using the traditional approach that are not present in the Faraday synthesis results. In all, we achieve a higher spatial resolution, an improvement in dynamic range of about 20%, and a more accurate reconstruction of low signal to noise source fluxes when using the Faraday synthesis technique.