Lower mass normalization of the stellar initial mass function for dense massive early-type galaxies at z ~ 1.4
This paper aims at understanding if the normalization of the stellar initial mass function (IMF) of massive early-type galaxies (ETGs) varies with cosmic time and/or with mean stellar mass density Sigma (M*/2\pi Re^2). For this purpose we have collected a sample of 18 dense (Sigma>2500 M_sun/pc^2) ETGs at 1.2<z<1.6 with available velocity dispersion sigma_e. We have constrained their mass-normalization by comparing their true stellar masses (M_true) derived through virial theorem, hence IMF independent, with those inferred through the fit of the photometry assuming a reference IMF (M_ref). Adopting the virial estimator as proxy of the true stellar mass, we have assumed for these ETGs zero dark matter (DM). However, dynamical models and numerical simulations of galaxy evolution have shown that the DM fraction within Re in dense high-z ETGs is negligible. We have considered the possible bias of virial theorem in recovering the total masses and have shown that for dense ETGs the virial masses are in agreement with those derived through more sophisticated dynamical models. The variation of the parameter $\Gamma$ = M_true/M_ref with sigma_e shows that, on average, dense ETGs at <z> = 1.4 follow the same IMF-sigma_e trend of typical local ETGs, but with a lower mass-normalization. Nonetheless, once the IMF-sigma_e trend we have found for high-z dense ETGs is compared with that of local ETGs with similar Sigma and sigma_e, they turn out to be consistent. The similarity between the IMF-sigma_e trends of dense high-z and low-z ETGs over 9 Gyr of evolution and their lower mass-normalization with respect to the mean value of local ETGs suggest that, independently on formation redshift, the physical conditions characterizing the formation of a dense spheroid lead to a mass spectrum of new formed stars with an higher ratio of high- to low-mass stars with respect to the IMF of normal local ETGs.