Theoretical models of protoplanetary disks including stellar irradiation often show a spontaneous amplification of scale height perturbations, produced by the enhanced absorption of starlight in enlarged regions. In turn, such regions cast shadows on adjacent zones that consequently cool down and shrink, eventually leading to an alternating pattern of overheated and shadowed regions. Previous investigations have proposed this to be a real self-sustained process, the so-called self-shadowing or thermal wave instability, which could naturally form frequently observed disk structures such as rings and gaps, and even potentially enhance the formation of planetesimals. All of these, however, have assumed in one way or another vertical hydrostatic equilibrium and instantaneous radiative diffusion throughout the disk. In this work we present the first study of the stability of accretion disks to self-shadowing that relaxes these assumptions, relying instead on radiation hydrodynamical simulations. Our results suggest that radiative cooling and gas advection at the disk surface prevent a self-shadowing instability from forming, by damping temperature perturbations before these reach lower, optically thick regions.