No CrossRef data available.
Published online by Cambridge University Press: 12 April 2016
The origin and evolution of neutron star magnetic fields has been hotly debated for a long time. Spontaneous field decay was originally proposed with timescales of (5–10) × 106 years, while another possible model which associates field decay with mass accretion in the evolution of binary systems has been suggested (see Bhattacharya & van den Heuvel 1991 for a review). The aim of this paper is to examine whether accretion-induced field decay can reproduce the observed properties of the wide binary radio pulsars in quantitative calculations.
In a binary system consisting of a neutron star and a low-mass giant companion, if the initial orbital period is longer than 1 day, mass transfer, taking the form of Roche-lobe overflow, is driven by the nuclear evolution of the giant through radius expansion (Webbink et al. 1983). We assume the mass accretion rate Ṁ of the neutron star is limited to the Eddington accretion rate Ṁ E ≃ 10−8 M ⊙ yr−1. If the mass transfer rate is in excess of Ṁ E , the rest mass is blown from the system in the forms of jets or beams.