4.7 Article

Magnetic field evolution in neutron star crusts due to the Hall effect and ohmic decay

Journal

ASTROPHYSICAL JOURNAL
Volume 609, Issue 2, Pages 999-1017

Publisher

UNIV CHICAGO PRESS
DOI: 10.1086/421324

Keywords

stars : magnetic fields; stars : neutron

Ask authors/readers for more resources

We present calculations of magnetic field evolution by the Hall effect and ohmic decay in the crust of neutron stars (NSs). In accreting NSs, ohmic decay is always the dominant effect because of the large resistivity. In isolated NSs with relatively pure crusts, the Hall effect dominates ohmic decay after a time t(switch) similar or equal to 10(4) yr B-12(-3) where B-12 is the magnetic field strength in units of 10(12) G. We compute the evolution of an initial field distribution by ohmic decay and give approximate analytic formulae for both the surface and interior fields as a function of time. Because of the strong dependence of t(switch) on B-12, early ohmic decay can alter the currents down to the base of the crust for B similar to 10(11) G, neutron drip for B similar to 10(12) G, and near the top of the crust for Bgreater than or similar to10(13) G. We then discuss magnetic field evolution by the Hall effect. Several examples are given to illustrate how an initial field configuration evolves. Hall-wave eigenfunctions are computed, including the effect of the large density change across the crust. We estimate the response of the crust to the magnetic stresses induced by Hall waves and give a detailed discussion of the boundary conditions at the solid-liquid interface. Finally, we discuss the implications for the Hall cascade proposed by Goldreich Reisenegger.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available