Quantitative interpretation of magnetic data has played an important role inmineral exploration in recent years. The most commonly used method is threedimensional inversion to recover a distribution of magnetic susceptibility. Thetraditional inversion algorithms require knowledge of magnetization direction, whichmeans that one should assume there is no remanent magnetization and self-demagnetization. Consequently, the direction of magnetization is assumed to be thesame as the current geomagnetic field direction. However, strong remanentmagnetization always exists and the total magnetization direction can be significantlydifferent from that of the geomagnetic field direction due to the complicated geologicalcondition, in which case the traditional inversion algorithms become ineffective and theinversion result may be wrong. In addition, the computer memory required for storingthe sensitivity matrix and the computational complexity increase rapidly with theincreasing size of the inversion problem, which means that the traditional inversionalgorithm cannot be applied to large-scale data sets.Faced with the above two problems, I first invert magnetic anomaly modulus andnormalized source strength (NSS), two magnetic field transforms with low sensitivityto the direction of source magnetization and high centricity, calculated from the totalfield anomaly using both smooth inversion method and data-space inversion method toreduce the remanent magnetization effects, then I combine the adaptive samplingmethod and wavelet transforms to reduce the memory required for storing thesensitivity matrix and the computational time required for matrix-vector multiplicationin large-scale data inversion.I propose the data-space inversion method of magnetic anomaly modulus basedon some results of other researchers, which produces a more focused solution andreduces the computational time compared with a standard model-space, least-squares inversion. In addition, combining the adaptive sampling method and wavelet transformsto reduce the redundant information hidden in the original data and dense sensitivitymatrix can reduce the storage requirement and likewise increase solution speed by upto three orders of magnitude. As a result, the large-scale data problem, whetherincluding remanence or not, now can be solved fast and effectively in a mainstreamcomputer. |