To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We show that the U-Net neural network architecture provides an efficient and effective way of locating sources in SKA Data Challenge datasets. The improved performance relative to PyBDSF is quantified and U-Net is proposed as an efficient source finder for real radio surveys.
In this paper, we present a convolutional neural network (CNN)-based architecture trained on a dataset of meteorites and terrestrial rocks and another dataset trained on meteors and light sources. For meteorites, the dataset comprises augmented images from the meteorite collection at the Sharjah Academy for Astronomy, Space Sciences, and Technology (SAASST). For meteors, the images are taken from the United Arab Emirates (UAE) Meteor Monitoring Network (MMN). Such a project’s significance is to expand machine learning applications in astronomy to include the solar system’s small bodies upon contact with the Earth’s atmosphere. This, in return, acts as deep learning research, which examines a computer’s ability to mimic a human’s brain in recognizing meteorites from rocks, and meteors from airplanes and other noise sources. When testing the CNN models, results have shown that both the meteorite and meteor models reached an accuracy of above 80%.
The Gaia mission DR3 provides accurate data of around two billion stars in the Galaxy, including a classification based on astronomical classes of objects. In this work we present a web visualization tool to analyze one of the products published in the DR3, the Outlier Analysis Self-Organizing Map†.
In a recent search (Kim et al. 2022), we looked for microlensing signature in gravitational waves from spectrograms of the binary black hole events in the first and second gravitational-wave transient catalogs. For the search, we have implemented a deep learning-based method (Kim et al. 2021) and figured out that one event, GW190707_093326, out of forty-six events, is classified into the lensed class. However, upon estimating the p-value of this event, we observed that the uncertainty of the p-value still includes the possibility of the event being unlensed. Therefore, we concluded that no significant evidence of beating patterns from the evaluated binary black hole events has found from the search. For a consequence study, we discuss the distinguishability between microlensed GWs and the signal from precessing black hole binaries.
As the scale of cosmological surveys increases, so does the complexity in the analyses. This complexity can often make it difficult to derive the underlying principles, necessitating statistically rigorous testing to ensure the results of an analysis are consistent and reasonable. This is particularly important in multi-probe cosmological analyses like those used in the Dark Energy Survey (DES) and the upcoming Legacy Survey of Space and Time, where accurate uncertainties are vital. In this paper, we present a statistically rigorous method to test the consistency of contours produced in these analyses and apply this method to the Pippin cosmological pipeline used for type Ia supernova cosmology with the DES. We make use of the Neyman construction, a frequentist methodology that leverages extensive simulations to calculate confidence intervals, to perform this consistency check. A true Neyman construction is too computationally expensive for supernova cosmology, so we develop a method for approximating a Neyman construction with far fewer simulations. We find that for a simulated dataset, the 68% contour reported by the Pippin pipeline and the 68% confidence region produced by our approximate Neyman construction differ by less than a percent near the input cosmology; however, they show more significant differences far from the input cosmology, with a maximal difference of 0.05 in $\Omega_{M}$ and 0.07 in w. This divergence is most impactful for analyses of cosmological tensions, but its impact is mitigated when combining supernovae with other cross-cutting cosmological probes, such as the cosmic microwave background.
With the advent of deep, all-sky radio surveys, the need for ancillary data to make the most of the new, high-quality radio data from surveys like the Evolutionary Map of the Universe (EMU), GaLactic and Extragalactic All-sky Murchison Widefield Array survey eXtended, Very Large Array Sky Survey, and LOFAR Two-metre Sky Survey is growing rapidly. Radio surveys produce significant numbers of Active Galactic Nuclei (AGNs) and have a significantly higher average redshift when compared with optical and infrared all-sky surveys. Thus, traditional methods of estimating redshift are challenged, with spectroscopic surveys not reaching the redshift depth of radio surveys, and AGNs making it difficult for template fitting methods to accurately model the source. Machine Learning (ML) methods have been used, but efforts have typically been directed towards optically selected samples, or samples at significantly lower redshift than expected from upcoming radio surveys. This work compiles and homogenises a radio-selected dataset from both the northern hemisphere (making use of Sloan Digital Sky Survey optical photometry) and southern hemisphere (making use of Dark Energy Survey optical photometry). We then test commonly used ML algorithms such as k-Nearest Neighbours (kNN), Random Forest, ANNz, and GPz on this monolithic radio-selected sample. We show that kNN has the lowest percentage of catastrophic outliers, providing the best match for the majority of science cases in the EMU survey. We note that the wider redshift range of the combined dataset used allows for estimation of sources up to $z = 3$ before random scatter begins to dominate. When binning the data into redshift bins and treating the problem as a classification problem, we are able to correctly identify $\approx$76% of the highest redshift sources—sources at redshift $z > 2.51$—as being in either the highest bin ($z > 2.51$) or second highest ($z = 2.25$).
We present the third data release from the Parkes Pulsar Timing Array (PPTA) project. The release contains observations of 32 pulsars obtained using the 64-m Parkes ‘Murriyang’ radio telescope. The data span is up to 18 yr with a typical cadence of 3 weeks. This data release is formed by combining an updated version of our second data release with $\sim$3 yr of more recent data primarily obtained using an ultra-wide-bandwidth receiver system that operates between 704 and 4032 MHz. We provide calibrated pulse profiles, flux density dynamic spectra, pulse times of arrival, and initial pulsar timing models. We describe methods for processing such wide-bandwidth observations and compare this data release with our previous release.
The putative host galaxy of FRB 20171020A was first identified as ESO 601-G036 in 2018, but as no repeat bursts have been detected, direct confirmation of the host remains elusive. In light of recent developments in the field, we re-examine this host and determine a new association confidence level of 98%. At 37 Mpc, this makes ESO 601-G036 the third closest FRB host galaxy to be identified to date and the closest to host an apparently non-repeating FRB (with an estimated repetition rate limit of $<$$0.011$ bursts per day above $10^{39}$ erg). Due to its close distance, we are able to perform detailed multi-wavelength analysis on the ESO 601-G036 system. Follow-up observations confirm ESO 601-G036 to be a typical star-forming galaxy with H i and stellar masses of $\log_{10}\!(M_{\rm{H\,{\small I}}} / M_\odot) \sim 9.2$ and $\log_{10}\!(M_\star / M_\odot) = 8.64^{+0.03}_{-0.15}$, and a star formation rate of $\text{SFR} = 0.09 \pm 0.01\,{\rm M}_\odot\,\text{yr}^{-1}$. We detect, for the first time, a diffuse gaseous tail ($\log_{10}\!(M_{\rm{H\,{\small I}}} / M_\odot) \sim 8.3$) extending to the south-west that suggests recent interactions, likely with the confirmed nearby companion ESO 601-G037. ESO 601-G037 is a stellar shred located to the south of ESO 601-G036 that has an arc-like morphology, is about an order of magnitude less massive, and has a lower gas metallicity that is indicative of a younger stellar population. The properties of the ESO 601-G036 system indicate an ongoing minor merger event, which is affecting the overall gaseous component of the system and the stars within ESO 601-G037. Such activity is consistent with current FRB progenitor models involving magnetars and the signs of recent interactions in other nearby FRB host galaxies.
Next-generation astronomical surveys naturally pose challenges for human-centred visualisation and analysis workflows that currently rely on the use of standard desktop display environments. While a significant fraction of the data preparation and analysis will be taken care of by automated pipelines, crucial steps of knowledge discovery can still only be achieved through various level of human interpretation. As the number of sources in a survey grows, there is need to both modify and simplify repetitive visualisation processes that need to be completed for each source. As tasks such as per-source quality control, candidate rejection, and morphological classification all share a single instruction, multiple data (SIMD) work pattern, they are amenable to a parallel solution. Selecting extragalactic neutral hydrogen (Hi) surveys as a representative example, we use system performance benchmarking and the visual data and reasoning methodology from the field of information visualisation to evaluate a bespoke comparative visualisation environment: the encube visual analytics framework deployed on the 83 Megapixel Swinburne Discovery Wall. Through benchmarking using spectral cube data from existing Hi surveys, we are able to perform interactive comparative visualisation via texture-based volume rendering of 180 three-dimensional (3D) data cubes at a time. The time to load a configuration of spectral cubes scale linearly with the number of voxels, with independent samples of 180 cubes (8.4 Gigavoxels or 34 Gigabytes) each loading in under 5 min. We show that parallel comparative inspection is a productive and time-saving technique which can reduce the time taken to complete SIMD-style visual tasks currently performed at the desktop by at least two orders of magnitude, potentially rendering some labour-intensive desktop-based workflows obsolete.
Having developed the necessary mathematics in chapters 4 to 6, chapter 7 returns to physics Evidence for homogeneity and isotropy of the Universe at the largest cosmological scales is presented and Robertson-Walker metrics are introduced. Einstein’s equations are then used to derive the Friedmann equations, relating the cosmic scale factor to the pressure and density of matter in the Universe. The Hubble constant is discussed and an analytic form of the red-shift distance relation is derived, in terms of the matter density, the cosmological constant and the spatial curvature, and observational values of these three parameters are given. Some analytic solutions of the Friedmann equation are presented. The cosmic microwave background dominates the energy density in the early Universe and this leads to a description of the thermal history of the early Universe: the transition from matter dominated to radiation dominated dynamics and nucleosynthesis in the first 3 minutes. Finally the horizon problem and the inflationary Universe are described and the limits of applicability of Einstein's equations, when they might be expected to break down due to quantum effects, are discussed.
Geodesics are introduced and the geodesic equation analysed for the geometries introduced in chapter 2, using variation principles of classical mechanics. Geodesic motino on a sphere is described as well as the Coriolis effect and the Sagnac effect. Newtonian gravity is derived as the non-relativistic limit of geodesic motion in space-time. Geodesics in an expanding universe and heat death is described. Geodesics in Schwarzschild space-time are treated in detail: the precession of the perihelion of Mercury; the bending of light by the Sun; Shapiro time delay; black holes and the event horizon. Gravitational waves and gravitational lensing are also covered.