To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In The Secret Life of Copyright, copyright law meets Black Lives Matter and #MeToo in a provocative examination of how our legal regime governing creative production unexpectedly perpetuates inequalities along racial, gender, and socioeconomic lines while undermining progress in the arts. Drawing on numerous case studies – Harvard’s slave daguerreotypes, celebrity sex tapes, famous Wall Street statues, beloved musicals, and dictator copyrights – the book argues that, despite their purported neutrality, key rules governing copyrights – from the authorship, derivative rights, and fair use doctrines to copyright’s First Amendment immunity – systematically disadvantage individuals from traditionally marginalized communities. Since laws regulating the use of creative content increasingly mediate participation and privilege in the digital world, The Secret Life of Copyright provides a template for a more robust copyright system that better addresses egalitarian concerns and serves the interests of creativity.
Sharp, nonasymptotic bounds are obtained for the relative entropy between the distributions of sampling with and without replacement from an urn with balls of $c\geq 2$ colors. Our bounds are asymptotically tight in certain regimes and, unlike previous results, they depend on the number of balls of each color in the urn. The connection of these results with finite de Finetti-style theorems is explored, and it is observed that a sampling bound due to Stam (1978) combined with the convexity of relative entropy yield a new finite de Finetti bound in relative entropy, which achieves the optimal asymptotic convergence rate.
This study examines how human activities influenced soil development at two contrasting Arctic sites: Maiva, a 19th-century farmstead, and Snuvrejohka, a seasonal Sámi reindeer herding settlement in the Lake Torneträsk region, northern Sweden. Using geochemical and geophysical soil analyses, we explore the spatial distribution and vertical development of anthropogenic signals in the soil. At Maiva, prolonged agricultural use and earthworm bioturbation have led to extensive soil mixing and altered soil horizons, resulting in elevated phosphate, lead, and organic matter concentrations in Ap and Ah horizons. In contrast, Snuvrejohka displays more stratified profiles with localized chemical enrichment around hearths, primarily within E horizons. These results highlight how different land-use practices leave distinct geochemical fingerprints in Arctic soils and emphasize the need for sampling strategies adapted to site-specific soil formation processes. Our findings demonstrate that even short-term or seasonal human activities can leave distinct and detectable signatures in Arctic soils. Through an integrated approach combining soil science, geoarchaeological methods, and historical data, this study provides new insights into the reconstruction of past land-use practices and highlights the vulnerability of archaeological soil records in Arctic environments facing rapid climate-driven change.
This chapter discusses more specialized examples on how machine learning can be used to solve problems in quantum sciences. We start by explaining the concept of differentiable programming and its use cases in quantum sciences. Next, we describe deep generative models, which have proven to be an extremely appealing tool for sampling from unknown target distributions in domains ranging from high-energy physics to quantum chemistry. Finally, we describe selected machine learning applications for experimental setups such as ultracold systems or quantum dots. In particular, we show how machine learning can help in tedious and repetitive experimental tasks in quantum devices or in validating quantum simulators with Hamiltonian learning.
Who gets to have a voice, and what does it mean? Questions of vocal ontology and ethics are perennial, but in a world where the ability to sample the voices of others or to synthesize new ones in pursuit of both creative and commercial endeavours is available more widely than ever before, the relationship of the voice to the individual body, agency, and rights is invested with a new urgency. Through a discussion ranging from The Little Mermaid to Kanye West, Cathy Berberian to Holly Herndon, this short provocation considers the manifold ways in which we find, have, and borrow voices.
In epidemiology, we are interested in conducting studies to measure disease occurrence and look for causes of disease. Such studies can be applied to public health, allowing us to modify the causes for disease prevention. In the previous chapters, we learned about several commonly used public health measures and routine collections of health data. They form the basis of descriptive epidemiology and enable us to describe the frequency and patterns of health-related issues in relation to person, place and time characteristics. It is important to note that descriptive studies cannot be used to establish causal relationships but are useful for generating hypotheses. These hypotheses need to be tested in analytical studies to determine whether the ‘exposure’ of interest is associated with the changes in disease morbidity or mortality to search for the possible causes of the disease.
Despite their numerous advantages, exit polls are not a common tool in the study of Canadian electoral behaviour. In this methodological note, we use data from two pilot projects to test small-scale exit polls’ accuracy when estimating party support. We mobilize exit-polling data collected in the 2018 Quebec provincial election (four voting locations) and the 2019 federal election in Quebec (two voting locations). We focus on chance error and bias error in small samples. Results obtained using parametric linear models suggest that small sample exit polls achieve relatively precise estimations. We do find, however, that right-of-the-centre parties’ vote share tends to be underestimated. These findings shed light on the strengths and shortcomings of small-scale exit polls in Canada.
This chapter describes the data collection strategy and multimethod research design employed to test the theory in the subsequent chapters of the book. The structure of the empirical analysis mirrors the book’s primary argument: to show how peacekeeping works from the bottom up, from the individual to the community to the country. Given that UN peacekeepers deploy to the most violent areas, the design needed to account for selection bias as well as other confounding variables in order to make causal inference possible. Using data from individual- and subnational/community-level data from Mali as well as cross-national data from the universe of multidimensional PKOs deployed in Africa, the book employs a three-part strategy to test the hypotheses in the next few chapters. First, the book considers the micro-level behavioral implications of the theory using a lab-in-the-field experiment and a survey experiment, both implemented in Mali. Second, it test whether UN peacekeepers’ ability to increase individual willingness to cooperate aggregates upward to prevent communal violence in Mali. Third, the book considers whether these findings extend to other countries.
The implicit revolution seems to have arrived with the declaration that “explicit measures are informed by and (possibly) rendered invalid by unconscious cognition.” What is the view from survey research, which has relied on explicit methodology for over a century, and whose methods have extended to the political domain in ways that have changed the landscape of politics in the United States and beyond? One survey researcher weighs in. The overwhelming evidence points to the continuing power of explicit measures to predict voting and behavior. Whether implicit measures can do the same, especially beyond what explicit measures can do, is far more ambiguous. The analysis further raises doubts, as others before have done, as to what exactly implicit measures measure, and particularly questions the co-opting among implicit researchers the word “attitude” when such measures instead represent associations. The conclusion: Keep your torches at home. There is no revolution.
In the applications of maximum likelihood factor analysis the occurrence of boundary minima instead of proper minima is no exception at all. In the past the causes of such improper solutions could not be detected. This was impossible because the matrices containing the parameters of the factor analysis model were kept positive definite. By dropping these constraints, it becomes possible to distinguish between the different causes of improper solutions. In this paper some of the most important causes are discussed and illustrated by means of artificial and empirical data.
Much of the literature on first (L1) second language (L2) reading agrees that there are noticeable behavioral differences between L1 and L2 readers of a given language, as well as between L2 speakers with different L1 backgrounds (Finnish vs German readers of English). Yet, this literature often overlooks potential variability between multiple samples of speakers of the same L1. This study examines this intersample variance using reading data from the ENglish Reading Online (ENRO) database of English reading behavior comprising 27 university student samples from 15 distinct L1 backgrounds. We found that the intersample variance within L2 readers of English with the same L1 background (e.g., two samples of Russian speakers) often overshadowed the difference between samples of L2 readers with different L1 backgrounds (Russian vs Chinese speakers of English). We discuss these and other problematic methodological implications of representing each L1 background with a single participant sample.
The move from theory to empirics requires figuring out how to collect evidence that could support or disconfirm hypotheses derived from your theory. Empirically studying the network in your theory requires two steps: determining which nodes to include in your data and operationalizing the link type. This chapter helps a reader select the boundary that contains the nodes of interest, pointing out some subtle downsides to random sampling in network studies. It also helps readers determine whether they want to measure full networks or ego ones and offers pointers on operationalizing link types.
Students are introduced the logic, foundation, and basics of statistical inference. The need for samples is first discussed and then how samples can be used to make inferences about the larger population. The normal distribution is then discussed, along with Z-scores to illustrate basic probability and the logic of statistical significance.
Gives a brief overview of the book. Notations for signal representation in continuous time and discrete time are introduced. Both one-dimensional and two-dimensional signals are introduced, and simple examples of images are presented. Examples of noise removal and image smoothing (filtering) are demonstrated. The concept of frequency is introduced and its importance as well as its role in signal representation are explained, giving musical notes as examples. The history of signal processing, the role of theory, and the connections to real-life applications are mentioned in an introductory way. The chapter also draws attention to the impact of signal processing in digital communications (e.g., cell-phone communications), gravity wave detection, deep space communications, and so on.
This introductory chapter sets out the aim of the project, which is to reassess the social and cultural relations between the Aegean and the Mediterranean through a new examination of some of the earliest Greek pottery finds overseas. The focus is on Protogeometric and Geometric ceramics from Greek and Phoenician colonies, certain Phoenician metropolises and further Indigenous sites in the Aegean and the Mediterranean, which were analysed by Neutron Activation. The analytical results are examined against the background of the social and economic relations that were generated through the production, exchange and consumption of the pottery finds under scrutiny.
Chapter 3 focuses on WTO disputes about anti-dumping issues. Anti-dumping or to be more specific, zeroing is the single-most litigated issue under WTO law. Although the Appellate Body has found zeroing method inconsistent with ADA several times, it is still being used with small alterations. Chapter 3 shows that the so-called jewel in the crown is sometimes ineffective. To do so, the role of the DSM in the anti-dumping issue is presented and anti-dumping cases dealing with procedural issues are analysed. The procedural issues mentioned in this chapter are as follows: calculation methods, transparency, public notice/notification, selection of investigated parties (sampling), submission of evidence and rebuttals, access to non-confidential files, hearings, newcomers and enforcement.
The album Slave to the Rhythm is typical of the exaltation of pop stars but atypical in its presentation and interaction with biographical material. Three crossings are considered in this assessment of the work: technological, cultural, and structural. These are presented with a detailed track-by-track analysis using a range of signal processing techniques, some adapted specifically for this project. This Element focuses on the combination of digital, novel, and analogue technology that was used, and the organisational and transformational treatments of recorded material it offered, along with their associated musical cultures. The way in which studio technology functions, and offers interaction with its users, has a direct influence over the sound of the music that is created with it. To understand how that influence is manifested in Slave, there is considerable focus on the development and use of music technology.
This chapter explores the evolution of the djent subgenre from the perspective of the musical, technological and environmental factors that have shaped its identity. The chapter considers the early circumstances of djent’s emergence during the early mid−2000s, with particular reference to the online culture which contributed to its wider transmission and proliferation. Key musical influences are also discussed, including djent’s roots in progressive metal and the work of bands such as Meshuggah and SikTh, as well as the subgenre’s interaction with electronic music aesthetics and popular music. A principal focus of the chapter is on the role of emerging digital technologies, particularly Digital Audio Workstations (DAW) and digital amplifier and drum kit modelling software, in the formation of djent’s musical and sonic characteristics. Finally, the chapter considers djent’s position as a subgenre within modern metal music and evaluates, with reference to the critical reception literature, the debates that persist concerning its legitimacy within metal.
This chapter describes how relationship scientists conduct research to answer questions about relationships. It explains all aspects of the research process, including how hypotheses are derived from theory, which study designs (e.g., experiments, cross-sectional studies, experience sampling) best suit specific research questions, how scientists can manipulate variables or measure variables with self-report, implicit, observational, or physiological measures, and what scientists consider when recruiting a sample to participate in their studies. This chapter also discusses how researchers approach questions about boundary conditions (when general trends do not apply) and mechanisms (the processes underlying their findings) and describes best practices for conducting ethical and reproducible research. Finally, this chapter includes a guide for how to read and evaluate empirical research articles.
The Introduction sets out how the number of forcibly displaced persons in the world is the highest ever recorded. Violence associated with armed conflict has become the main cause of forced displacement in the twenty-first century and most refugees are fleeing armed conflicts. Most asylum seekers in the European Union (EU) originate from Syria, Afghanistan and Iraq. However, there are many misconceptions whether persons fleeing armed conflicts are refugees as defined by the Refugee Convention. This book is thus an enquiry into the continued relevance of the Refugee Convention and examines the extent to which asylum appellate authorities in the EU take into account the changing nature of contemporary armed conflicts. The book also explores how the Refugee Convention may be interpreted in a manner that better responds to the changed nature of contemporary armed conflicts from a gender perspective, thus reconceptualising the concept of the refugee. The Introduction sets out the conceptual notions adopted in the book, such as the importance of distinguishing between violence and armed conflicts, the research methodology and sampling of 320 asylum appeal decisions from Belgium, Denmark, France, the Netherlands, Spain and the UK. Finally, it sets out the structure of the book.