We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Two well-known lower bounds to the reliability in classical test theory, Guttman’s λ2 and Cronbach’s coefficient alpha, are shown to be terms of an infinite series of lower bounds. All terms of this series are equal to the reliability if and only if the test is composed of items which are essentially tau-equivalent. Some practical examples, comparing the first 7 terms of the series, are offered. It appears that the second term (λ2) is generally worth-while computing as an improvement of the first term (alpha) whereas going beyond the second term is not worth the computational effort. Possibly an exception should be made for very short tests having widely spread absolute values of covariances between items. The relationship of the series and previous work on lower bound estimates for the reliability is briefly discussed.
Bowker's test for marginal equality in contingency tables provides a familiar chi-square test to determine whether the marginal distributions are the same across two or more factors or occasions. In this note it is shown how latent trait theory provides a theoretical framework for the development and application of this test.
There are three fundamental problems in Sijtsma (Psychometrika, 2008): (1) contrary to the name, the glb is not the greatest lower bound of reliability but rather is systematically less than ωt (McDonald, Test theory: A unified treatment, Erlbaum, Hillsdale, 1999), (2) we agree with Sijtsma that when considering how well a test measures one concept, α is not appropriate, but recommend ωt rather than the glb, and (3) the end user needs procedures that are readily available in open source software.
Nineteenth and twentieth-century West African writer-intellectuals harnessed their Atlantic networks to explore ideas of race, regeneration, and nation-building. Yet, the ultimately cosmopolitan nature of these political and intellectual pursuits has been overlooked by dominant narratives of anti-colonial history. In contrast, Cosmopolitan Nationalism in Ghana uses cosmopolitanism as a primary theoretical tool, interrogating the anti-colonial writings that prop up Ghana's nationalist history under a new light. Mary A. Seiwaa Owusu highlights the limitations of accepted labels of nationalist scholarship and confirms that these writer-intellectuals instead engaged with ideas around the globe. This study offers a more complex account of the nation-building project, arguing for the pivotal role of other groups and factors in addition to Kwame Nkrumah's leadership. In turn, it proposes a historical account which assumes a cosmopolitan setting, highlights the centrality of debate, and opens a vista for richer understandings of Ghanaians' longstanding questions about thriving in the world.
This chapter provides the historical background necessary to understand the book’s empirical analysis. It discusses the political decisions that led to the displacement of Germans and Poles at the end of WWII and challenges the assumption that uprooted communities were internally homogeneous. It then zooms in on the process of uprooting and resettlement and introduces data on the size and heterogeneity of the migrant population in postwar Poland and West Germany.
Bentonite is considered as an ideal buffer/backfill material for preparing an engineering barrier for high-level radioactive waste (HLW) disposal. During initial sample preparation, the tendency of wet bentonite powder to gather into large agglomerates and the water to be spread unevenly in the traditional water content adjustment process decreases the homogeneity of compacted bentonite. The main purpose of this study was to solve this problem by applying a new wetting method, which mixes ice powder with bentonite powder (the ice-bentonite mixing method). This new method was used to adjust the water distribution in Gaomiaozi County, China (GMZ) bentonite powder and was compared to the traditional spray method. The screening method was used to separate macro-agglomerates (≥ 0.25 mm) from the water and bentonite mixture. The properties, the content of the various size agglomerates in loose mixtures, and the heterogeneity defects observed in compacted bentonite were compared. An index (P) was defined to quantitatively evaluate the water distribution in a loose bentonite/water mixture. Macro-agglomerates in loose mixtures produced heterogeneities in water content, density, and shrinkage. By using the ice-bentonite mixing method, fewer macro-agglomerates were formed and a homogeneous distribution of water was produced in the compacted bentonite. A homogeneous water distribution had the tendency to decrease the number of shrinkage cracks after the drying process and to maintain high mechanical strength in the compacted bentonite. Although the production of ice powder was laborious, the ice-bentonite mixing method has workability advantages: (i) a high mixing efficiency, (ii) a low mass loss rate, and (iii) a small deviation between measured water content and target water content. The low thawing efficiency of ice-bentonite mixtures can be solved by using a microwave-assisted thawing method. This research can improve the sample preparation method used to produce compacted buffer/backfill materials for HLW disposal.
Having developed the necessary mathematics in chapters 4 to 6, chapter 7 returns to physics Evidence for homogeneity and isotropy of the Universe at the largest cosmological scales is presented and Robertson-Walker metrics are introduced. Einstein’s equations are then used to derive the Friedmann equations, relating the cosmic scale factor to the pressure and density of matter in the Universe. The Hubble constant is discussed and an analytic form of the red-shift distance relation is derived, in terms of the matter density, the cosmological constant and the spatial curvature, and observational values of these three parameters are given. Some analytic solutions of the Friedmann equation are presented. The cosmic microwave background dominates the energy density in the early Universe and this leads to a description of the thermal history of the early Universe: the transition from matter dominated to radiation dominated dynamics and nucleosynthesis in the first 3 minutes. Finally the horizon problem and the inflationary Universe are described and the limits of applicability of Einstein's equations, when they might be expected to break down due to quantum effects, are discussed.
We give a broad-brush overview of cosmology, including a timeline of events starting from the Big Bang until the present day. We introduce the three pillars of the Big Bang cosmological model, the concepts of homogeneity and isotropy, as well as parsec as a unit of distance. We also introduce natural units, and develop intuition on how to adopt and use them.
Monolingualism, bilingualism, and multilingualism represent concepts of individual upbringing and social organization of extreme impact and scope. All in all, the book attempted to guide the reader from a multilingualism-as-problem to a multilingualism-as-resource perspective. However, it also argued that multilingualism cannot work wonders and should not be considered a goal in itself. Running a multilingual society can produce many beneficial effects, but maintaining several languages at the same time also incurs costs that a society must be prepared to burden and share. It is crucial to know which boundary conditions tip the balance from burden to benefit, or vice versa. The book further argues for a continuum from monolingualism to multilingualism based on the dimensions of homogeneity and heterogeneity. It further introduces a novel typology of English in multilingual contexts, distinguishing between English in heritage contexts, English in bilingual heritage contexts, English in contexts of balanced bilingualism, English in indigenous multilingual contexts, English in postcolonial multilingual contexts, and English as a lingua franca in modern multilingual immigrant contexts.
Critics of commercial country music say that the music is homogenous, cliché, and that the so-called bro-country subgenre has taken over. This chapter uses interviews with hit songwriters in Nashville to examine the social and structural factors that influence the way songwriters practice their craft. One such factor, the “360 deal,” is a type of recording contract introduced as a way for record labels to recoup some of the revenue lost with the decline of recorded music sales. Though these contracts are legal agreements between artists and their labels, they have entirely restructured the careers of professional songwriters and the music that they create. This analysis of country music in the twenty-first century is based on a deep understanding of the occupational arrangements that underlie the creation of songs to argue for understanding the structures that shape the songwriting community as critical to the formation of country songs.
GC II 7 supports Aristotle’s elemental theory, according to which the four elements possess a common matter that enables their inter-transformation, over the superficially similar Empedoclean one, by arguing that the former theory, and it alone, can accommodate the formation of homogenous stuffs like flesh and bone from the four elements. According to the interpretation offered here, these stuffs are mixtures in the sense spelled out in GC II 10, and appear to be exhibit the kind of strong uniformity that some interpreters have denied to Aristotelian mixtures. Special attention is devoted to bringing out the significance of elemental mixture for Aristotle’s twin projects in GC: understanding the causes of generation and destruction and establishing a theory of the elements. Explaining the formation of elemental mixtures is a crucial step in showing how the generation of more complex substances is possible and how the four elements, as he conceives them, function as elements of more complex substances.
This is the first book to present a comprehensive, up to date overview of archaeological and environmental data from the eastern Mediterranean world around 6000 BC. It brings together the research of an international team of scholars who have excavated at key Neolithic and Chalcolithic sites in Syria, Anatolia, Greece, and the Balkans. Collectively, their essays conceptualize and enable a deeper understanding of times of transition and changes in the archaeological record. Overcoming the terminological and chronological differences between the Near East and Europe, the volume expands from studies of individual societies into regional views and diachronic analyses. It enables researchers to compare archaeological data and analysis from across the region, and offers a new understanding of the importance of this archaeological story to broader, high-impact questions pertinent to climate and culture change.
During the seventeenth century, the advent of what were known as the “common” and “new” analyses fundamentally changed the landscape of European mathematics. The widely accepted narrative is that these analyses, analytic geometry and calculus (mostly due to Descartes and Leibniz, respectively), occasioned a transition from geometrical to symbolic methods. In dealing with the science of motion, mathematicians abandoned the language of proportion theory, as found in the works of Galileo, Huygens, and Newton, and began employing the Newtonian and Leibnizian calculi when differential and fluxional equations first appeared in the 1690s. This was the advent of a more abstract way of practicing mathematics, which culminated with the algebraic approach to calculus and mechanics promoted by Euler and Lagrange in the eighteenth century. In this chapter, it is shown that geometrical interpretations and mechanical constructions still played a crucial role in the methods of Descartes, Leibniz, and their immediate followers. This is revealed by the manner in which they handled equations and how they sought their solutions. The passage from proportions to equations did not occur in a single step; it was a process that took a century to reach completion.
Chapter 7 returns to Kant for a deeper analysis of his views and their relation to the Euclidean mathematical tradition. Chapter 6 revealed that Euclid defined neither magnitude nor homogeneity, so that these notions are at best implicitly defined by the Euclidean-Eudoxian theory of proportions. Kant reworks the Euclidean theory of magnitudes, defining magnitude in terms of his own understanding of homogeneity, which admits of no qualitative difference of the manifold and which I call strict homogeneity. Most importantly, he thinks that intuition is required to represent either a continuous or a discrete manifold without qualitative difference. This role for intuition in Kant’s philosophy of mathematics and experience has not been appreciated, but finds support in various texts, especially Kant’s lectures on metaphysics and his criticisms of Leibniz’s views in the Amphiboly. Given Kant’s understanding of qualities, differences in dimension correspond to qualitative differences, so that Kant’s account corresponds to Euclid’s understanding of homogeneous magnitudes. Understanding the role of intuition allows us to appreciate the role of the categories of quantity and intuition in part–whole relations and the composition of magnitudes. The chapter closes with clarifying the sense in which intuition is required for the representation of magnitudes.
The plebiscitarian leader’s legitimacy is based on the tenets of populism: the people versus an elite opposition, and the promise that this time the government will follow the will of the authentic people. The primacy of popular will turns the regime against the self-constraining institutions of democracy. Populism as a movement and ideology is based on exclusion. Its anti-institutionalism and rejection of political mediation is contrary to democracy as rational decision-making. This rough democracy is deprived of its constitutional protection against the arbitrariness of the genuine will of genuine people. With this point of departure, the natural choice of the leader is to turn plebiscitarian. The leader claims that there is a direct relation between government and people. “The people,” as a concept used by populists is highly problematic for constitutional culture, even if the collective action of citizens (in specific circumstances) forms a bulwark of the constitutional system and the people’s properly institutionalized action plays an indispensable role in the system of modern checks and balances.
Cosmology is the part of science concerned with the structure and evolution of the universe on the largest scales of space and time. Gravity governs the structure of the universe on these scales and determines its evolution. General relativity is thus central to cosmology, and cosmology is one of the most important applications of general relativity. Our understanding of the universe on the largest scales of space and time has increased dramatically in recent years – both observationally and theoretically. This book concentrates on the role of relativistic gravity in cosmology, introducing only the most basic observational facts and working out the simplest theoretical models. This chapter sketches the three basic observational facts about our universe on the largest distance scales: the universe consists of stars and gas in gravitationally bound collections of matter called galaxies, diffuse radiation, dark matter of unknown character, and vacuum energy; the universe is expanding; averaged over large distance scales, the universe is isotropic and homogeneous.
After the financial crisis of 2008, the European Union (‘EU’) not only increased its substantial legislation regarding financial services, but also built up a strong and unified system of financial market supervision. In particular, central surveillance authorities were created. These were given far-reaching competences with regard to substituting dysfunctional national authorities or players in the financial services sector. The three European Economic Area (‘EEA’) and European Free Trade Association (‘EFTA’) States—Iceland, Liechtenstein, and Norway—participate in the EU's internal market through their membership of the EEA. In order to continue participating on an equal footing in the internal market for financial services and to honour their duty to maintain homogeneity, the EEA EFTA States also had to incorporate the new institutional setup regarding financial services supervision. This obligation, however, in particular relating to certain intrusive powers of the new surveillance authorities, collided with some constitutional reservations, above all of the two Nordic EEA EFTA States. This article will show how these conflicting aims could be merged into a system that on the one hand guarantees the unified overall approach needed for strengthened surveillance of the internal market for financial services, and on the other hand safeguards certain constitutional reservations of the EEA EFTA States. It also looks at how third countries that do not (fully) participate in the internal market, such as the United Kingdom and Switzerland, are likely to be treated in this context by the EU.
Chapter 4 scrutinises the notion of unity as an essential characteristic of the EU legal order and the internal market, and the aims of the multilateral agreements to achieve ‘homogeneity’ in the expanded internal market. The Chapter provides an examination of the notions of ‘unity’ and ‘homogeneity’ and the nature of the homogeneity provision in the acquis-exporting agreements. The Chapter seeks to establish the level of legislative commonality – as opposed to flexibility and differentiation – necessary in the extended internal market in order to be able to consider the third country market participants equal to their EU counterparts.
Borders and boundaries can represent old narratives, which often, however, cannot deal with new realities. Borders are inflexible, but reality is flexible and fluid. This is augmented in crisis situations. Multi-ethnicity and history run in parallel, as shared cultures often precede and transcend Westphalia and institutionally imposed borders. For cultures with roots in antiquity, top-down established borders appear to lack legitimacy, as these cultures place more emphasis on historical similarities and traditions of peoples. Thus, what is more important: cultural and historical commonalities or institutional top-down constructions? This article examines the impact of the prioritization of top-down ethno-religious homogeneity over lasting conflict resolution. Through an interdisciplinary approach, the article draws a number of hypotheses from the fields of conflict resolution, territoriality, and nation building and tests these hypotheses on the specific case of the 1923 Compulsory Population Exchange (CPE) between Greece and Turkey and the dual role of the Mediterranean as a security bridge or barrier. This article highlights a “how-not-to” scenario in conflict resolution and argues that efforts to form apparent homogeneous nation-states led to short-term, incomplete conflict termination with a lasting impact, while conflict resolution remained elusive.