We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
What defines a correct program? What education makes a good programmer? The answers to these questions depend on whether programs are seen as mathematical entities, engineered socio-technical systems or media for assisting human thought. Programmers have developed a wide range of concepts and methodologies to construct programs of increasing complexity. This book shows how those concepts and methodologies emerged and developed from the 1940s to the present. It follows several strands in the history of programming and interprets key historical moments as interactions between five different cultures of programming. Rooted in disciplines such as mathematics, electrical engineering, business management or psychology, the different cultures of programming have exchanged ideas and given rise to novel programming concepts and methodologies. They have also clashed about the nature of programming; those clashes remain at the core of many questions about programming today. This title is also available as Open Access on Cambridge Core.
Everywhere one looks, one finds dynamic interacting systems: entities expressing and receiving signals between each other and acting and evolving accordingly over time. In this book, the authors give a new syntax for modeling such systems, describing a mathematical theory of interfaces and the way they connect. The discussion is guided by a rich mathematical structure called the category of polynomial functors. The authors synthesize current knowledge to provide a grounded introduction to the material, starting with set theory and building up to specific cases of category-theoretic concepts such as limits, adjunctions, monoidal products, closures, comonoids, comodules, and bicomodules. The text interleaves rigorous mathematical theory with concrete applications, providing detailed examples illustrated with graphical notation as well as exercises with solutions. Graduate students and scholars from a diverse array of backgrounds will appreciate this common language by which to study interactive systems categorically.
This up-to-date introduction to type theory and homotopy type theory will be essential reading for advanced undergraduate and graduate students interested in the foundations and formalization of mathematics. The book begins with a thorough and self-contained introduction to dependent type theory. No prior knowledge of type theory is required. The second part gradually introduces the key concepts of homotopy type theory: equivalences, the fundamental theorem of identity types, truncation levels, and the univalence axiom. This prepares the reader to study a variety of subjects from a univalent point of view, including sets, groups, combinatorics, and well-founded trees. The final part introduces the idea of higher inductive type by discussing the circle and its universal cover. Each part is structured into bite-size chapters, each the length of a lecture, and over 200 exercises provide ample practice material.
Petri nets are one of the most popular tools for modeling distributed systems. This book provides a modern look at the theory behind them, by studying three classes of nets that model (i) sequential systems, (ii) non-communicating parallel systems, and (iii) communicating parallel systems. A decidable and causality respecting behavioral equivalence is presented for each class, followed by a modal logic characterization for each equivalence. The author then introduces a suitable process algebra for the corresponding class of nets and proves that the behavioral equivalence proposed for each class is a congruence for the operator of the corresponding process algebra. Finally, an axiomatization of the behavioral congruence is proposed. The theory is introduced step by step, with ordinary-language explanations and examples provided throughout, to remain accessible to readers without specialized training in concurrency theory or formal logic. Exercises with solutions solidify understanding, and the final chapter hints at extensions of the theory.
Providing in-depth coverage, this book covers the fundamentals of computation and programming in C language. Essential concepts including operators and expressions, input and output statements, loop statements, arrays, pointers, functions, strings and preprocessors are described in a lucid manner. A unique approach - 'Learn by quiz' - features questions based on confidence-based learning methodology. It helps the reader to identify the right answer with adequate explanation and reasoning as to why the other options are incorrect. Computer programs and review questions are interspersed throughout the text. The book is appropriate for undergraduate students of engineering, computer science and information technology. It can be used for self-study and assists in the understanding of theoretical concepts and their applications.
Session types are type-theoretic specifications of communication protocols in concurrent or distributed systems. By codifying the structure of communication, they make software more reliable and easier to construct. Over recent decades, the topic has become a large and active research area within the field of programming language theory and implementation. Written by leading researchers in the field, this is the first text to provide a comprehensive introduction to the key concepts of session types. The thorough theoretical treatment is complemented by examples and exercises, suitable for use in a lecture course or for self-study. It serves as an entry point to the topic for graduate students and researchers.
Mobile systems, whose components communicate and change their structure, now pervade the informational world and the wider world of which it is a part. The science of mobile systems is as yet immature, however. This book presents the pi-calculus, a theory of mobile systems. The pi-calculus provides a conceptual framework for understanding mobility, and mathematical tools for expressing systems and reasoning about their behaviours. The book serves both as a reference for the theory and as an extended demonstration of how to use pi-calculus to describe systems and analyse their properties. It covers the basic theory of pi-calculus, typed pi-calculi, higher-order processes, the relationship between pi-calculus and lambda-calculus, and applications of pi-calculus to object-oriented design and programming. The book is written at the graduate level, assuming no prior acquaintance with the subject, and is intended for computer scientists interested in mobile systems.
In a technologically advanced and competitive landscape dominated by major tech companies and burgeoning start-ups, the key asset lies in boosting monthly active users. Traditionally, product design has relied on fragmented insights from personal experience, common sense, or isolated experiments. This work endeavours to establish a theoretical framework for predicting and influencing the digital behaviour of technology users. Drawing on over a century of scientific research in behaviour, cognition, and physiology, this presents a comprehensive approach to customizing digital stimuli. The objective is to enhance user interactions with digital and virtual environments. Through real and cost-effective examples, diagrams, and formulas, the text offers theoretical knowledge and a practical methodology to elevate digital product designs, setting them apart from the competition. With the potential to reshape the digital design landscape, this book emerges as a game-changer, promising to revolutionize how digital products and services are conceived and delivered.
There is a canonical and efficient way to extend a convergent presentation of a category by a 2-polygraph into a coherent one. Precisely, the 3-cells used in this extension procedure are in one-to-one correspondence with the confluence diagrams of critical branchings in the polygraph. Now, if the polygraph is finite, so is the set of its critical branchings, and therefore the set of 3-cells generating coherence can be taken to be finite. In such a situation, the polygraph is said to have finite derivation type, or FDT. The relevance of this concept, introduced by Squier, lies in the following invariance property: if a category admits a finite presentation having finite derivation type, then all finite presentations of also have FDT. This invariance will prove essential to show that some finitely presented categories do not admit convergent presentations. Using these conditions, Squier managed to produce an explicit example of a finitely presented monoid, with decidable word problem, but having no finite convergent presentation. This provides a negative answer to the question of universality of finite convergent rewriting.
This chapter presents techniques for proving the termination of 3-polygraphs. A first method is based on a certain type of well-founded orders called reduction orders. Attention then turns to functorial interpretations: these amount to construct a functor from the underlying category to another category which already bears a reduction order. This covers quite a few useful examples. To address more complex cases, a powerful technique, due to Guiraud, is presented, based on the construction of a derivation from the polygraph. Here, termination is obtained by specifying quantities on 2-cells which decrease during rewriting, based on information propagated by the 2-cells themselves.
The study of universal algebra, that is, the description of algebraic structures by means of symbolic expressions subject to equations, dates back to the end of the 19th century. It was motivated by the large number of fundamental mathematical structures fitting into this framework: groups, rings, lattices, and so on. From the 1970s on, the algorithmic aspect became prominent and led to the notion of term rewriting system. This chapter briefly revisits these ideas from a polygraphic viewpoint, introducing only what is strictly necessary for understanding. Term rewriting systems are introduced as presentations of Lawvere theories, which are particular cartesian categories. It is shown that a term rewriting system can also be described by a 3-polygraph in which variables are handled explicitly, i.e., by taking into account their duplication and erasure. Finally, a precise meaning is given to the statement that term rewriting systems are "cartesian polygraphs".
This appendix provides an explicit description of the free n-category generated by an n-polygraph. This section is mostly inspired of the work of Makkai. A formal definition of the syntax of n-categories is first provided, describing morphisms in an (n+1)-category freely generated by an n-polygraph, allowing reasoning by induction on its terms to prove results on free categories. It turns out that this syntax for n-categories, which corresponds to the one used throughout the book, is very "redundant", in the sense that there are many ways to express a composite of cells which will give rise to the same result, and is sometimes not very practical for this reason. An alternative syntax, which suffers less from these problems, is provided by restricting compositions. Finally, a brief mention of the word problem for free n-categories is made.
This chapter is dedicated to the definition of 2-polygraphs, which are a 2-dimensional generalization of 1-polygraphs. Before introducing this notion, a refined viewpoint over 1-polygraphs is given. Instead of merely focusing on the set presented by a 1-polygraph as a set of equivalence classes of generators modulo the relations, the free category generated by the polygraph is now considered. The notion of 2-polygraph naturally appears as soon as arbitrary, non necessarily free, small categories are considered. In order to present such a category, one starts with a polygraph such that the 1-generators generate the morphisms of the category, but now it must be taken into account the relations induced by the category among the morphisms of the free category generated the resulting 1-polygraph. These relations will be generated by a set of 2-generators, consisting in certain pairs of morphisms intended to be equalized in the category. Following the same pattern, it will be explained that a 2-polygraph can also be seen as a system of generators for a free 2-category, thus preparing the study of 3-polygraphs. The variant where a (2,1)-category is freely generated is also examined.
This chapter introduces in full generality the central concept of this book, namely the notion of polygraph. Given an n-category, a cellular extension of it consist in attaching cells of dimension n+1 between certain pairs of parallel n-cells. This operation freely generates an (n+1)-category. Polygraphs are then obtained by starting with a set, considered as a 0-category and inductively repeating the above process in all dimensions. The construction yields a fundamental triangle of adjunctions between omega-categories, polygraphs, and globular sets. A brief description of (n,p)-polygraphs, that is, the notion of polygraph adapted to (n,p)-categories, concludes the chapter.
This chapter introduces the notion of acyclic extension of a 2-category, which consists of the additional data of 3-generators "filling all the spheres". This leads to the notion of coherent presentation of a category, which consists of a 2-polygraph presenting the category together with an acyclic extension of the free (2,1)-category on the polygraph. Coherent presentations are then constructed from convergent ones, and the appropriate notion of Tietze transformation between coherent presentations is studied: this allows formulation of a coherent variant of the Knuth-Bendix completion procedure, but also a reduction procedure, which can be used to obtain smaller coherent presentations. Finally, coherent presentations of algebras are studied, thereby defining the proper notion of coherent extension for linear polygraphs.
The purpose of this chapter is to introduce the notion of a polygraphic resolution of an ω-category. This notion was introduced by Métayer to define a homology theory for ω-categories, that is now known as the polygraphic homology. It was then showed by himself and Lafont that this homology recovers the classical homology of monoids for ω-categories coming from monoids. It is now known by work of Lafont, Métayer, and Worytkiewicz that these polygraphic resolutions are resolutions in the sense of a model category structure on ω-categories, the so-called folk model structure. Every ω-category is shown to admit such a resolution, and the relationship between two resolutions of the same ω-category is examined.