To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In some daily tasks, such as pick and place, the robot is requested to reach with its hand tip a desired target location while it is operating in its environment. Such tasks become more complex in environments cluttered with obstacles, since the constraint for collision-free movement must be also taken into account. This paper presents a new technique based on genetic algorithms (GAs) to solve the path planning problem of articulated redundant robot manipulators. The efficiency of the proposed GA is demonstrated through multiple experiments carried out on several robots with redundant degrees-of-freedom. Finally, the computational complexity of the proposed solution is estimated, in the worst case.
This paper describes an accurate and robust text alignment system for structurally different languages. Among structurally different languages such as Japanese and English, there is a limitation on the amount of word correspondences that can be statistically acquired. The main reason for this is the systems of functional (closed) words are quite different in the two languages. The proposed method makes use of two kinds of word correspondences in aligning bilingual texts. One is a bilingual dictionary of general use. The other is the word correspondences that are statistically acquired in the alignment process. Our method gradually determines sentence pairs (anchors) that correspond to each other by relaxing parameters. The method, by combining two kinds of word correspondences, achieves adequate word correspondences for complete alignment. As a result, texts of various length and of various genres in structurally different languages can be aligned with high precision. Experimental results show our system outperforms conventional methods for various kinds of Japanese–English texts.
TheJoint Venture Programme, newly launched by the Cranfield School ofManagement, offers help to companies that are looking to expand their businessin developing countries. It is claimed that valuable European Community (EU)grants are being lost, and the Cranfield initiative offers a fast track throughwhat they describe as the bureaucracy involved in making applications for aid todevelop partnerships abroad. This is particularly true of companies involvedwith automation and robotics who because of their intense development andmarketing activities fail to take advantage not only of EU programmes that offerfinancial assistance, but even the help that is at hand from their owngovernment's initiatives.
A practical implementation of a genetic algorithm for routing a real autonomous robot through a changing environment is described. Moving around in a production plant the robot collects information about its environment and stores it in a temporal map, which is virtually a square grid, taking account of changing obstacles. The evolutional optimizer continuously searches for short paths in this map using string representations of paths as chromosomes. The main features of the implementation include physical realization, random walk exploration, temporal mapping, and dedicated genetic operators.
This paper presents the software package SYMORO+ for the automaticsymbolic modelling of robots. This package permits to generate the directgeometric model, the inverse geometric model, the direct kinematic model, theinverse kinematic model, the dynamic model, and the inertial parametersidentification models.
The structure of the robots can be serial, treestructure or containing closed loops. The package runs on Sun stations and PCcomputers, it has been developed under MATHEMATICA and Clanguage. In this paper we give an overview of the algorithms used in thedifferent models; the computational cost of the dynamic models of the PUMArobot are given.
The popular method of enumerating the primes is the Sieve of Eratosthenes. It can be programmed very neatly in a lazy functional language, but runs rather slowly. A little-known alternative method is the Wheel Sieve, originally formulated as a fast imperative algorithm for obtaining all primes up to a given limit, assuming destructive access to a bit-array. This article describes functional variants of the wheel sieve that enumerate all primes as a lazy list.
An [n, k, r]-partite graph is a graph whose vertex set, V, can be partitioned into n pairwise-disjoint independent sets, V1, …, Vn, each containing exactly k vertices, and the subgraph induced by Vi ∪ Vj contains exactly r independent edges, for 1 [les ] i < j [les ] n. An independent transversal in an [n, k, r]-partite graph is an independent set, T, consisting of n vertices, one from each Vi. An independent covering is a set of k pairwise-disjoint independent transversals. Let t(k, r) denote the maximal n for which every [n, k, r]-partite graph contains an independent transversal. Let c(k, r) be the maximal n for which every [n, k, r]-partite graph contains an independent covering. We give upper and lower bounds for these parameters. Furthermore, our bounds are constructive. These results improve and generalize previous results of Erdo″s, Gyárfás and Łuczak [5], for the case of graphs.
Lemke and Kleitman [2] showed that, given a positive integer d and d (necessarily non-distinct) divisors of da1, …, ad there exists a subset Q ⊆ {1, …, d} such that d = [sum ]i∈Qai answering a conjecture of Erdo″s and Lemke. Here we extend this result, showing that, provided [sum ]p|d1/p [les ] 1 (where the sum is taken over all primes p), there is some collection from a1, …, ad which both sum to d and which can themselves be ordered so that each element divides its successor in the order. Furthermore, we shall show that the condition on the prime divisors is in some sense also necessary.
There is a wide diversity in the functioning and programming of robots designed and programmed to assist individuals with disabilities. The planning and structure of four rehabilitation robot implementations is presented. The first is the CURL language developed for human interface and the most widely used in this field. The second, MUSIIC, explores methods for direct manipulation of objects. RoboGlyph uses symbolic constructs to assist with the direction and programming of rehabilitation robots and finally a multi-tasking operating executive is discussed that controls a bilateral head operated telerobot. These four implementations reflect a wide range of interface concepts for the intended users.
We study three comonads derived from the comma construction. The induced coalgebras correspond to the three concepts displayed in the title of the paper. The comonad that yields the *-autonomous categories is, in essence, the Chu construction, which has recently awaken much interest in computer science. We describe its couniversal property. It is right adjoint to the inclusion of *-autonomous categories among autonomous categories, with lax structure-preserving morphisms. Moreover, this inclusion turns out to be comonadic: *-autonomous categories are exactly the Chu-coalgebras.
In this paper we define a logical framework, called &;lambda;TT, that is well suited for semantic analysis. We introduce the notion of a fibration [Lscr]1 : [Fscr]1 [xrarr ] [Cscr]1 being internally definableThe definability as used in this paper should not be confused with Bénabou's ‘definability’ (Bénabou 1985). in a fibration [Lscr]2 : [Fscr]2 [xrarr ] [Cscr]2. This notion amounts to distinguishing an internal category Lin [Lscr]2 and relating [Lscr]1 to the externalization of L through a pullback. When both [Lscr]1 and [Lscr]2 are term models of typed calculi [Lscr]1 and [Lscr]2, respectively, we say that [Lscr]1 is an internal typed calculus definable in the frame language [Lscr]2. We will show by examples that if an object language is adequately represented in λTT, then it is an internal typed calculus definable in the frame language λTT. These examples also show a general phenomenon: if the term model of an object language has categorical structure S, then an adequate encoding of the language in λTT imposes an explicit internal categorical structure S in the term model of λTT and the two structures are related via internal definability. Our categorical investigation of logical frameworks indicates a sensible model theory of encodings.
The traditional notions of strong and weak normalization refer to properties of a binary reduction relation. In this paper we explore an alternative approach to normalization, in which we bypass the reduction relation and instead focus on the normalization function, that is, the function that maps a term to its normal form. We work in an intuitionistic metalanguage, and characterize a normalization function as an algorithm that picks a canonical representative from the equivalence class of convertible terms. This means that we also get a decision algorithm for convertibility.Such a normalization function can be constructed by building an appropriate model and a function quote, which inverts the interpretation function. The normalization function is then obtained by composing the quote function with the interpretation function. We also discuss how to get a simple proof of the property that constructors are one-to-one, which is usually obtained as a corollary of Church–Rosser and normalization in the traditional sense.We illustrate this approach by showing how a glueing model (closely related to the glueing construction used in category theory) gives rise to a normalization algorithm for a combinatory formulation of Gödel System T. We then show how the method extends in a straightforward way when we add cartesian products and disjoint unions (full intuitionistic propositional logic under a Curry–Howard interpretation) and transfinite inductive types such as the Brouwer ordinals.
This article describes some mathematical methods for verifying properties of programs in higher-order, functional languages. We focus on methods for reasoning about equivalence of expressions. Such methods are often based upon a denotational semantics of the programming language in question, using the mathematical theory of domains (Scott 1982; Plotkin 1981a). Here I will describe some methods which are based upon operational semantics (Plotkin 1981b). These operationally-based techniques have several attractive features. For example, there is relatively little mathematical overhead involved in developing the basic theory—in contrast with that needed to develop the existence and properties of recursively defined domains, the sine qua non of denotational semantics. On the other hand, domain theory provides an extremely powerful tool for analysing recursive program constructs. I believe that any serious attempt to develop a useful theory for verification of program properties has to involve both operational and denotational techniques.
Highlights The main purpose of this article is to advertise the usefulness, for proving equivalences between functional programs, of co-inductive techniques more familiar in the context of concurrency theory (de Roever 1978; Park 1981; Milner 1989). They were imported into the world of lambda calculus and functional programming by several people: see Dybjer and Sander (1989); Abramsky (1990); Howe (1989, Howe (1996); Egidi, Honsell, and della Rocca (1992); and Gordon 1994.
This book is based on material presented at a summer school on Semantics and Logics of Computation that took place at the Isaac Newton Institute for Mathematical Sciences, Cambridge UK, in September 1995. The school was sponsored by the EU ESPRIT Basic Research Action on Categorical Logic in Computer Science (CLiCS I & II) and aimed to present some modern developments in semantics and logics of computation in a series of graduate-level lectures. Most of the material presented here has not previously been accessible in such a coherent and digestible form. This Preface gives a thematic overview of the contents of the book. It also briefly sketches the history of the two CLiCS projects which came to an end with the summer school.
Games, proofs and programs One of the most exciting recent developments in programming semantics has been the use of games and strategies to provide a more fine grained semantics than that provided by domain-theoretic models. This ‘intensional semantics’ aims to combine the good mathematical and structural properties of traditional denotational semantics with the ability to capture dynamical and interactive aspects of computation, and to embody the computational intuitions of operational semantics. More pragmatically, game semantics have been used to solve long-standing ‘full abstraction’ problems for PCF (a simple language for Programming Computable arithmetic Functions of higher type) and Idealised Algol. In other words games have provided syntax-independent models of these languages which precisely capture the operationally defined notion of ‘contextual equivalence’ of program expressions.
This course is an introduction to the research trying to connect the proof theory of classical logic and computer science. We omit important and standard topics, among them the connection between the computational interpretation of classical logic and the programming operator callcc.
Instead, here we put the emphasis on actual mathematical examples. We analyse the following questions: what can be the meaning of a non-effective proof of an existential statement, a statement that claims the existence of a finite object that satisfies a decidable property? Is it clear that a non-effective proof has a meaning at all? Can we always say that this proof contains implicitly, if not explicitly, some effective witness? Is this witness unique? By putting the emphasis on actual mathematical examples, we follow Gentzen who founded natural deduction by analysing concrete mathematical examples, like Euclid's proof of the infinity of prime numbers.
We present general methods that can be used to compute effectively such witnesses. The three methods we shall analyse are
The negative translation. This is a quite natural translation of classical logic in intuitionistic logic, which is both simple and powerful.
A game-theoretic interpretation of classical logic, which is a suggestive reformulation of Gentzen-Novikov's sequent calculus, and may be in some cases more convenient than negative translation.
Use of formal topology. It consists of realising some conditions in a topological model. In some cases, it is possible then to realise these conditions effectively, while the “absolute” realisation of these conditions is impossible.