To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This friendly guide is the companion you need to convert pure mathematics into understanding and facility with a host of probabilistic tools. The book provides a high-level view of probability and its most powerful applications. It begins with the basic rules of probability and quickly progresses to some of the most sophisticated modern techniques in use, including Kalman filters, Monte Carlo techniques, machine learning methods, Bayesian inference and stochastic processes. It draws on thirty years of experience in applying probabilistic methods to problems in computational science and engineering, and numerous practical examples illustrate where these techniques are used in the real world. Topics of discussion range from carbon dating to Wasserstein GANs, one of the most recent developments in Deep Learning. The underlying mathematics is presented in full, but clarity takes priority over complete rigour, making this text a starting reference source for researchers and a readable overview for students.
Seismic data must be interpreted using digital signal processing techniques in order to create accurate representations of petroleum reservoirs and the interior structure of the Earth. This book provides an advanced overview of digital signal processing (DSP) and its applications to exploration seismology using real-world examples. The book begins by introducing seismic theory, describing how to identify seismic events in terms of signals and noise, and how to convert seismic data into the language of DSP. Deterministic DSP is then covered, together with non-conventional sampling techniques. The final part covers statistical seismic signal processing via Wiener optimum filtering, deconvolution, linear-prediction filtering and seismic wavelet processing. With over sixty end-of-chapter exercises, seismic data sets and data processing MATLAB codes included, this is an ideal resource for electrical engineering students unfamiliar with seismic data, and for Earth Scientists and petroleum professionals interested in DSP techniques.
This chapter first establishes the fundamental definitions necessary to the construction of the approach: technique and technology, machine and dispositif. It discusses Foucault, Simondon, Crary, and Albera/ Tortajada in the process. It then argues that there is a fundamental link between machines, images, and movement within the history of culture. It analyses the apparatuses invented by Filippo Brunelleschi during the Renaissance, before exploring the depiction of machines from the Renaissance to industrial drawing. Given these relations, this chapter argues that machines should be considered as archives, materializing the history of performance gestures, and of the system they have been a part of. A detailed analysis of the camera obscura and its historical variants, connecting the histories of art, of spectacles and of science, exemplifies the approach.
Keywords: Machine, technology, dispositif, Gilbert Simondon, camera obscura, media epistemology
Today's proliferation of media, their base and equipment, has given urgency to the need to theorize the issues they raise and, consequently, have brought about the return to film theory and to media theory more generally of a vocabulary borrowed from a description of what Gilbert Simondon called ‘technical objects’: devices; instruments; machines; technologies; techniques; dispositifs. Because of the structural importance of these terms to the approach taken in this volume, it is important that we establish distinctions between them.
A Few Definitions
Technique/Technology
Historically, ‘technology’ is a term initially used to describe a field of study that began in English- and German-speaking milieux, first by Christian Wolff in 1728 in his Preliminary Discoujrse on Philosophy in General, in which he invented the concept in its modern sense. His work had no concrete consequences, but was adopted more successfully as a simultaneously theoretical and pedagogical project by Johann Beckmann in 1772 and then in 1776 in the latter's Anleitung zur Technologie. Traces of it can be found in English in Jacob Bigelow's Elements of Technology of 1829. The goal of technology was to describe, classify, and analyse the technical operations of the mechanical arts, or ‘the science of the arts and of the works of art,’3 in the words of Christian Wolff.
Kinemacolor, the first commercially exploited ‘natural colour’ process, has often been considered as a step in the wrong direction for colour cinema. But it was an extraordinarily coherent system, based on a mechanical apparatus and involving a whole conception of what cinema was, what it should be, how it should be done and sold, and what was to be its place within culture. Moreover, the characteristics of the process involved highly original perceptual traits that are of major theoretical interest today. Technically invented by George Albert Smith, it was its promoter Charles Urban that gave it its real coherence. For Urban, Kinemacolor was conceived as a true reinvention of cinema. Cinema thus never ceases to be confronted with reinvention projects.
Keywords: Kinemacolor, Charles Urban, George Albert Smith, technical network, colour cinema, film technology.
In the introduction to his doctoral dissertation on ‘the conquest of the snapshot’, the photography historian André Gunthert writes:
Any photographic image, the product of technique, contains an ensemble of information about the operational modalities which presided over its creation: an iconic document offered up for aesthetic reading, it is also a technological monument capable of becoming the subject of an archaeological interrogation.
This ‘also’ is tied up with a shift: the attention to the technological, to its traces in the image, transforms our gaze and has us move to the ar-chaeological level. But this passage involves taking non-verbal elements into account: images, devices, diagrams, graphics, etc. Pierre Francastel, in his article ‘Valeurs sociologiques de l’espace-temps figuratif’, clearly demonstrated the importance of these non-verbal sources, and yet they create methodological problems as part of an archaeology: how (and of what) can they make an archive? It was questions of this sort that led Michel Foucault to pass from an archaeology based on discourses to an epistemology that takes dispositifs into account. I would like to give an example, using a concrete object, to illustrate these questions, one sometimes mentioned by archivists because of the singular problems it poses: the first cinematic natural colour process marketed commercially, Kinemacolor.
This process was invented in 1906 in England by George Albert Smith and was financed and marketed by Charles Urban until 1915.