Galileo inclined Planes and Curves for motion study |
First good news: our seismic data processing paper has been accepted to ICASSP 2014 (a just below 50% acceptance rate: 1745 out of 3492). Second good news: it is in Florence, a magnificent city, not only for arts (the conference is called "The Art of Signal Processing), but for Science too. The Galileo (1564-1642) History of Science Museum is a must-see. At our epoch, we can afford running time consuming dumb computations, changing only one hyper-parameter at the time to solve inverse restoration problems. We benefit from neat integro-differential settings to solve convex optimization problems. It is difficult to imagine a time when people endeavored the making of Science as we know it today, with very imprecise time measurements: imagine you want to compare the duration of free-fall for two objects (a 1 kg feather ball and a 1kg feather ball, for instance, with a water clock, or clepsydra. The calculus of variation blossomed between Leibniz and Newton between 1666 and 1674. Yet, people where able to "prove" important properties of motions, such as the tautochronism of the cycloid (the motion of a point on a rolling circle). Take a wooden gutter in shape of a cycloid, drop two balls from different heights along the wooden frame, they would arrive at the same time. The video is available here, at the Galileo virtual Museum. This "tautochrone" phenomenon (meaning "same time" in Greek) was discovered by Christiaan Huyghens around the 1650's. It was instrumental in the development of "more perfect pendulum motions", at the core of modern pendulum clocks, that allowed increased precision in Science, as well as farther and less hazardous boat trips around the world. The clock making industry is related to the longitude problem.
The special session: Seismic Signal Processing is organized by Leonardo Duarte, Daniela Donno, Renato R Lopes and João Romano.
A fundamental problem in geophysics is to estimate the properties of the Earth’s subsurface based on measurements acquired by sensors located over the area to be analyzed. Among the different methods to accomplish this task, seismic reflection is the most widespread and has been intensively applied for hydrocarbon exploration. The characterization of the subsoil using seismic reflection techniques is conducted by recording the wave field that is originated from the interaction between the environment under analysis and a seismic wave generated by controlled active sources (e.g. a dynamite explosion in land acquisition). Signal processing (SP) plays a fundamental role in seismic reflection. Indeed, in order to extract relevant information from seismic data, one has to perform tasks such as filtering, deconvolution, and signal separation. Originally, there was a close interaction between the signal processing and geophysics communities – for instance, important achievements in deconvolution and the wavelet transform were obtained in the context of seismic data. Nowadays, however, this interaction has been partially lost – as a consequence, geophysicists are not aware of the most recent SP methods, and, on other hand, the SP community is drawing weak attention to this interesting application. Given this panorama, the main goals of this special session are to shed some light on the research in seismic signal processing, and to broaden and reinforce collaboration between the signal processing and the geophysics research communities. With this goal in mind, the session comprises works on important theoretical and practical topics that arise in seismic signal processing.
The accepted presentation is: A constrained-based optimization approach for seismic data recovery problems
Abstract: Random and structured noise both affect seismic data, hiding the reflections of interest (primaries) that carry meaningful geophysical interpretation. When the structured noise is composed of multiple reflections, its adaptive cancellation is obtained through time-varying filtering, compensating inaccuracies in given approximate templates. The under-determined problem can then be formulated as a convex optimization one, providing estimates of both filters and primaries.Within this framework, the criterion to be minimized mainly consists of two parts: a data fidelity term and hard constraints modeling a priori information. This formulation may avoid, or at least facilitate, some parameter determination tasks, usually difficult to perform in inverse problems. Not only classical constraints, such as sparsity, are considered here, but also constraints expressed through hyperplanes, onto which the projection is easy to compute. The latter constraints lead to improved performance by further constraining the space of geophysically sound solutions.
This conference presentation is strongly related to the journal paper: A Primal-Dual Proximal Algorithm for Sparse Template-Based Adaptive Filtering: Application to Seismic Multiple Removal [page|pdf|blog|arxiv], accepted in 2014 at IEEE Transactions on Signal Processing.