受数学系马坚伟老师邀请，国际著名数据同化（data assimilation）专家，法国Grenoble第一大学数学系Francois-Xavier Le Dimet教授访问我系一个月，并将做一系列关于数据同化方面的学术报告，欢迎大家参加。 Le Dimet 教授于1982年提出变法数据同化方法，该论文已被引用一千余次，该方法已被很多机构（如欧洲气象中心，日本气象中心，加拿大气象中心等）应用到实际天气预报中。 Le Dimet教授近年致力于数据同化在海洋污染、洪水灾害等领域的应用。 第一次报告时间：10月15号9:00-10:00，地点：格物楼503. 下面是几次报告的摘要。后续报告时间和地点将会晚些时候通告。 **DATA ASSIMILATION : A PARADIGM FOR MODELING**
** **
**François-Xavier Le Dimet**
**Université de Grenoble-Alpes et INRIA **
** **
** **
During these last decades mathematical modeling has known a huge development from the theoretical, physical and computational viewpoints. The domain of application goes from physics, geophysics, biology, human and social sciences. A question we may ask is: modeling for what to do and how to do it? Modeling has two main purposes : understanding and predicting the evolution of a system either in the "real world" or after a virtual perturbation.. Of course understanding is a preliminary step before predicting. It's necessary to distinguish two basic situations: - Linear models : in this case the model has global properties summarized by the eigenvalues and eigenfunctiions of the model. A good knowledge of these elements permits to understand and predict. Of course the analysis of the model is far from being a trivial problem! -Nonlinear models : in this case, most of time, the model doesn't have global properties but only in the vicinity of some situations therefore a necessity to understand/predict a system require to provide more information. This information could be physical : observations on the state of the system, statistical : on the behavior of the system or on some approximate state of the system. Data Assimilation is the ensemble of methods able to link together heterogeneous sources of information is order to retrieve "at best" the state of a system. In the talks we will mainly consider the methods based on Variational Methods and based on Optimal Control Theory. In a first talk we will present the general formalism of Variational Data Assimilation (VAD) and some applications to geophysical fluids (atmosphere, ocean, continental waters). The second and third talk will be devoted to error propagation: both models and data have errors, the question we ask is : what is their impact on the final analysis? It is crystal clear that a prediction makes sense only if we are able to have an estimation of this error. We will see how to proceed to obtain this estimation. The fourth talk will consider new sources of information for instance the images and their dynamics issued from the observation of the system. How to couple images with mathematical models? Even if we will mainly focus to geophysical applications all these methods can be applied, and are sometimes applied, to other scientific domains ranging from nuclear physics to medical imaging and agronomy.
**TALK 1.**
**A GENERAL FORMALISM FOR VARIATIONAL DATA ASSIMILATION.**
The classical meteorological prediction is based on the numerical integration of the equations governing the atmosphere from an initial time (today) toward the date of the prediction (tomorrow). But this initial condition is unknown therefore, prior to the prediction, we need to retrieve the state of the system. The ingredients we have are: - a mathematical model - a set of observation distributed in space and time - statistics of errors on the observations and the past predictions - an a priori estimation of the state at the initial time. The application of optimal control methods the control variable being the initial condition) leads to define an Optimality System (O.S.) which is the Euler Lagrange equation of the optimization problem. The O.S. contains the model and so called Adjoint model whom the backward integration will give the gradient of the function measuring the discrepancy between the model and observations. This method is presently used by the main meteorological services worlwide. We will describe what are the difficulties from the mathematical, computational and coding point of views. Several examples will be displayed in meteorology, oceanography and hydrology.
**TALK 2**
**SECOND ORDER METHODS ; SENSITIVITY ANALYSIS.**
** **
The problem of Variational Data Assimilation, considered in the framework of Optimal Control Theory, leads to an Optimality System containing all the available information -- model, data and statistics, its solution gives a necessary condition for optimality. Introducing a second order adjoint, a second order analysis can be carried out with important applications, such as: -Improvement of the optimization algorithms - Sensitivity Analysis - Estimation of a posteriori statistics on the solution. In this talk we will mainly consider the problem of Sensitivity Analysis. A sensitivity is defined by a response function depending on a parameter through the state of the system solution of the model. By definition sensitivity is the gradient of the response function with respect to this parameter. If we consider the problem of the impact of an error of observations on the retrieved fields then these variables are lonly linked in the optimality system. Therefore sensitivity analysis will requite to go one step further, in this case it will be necessary to derive the O.S. and to put in light a second order adjoint (SOA) This SOA gives access to the product of any vector by the Hessian of the cost function, then it's possible to carry out second order optimization methods and to have access to the spectrum of the Hessian. Sensitivity analysis will be applied to a problem of water pollution to identify a source.
**TALK 3**
**SECOND ORDER METHODS ; EVALUATION OF A POSTERIORI ERROR COVARIANCES.**
** **
** **
In the definition of VDA we introduce the covariance matrix of the prediction error. Therefore for the next step of the prediction we will have to determine this operator for the "next day". How to proceed? For linear models this covariance matrix is the inverse of the Hessian of the cost function therefore this information could be obtained from the SOA. For non linear problems this is no longer true. Many developments have been done about "Ensemble Methods" that compute covariance after sampling the trajectories of the predictions but this is very costly from the computational point of view because a large number of trajectories are necessary for the covariance estimation. We will present a method permitting to add a correcting term to the inverse of the Hessian leading to an improved estimation of the covariance at a lower cost. The comparison is carried out with slightly nonlinear model and a strongly nonlinear model.
** **
** **
**TALK 4**
**ASSIMILATION OF IMAGES.**
** **
Information issued from satellite imagery. At the present time the earth is permanently observed by a large number of satellites in several wavelength. For the ocean are of interest Images can be considered from two viewpoints: Eulerian approach : the image is considered as a set of pixels, at each pixels is associated a radiance then a physical data measuring an oceanic variable such as the sea surface temperature (SST). These data, when they are available, can be plugged in a regular scheme of data assimilation Lagrangian approach. An image with a uniform gradient of variance if is has a potential for eulerian information doesn't bear any lagrangian information, on the contrary the evolution of an images with strong gradients i.e. an image with visible discontinuities (e.g. filaments, fronts, vortex) will provide an information on the evolution of the fields. If this information is currently used in a qualitative way the remaining question is how to couple this information with a numerical models?
In this talk we will see how Data Assimilation methods can be used for mixing these sources of information in order to retrieve the state of a system, We will consider several approaches and an application to oil spill. |