shadow_tr

Colloquium: 2004-2005

FALL 2004

View Spring 2005 »

September 9

The Role of the Pressure for Navier-Stokes Blow-Up

Jens LorenzUniversity of New Mexico

Abstract: In this talk I consider the blow-up question for the solutions of the Navier-Stokes equations, one of the millennium problems. After reviewing blow-up conditions for some simpler models, I will focus on the particular difficulties posed by the Navier-Stokes equations.

To analyze the equations, it is common to first eliminate the pressure by applying the Helmholtz projector. However, since pressure differences drive the flow, the pressure term is essential for understanding blow-up conditions. In this lecture, I will describe properties of the pressure, assuming that blow-up does occur.

September 23

Complex Approximation

Laszlo LempertPurdue University

Abstract: Originally, complex approximation was about approximating a holomorphic function, defined on an open subset of the complex plane, by polynomials. Subsequently one was led to consider more general problems: the function to be approximated had several, perhaps infinitely many variables, and the approximating functions could come from classes of functions other than polynomials.

In my talk I will describe one line of research, starting with Carl Runge's discovery from 1885 and leading up to recent results. I will also explain why approximations are in a sense inevitable in complex analysis/geometry.

September 30

The Art and Science of Achieving Harmonics on Stringed Instruments

Steven CoxRice University

Abstract: One may elicit the qth tone of a string by applying the 'correct touch' at one of its q-1 nodes during a simul-taneous pluck or bow. This effect was first scored for violin by Mondonville in 1735. Though it captured the attention of the 19th century masters, Chladni, Tyndall, Helmholtz and Rayleigh, it remained for Bamberger, Rauch and Taylor in 1982 to develop and analyze the first mathematical model of harmonics Their 'touch' is a damper of magnitude b concentrated at the node p/q. The `correct touch' is that b for which the modes, that do not vanish at p/q, are maximally damped. We here examine the associated spectral problem. We find the spectrum to be periodic and determined by a polynomial of degree q-1. We establish lower and upper bounds on the spectral abscissa and show that the set of associated root vectors constitutes a Riesz basis in the natural energy space and so identify 'correct touch' with the b that minimizes the spectral abscissa.

October 7

Using Auxiliary Information in Survey Sampling: A Quantile Regression Approach

Constantin GeorgescuTulane University

Abstract: Improved estimates of a survey population parameter can be obtained by using information about co-related auxiliary variable. There is a huge literature on such methods for estimating the mean. In this talk we explore some median-versions of these mean-based methods.

Following a brief overview on quantile regression, the new quantile based estimator is introduced and some of its properties are examined. A proof for consistency of the marginal quantile estimator and for Bahadur-Expansion validity of the conditional quantile estimator is included. Then, a look by means of the conditional double exponential likelihood model reveals the quantile base estimator connections to some of the already classical estimators.

Some simulation results exploring the performance of the new estimator concludes the presentation.

October 13

Can Viscosity and Forcing in a Fluid Flow Result in Chaotic Advection?

Chris JonesUniversity of North Carolina

Abstract: The Lagrangian (fluid particle) trajectories of a steady Euler flow in 2D are determined by an integrable system and hence exhibit no chaotic motion. The question then naturally arises as to whether a perturbed flow, which incorporates the physical effects of viscosity and forcing, can be chaotic. This problem is non-trivial since the perturbation is added at the level of the full partial differential equation but the potential chaos is at the level of the particle trajectories. It brings up many issues including long-time existence for 2D Navier-Stokes and finite-time Melnikov theory.

October 14

Renormalized Chern Forms

Dan BurnsUniversity of Michigan

Abstract: Renormalized Chern forms are invariants of a complex manifold with strongly pseudoconvex boundary M. They give rise to numerical invariants because though X may have infinite (invariant) volume, the characteristic numbers are given by convergent integrals. Prompted by numerous examples in two complex dimensions, JS Ryu and the speaker have shown that if the boundary M is locally spherical (CR equivalent to the sphere), then these forms give rational cohomology classes. The methods of proof are analytic continuation and showing that monodromy eigenvalues are roots of unity, and a residue calculaton. Speculations on the real analogue, and analytic applications will also be discussed, as well as further applications of the analytic continuation technique.

October 21

No Activity

October 28

No Activity

November 4

Do Shadows Leave Impressions?

John MayerUniversity of Alabama at Birmingham

Abstract: A simply-connected open set U in the plane R² can have a "nice" boundary (a circle, or homeomorphic image thereof) or a "nasty" one—with lots of interesting possibilities (to a topologist) for "nasty". There is a theory, useful in plane topology and in dynamics of the complex plane, that views even nasty boundaries from the point of view of the nicest one, a circle. Imagine that you have a map of U with polar coordinates provided by the Riemann Mapping Theorem. That is, there is a complex analytic homeomorphism h from the open unit disk D (with nice radial rays and concentric circles as its coordinates) onto U (with images of those rays and circles to provide coordinates). You stand at the image h(0) in U of the center 0 of D and walk along an image of a radial ray toward the boundary of U. Assume the sun is at h(0) and its illuminating rays follow the radial coordinates that h imposes on U. You are thicker than a ray, so you cast a shadow on the boundary. Suppose this shadow gets narrower as you get further from the sun and closer to the boundary. If the boundary is "nice" (like a circle, say), then your limiting shadow is a point on the boundary—surely a trivial impression! What is your limiting shadow if the boundary is "nasty" in the direction you are walking? Further progress on understanding the connected Julia sets of certain polynomials requires understanding shadows that leave nasty impressions.

November 11

Composite Materials: An Old Field of Study Full of New Surprises

Graeme MiltonUniversity of Utah

Abstract: Composite materials have been studied for centuries, and have attracted the interest of reknown scientists such as Poisson, Faraday, Maxwell, Rayleigh, and Einstein. Their properties are usually not just a linear average of the properties of the constituent materials and can sometimes be strikingly different. The beautiful red glass one sees in old church windows is a suspension of small gold particles in glass. Sound waves travel slower in bubbly water than in either water or air. In the last few decades composites have been found to have some surprising properties. Most materials, such as rubber, get thinner when they are stretched, but it is possible to design composites which get fatter as they are stretched. Electromagnetic signals can travel faster in a composite than in the constituent phases. It is possible to combine materials which expand when heated to obtain a material which contracts when heated. It is still an open question as to what properties can be achieved when one mixes two or more materials with known properties. This lecture will survey some of the progress which has been made.

November 18

Complex Dynamics and Geometry

Lex OversteegenVigreUniversity of Alabama, Birmingham

Abstract: A dynamical system is a function ƒ : Χ → Χ from a space Χ to itself. In dynamics one is interested in understanding the behavior of points under iteration (i.e., the behavior of the sequence ƒⁿ ( x ) as n → ∞). Even in very simple cases a complete description of the behavior of all points is not possible.

Much progress has been made in case Χ = C is the complex plane and ƒ : C → ℂ is a complex polynomial but important problems remain. In this case, it is well known that it suffices to consider polynomials of the form ƒ ( z ) = zd + ad-2 zd-2 + ⋅⋅⋅ +az + a₀. Hence the space of all relevant polynomials of degree d can be parameterized by C d-1.

For any polynomial ƒ, C + JF with JF = ∅, where J is perfect and compact (the Julia set) and F is open and dense (the Fatou set). For all zF, ƒⁿ ( z ) converges to either a periodic point, infinity, or a periodic circle (on which the polynomial is an irrational rotation). Hence all interesting dynamics occurs in the Julia set.

In an unpublished preprint W.P. Thurston outlined a method by which questions of the behavior of ƒ on J can be translated to the study of very simple geometric objects in the unit disk invariant under the map zzd. Moreover this method can also be used to obtain information on subsets of the parameter space ℂd-1. Thurston's preprint is now 20 years old and left many questions unresolved. In this talk we will present Thurston's approach and answer one of his questions.

December 2

High-Resolution Finite Volume Methods for Modeling Volcanos and Tsumanis

Randy LeVequeUniversity of Washington, Seattle

Abstract: Hyperbolic systems of partial differential equations often arise when modeling phenomena involving wave propagation or advective flow. Finite volume methods are a natural approach for conservation laws of this form since they are based directly on integral formulations and are applicable to problems involving shock waves and other discontinuities. High-resolution shock-capturing methods developed originally for compressible gas dynamics can also be applied to many other hyperbolic systems. A general formulation of these methods has been developed in the CLAWPACK software that allows application of these methods, with adaptive mesh refinement, to a variety of problems in fluid and solid dynamics.

I will describe these methods in the context of some recent work on modeling geophysical flow problems, particularly in the study of volcanos and tsunamis. Volcanos generate many challenging flow problems, and accurate simulation is required both to further scientific understanding and to aid in hazard assessment and mitigation. The initial blast wave can cause devastation in a large region around the volcano, the continuing eruption leads to lava flows or pyroclastic flows on the flanks of the volcano and ash plumes that are a danger to aircraft far away. Melting glaciers on snow-capped volcanos can lead to debris flows endangering nearby cities. Tsunamis generated by earthquakes or underwater landslides can cause damage and loss of life far away from the source, and accurate prediction of their propagation through the ocean and interaction with coastal topography is essential in issuing early warnings.

SPRING 2005

January 13

Conjugate Coupling: The Romantic Adventure of the Quintessential Quadratic

Edward B. BurgerWilliams College

Abstract: Here we will come to understand the "personality" of real numbers. Along the way we will encounter some beautiful ideas from number theory and develop an appreciation for an area known as "diophantine approximation". Results of both the ancient and recent variety will be offered. No number theory background is required beyond a desire to explore the alluring notion of number.

January 20

New Random Walk Models

David LevinUniversity of Utah & MSRI

Abstract: I will describe research on recent models involving random walks. The first topic concerns the statistical problem of reconstructing the labeling of a graph from data generated by an unobservable random walker on the graph. (Joint work with Y. Peres, and Y. Peres and R. Pemantle.) I will also discuss work on dynamical random walk. This is a walk-valued stationary stochastic process whose equilibrium measure is the law (on sequence space) of an ordinary random walk. Of particular interest are the "exceptional times" for the dynamical random walk—times at which events of zero probability for the ordinary random walks occur in the dynamical version. (Joint work with D. Khoshnevisan and P. Mendez.)

January 27

Construction and Analysis of Unstructured Mesh Generation Algorithms

Noel J. WalkingtonCarnegie Mellon University

Abstract: Given a collection of points, edges, and faces, in a bounded two or three dimensional region, the meshing problem is to construct a triangulation which (i) conforms to the given region, (ii) the triangles or tetrahedra have bounded aspect ratio, and (iii) is as coarse as possible. These requirements can lead to very complicated algorithms; so much so that it can be difficult to verify correctness. I will give an overview of the ideas and issues that arise when constructing algorithms to solve the meshing problem, and will indicate how the mesh generation problem touches on many areas of mathematics and computer science such as approximation/interpolation theory, computational geometry, sphere packing, graph theory, and algorithm design.

February 3

Valuations in Algebraic Geometry

Laura GhezziFlorida International University

Abstract: This is an introduction to valuation theory and the role it plays in Algebraic Geometry.

After giving the necessary background and definitions we discuss Zariski's theorem of local uniformization and we give examples of valuations that arise naturally in Algebraic Geometry.

This is joint work with S.D. Cutkosky.

February 3

Turing patterns and Concentration Phenomena in Differential Equations

Wei-Ming NiUniversity of Minnesota

Abstract: "Diffusion-driven instability" was first formulated in the celebrated work of Turing (1952) in an attempt to model the regeneration phenomenon of hydra which is one of the earliest example observed in morphogenesis (Trombley, 1744). In this talk, I will describe recent mathematical progress on the Gierer-Meinhardt system (1972), which was derived based on Turing's original idea. The strikingly nontrivial patterns, namely, spike-layer steady states, exhibited by this system will be discussed mathematically. Other related concentration phenomena (such as the CIMA reaction in chemistry) will be mentioned as well.

February 17

Renormalization and Quantitative Equidistribution for Parabolic Flows

Giovanni ForniNorthwestern University

Abstract: A flow is called parabolic if nearby orbits diverge with at most polynomial speed with time. Examples of such flows include billiards in polygons, conservative flows with saddle singularities on surfaces (related to interval exchange transformations), horocycle flows and nilflows. For the typical parabolic flow all trajectories tend to equidistribution and for applications, for instance to number theory, it is important to know the equidistribution speed (for smooth functions). In this talk we will describe an approach to this questions based on the introduction of an appropriate renormalization dynamics and on the study of the cohomological equation and of invariant distributions of the flow. The renormalization dynamics is hyperbolic and can be studied with tools of hyperbolic theory such as Lyapunov exponents. For instance, in the case of conservative flows on surfaces the renormalization is given by the Teichmueller flow on the moduli space of holomorphic differentials and for horocycle flows by the corresponding geodesic flow. The cohomological equation can be studied by tools of Fourier analysis/representation theory, although in some cases a dynamical approach is also possible. Interesting applications to number theory come from the study of nilflows.

February 23

Leader Election with Quantum Resources

Prakash PanangadenMcGill University

Abstract: The idea of using the phenomenon of quantum entanglement for more efficient implementation of algorithms is now 20 years old. In this talk I consider a relatively new variation on this theme: solving tasks in distributed systems using quantum resources. The task that we consider is the problem of a fully connected network of processors selecting a leader. This is an abstraction of a key step in many tasks involving distributed decision making. When the network is anonymous (no processor can be named and each processor runs the same protocol) the task is known to be unsolvable: essentially because there is no way to break the symmetry. If the system can use probabilistic resources then it can break the symmetry - with high probability - and elect a leader. With entangled states however one can break the symmetry and get a solution that works always and always with a fixed number of steps. In fact we show that there is one very special state - called the W state - that must be used, no other entangled state will work. This state thus embodies very different properties from the usual "Bell" states used in quantum algorithms. The arguments are essentially based on symmetry breaking. This talk will include an introduction to the background material needed to understand the results; in particular, I will not assume familiarity with quantum computing. This is joint work with Ellie D'Hondt.

February 24

Examining the Evolutionary Principal Components of a Multivariate Time Series with Application to Stock Sector Data

Ginger DavisRice University

Abstract: Financial data are heavily analyzed due to the potential payoff of useful models. Many models exist for the joint analysis of several financial instruments such as securities due to the fact that they are not independent. These models often assume some type of constant behavior between the instruments over the time period of analysis. Instead of imposing that assumption for our system of securities, we are interested in modeling the dynamics of the overall system. Specifically, we model individual stock data that belong to one of three market sectors and examine the behavior of the market as a whole and the behavior of the sectors. Our aim is detecting and forecasting unusual changes in the system, such as market crashes and outliers.

March 3

Mathematical Models for the Spread of Epidemics

Luc TartarCarnegie-Mellon University

Abstract: Problems in Mechanics and Physics have played an important role in the past for inducing mathematicians to create new mathematical tools. Nowadays, most mathematicians have no serious knowledge in Mechanics or Physics, which explains why so many fall prey to fashions, whose leaders play with terms from Mechanics or Physics but show a poor understanding of these fields. Even for those with enough critical judgment for avoiding some traps, learning Mechanics or Physics is difficult, as it is not easy to guess what one should believe from all that is said about the real world and about the mathematical models which are used for describing it.

Classical Mechanics is a 18th Century point of view, studied with Ordinary Differential Equations. Continuum Mechanics is a 19th Century point of view, studied with Partial Differential Equations, at least for the linear effects. Because the mathematical tools which existed in the late 1960s were only adapted to linear problems, part of my research work looked at the difficulties posed by nonlinear effects. I first developed Homogenization and Compensated Compactness (partly with François Murat), and then H-measures for a few particular questions (also introduced by Patrick Gérard for another question). I will describe why I consider all this as a (small) part of a new theory, which lies Beyond Partial Differential Equations, and should be adapted to 20th Century problems in Continuum Mechanics, like Plasticity or Turbulence, and 20th Century Physics, i.e. quantum effects.

Anyone who has understood Continuum Mechanics knows that Turbulence has nothing to do with letting time tend to infinity, a game which is one of the deluded fashions alluded to. Turbulence is a problem of Homogenization in the general framework that I had introduced in the late 1970s, which has little to do with -convergence, a game which has transformed in another deluded fashion (because it is played with entirely wrong topologies for being useful for Continuum Mechanics); however, the corresponding Homogenization problems have not been solved yet in general situations, but the simplified examples which have been understood explain why physicists have been misled in their use of probabilities in the laws of Physics.

March 10

Nonlinear Optics in Photonic Structures

Alejandro AcevesUniversity of New Mexico

Abstract: An area of intense research is that of photonics, where light propagation features are controlled by clever engineering of periodic optical structures. For example, the fiber bragg grating where an additional intensity dependent nonlinear index of refraction allows soliton like propagation with tunable velocities. In this work we consider nonlinear periodic geometries. We show that the additional transverse dimension allows for a richer dynamics of light trapping, bending and switching, provided stable gap soliton-like bullets exist.

March 17

Polynomial Mappings

Dale CutkoskyUniversity of Missouri

Abstract: A vector of m polynomials in n-variables gives an algebraic mapping of complex spaces Cⁿ to Cm. This is an example of an algebraic mapping of algebraic varieties. The simplest mappings are the locally monomial or toroidal mappings. The toroidal mappings from Cⁿ to Cm are given by polynomials which are monomials is the coordinate variables. We discuss the problem of toroidalization of mappings, and discuss our proof that algebraic mappings of 3-dimensional varieties can be toroidalized.

March 18

Multiplicity Bounds

Hema SrinivasanUniversity of Missouri

Abstract: If R is a polynomial ring and I is a homogeneous ideal, then the invariant multiplicity of R/I can be easily computed from any homogenous free resolution of R/I over R. For this reason, it seems reasonable to bound the multiplicity by the shifts in a free resolution. The conjectured bounds in terms of the shifts in the minimal resolution of R/I are still open in general. In this talk we will survey various bounds for the multiplicity conjectured by Herzog-Huneke-Srinivasan and discuss the current status of these conjectures.

March 29

Federal Funding of Research in the Mathematical Sciences

Philippe TondeurVigreUniversity of Illinois and former Director of Division of Mathematical Sciences at NSF

Abstract: TBA

April 4

Information is Physical, but Physics is Logical

Samson AbramskyOxford University

Abstract: The new fields of quantum information and quantum computation are causing a re-examination of basic ideas in both Physics and Computer Science. One of the key ideas which has emerged is that quantum entanglement—Einstein's "spooky action at a distance"—can be seen as a computational resource. Entanglement arises because the state of a compound (e.g. two-particle) quantum system can encode correlations between the components, even when they are spatially separated, so that measuring one component has an instantaneous effect on the other. This receives a spectacular application in the teleportation protocol, which uses just two classical bits to transport an unknown qubit q from one site to another. Teleportation is simply the most basic of a family of quantum protocols, including logic-gate teleportation, entanglement swapping, and quantum key exchange, which form the basis for novel and potentially very important applications to secure and fault-tolerant communication and computation.

The current tools available for developing quantum algorithms and protocols are deficient on two main levels: Firstly, they are too low-level, because quantum algorithms are currently mainly described using the 'network model' corresponding to circuits in classical computation. One finds a plethora of ad hoc calculations with 'bras' and 'kets', normalizing constants, matrices etc.

At a more fundamental level, the standard mathematical framework for quantum mechanics (which is essentially due to von Neumann) is actually insufficiently comprehensive for informatic purposes. In describing a protocol such as teleportation, or any quantum process in which the outcome of a measurement is used to determine subsequent actions, the von Neumann formalism leaves feedback of information from the classical or macroscopic level back to the quantum implicit and informal, and hence not subject to rigorous analysis and proof.

In this talk I'll describe recent work with Bob Coecke, in which we recast the von Neumann formalism at a more abstract and conceptual level, and then use the extra structure made available by the category-theoretic framework to remedy the deficiencies in the standard approach noted above. This enables a high-level but effective approach to modelling and reasoning about all the key features of quantum information processing. The effectiveness of these methods is demonstrated by a detailed treatment of three of the main quantum protocols: teleportation, logic-gate teleportation (which is universal for quantum computation), and entanglement swapping. Because of the explicit treatment of 'classical communication'—i.e., the use of measurement outcomes to determine subsequent actions, possibly at other sites in a compound system than the site at which the measurement was performed—it can reasonably be claimed that these are the first completely formal descriptions and proofs of correctness of these protocols. From some very practical considerations we are led to a new fundamental axiomatization of quantum mechanics which can be cast in an essentially logical form, where the calculations to derive the information flow inherent in an entangled quantum system are performed diagrammatically, and correspond to the logical notion of Cut-elimination. Thus we find a new kind of "Categorical Quantum Logic", radically different from the Birkhoff-von Neumann quantum logic.

Proessor Abramsky holds the Christopher Strachey Chair of Computer Science at Oxford University, UK and recently was elected a Fellow of the Royal Society.

April 5

The Smallest Projective Varieties

David EisenbudVigreDirector, MSRI and UC Berkeley

Abstract: Any algebraic curve in projective 3-space that is not contained in a plane has degree at least 3 -- that is, it meets any plane in at least 3 points. Moreover, any curve of degree 3, can be parametrised (in suitable coordinates) by t → (t, t², t³).

This was known 150 years ago, and since that time many mathematicians have used and generalized the result. I will describe some of the ideas involved, including recent work of mine with Mark Green, Klaus Hulek, and Sorin Popescu.

April 7

Binomial Complete Intersections

Eduardo CattaniUniversity of Massachusetts

Abstract: The modeling of cancer provides an enormous mathematical challenge because of its inherent multi-scale nature. For example, in vascular tumours, nutrient is transported by the vascular system, which operates on a tissue level. However, it affects processes occurring on a molecular level. Molecular and intra-cellular events in turn affect the vascular network and therefore the nutrient dynamics. Our modeling approach is to model, using partial differential equations, processes on the tissue level and couple these to the intercellular events (modeled by ordinary differential equations) via cells modeled as automaton units. Thus far, within this framework we have modeled structural adaptation at the vessel level and we have modeled the cell cycle in order to account for the effects of p27 during hypoxia. These results will be presented.

April 14

Modelling Aspects of Vascular Cancer

Phillip MainiOxford University, UK

Abstract: A binomial ideal in a polynomial ring is an ideal generated by binomials. They are quite ubiquitous in various contexts such as toric geometry, semigroup algebras, and hyper-geometric equations. Although binomial ideals are very amenable to Gröbner and standard bases techniques, they also provide some of the "worst-case" examples in computational algebra.

In this talk I will discuss some joint work with Alicia Dickenstein (U. of Buenos Aires), where we attempt to obtain properties of a binomial ideal such as characterization of complete intersections, number of "solutions" in the zero-dimensional case, etc. purely in terms of the monomials appearing in a set of generators of the ideal.

April 18

Hilbert Functions and Castelnuovo-Mumford Regularity

Brent D. StrunkPurdue University

Abstract: "Suppose G is a standard graded ring over an infinite field. From the minimal graded free resolution of G, it is possible to derive several invariants, among them the multiplicity, the Castelnuovo Mumford regularity, the Hilbert series, and the postulation number. I discuss a sharp lower bound for the regularity of G in terms of the postulation number, depth, and dimension. I also present a class of examples in dimension 1 where the postulation number is 0 and the regularity of G can take on any value between 1 and the embedding codimension of G."

April 21

Modular Functions and Continued Fractions

Bill DukeUniversity of California at Los Angeles

Abstract: In this mostly expository talk I will show the connection between Ramanujan's work on the special values of certain continued fractions and Klein's theory of the icosahedron. In addition to explaining some of Ramanujan's identities, this observation opens some avenues for the study of special values of certain modular functions defined by continued fractions and generalizations.

Mathematics Department, 424 Gibson Hall, New Orleans, LA 70118 504-865-5727 math@math.tulane.edu