Clifford's theorem on special divisors

In mathematics, Clifford's theorem on special divisors is a result of W. K. Clifford (1878) on algebraic curves, showing the constraints on special linear systems on a curve C.

Statement

If D is a divisor on C, then D is (abstractly) a formal sum of points P on C (with integer coefficients), and in this application a set of constraints to be applied to functions on C (if C is a Riemann surface, these are meromorphic functions, and in general lie in the function field of C). Functions in this sense have a divisor of zeros and poles, counted with multiplicity; a divisor D is here of interest as a set of constraints on functions, insisting that poles at given points are only as bad as the positive coefficients in D indicate, and that zeros at points in D with a negative coefficient have at least that multiplicity. The dimension of the vector space

L(D)

of such functions is finite, and denoted (D). Conventionally the linear system of divisors attached to D is then attributed dimension r(D) = (D)  1, which is the dimension of the projective space parametrizing it.

The other significant invariant of D is its degree, d, which is the sum of all its coefficients.

A divisor is called special if (K  D) > 0, where K is the canonical divisor.[1]

In this notation, Clifford's theorem is the statement that for an effective special divisor D,

(D) 1 ≤ d/2,

together with the information that the case of equality here is only for D zero or canonical, or C a hyperelliptic curve and D linearly equivalent to an integral multiple of a hyperelliptic divisor.

The Clifford index of C is then defined as the minimum value of the d 2r(D), taken over all special divisors (that are not canonical or trivial). Clifford's theorem is then the statement that this is non-negative. The Clifford index for a generic curve of genus g is the floor function of

The Clifford index measures how far the curve is from being hyperelliptic. It may be thought of as a refinement of the gonality: in many cases the Clifford index is equal to the gonality minus 2.[2]

Green's Conjecture

A conjecture of Mark Green states that the Clifford index for a curve over the complex numbers that is not hyperelliptic should be determined by the extent to which C as canonical curve has linear syzygies. In detail, the invariant a(C) is determined by the minimal free resolution of the homogeneous coordinate ring of C in its canonical embedding, as the largest index i for which the graded Betti number βi, i + 2 is zero. Green and Lazarsfeld showed that a(C) + 1 is a lower bound for the Clifford index, and Green's conjecture is that equality always holds. There are numerous partial results.[3]

Claire Voisin was awarded the Ruth Lyttle Satter Prize in Mathematics for her solution of the generic case of Green's conjecture in two papers.[4][5] The case of Green's conjecture for generic curves had attracted a huge amount of effort by algebraic geometers over twenty years before finally being laid to rest by Voisin.[6] The conjecture for arbitrary curves remains open.

Notes

References

External links

This article is issued from Wikipedia - version of the 7/9/2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.