“Traditional” Marriage or a Break with Tradition?

The recent California court ruling in favor of same-sex marriage has elicited a new round of warnings about the threats to “traditional” marriage. Marriage, say foes of the ruling, has always been a union of one man and one woman, with procreation as its central purpose. And Christianity in particular has historically surrounded marriage with sacred ceremonies, reserved for those who understand its solemn meaning, they contend. Compelling either church or state to accept the validity of same-sex unions would force these institutions, in defiance of tradition, to condone marriages of which they disapprove.

But these arguments rest on a misunderstanding of the unique legal and religious history of Western marriage. It is true that Western law and religion have long held that marriage must consist of one man and one woman. But this represented a profound break with tradition. The most commonly preferred model of marriage through the ages (and the type of marriage mentioned most often in the first five books of the Old Testament) was not one-man, one-woman, but one-man, many-women: polygyny. Even where polygyny was not the norm, a man whose marriage did not produce a child was traditionally allowed to either divorce his original wife or add another wife or concubine to his household.

The establishment of monogamy required the Church to deny that procreation was central to the definition of marriage. In fact, one of Christianity’s major innovations was its insistence that a marriage remained valid even if the couple could not reproduce. The church would overturn a marriage if the man was impotent, but not if one of the partners was sterile.

This principle became the foundation of subsequent Anglo-American law. English and American courts traditionally voided a marriage if a person was incapable of sexual intimacy and had hidden this from his or her partner. But they never made the validity of a marriage dependent on the ability or willingness of a couple to reproduce. As a New York court ruled in 1898, “it cannot be held, as a matter of law, that the possession of the organs necessary to conception are essential to entrance to the married state, so long as there is no impediment to the indulgence of the passions incident to this state.” The ability to have sex, not to reproduce, was the primary foundation of marriage in Western religious and secular traditions alike.

Nor did Christianity insist that marriage be approved by church or state. Here the Church was hewing to an even older tradition. In most ancient societies, marriage had been a private contract between two families. If the parents agreed to the match, that confirmed its validity. Those individuals who, for whatever reason, could marry without consulting their parents did not need anyone else’s permission. Long before Christianity arose, the Roman state incorporated this principle into its legal system. In the Roman Empire, if a court had to decide whether a marriage was valid and whether the partners or children were subject to the rights and duties attached to marital law, it did so on the basis of the couple’s intentions. If a couple regarded each other as husband and wife, and neither was a slave, their marriage was deemed valid.

After the fall of the Roman Empire, one early pope suggested that the Church depart from tradition by decreeing that a marriage was valid only if it took place in church, with the approval of a priest. When his advisors pointed out that this would render the majority of Christians illegitimate, the pope backed off. For the first 16 centuries of its existence, Christianity held that the validity of a match was determined by a couple’s stated intention to be married, rather than by any formal ceremony or licensing process. This doctrine of consent took the traditional acceptance of private agreement to marriage to a new level, requiring the Church to support the validity of a marriage even if the parents had not given permission. If a man and woman claimed they had exchanged marital vows—out by the haystack or behind the stable, without any witnesses—then they were validly married in the eyes of the Church, unless they were slaves or non-believers.

In 1215, the Church decreed that a “licit” marriage required that the bride have a dowry (which implied parental approval), that the banns be published three weeks in advance, and that the marriage take place in a church. But an “illicit” marriage was equally binding in the church’s eyes: the children were seen as legitimate; the wife was entitled to her “widow’s third” of the inheritance; and the couple was subject to the same prohibitions against divorce as a couple married in church.

Secular authorities were similarly accepting of informal marriages. Not until the 16th century—and not until 1754 in England—did states require couples to obtain a license to marry. And even after governments began requiring couples to register to marry, they did not initially enforce this. In America, authorities traditionally “inferred” marriage from a couple’s behavior rather than demanding a public ceremony or a license. Until the latter half of the 19th century, American courts routinely ruled that cohabitation was sufficient evidence of a valid marriage. When one woman in New York laid claim to her brother’s estate because his “widow” had not had a registered wedding, the judge indignantly declared that “society would not be safe for a moment… if an open and public cohabitation as man and wife for ten years…could be overturned.”

During the late 19th century, however, American courts and legislatures began to depart from the tradition of recognizing informal and common-law marriages. This was part of a broader attempt to exert more governmental control over who could marry and who could reproduce. By the 1920s, 38 states had laws prohibiting whites from marrying blacks, “Mulattos,” Japanese, Chinese, Indians, “Mongolians,” “Malays,” or Filipinos. Twelve states forbade marriage to a “drunk” or a “mental defective.” A Washington state statute forbade marriage to any “drunkard, habitual criminal, epileptic, imbecile, feeble-minded person, idiot or insane person,” or anyone with advanced tuberculosis or contagious venereal disease. Interestingly, these prohibitions applied only to marriages that involved a woman under age 45, suggesting that marriage between such “undesirables” was fine as long as the couple was unable to reproduce.

After the 1920s, governments began to retreat from the non-traditional business of determining who was fit to marry or to reproduce. Statutes that denied marriage to epileptics or people with low IQs were gradually repealed. In the 1960s, the Supreme Court invalidated laws against interracial marriage and overturned the right of prison officials or employers to prohibit inmates or workers from marrying. In most states today, the only barrier to marriage for heterosexuals is if people are too “impaired by reason of mental illness or mental retardation” to make decisions about their property or person.

These changes amount to a reassertion of older traditions wherein the state allowed people to decide for themselves if they were married. But in the past 50 years, the concept of individual choice has been greatly expanded, with states and courts defining marriage as a personal decision that cannot be denied by parents or authorities even to groups that traditionally lacked individual rights, such as “unbelievers,” paupers, members of subordinate religions or races, or people confined to institutions.

In the middle of the 20th century, however, the state increasingly—and untraditionally—began using marital status as the main criterion for distributing social and economic benefits and determining people’s interpersonal rights and obligations. The Social Security Act provided survivors’ benefits for the wives and minor children of men who died before age 65—if the couple was legally married. The federal tax code was rewritten to provide special benefits to married couples. Private employers followed suit, using marital status to determine whether they would provide health insurance or pension benefits to employees’ dependents. Legal statutes strengthened the rights of a spouse at the expense of other kin, including parents and adult children. Lack of a marriage license meant that courts, hospitals, or landlords would refuse to accept a couple’s claim that they were entitled to such privileges. This was a departure from the long tradition in which private, informal agreements to live as a couple entitled a man and a woman to public recognition of the rights and obligations attached to their relationship.

Today, the American government is much more insistent than it traditionally was that couples who want the rights and protections of a committed relationship must first get a marriage license and be formally married by a judge or member of the clergy. But the state is much more willing than in the past to guarantee that all individuals—except gays and lesbians—have access to these legal formalities. These two innovations—channeling more benefits through marriage than in the past while also repealing the denial of individual choice to most groups—have given gays and lesbians a strong socioeconomic incentive to demand access to marriage and a strong moral argument to press their case on the basis of equal justice. And contrary to “Conventional Wisdom,” their case is also supported by the Western legal and religious tradition, which has never made ability to procreate a precondition for marriage and which traditionally accorded legal rights to many unions that religious leaders considered illicit or immoral.

—-

This article was originally posted on The Immanent Frame, a project of the SSRC.

For a related article by Stephanie Coontz, see The Validity of Marriage: Who Gets to Decide? Who Gets to Choose? in Conscience, Winter 2007-2008.