Self-Motivation
David Goggins
Former Navy Seal
Career Development
Bryan Cranston
Actor
Critical Thinking
Liv Boeree
International Poker Champion
Emotional Intelligence
Amaryllis Fox
Former CIA Clandestine Operative
Management
Chris Hadfield
Retired Canadian Astronaut & Author
Learn
from the world's big
thinkers
Start Learning

We need highly formal rituals in order to make life more democratic

Comfort has won, and most formality is gone.

Stephen Maturen/Getty Images
Benedictus, Benedicat, per Jesum Christum, Dominum Nostrum. Amen.

Please be seated. It's dinner time in St Paul's College, Sydney, where I'm dean and head of house at Graduate House.


The members of the High Table, wearing academic gowns, have processed into the refectory to a table laden with candelabra and silver accoutrements from the college treasury, each place set with cutlery and glasses. The students, also in gowns, rise from their seats to acknowledge the High Table, and stand until the presider has finished the Latin grace (this is the shorter one – a longer version is kept for feasts). Now that all are seated, a three-course meal is served, accompanied by poetry, music, announcements and general well-dressed merriment. Port is served. A final grace is said after dinner, then all retire to the common room for coffee (or more port) and further conversation. The men wear ties. The women dress up. Diners bow to the High Table when excusing themselves, and the High Table bows back when departing from dinner.

This is, by no means, an entirely unique ritual. Everywhere the British empire planted its flag, its two great universities of Oxford and Cambridge spread their collegiate model, and so Australia, Canada, New Zealand and the United States all have their colleges, each with their traditional ways of dining and living. St Paul's is the oldest such college in Australia, but it's different from the others (and from those in Britain) in a significant respect. St Paul's contains two communities – undergraduate and postgraduate – each with their own buildings, dining halls, common rooms, and leadership; each almost a college unto itself, but joined in many endeavours. The undergraduate community was founded in 1856, and Graduate House, which I lead, in 2019. Yet, despite this difference in antiquity, the description above describes dinner in either community, every week.

When I started as dean of Graduate House, there was no Graduate House, only an incomplete construction site and an idea. My brief was to recruit the students and academics, fill the buildings with people, set up student leadership, and design and define the culture and practices of a new college-within-a-college.

I didn't want for unsolicited advice. The most common sentiments I heard were unsurprising: 'a new college can be modern', 'you don't need gowns', 'you don't need formal dinner', 'graduate students in a new college will want it casual!'

We wear gowns. To formal dinners. It is not casual. It is not 'modern'.

I hold an unpopular view. I believe, firmly and invariably, that life in the 21st century is too informal and empty of ritual, and that we should encourage and erect more needless formality. Formality, ritual and ceremony – not casual approachability – are among the most effective ways of making the world and its institutions more inclusive and egalitarian. We all need much more formality in our lives.

The past century has been a good one for individual freedoms – in almost every respect. This wholesale liberalisation has included the freedom of individuals to dress, dine and discourse how they like it. And how they like it is invariably: 'casual', 'low key', 'without too much fuss', 'not too precious', 'not too pretentious', 'not ostentatious' or, as I heard just the other day, 'not too "bougie"' (qua 'bourgeois')… in short, informal. Comfort is king in the modern world; and comfort is the excuse proffered for the evaporation of formality from daily life.

While formality and its rituals persist in little pockets, they do so only where they are bolstered by elaborate protective struts. In general (though decreasingly), government ceremonies remain somewhat formal. With ever increasing exceptions, weddings and funerals cling to formal traditions. The High Church has positioned itself as the last refuge of formal practice – a claim that would have no teeth had not the Low Church so effectively abolished the bells and smells and hymns and ceremony in favour of appealing to parishioners who want a service that 'isn't too fussy'.

Comfort has won, and most formality is gone. But the freedom of informality comes at a cost. Formality is the bulwark against some of the nastiest human impulses, and acts as a vaccine against our most dangerous tendency: forming in-groups and out-groups.

There's nothing you or I or the Pope or the United Nations could do to stop humans from forming clubs, inventing or elevating meaningful markers of difference, and building fences and corrals that keep one's group together while keeping the 'others' out. We are a tribal ape with a brain built to exaggerate our allegiance to our small band while manning the barricades against others distinguished by vanishingly tiny differences. Individuals can, with great effort, consciously suppress this nasty bit of programming, but populations on the whole will fail.

Groups can form around any distinguishing feature, from the harmless, such as sporting teams, schools attended or favourite novels, to the nefarious, such as race, class or sex. Each person can disavow some marks of difference while clinging to others – and no person can disavow them all.

This mental virus might be incurable, but there is a vaccine: formality. Formality gives us something harmless around which to form an in-group: namely, knowledge of the rules of that particular formality, with its own trials of membership and rules of initiation.

'Ah yes, the dress code is a bit difficult to understand… You see, it's based on Edwardian standards, of course, so "semiformal" actually means black tie! No, no, don't worry a bit, it is unusual…'

The opportunity to be a crowing pedant about the rules of formality gives one something to do instead of in-grouping around more exclusionary traits, such as to which expensive school one went. More importantly, the rules of formality are ultimately accessible to all. Anyone can learn the etiquette and wear the tie, and so become part of the ever larger, ever more diverse in-group that practices the formality of the event.

The livery companies of the City of London are some of the more formal and traditional institutions in the United Kingdom today; formal dinners, ceremonies in Tudor (or mock-Tudor) garb, and incredibly convoluted elections are their standard fare. Despite their finery and antiquity, they aren't – nor have they ever been – aristocratic. More than a century ago, they were already associated with upwardly mobile plebs, so much so that Gilbert and Sullivan poked fun at the House of Lords' collective disdain for the Common Council (composed of many livery company members) in their comic opera Iolanthe (1882). The companies began as workmen's guilds and preserve those class associations, but they are formal, traditional organisations, because this helps to bind their members together, despite their differences, making them all feel as one.

This is a common pattern. While the London gentlemen's clubs are well-dressed and traditional, they're largely devoid of ceremony; instead, they're well-appointed places to relax over meals or drinks and sniffingly observe shibboleths of the upper classes, from which syllable to stress in 'patina', to why one ought not to own fish knives. Meanwhile, foundationally working-class clubs, such as the Knights of Columbus or the Freemasons, deck themselves in formal ceremony and ritual. The already powerful can afford not to make too much fuss. For the up-and-coming, or the downtrodden, formality gives an unparalleled sense of membership to a grander body.

Universities and colleges once knew this well. They remain some of the only institutions still using formality to their advantage, though often grudgingly and falteringly. I lived and worked in a number of colleges in Oxford before moving to Australia, and watched as various members of the leadership tried – sometimes successfully, sometimes not – to strike away little elements of salubrious formality, when they felt the striking was good. And so, dinner's fourth course went, but second dessert was preserved. Another night of the week became informal, but Sunday was still black tie. They chip away at traditions, forgetting that, for students, visiting fellows and new academics, these are the very things that cause rapture and delight.

In 2019, it was an act of fortitude to stand before 100 newly enrolled graduate students – mostly Australian, few with any experience of an ancient college – and insist that in this brand-new, modern building, at our very first dinner, we would wear academic gowns, say grace in Latin, and pass decanters to the left. It was harder still to say the same to a dozen busy and seasoned academics who joined us. But it was the right choice, and the college is better for it. In this modern university, my students and academics come from every political, religious, social and economic background one can imagine; they don't have anything extrinsic in which to believe together. College gives them something to believe in as a whole.

The college needs ritual, tradition, anachronism and whispers of the numinous to bind together this diversity. Not to smooth it out, but to unite it in true engagement. Any apartment building can fill itself with diverse residents who politely acknowledge each other in the hallways, then keep to themselves. It takes a formal, traditional, ritual-filled ancient college to make them all feel as though they're truly of one kind – even if that ancient college is only a year old.

Benedicto, Benedicatur, per Jesum Christum, Dominum Nostrum. Amen.

Postscript: This Idea was conceived and written in early 2020, in a time when COVID-19 was but a suppressed whisper. Reading it now, when ceremony and togetherness are rightly halted for the good of global health, feels like reading a dispatch from a different world. But I do hope this crisis, which is, underneath the medical crisis, a social one, will provide a chance for reflection on how we interact, and that a global community resuming its usual business will embrace the opportunity to repair our broken institutions of formality and ceremony. In short, I hope we all come out of quarantine wearing our Sunday best, ringing bells, lighting candles and burning incense.Aeon counter – do not remove

This article was originally published at Aeon and has been republished under Creative Commons. Read the original article.

Live today! Unfiltered lessons of a female entrepreneur

Join Pulitzer Prize-winning reporter and best-selling author Charles Duhigg as he interviews Victoria Montgomery Brown, co-founder and CEO of Big Think, live at 1pm EDT today.

Two MIT students just solved Richard Feynman’s famed physics puzzle

Richard Feynman once asked a silly question. Two MIT students just answered it.

Surprising Science

Here's a fun experiment to try. Go to your pantry and see if you have a box of spaghetti. If you do, take out a noodle. Grab both ends of it and bend it until it breaks in half. How many pieces did it break into? If you got two large pieces and at least one small piece you're not alone.

Keep reading Show less

Two-thirds of parents say technology makes parenting harder

Parental anxieties stem from the complex relationship between technology, child development, and the internet's trove of unseemly content.

Sex & Relationships
  • Today's parents believe parenting is harder now than 20 years ago.
  • A Pew Research Center survey found this belief stems from the new challenges and worries brought by technology.
  • With some schools going remote next year, many parents will need to adjust expectations and re-learn that measured screen usage won't harm their children.

Parents and guardians have always endured a tough road. They are the providers of an entire human being's subsistence. They keep that person feed, clothed, and bathe; They help them learn and invest in their enrichment and experiences; They also help them navigate social life in their early years, and they do all this with limited time and resources, while simultaneously balancing their own lives and careers.

Add to that a barrage of advice and reminders that they can always spend more money, dedicate more time, or flat-out do better, and it's no wonder that psychologists worry about parental burnout.

But is parenting harder today than it was, say, 20 years ago? The Pew Research Center asked more than 3,600 parents this question, and a majority (66 percent) believe the answer is yes. While some classic complaints made the list—a lack of discipline, a disrespectful generation, and the changing moral landscape—the most common reason cited was the impact of digital technology and social media.

A mixed response to technology

children using desktop computer

Parents worry that their children spend too much time in front of screens while also recognizing technologies educational benefits.

(Photo: Chris Hondros/Getty Images)

This parental concern stems not only from the ubiquity of screens in children's lives, but the well-publicized relationship between screen time and child development. Headlines abound citing the pernicious effects screen time has on cognitive and language development. Professional organizations, such as the American Academy of Child and Adolescent Psychiatry, issue warnings that too much screen time can lead to sleep problems, lower grades, weight problems, mood problems, poor self-image, and the fear of missing out—to name a few!

According to Pew's research, parents—which Pew defines as an adult or guardian with at least one child under their care, though they may also have adult children—have taken these warnings to heart. While 84 percent of those surveyed are confident they know how much screen time is appropriate, 71 percent worry their child spends too much time in front of screens.

To counter this worry, most parents take the measured approach of setting limits on the length of time children can access screens. Others limit which technologies children have access to. A majority of parents (71 percent) view smartphones as potentially harmful to children. They believe the devices impair learning effective social skills, developing healthy friendships, or being creative. As a result, about the same percentage of parents believe children should be at least 12 years old before owning a smartphone or using social media.

But a deeper concern than screen time seems to be what content those screens can access. An overwhelming 98 percent of those surveyed say parents and guardians shouldered the responsibility of protecting children from inappropriate online content. Far less put the responsibility on tech companies (78 percent) or the government (65 percent).

Parents of young children say they check the websites and apps their children use and set parental controls to restrict access. A minority of parents admit to looking at call and text records, tracking their child's location with GPS, or following their child on social media.

Yet, parents also recognize the value of digital technology or, at least, have acquiesced to its omnipresence. The poster child for this dichotomy is YouTube, with its one billion hours played daily, many before children's eyes. Seventy-three percent of parents with young children are concerned that their child will encounter inappropriate content on the platform, and 46 percent say they already have. Yet, 80 percent still let their children watch videos, many letting them do so daily. Some reasons cited are that they can learn new things or be exposed to different cultures. The number one cited reason, however, is to keep children entertained.

For the Pew Research Center's complete report, check out "Parenting Children in the Age of Screens."

Screens, parents, and pandemics

Perhaps most troubling, Pew's survey was conducted in early March. That's before novel coronavirus spread wildly across the United States. Before shelter-in-place laws. Before schools shuttered their doors. Before desperate parents, who suddenly found themselves their child's only social and educational outlet, needed a digital lifeline to help them cope.

The COVID-19 pandemic has led many parents to rely on e-learning platforms and YouTube to supplement their children's education—or just let the kids enjoy their umpteenth viewing of "Moana" so they can eke out a bit more work. With that increase in screen time comes a corresponding increase in guilt, anxiety, and frustration.

But are these concerns overblown?

As Jenny Radesky, M.D., a pediatrician and expert on children and the media at the University of Michigan's C.S. Mott Children's Hospital, told the New York Times, parents don't always need to view screen time as a negative. "Even the phrase 'screen time' itself is problematic. It reduces the debate to a black and white issue, when the reality is much more nuanced," Radesky said.

Radesky helped the American Academy of Pediatrics craft its statement about screen time use during the pandemic. While the AAP urges parents to preserve offline experiences and maintain limits, the organization acknowledges that children's media use will, by necessity, increase. To make it a supportive experience, the statement recommends parents make a plan with their children, be selective of the quality of media, and use social media to maintain connections together. It also encourages parents to adjust their expectations and notice their own technology use.

"We are trying to prevent parents from feeling like they are not meeting some sort of standard," Radesky said. "There is no science behind this right now. If you are looking for specific time limits, then I would say: Don't be on it all day."

This is good advice for parents, now and after the pandemic. While studies show that excessive screen time is deleterious, others show no harm from measured, metered use. For every fear that screens make our kids stupid, there's a study showing the kids are all right. If we maintain realistic standards and learn to weigh quality and quantity within those standards, maybe parenting in the digital age won't seem so darn difficult.

How meditation can change your life and mind

Reaching beyond the stereotypes of meditation and embracing the science of mindfulness.

Videos
  • There are a lot of misconceptions when it comes to what mindfulness is and what meditation can do for those who practice it. In this video, professors, neuroscientists, psychologists, composers, authors, and a former Buddhist monk share their experiences, explain the science behind meditation, and discuss the benefits of learning to be in the moment.
  • "Mindfulness allows us to shift our relationship to our experience," explains psychologist Daniel Goleman. The science shows that long-term meditators have higher levels of gamma waves in their brains even when they are not meditating. The effect of this altered response is yet unknown, though it shows that there are lasting cognitive effects.
  • "I think we're looking at meditation as the next big public health revolution," says ABC News anchor Dan Harris. "Meditation is going to join the pantheon of no-brainers like exercise, brushing your teeth and taking the meds that your doctor prescribes to you." Closing out the video is a guided meditation experience led by author Damien Echols that can be practiced anywhere and repeated as many times as you'd like.
Keep reading Show less
Scroll down to load more…
Quantcast