Page 3 of 23

Was Paul a Jihadist?

 

I heard it a second time, “The apostle Paul was a jihadist.”  He wasn’t really though, was he?

Before his conversion, Paul, then called Saul, persecuted Christians.  In his own words, he says that he

persecuted the church of God and tried to destroy it. I was advancing in Judaism beyond many of my own age among my people and was extremely zealous for the traditions of my fathers. Galatians 1:13–14

Does this make him a jihadist?  Does the term mean anyone who persecutes Christians?  Or maybe those who do so for religious reasons?

In Arabic, the term means struggle.  It might mean a struggle for a better Muslim society or a war defending the faith against unbelievers.

If we used the term correctly, I suppose we could say that Paul was as much a jihadist after his conversion, as he struggled to spread the Christian faith, as before it, in the defense of the Jewish one.

One of two conditions must be met in order to apply the term jihadist to anyone.  First, the person must be a jihadist.  Paul, a first century Jew can no more be a jihadist that Nero could be a Nazi.  In researching for this post, I discovered that most Muslims don’t use the term jihadist to describe the radical, violent Islamist.  Instead, they use a term similar to the word “deviant.”  If this is the case, it would be inappropriate for Christians to use the term for a first century Jew who was persecuting the Christian faith.  The second condition, you need to be an Arab speaking to Arabs–this way everyone knows what you mean.

If one of these two conditions are not met, then we end up using the term like Western politicians and media: generally and sensationally, to evoke fear and anger, to create unity through demonization.

I get it, we want to make the point that Paul was a really, really bad dude.  But what’s wrong with simply saying he “persecuted the church of God and tried to destroy it”?

If we are to be a light to the nations, it is really important for Christians to use language responsibly and to communicate with integrity.

12 Keys to Writing a Great Exam Composition

The view of an exam composition marker

Of course you want the highest mark possible on your exam composition.

I’ve been marking the BC English 12 provincial exam for many years—more than 20, I think.  I just finished marking the composition for this year’s exam and I decided to write about it while my thoughts are still fresh.

The best way to get a 6 out of 6 on your composition on the English 12 exam is to be an excellent writer.  Not everyone is an excellent writer. But there is much an average writer can do to give them a chance to earn the highest score of which they are capable.

Here are what I consider the 10 most important things to keep in mind when writing Part D: The Composition.

1. It’s all about conflict.

Don’t start your narrative with the alarm clock ringing. This is a very short composition, you don’t have time to lollygag–don’t write about life, write about conflict!

2. Don’t be like everyone else!

Markers read a lot of compositions—well over 700 per day. All have been written by English 12 students and are based on the same prompt. This can lead to a lot of “sameness”—same diction, same topics, same perspective, same approach, same structure.   It feels like you are reading the same five essays, hour after hour, day after day.

It stands to reason, then, if you can write an exam composition that is not like all others, yours will stand out. Most of my 10 keys are about how to make your composition stand out, in a good way.

3. Go with your fifth idea.

If the composition prompt is, “Beauty can be found in simple things,” do not write about smiles, snowflakes, kittens, rainbows or babies.  Everyone is writing about these things because they are the first things that pop into their minds. The solution is, don’t write about the first thing that pops into your mind—or the second. Get down your list.  When you get to oatmeal, socks or the word “and”–you have arrived.  I’m excited just thinking about an essay about the word “and.”

When the prompt was, “Surprises can make life more interesting” – over 90% were about surprise birthday parties.  Because of the lack of surprises when they opened the exam book, markers were ready to jump off the buildings by the middle of the first afternoon.

Oh, and don’t think that you are original if you argue that surprises don’t make life more interesting—negation of the prompt is not clever, it’s cliché.  Speaking of clichés . . . 

4. Don’t use clichés.

No matter what the prompt is, markers can always count on frequent encounters with all of the following:

“When life gives you lemons, make lemonade”
“Life can throw you a curveball . . .”
“Life is a rollercoaster full of ups and downs.”
“Life can hit you like a tonne of bricks.”
“Life is like a box of chocolates . . . .” This one always includes the appropriate textual or parenthetical citation.

By using clichés, you are screaming to your reader that you are an average writer, at best.  And you may be an average writer, there is nothing wrong with that, but there is no sense advertising it.  And who knows, if you are deliberate about not using clichés, you just might infuse a bit of freshness that gives your exam composition a boost.

5. Don’t write about death.

It feels like at least a third of the compositions are about death. As markers, we are forced to vicariously experience the death of every family member in every possible combination from every possible disease. Then there are the accidents, usually car accidents. These often involve drunk driving and the loss of a best friend or lover.  Markers don’t like death essays, not because it makes the process too difficult emotionally.  Quite the opposite, in fact.

I am pretty sure that many English 12 teachers encourage their students to write a composition that is “emotionally engaging.”  This is not bad advice, but when students hear the words “emotionally engaging” they instantly settle on the death of a loved one, because they can think of nothing else that produces stronger emotions than the death of a loved one.

This is probably true, but most high school students lack the skill to sensitively deal with topics like death.  Consequently, these “death essays” become cliché, which is the opposite effect you are trying to achieve.  Save writing about death for when you are a more experienced writer, and you have longer than an hour and a 600-word limit.

6. Don’t preach.

Nobody wants to be preached at.  This is a “stance” issue.  Readers don’t like to be talked down to.  As a marker, I’ve been preached to about recycling and about how I should be Christian, and about why all religions are dumb.  I’ve been told repeatedly that I need to be tolerant.  I’m sick of lectures instructing me how to face the hardships of life and how I should respect the elderly.  I now know what I should think about every issue imaginable.

You can still write about these things but take a different stance—write about when you discovered the importance of recycling.  Or how faith adds meaning to your life.  Write about a difficulty that you experienced and what you learned from it.  See the difference? Rather than tell me what I should do and learn, talk about what you did and what you learned.

7.  Have a strong first paragraph.

Many first paragraphs read like the students are warming up for the real task of writing the essay. They throw down their first thoughts on the subject searching for the point from which they can push off into the first paragraph of their composition. The warm-up or the search for a point of departure should happen someplace else. Write for a few minutes on a piece of scrap paper till you find your direction, then carefully craft the first paragraph to set up what is to come. The reader makes all sorts of judgments from the first lines of your composition; first impressions are powerful—make a good first impression.

I have found that many student compositions benefit from simply drawing a line through the first paragraph.

I’ve read compositions that use “the word” five times in the introduction.  If the prompt tells you to write about beauty or surprises, challenges, maturity, change, dreams or relationships, consider not using that word in the essay at all. Or if you do use “the word,” use it in the last line of the composition.   As with all creative writing, it is better to show than to tell.

8. Be Specific.

Generalities are boring.

This is true for everything you write be it an email, an essay or a narrative. Don’t say, “We went out for my favourite meal,” say, “We went out for chicken wings and Shirley Temples.” Don’t write, “My boyfriend pulled his car into the driveway”; have him pull up “in the family mini-van” or “his red convertible,” or his “1978 windowless van with the words ‘Refer Madness’ airbrushed on the side.”  Specifics make a difference.  If you don’t believe me, ask your father.

9. Don’t write a “5 paragraph essay.”

First of all, by the time you are in grade 12 you should never write a 5 paragraph essay (although there is nothing wrong with an essay of five paragraphs). The body of a 5 paragraph essay consists of 3 examples from your life that show the prompt to be true.

So if the topic is something like “Certain situations lead to maturity,” don’t write an essay in which you briefly and superficially discuss three of the following:

  • entry into kindergarten,
  • the magic of puberty,
  • your parents’ divorce,
  • a torn ACL,
  • getting a driver’s license,
  • your first job,
  • or first kiss,
  • the death of a loved one,
  • or moving to, or within, Canada.

Pick one of these (except the death of a loved one) and meaningfully explore the factors and forces that contributed to your maturation.

You understand, of course, that by adding or dropping a paragraph to the essay I’ve just described does not fix the problem.

10. Punctuation, spelling and capitalization, etc.

There’s no avoiding the fact that the mechanics of writing are important. The good news is that all of the writing on the exam is marked as a first draft, so if you misspell the odd word, or miss a comma or two, your mark will not go down.

So it doesn’t matter if I don’t know the difference between “then” and “than”?  Or “there,” “their” and “they’re”?

Technically no, but no upper-level writer will ever confuse these words.  So by confusing them, you are proclaiming, loudly, that you are not an upper-level writer.

Work hard to understand basic usage and punctuation rules.

As for spelling, two words that, for some reason, come up again and again in the composition essays are obstacle and opportunity. For years I’ve been telling my students to make sure they can spell these two words correctly—because “obsticle” and “oppertunity” scream out that you may not be a competent writer.

11. Be yourself.

You are not a 47-year-old drug addict living in Detroit.

You are not a soldier storming the beaches of Normandy.

You are not a mother deer concerned about your fawn.

Like death, this sort of writing requires a maturity of thought and style that most young writers don’t have.

The thing is, you are the preeminent expert on one subject–YOU.  You know things about this topic that no one else does.  Play to your strengths.  Human beings, even markers, respond to stories–good stories, well told.  I recommend that you walk into your exam with three stories, true stories.  If it’s a true story, you can draw from an actual setting with actual characters doing actual things (and you can embellish a little).  And don’t just tell me what happened, tell me what you thought and felt as well.  But go even further–What did this event mean?  How did it change you?  You may never have thought about it, but think about it now.  This type of essay is generally called a personal narrative.  It involves a true story and some reflection about what the story means.  Consider practicing these three stories beforehand.  Then, when you see the prompt, adapt one of them to fit.  Or write a brand new one if you are so inspired.

So, for these reasons, I strongly recommend you write a personal narrative for your exam composition.

12. “In Conclusion”

Don’t end the last paragraph of your exam composition with the words, “in conclusion.”

In conclusion, I must tell you that I’ve read papers that have ignored half of my 10 keys to writing a great composition and they still earned a 6 out of 6.  They did so because the authors are great writers, but these 10 keys will give you your best chance at earning the highest mark of which you are capable.

I hope you read this post long before you take your exam–years preferably–because all of these things take practice.  Good luck, and I look forward to reading your Composition.

Is there any other advice you give your students?

What else has your teacher suggested for writing the composition?

Truth is a Fad

On a recent trip to downtown Vancouver, my wife and I popped into Christ Church Cathedral on the corner of Georgia and Burrard.  I find it hard to resist a cathedral and always try the doors to see if I can get a look inside.  The door was unlocked and a pleasant woman offered to answer our questions.  I asked about the beautiful interior and she was delighted to tell us about the recent renovations.  There was even a photo album.

The original church was filled with local cedar, but in a previous renovation, the original wood had been covered.  The red cedar ceiling had been covered by fiber-board.  It was the same story with the floor.  With this new renovation, the foul fiber-board and hideous carpeting had been removed and the original red and yellow cedar, covered up for decades is once again gracing parishioners and visitors with its beauty.

Why had the natural wood of the ceiling and floor been covered in the previous renovation?  It seems preposterous that anyone could think that fiber-board and carpeting were an improvement on the natural cedar, but they apparently did.

Changing Fashions, Changing Ideas

This got me thinking about change, more specifically, changing tastes.  It’s a truism that fashions change, but they don’t just change; they change radically–what is all the rage in one time, is hideous and vile in another age.  This is true whether we are talking about clothing, church interiors or ideas.

The second truism is that we are completely aware of the first truism.  We are somehow convinced that the way we think at the present moment is, at long last, the end of changing “truth”–with today’s thinking, we have arrived.

Previous generations had it wrong, but we have figured it out.  As dumb as it seems now, there was a time when it was generally thought that wood ought to be covered by synthetic materials, and in fifty years the congregation will likely vote to cover the wood with synthetic polar bear fur.  So goes fashion.  So also go our ideas.

The Fashion of Truth

I look at some of the ideas that are spreading throughout culture, replacing the old ones, and I think they are beautiful changes.  Others are more like ghastly fiberboard and anemic pink carpeting obscuring beautiful red and yellow cedar.   And we take these new ways of thinking as absolute truth.  Consequently, in our conversations and disagreements, we condemn those with whom we disagree as bigots and freaks and ogres.  Given that our most recent truth is just a phase, perhaps we ought to be a little less certain about everything–a little less venomous.

In our conversations and disagreements we must remember that the way we think today, is a fad. Consequently, we ought to be a little less certain about everything--a little less venomous.

Doomed to Relativism?

I believe that there is something under the intellectual fads and whims of our culture that never changes.  Core ideas like courage is better than cowardice and it’s evil to harm a child for one’s own pleasure and the ocean is sublime.

Just because it’s new and in fashion, doesn’t mean it’s objectively true.  I say objectively because, although I’m not entirely sure which ideas are cedar and which are fiber-board, I firmly believe that there is an objective truth.  We will continue down our slide of subjectivism for a time, we will continue to believe that we create our own reality, but I hope at some point we will look back and wonder what the heck we were thinking.  And rip up the pasty carpet to expose the rich wood beneath.

We will continue down our slide of subjectivism for a time, but at some point, we will look back and wonder what the heck we were thinking.

Are students prepared for university?

Education has changed.  I’m teaching differently.  Student’s are learning differently.

How well do the new approaches to learning and teaching prepare students for university?

Back in the Day

When I first started teaching Literature 12 there were provincial exams.  These were content focussed.  One of the purposes of the exam was to ensure students were prepared for the rigors of university.  There was a prescribed reading list of over 40 literary works from the literary canon extending from Beowulf to a poem by Margaret Atwood.  Students were also required to understand over 100 literary terms and devices.  Back in those days, I did a lot of talking and students took copious notes.  Given that the exam scores would be used to rank students against other students, schools against other schools, teachers against other teachers, exam performance mattered a great deal on many levels.  So we worked very hard on exam preparation.  Students created very detailed study sheets on each of the literary works on the prescribed reading list.  These were collated into large packets and students spent hours reviewing this material.  At the end of the process, they knew a lot, and my students generally did very well on the Provincial English Literature exam.

“Nowadays”

I still teach Lit 12, but I do so very differently.  My class looks much more like a graduate seminar than a lecture hall.  Students discuss and unpack the literary works, rather than listen to me tell them what they would notice if they were as smart as I was.  Through this dialogue, students analyze, synthesize, evaluate, propose, inquire, challenge, concede, admire and they connect the ideas they encounter to life and society.  After we talk, we write.  They use their laptops for this task.  Sometimes they journal, other times they write an academic essay or a personal narrative; we mix it up.  My assessment has changed as well.  We no longer end the year with an exam.  We end the year with presentations–students explore a topic of their choice making connections literature, often beyond the material we worked over the course of the semester.

Are students today as knowledgeable as in the days of yore?

Last year, I dusted off an old provincial exam, one of the same exams for which I used to work so hard to prepare my students.  We didn’t review the material in class–students didn’t create review sheets for each other, and they didn’t study for it.  I passed it out one day and they wrote it.  I used to mark the Literature 12 exam, so the marks students got on this test were valid.  I was surprised that their scores were significantly higher than those of students of similar ability from 2 decades ago.  I realize this observation is anecdotal and does not meet the standards of a proper study, but I am convinced of the results.  My students know the literature better now than they did when learning was primarily focussed on content rather than projects and discussions.  With the new approach to learning, students are performing better on exams designed to measure university preparedness.

The beauty is, they don’t just know the content–they have a much broader and deeper understanding of the literature than they used to.   They can talk about it and bring it into dialogue with other artistic expressions and with life and society.  They are better readers and thinkers and moviegoers.  Almost all are reporting great success in university classes.

But not all reports are positive.  One of my students excitedly entered her Literature course at the local university this fall, and dropped it after only a few classes.  It was clear to her that, in this particular university class, the study of her favourite high school subject would involve transferring what she heard in a lecture onto an exam paper at the end of the term.  There is no doubt in my mind that she could have passed this course with an A.  Many of my less gifted students frequently do.

Do modern instructional techniques prepare students for university?

My little experiments shows that if students are expected to know the material well, then they are prepared for university.  If their university courses will expect them to be able to analyze and synthesize information and concepts, they are ready.  If they are expected to evaluate ideas; to challenge assumptions and be able to recognize a strong argument and concede, they are ready.  If they are expected to communicate clearly and effectively, both verbally and in various written forms, they are ready and very well prepared for university.

If students are expected to passively listen to a professor talk for hours, collecting information that will be transferred onto an examination paper, then students are ill-prepared for university.
If, on the other hand, they are expected to passively listen to a professor talk for hours, collecting information and transfer this information onto an examination paper at the end of the term, then perhaps my students are ill-prepared for university.

Does anyone really want me to change my approach to teaching literature?

From the Pew: Reflections on the Reformation

Reflections on the Reformation in the Cathedrals 

When I am in Europe, I worship with the Roman Catholics.  On a recent trip, my wife and I attended services in the cathedrals of three different cities–Salzburg, Vienna and Prague.  Because I speak neither German or Czech, I didn’t get much out of what was said in the sermon, but I did walk out of each service having learned something very important about our Lord and Saviour, His Grace, my faith, our worship–all of which have a little something to do with Protestantism and the Reformation.

Salzburg Cathedral  from exquisite beauty to holiness.

The interior of Salzburg Cathedral is beautifully ornate, but not gaudy.  From my humble pew, I looked up and was overwhelmed, and I realized that the encounter with holiness, is facilitated by exquisite beauty.

My church at home is very nice, but the priorities are different–utility and stewardship are the guiding principles for construction–and the comfort of the congregants.  The seats in my home church are very comfortable. Even in this most beautiful of cathedrals, one always sits on very hard wooden benches.  The back is set at almost 90 degreed to the seat.  The seatback is capped with a board upon which the kneeling parishioner behind you can rest his elbows, so you can’t really lean back.  These were definitely not designed with my comfort in mind.

It begins to dawn on me that very little thought has gone into my experience of this service.

The music that Sunday morning included an organ, a small orchestra, and more than one choir.  These are located behind and above me so I can’t see the musicians.  Obviously, the music is not performed for me–I am graciously allowed to listen in.  In my home church, it’s not about me either, but the excellent praise band occupies the same place as would a performance band would, so I have to do the work of remembering that they aren’t there for my listening pleasure.  I sometimes forget.

Pretty much everything in any church service is directed toward the worship of the triune God, but in Salzburg Cathedral, it was so obvious.  The building and the music represent the very best of human achievement, and none of it was for me.  That I can see, hear and enjoy them is pure grace.

 

St. Augustine in Vienna – where divine Grace intersects with nature

The Hapsburgs were christened, married and buried at St. Augustine so this cathedral has seen a lot of pageantry and ceremony over the years.  The mass still reflects a polish and flair consistent with this history.  I particularly noticed this in the treatment of the elements of the Eucharist.

In all Catholic services, the host, what we call, “the bread,” is treated with a great deal of respect.  When congregants enter the door, they make the sign of the cross, and they genuflect before entering the pew.  Both these actions are directed toward the host.  Before, during and after the Eucharist, the actions of the officiating priests all reflect the veneration of the host.  This reverence is seen in every Catholic service. In the mass at St. Augustine, all this was done with particular precision and flourish.

This elaborate treatment of the Communion elements is easily explained.  Jesus Christ is present in the elements.  From the New Testament until the sixteenth century, all Christians believed that those who partook of communion somehow received the body and blood of Jesus.  Exactly how this happened was an unfathomable mystery, and according to the church fathers, it was supposed to remain that way.  In 1215, against the warnings of the church fathers, the church accepted transubstantiation, that is, the conversion of the communion elements into the body and blood of Christ, as the explanation for Christ’s presence in the Eucharist.  Protestants reject transubstantiation as the explanation for what happens in Communion, but some of us have duplicated the error of the Roman Church in 1215; we attempt to explain “the unfathomable mystery.”

The explanation for the unfathomable mystery is that there is no unfathomable mystery—the elements are just plain ol’ bread and wine.  These are “just symbols” of the body and blood of Christ, and the entire point of the celebration of the Last Supper is to remember this past event.  This explanation is a result of the modern tendency to separate the physical from the spiritual, to the detriment of both.  The physical elements, then, are reduced to mere “crackers and juice,” and the spiritual dimension of the meal is reduced to mere remembrance.

What God did through the death and resurrection of Jesus is inseparably physical and the spiritual.  The sacrament by which we remember this redemptive work is not simply a physical symbol, but also an actual spiritual event where Grace intersects with nature, in which a rational human eats and drinks in faith and so, encounters the divine.

Does that sound a little perplexing?  It ought to be.  That’s why it was called an “unfathomable mystery.”  While I watched the Eucharist at St. Augustine, I was impressed with the mystery and wonder of it all.  I thought I could bring some of this  wonder to my own participation in the Lord’s Supper.

 

St. Vitus in Prague – The Kingdom of Heaven is not a democracy

St. Vitus has a commanding view of Prague.  This Gothic structure stands within the castle complex situated on top of the highest point, so it is visible from anywhere in the city.

Getting into the service was a little bit tricky because church officials stood at the door to prevent tourists from entering the church before mass, while at the same time allowing worshipers to pass.  We fell into both categories, I suppose, but we entered unimpeded because we approached the door with the confidence of a parishioner.  Other camera-toting tourists would have to wait in the long line until the last mass was over.

This was the oldest of the cathedrals in which we worshiped on this trip; parts of the structure date back to the 14th century.   The age of the church facilitated a connection to the medieval worshipers who also looked up into these same ceiling vaults.

I couldn’t understand the homily, so I looked at the windows.

Highest and most central in the central window was a depiction of God the Father embracing his crucified son.  Beneath these dominant figures were haloed, major saints.  All these figures were attended to by angels which were arranged according to their heavenly status.  Beneath these were even smaller images of other saints.  From my position in the pew, I looked up to them all, reinforcing my place in the hierarchy of the universe.

I’m way down here—the least of these.

This is a good position for a North American Protestant to be in every now and then.  The window presents a medieval reality—a hierarchy.  I am a product of the Reformation and the ensuing centuries.  So is my church at home.  I was recently disturbed to hear a church leader describe the Reformation as a holy response to church corruption.  He implied that God’s endorsement of this step toward purity was certain.  It is certainly true that corruption pushed Luther and others away from the Roman Church, but we tend to forget that it was also a movement toward something else.  It was a step toward freedom—freedom from authority, and would eventually lead to one of our cultures most serious idolatries.

The Reformation was the first major step toward this freedom that continues to this day.   Rebellion against the authority of the pope was followed by rebellions against monarchies in 1640, 1776 and 1789.  In 1882, Nietzsche’s madman, declaring that “God is dead,” captured the spirit of our rebellion against the authority of God.  The last century has seen freedom spread into all different directions—and some of these are very good directions.  But have you noticed how much we talk about Freedom these days?  We sing its praises between innings at baseball games and before NASCAR races— it is now linked with military power and has become the reason we have fought all our wars.   Freedom–individual freedom—has become our god, submission has become sin.  Any form of authority is evil—even that of our biology.

In the absence of this hierarchy, we have placed the autonomous individual at the top of a flattened, dark and lonely reality.  This is why individualism ultimately leads to despair.  Individualism dominates Western culture and has seeped into our churches as well.  It is countered by the windows in St. Vitas.  Looking up at the window I experience my smallness through awe, rather than loneliness, and see that the cosmos is full of light and love.

C. S. Lewis says that the medieval model, as presented in this stained-glass window, has a “serious defect,” that being “it is not true.”  Nor, he goes on to say, is any model “true.”

The cathedral and mass do not necessarily represent a wholly true view of reality, but they do represent a different reality from the modern ideas that are woven into the fabric of our Protestantism.  We might benefit from the admission that some aspects of worship in the cathedral do a little better job of engaging our imagination and allowing us to experience the significance of things.

Carl Grimes, Hope and Merry Christmas

At the end of the mid-season finale of the 8th season AMC’s The Walking Dead, Carl Grimes reveals that he’s been bitten.  Fans are upset. 50,000 fans have signed a petition to remove showrunner Scott Gimple from the show.

Why are they so upset? I think it’s because children, Rick’s children, Carl and Judith, and Maggie’s unborn child, mean something. They are a glimmer of future hope in a very dark world. Perhaps they represent our hope as well, because, for some of us, the real world is very dark as well.  To kill a child is to kill hope, and we don’t like that.

We shouldn't be all that upset with Gimple for killing off Carl Grimes.
The thing is, we shouldn’t be all that upset with Gimple for killing off Carl Grimes. Gimple is doing what all zombie storytellers do–they give us characters that embody things that we value and then the kill them.  This goes all the way back to Night of the Living Dead in which most of the traditional values are murdered.

Barbara embodies devotion–dead.

Johnny, cynicism of every kind–dead.

Ben, the hero–dead.

Tom and Judy, romantic love–dead and dead.

The Coopers, the nuclear family–dead, dead and dead.  This, of course, includes little Karen, representative of innocence, who slays her mother with a cement trowel.

Conclusion:

If you are watching a show about zombies, get ready for the things you hold dear, and the characters who represent them, to snuff it.
Zombies are trying to tell us something

Zombie narratives force us to face the contradictions between what we profess and what we actually believe. It’s why monsters appear, and why the zombies have been so popular for the last fifty years.  On the one hand, we profess that there is no God, no universal truth, no ultimate meaning in life,  just what we create for our individual selves.   On the other hand, we believe that families and promises and honesty and courage and fair play matter. We live and act as if things like these are universal and objective.  We believe it’s wrong to deny someone their rights.  We believe that it’s wrong to exploit the weak. That it’s wrong to use women for sex against their will. We believe it’s wrong to kill and eat other people. We believe these things to be universally wrong.  We profess that life has no universal meaning, but we love the parts in TWD where the characters talk of the “something else” that we are fighting for that goes beyond survival.  Zombie narratives don’t let us get away with these inconsistencies.

Much of what Carl did in the final episode was to make his life have some meaning before he died–I can’t recall exactly, but I think his last words included, “I did this” as he pointed to all the people he safely evacuated from exploding Alexandria. But, unless the show does something totally inconsistent, and lame, Carl will die.  Will his life have meaning?  Will his death?  His future is now certain–he will either be dead or he will be lurching-dead–that’s it.  In the fictive world of The Walking Dead, millions have already met one of these two ends.

But the central question to zombie narratives is, if there is no transcendent meaning, is our existence really any different than Carl’s? Death is certainly at the end.  Perhaps we can say, “I did this.”  Is this adequate?  Is this all there is?

Don’t get mad at Gimple.  This is all our idea.

Unless, of course, there is a transcendent God in whom Truth and Meaning dwell–who Loves the world and has come to live among us to show us the way out of zombieland.

Merry Christmas

Why Tolkien?

A few pastors from my church are very wise and godly men; they are literate and literary, discerning and spiritually intuitive–then we have those who hosted this past week’s Extra Podcast.

I can forgive their derision of Star Wars fandom and can the ridiculous claim that Episodes I-III are terrible–evidenced by writing like this.  But I cannot tolerate groundless ridicule of Tolkien.

Mocking those who know the difference between a dwarf and an elf is like mocking someone for reading a book without pictures.

This segment of the podcast was a celebration of ignorance.  The host who has read the most Tolkien, couldn’t get past the middle of The Two Towers.  And derides those who are able to read beyond chapter six of the Silmarillion.  He went on to characterize The Fellowship of the Ring as “one of one of the most boring reads you will ever have in your life.”  This is the sort of reaction we usually get from people who think Transformers: Revenge of the Fallen (2012) was a good movie.  I can only surmise that this host equates reading with the recognition of words on the page.

Did the other host really suggest that Tolkien was a troll?  Motivated only by testing the limits of his future fans’ ability to digest his “drivel”?  In a pathetic attempt at concession, it was acknowledged that some might appreciate Tolkien for his “cultural impact” or his membership in a “Christiany” group called “The Foundlings.”  Allowing that some might appreciate Tolkien’s work because there is “all sorts of biblical imagery,” this host may fail to realize that simple imagery, biblical or otherwise, does little to recommend a literary work, and hardly goes further than lexical comprehension.

Then, the bombastic leader of this triumvirate then asked why so many people who love J. R. R. Tolkien and C. S. Lewis, anathematize J. K. Rowling’s Harry Potter Series.

Here’s my answer:  To do so is indefensible.

Harry Potter and Lord of the Rings counter our cultural individualism.

Both the Harry Potter Series and The Lord of the Rings trilogy are in the same genre, both ought to be read for the same reasons.

In both stories, the protagonists are collaborative and interdependent.  Harry Potter often contributes to victory over evil, but no more than Hermione, Ron, and any number of secondary characters who step up and do their part.  In the climax of the final book, Harry does very little except willingly lay down his life for his friends.  Neville Longbotton, it might be argued, does as much to defeat evil as the eponymous hero of the Harry Potter series.   In Tolkien’s fictive world, one of the main characteristics of the Good is its movement toward fellowship, and that of Evil, toward fragmentation.  The examples are plentiful.  The fellowship begins with the hobbits, including a Baggins, a Took and a Brandybuck.  Hobbits usually stick with their own; not their own species, but their own family group.  But the mixing has only just begun.  When the Fellowship of the Ring is created it includes, not only three types of hobbits, but a wizard, two men, an elf and a dwarf.  Perhaps podcast hosts don’t know the difference, but the dwarves and elves certainly do–and they don’t like each other at all.  Yet, in the context of The Fellowship, they become fast friends.  Difference is celebrated and the fellowship enjoined.  The mock-fellowship of the Ringwraiths is a fellowship of sameness a loss of individual identity.  Further, the forces of evil fragment.  The Sauron has become a disembodied eye, and the emblem of Saruman is the white hand.  Exposure to the Ring, the embodiment of evil, has separated Smeagol from himself–he has two identities, the other being Gollum. Is not this imaginative encounter with biblical truth at least as effective as a rational understanding?  One of the reasons Harry Potter and Lord of the Rings ought to be read is that it counters our cultural individualism.  I think this is a big deal.  We need to counter the cultural narratives with which they are bombarded that proclaim the autonomous individual as the solution to every antagonist.  In the case of individualism, these stories are countercultural in the same sense and direction that the church is, or ought to be.

Escape FROM Reality

It is important to counter individualism, but even more vital to challenge the materialism that so dominates our culture.  Those under the spell of materialism slander these stories as being an “escape from reality.”  Tolkien was familiar with the argument that escape through fantasy literature is harmful.  His response to this charge is found in his essay “On Fairy Stories.” He accepts the term “escape,” but he says it is not an escape from reality, but an escape to reality.  His argument is that we misunderstand reality, and in so doing, misunderstand the nature of escape.  Materialists will certainly be threatened by fantasy literature, but those who believe in an “enchanted” reality, as do Christians, ought to embrace it.  Those who feel compelled to mock Tolkien and authors like him, ought to take an honest look at their attitude to determine if they are possibly walking too deeply into materialist territory.  If so, one of the best ways to recover enchantment and to escape materialism is to read the very books you mock.

Escape TO Reality

G. K. Chesterton reminds us that there is no such a thing as an ordinary thing.  In a faerie story, we encounter a golden apple and this brings back to us the “forgotten moment,” and the ensuing thrill, when we first discovered that they were green.  We come to see the creation, not as slavishly following a deterministic law, but joyfully producing green apples again and again, like a child who wants to be thrown into the air one more time.  “Again . . . again . . . again.”  It is not law, but “magic” that we find in creation.  There is no wonder associated with law, but there always is with magic.  It is because they ought to invoke our sense of wonder, that Chesterton can claim “[a] tree grows fruit because it is a magic tree.  Water flows downhill because it is bewitched.”

Tolkien and Rowling's books present an enchanted world and can help to re-enchant the world we live in.
So, Modern individualism and materialism are countered by Tolkien and Rowling.  Tolkien also challenges modernism’s Myth of Progress.  Out culture believes that we are progressing.  Humanity is getting better and better.  We have the mistaken idea that just as our technology, transportation, communication and medical advances are proof of this progress.  These are indicators of certain kinds of progress, but just watching the evening news tells us that we are not making progress in some very important ways.   In many respects, we are little different than when we lived in caves.  We still lie, cheat, steal and kill.  Tolkien’s world is an ancient world, and the men and women of ages past were better than we are today.  If you read the Silmarillion, you learn of the great ancient race of men and women called the Numenorians, superior to modern men in every way.  But they were proud, and this resulted in their downfall.  This is the pattern of human beings–we can make and do some awesome things, but we never change morally–we always fall.

These are just the beginnings of a thousand reasons why, when asked, “What book would you want with you if you were stuck on a desert island, and you’re not allowed to say The Bible?” I would say, without hesitation, The Lord of the Rings.

We don’t get people to be less individualistic, less materialistic, less confident in progress, by telling them to stop being that way.  In order to affect change, people need to be convinced at a level deeper than reason, deeper than emotion, deeper even than belief (where things like “worldview” live).  We need to live out of a different story, and a transformed imagination.  We have the Bible, we also have experience and tradition, but it is a foolish thing to read these with reason alone.  Lewis would probably say that to look at these imaginatively is at least as important as exploring them rationally.

Let’s talk Glorfindal–apparently deserving of ridicule by one of our would-be, spiritual leaders.  Contrary to representation on the podcast, Glorfindal finds Frodo and his companions and rushes then along to Rivendell, taking breaks only for the exhausted hobbits.  When they are set upon by Black Riders, Glorfindal sets Frodo aback his horse for a mad dash to Rivendell.  When the Ford of Bruinen separates the exhausted Frodo, the Ringwraiths attempt to lure Frodo back to them.  Frodo musters his last bit of strength and says, “You shall have neither the ring nor me.”  Here we have, perhaps, the weakest representative of the forces of good, standing before nine of the top ten representatives of evil in Middle Earth.  And he defies them.  He defies them with confidence even though he has no clue that the river has been enchanted.  For all he knows, they can walk through it as easily as he did, but he defies evil anyway.  It’s not in his own power that Frodo is confident.  The Ringwraiths have never read Romans.  This is an inconsequential event in the story, yet, because of Tolkien’s genius, we can see such profound theology behind almost every act, or under every mushroom.

It seems to me, Tolkien’s work should be regularly referred to in sermons as we educate, not only the mind but also the imagination.  And the Silmarillion ought to be one of the central resources for all Master’s of Divinity degrees.

The Silmarillion ought to be one of the central resources for all Master's of Divinity degrees.
Lest my readers misunderstand, this blog was written knowing full well that the hosts of the Extra Podcast were probably not serious in their ridicule of Tolkien or his admirers.  However, if there is even the tiniest piece of sincerity in their critique of Tolkien, or in that of their listeners, I submit this rebuttal. And I recommend From Homer To Harry Potter: A Handbook on Myth and Fantasy (see link below) for further reading on this subject.

 

I couldn’t lose this one.

http://trentdejong.com/i-couldnt-lose-this-one/
Photo by rawpixel on Unsplash

We are not addicted!

“We are not ADDICTED!” My students were upset. They didn’t like the tone of the article we just read. It was about the effects of continuous access to the internet and social media. For the author, the effects were mostly negative, especially on young people.  When I said that the author might be right. That they were blind to how social media is programming them–I think I may have used the word “addiction”–they were indignant.

They wrote off the author and their teacher as being part of the older generation that didn’t understand the technology–we were, consequently, blind to the vast benefits of the internet and social media.

My Social Media Feeds

A few days later I read an article that popped up on Facebook called, ‘Our minds can be hijacked’: the tech insiders who fear a smartphone dystopia.  In it Paul Lewis tells us that the people responsible for making Google, Twitter and Facebook so addictive are disconnecting themselves and their families from the internet.  They are, apparently, even more concerned about the negative effects than I am. 

Another article came across my desk through Twitter: “How a half-educated tech elite delivered us into chaos.”  Here John Naughton argues that tech leaders know a lot about science, technology, engineering and math, but very little about the Humanities–and had they received a whole education–studied some Literature, History and Philosophy–they wouldn’t be surprised that their inventions have not taken us to the bright future they anticipated.  They thought they were providing a beautiful link between producers and consumers for the benefit of all.  Instead, we have ended up with “fake news and the weaponisation of social media.”

On my Facebook again, I came across the New York Times video called “How China is changing your internet.”  It warns of an Orwellian future if we allow internet giants to go the way of China’s “Super Apps.”  Since we have been reading Orwell, I thought I’d show this video in class.

Either Way, I Win

Perhaps I am too hard on the internet and social media.  Perhaps I have been reading too many articles bashing these sacred cows, and not enough material from the other side.  But if I am, it’s because the algorithms that run the social media platforms have are feeding me a steady diet from only one side of the argument.  These constitute and reinforce my views and biases–they begin to shape my thinking.

Either my negative view of the internet and social media is accurate, or my inaccurate negative view has been shaped by the internet and social media.

Either way, I win.

Biblical Inerrancy: The Spoken Word or Revelation of God?

Growing up in the Christian Reformed Church (CRC), I don’t recall ever hearing the term “inerrancy,” not from the pulpit, and not in catechism classes or at youth conferences.  At Calvin College, I stayed up many nights until 3 am discussing all sorts of theological issues, and I don’t think a single one of them had anything to do with biblical inerrancy.  I didn’t know that in 1978 there had been a major conference that generated The Chicago Statement on Biblical Inerrancy.  You’d have thought that this sort of thing would have trickled up the highway to my dorm room by 1980.  It didn’t.  And it didn’t really come up in the following decades either.

Now, I hear the term inerrancy a lot.  Part of the reason is that I go to a different church, but it’s also because the internet has me reading what other Christians are writing.  I get the sense that those who use the word inerrancy, read the Bible differently than I do.  I struggle with embracing the idea of biblical inerrancy, not because I think the Bible is wrong, but because of something else.  I couldn’t put my finger on it until I read James R. Payton Jr.’s Getting the Reformation Wrong: Correcting some Misunderstandings (IVP Academic, 2010).

Inspiration or Revelation?

Biblical inerrancy is a bigger deal for some Christians than for others and it comes down to what you think the Bible is, revelation or inspiration.  Of course, it’s both.  It is not an either/or proposition. But they are not the same things, and one will be often (perhaps inevitably) be subordinated to the other.

After reading Payton’s book, I realized that in the CRC, as I experienced it, the Bible was revelation, more than inspiration: God reveals himself through his creation–general revelation.  He reveals himself through the written word–special revelation.  And he reveals himself through his incarnate word–his Son.  God reveals himself.  Biblical inerrancy isn’t much of a concern for those who see the Bible as revelation because the focus is on the relationship between the readers and the person we find in its pages–inerrancy is not descriptive of a relationship.

The inerrancy of scripture is necessary and logical when we see the Bible as inspiration, or “the inspired Word of God.”  In this view, God speaks to us through scripture, and because he is all knowing, and doesn’t lie, anything contained therein is objectively true.  The Bible is, therefore, inerrant.

The inerrancy of scripture is logical if the Bible is inspired. But is this primarily what it is?

These different views of the Bible have a long history. According to James R. Payton Jr., the Reformers saw the Bible as revelation.  Later, after the Reformers passed on, the Protestant scholastics emphasized the divine inspiration of scripture in order to defend Protestant ideas against Catholic attackers, who used scholastic methods to argue—a “fight fire with fire” approach.  Because of its emphasis on objective truth, the Bible as inspiration is a more useful tool in such debates.

To illustrate these different approaches, Payton uses the analogy of studying a frog:

One way is to watch frogs for hours on end, the other is to dissect the frogs.  The Reformers watched the frogs, and they kept doing so, repeatedly and at great length.  The protestant scholastics dissected the frogs and probably came to quicker conclusions about what could be said about the frogs; the frogs never jumped again, though.

These contrasting approaches have resulted in different views of scripture that are still with us today.

Although Christians generally believe that God had an active role in shaping scripture, the degree to which He was active in the process is a subject of debate.  Was it 100% God, or were the historical human authors significantly responsible for the contents?   Zondervan’s Five Views of Biblical Inerrancy explores some of the main points on the continuum.  Where you end up on the continuum has something to do with how much you take the idea of revelation into account to shape your ideas of inspiration.

Another essential question is, what is it that God inspired?  Did he inspire the text, the authors, or the overall sense or spirit of scripture?  If one starts with the primacy of revelation, the spirit or sense of scripture is the object of inspiration.  When inspiration is the starting point, the object of God’s inspiration moves from sense toward text.  And again, if God actively inspired the text of the Bible, then it can’t possibly contain erroneous information–it must be objectively true.

Those who are concerned with inerrancy will usually include the text as one of the objects of God’s direct inspiration.

Informational or Relational?

When we think of the Bible primarily as revelation, we are emphasizing the relational dimension of scripture.  The Bible is a revelation between persons–God and his people.

When we think of the Bible as the Word of God.  We think in terms of God speaking to us through scripture, telling us things that we need to know.  Scriptures are thought of, primarily, a source of true information.

Is the Bible, primarily, the Spoken Word or the Special Revelation of God?

Is the Bible the Word of God, or it a Revelation of who He is?
James R. Payton Jr. says that the emphasis on the inspiration of scripture leads us to a “depersonalized” reading of the Bible.  It can have a significant effect on, among other things, the way we understand sin and faith. With a relational reading of the Bible, sin is understood as unfaithfulness to God and the effect on this relationship is like that which occurs in any broken relationship—estrangement.  These terms describe a personal relationship.  The Bible as inspired text may result in a depersonalized view of sin.  Sin is thought of as an infraction against God’s divine law and the effect is guilt.  The terms, and the feelings they describe are less personal.

How one understands faith is also affected by the relationship of revelation to inspiration.  The relational understanding of faith is thought of as “cleaving to God.”  The more depersonalized approach understands faith as the acceptance of right doctrine.  Both perspectives are found in the Bible; they don’t cancel each other out, but the emphasis of one idea over the other is not without effect.

One last example, on which I have previously written.  Do the sermons in your church end with applications or implications?  It’s not so much the word, as it is the idea behind the word.  The term application implies an impersonal adhesion of the object to the subject.  We stick the lesson onto the listeners like a band-aid onto a scraped elbow.  It makes the recipient feel better, but it doesn’t do much else.  Although implication suggests a lot more ambiguity than application, it is usually a better term because the clarity of the application is often achieved through a reduction of the truth, either factual or moral, to information.  Implication is not about how the sermon fits into, or onto, my life; it’s about how I fit into the story of the Bible and into a relationship with the person of God behind the scriptures.  Implication bridges the gap between subject and object because I enter the story and it enters me–I experience the story and in so doing, I encounter the truth.

There was a long stretch in my life where I hardly read the Bible at all.  My problem was that I thought of the Bible as if it were primarily informative and the reason for reading it was to acquire the right knowledge. I didn’t feel as if I needed more knowledge than I could glean from sermons and books about the Bible.  I’m much more inspired to read the Bible when it is about a person-to-person relationship, about finding in its pages the God who made us and loves us and seeks a relationship with us.  I have a strong desire for this relationship.

Is the Bible more like an encyclopedia or a letter from a loved one?
It’s sort of like the difference between reading an encyclopedia and reading a long letter received from a distant and cherished friend or lover.  Are the contents of each true–inerrant, if you will?  Sure they are, but it is not the veracity of the information in the love letter that motivates you to devour every word–even the description of yesterday’s weather.

 

Laptops in the Classroom

As a teacher with a classroom full of laptops, I had to read the article in my Twitter feed entitled–“Ban the Laptops, Yes.”   This article by Mark Bauerlein cites a study that appeared in Education Next, under the title “Should Professors Ban Laptops?” which suggests that the implementation of classroom technologies, such as laptops, may be detrimental to student learning.

The results were striking—

and disappointing for people who believe

that better classroom technology and implementation

will produce higher student achievement.

I was troubled and confused–troubled because last year the high school at which I teach required every student to come to school with a laptop, confused because, from my experience, laptops are improving student learning.

I read a little further.  The article explains, “The decisive measure was performance by students on the final exam.” Ah, there it is.

I was no longer troubled or confused.

It is clear from the study that the classrooms wherein laptops are causing the problems are the ones in which professors are lecturing and students are taking notes.  In this context, exams are a measurement of how well a student transfers the content of the lecture to the examination paper.  The study shows that technology interferes with this simple process because when students “update social-media sites, order takeout, and watch YouTube videos during lectures.”  It is easy to see why the study concludes that “unrestricted laptop use reduced students’ exam scores.”

This all leaves me with a few questions:

Why are laptops a detriment to student learning, but an indispensable tool for the professors who deliver the lectures, and research and publish their papers, articles, and books?  In the so-called, “real world” personal computers and other digital devices are used by adults all the time–presumably because they are effective tools for accomplishing important tasks. Is it simply that adults are more mature and therefore better able to resist the temptation to watch Youtube videos?

Or is it because the work that adults are doing is relevant and the results really matter, and because the work is challenging, requiring creativity and critical thinking?

Is it because it’s personal–involving the whole person–the unique gifts and abilities of the adult individual?  Or is it because it’s interpersonal, involving collaboration with others?

Is it because it’s complex, varied–interdisciplinary?

Or is it because the responsibility for the success and failure of our efforts rests heavily on our shoulders?

Perhaps the problem is not the laptops, but a pedagogy that lacks all of the things that keep adults motivated to do good work.

In my grade 9 humanities class, we are studying World War I.  One of the students’ tasks is to produce several documentary videos telling the story of the First World War–the causes, key figures and events, and the effects.  Their laptops are vital tools in this project.  They research their topics using the internet.  They write and edit the script for their documentary film using a word processor.  They find out how to properly cite their sources using online resources.  They record their scripts, then create and edit videos on their laptops.  In the process, they give and receive feedback as to how to improve their documentaries.  They then share these videos on a social media platform so that others may learn from their work.  The use of technology isn’t to make learning about WW1 more fun, nor is it a distraction from the learning.  It is a vital tool in the process of completing a complex project where students learn, not only about World War 1 but about research, primary and secondary sources, how to discern internet sources, documentary script writing, plagiarism, providing feedback, voice recording, video editing, and a lot more.

Students are not passive; they are active and motivated to complete a project of high quality–they don’t have time to check in on their social media accounts.  For a student passively listening to a lecture, it’s almost impossible to resist the lure of the distractions.

What is the problem here?  Are the laptops the issue?

If your primary task as a teacher is to cover content and communicate information, and if your students are passively listening and taking notes, then this study shows that it is important that you “should draw back, return to pencil and paper and chalkboards.”

But if your students transform, rather than transfer information; if the boundaries between your classroom and the “real world” are blurry; if what your students are learning will have relevance 30 years beyond the exam. . .

then bring in the laptops!

 

© 2018 crossing the line

Theme by Anders NorénUp ↑