Blog

Blog

Star Wars Seven

Writing/literaturePosted by Jim Baker Fri, January 01, 2016 11:03PM
(cross-posted from the ZBB)

I really enjoyed the film. The one big fault I would find with it would be that there was a lack of originality - it was really well done, but well done using ideas from the existing films more than coming up with its own ones. Fault the prequel trilogy all you like, you can't deny there are a lot of new ideas in there: not necessarily good new ideas, but new ideas all the same. Pretty much everything in this new film, however, felt derivative: BB-8 was just a souped-up R2-D2; Kylo Ren was (even in-universe!) just a cheap knock-off of Darth Vader - and these were amongst the film's more original elements!

I'm not necessarily sure the lack of originality was a bad thing. Like I said, I really enjoyed the film, and lots of other people have as well. It's not a model that can last, though. This series isn't going to last very long if it's just continually rehashing old ideas.



Some blog posts

ReligionPosted by Jim Baker Sat, April 04, 2015 12:08PM
Some good blog posts I have read today:

Spontaneity vs Craftsmanship [in Christian art]

The Moral Urgency of Anna Karenina

Thoughts on Note-Taking During Sermons

America's Muddled Morality About The Unborn

And also this one though I'm not sure how much I can agree with it.

The creation of man

ReligionPosted by Jim Baker Fri, November 14, 2014 07:53PM

http://www.worldmag.com/2014/11/interpretive_dance/page1:

"The BioLogos website states, “Genetic evidence shows that humans descended from a group of several thousand individuals who lived about 150,000 years ago.”

But Stephen Meyer, a Discovery Institute leader of the intelligent design movement, told WORLD BioLogos leaders are using “an unsubstantiated and controversial claim to urge pastors and theologians to jettison a straightforward reading of Genesis about the human race arising from one man and one woman. They think ‘the science’ requires such a reinterpretation, but apart from speculative models that make numerous question-begging assumptions, the science does no such thing.”

The version of Genesis in my Bible has the creation of humans of undetermined number and both sexes (1:27), followed by what appears to be a separate account of the creation of one particular man Adam and subsequently his wife Eve (2:5-25). That these are separate accounts is obvious from the fact that the relative order of creation is different in both places: the creation of man precedes the growth of plants (and possibly animals) in chapter 2 but follows them in chapter 1. Possibly Genesis 2 refers to a "second" act of creation in Eden specifically: there is no particular reason to believe that the "mankind" of chapter 1 refers to the Adam and Eve of chapter 2, though of course it's a possibility.

Chapter 4 introduces us to Cain, who is scared someone might kill him (v14), implying there are other people around who might do so, who then marries a previously unmentioned woman (v17), and subsequently starts building a city, implying there are plenty of people around who might want to live in it. One possible interpretation is that Adam and Eve had loads of children we simply aren't told about, but another - and it seems to me the more natural one, given chapters 1 and 2 - is to assume that there were a number of other people around separately created by God.

The idea that there existed humans who were not descendants of Adam and Eve is not, therefore, definitely in contradiction to the text and is arguably supported by it. Dismissing anyone who claims otherwise as unbiblical is a somewhat careless thing to do, therefore.

The real difficulty comes in trying to reconcile the genetic data with the story of Noah.

7:50

ReligionPosted by Jim Baker Sun, July 20, 2014 12:04AM
The "seven-fifty" in "seven-fifty.net" does not refer to a Bible verse, but if it did these are the ones it could potentially refer to (NNIV):

Numbers 7:50 - one gold dish weighing ten shekels, filled with incense;

1 Kings 7:50 - the pure gold basins, wick trimmers, sprinkling bowls, dishes and censers; and the gold sockets for the doors of the innermost room, the Most Holy Place, and also for the doors of the main hall of the temple.

Nehemiah 7:50 - Reaiah, Rezin, Nekoda,

Luke 7:50 - Jesus said to the woman, “Your faith has saved you; go in peace.”

John 7:50 - Nicodemus, who had gone to Jesus earlier and who was one of their own number, asked,

Acts 7:50 - Has not my hand made all these things?



Re: changes to GCSE Eng Lit syllabus

Writing/literaturePosted by Jim Baker Mon, May 26, 2014 08:35PM

Gove has made many bad decisions, but I'm not convinced this is one of them. Yes, To Kill A Mockingbird is a Great Novel - but it's not as if it's been suggested that it be replaced with something that's complete rubbish. It's a shame that kids won't get to read it, but surely that's made up for by the fact that they get to read another work of equal merit instead. Or do people seriously want to argue that the nineteenth century classics in question are, in fact, badly written or irrelevant or whatever? If so, I know who I think are the philistines.

At the end of the day there's only so many books you can fit into a GCSE syllabus. You're going to have to make difficult choices like Charles Dickens vs Harper Lee or whatever. It's just the practicalities of the thing. I rather suspect that if Great Expectations or something had been the number 1 set text for however many years and it was mooted that it be replaced people - the same people - would be complaining just as much, on the same grounds: that teenagers are being denied good literature. But that isn't the case. Thinking maybe we should have another book in its place isn't the same as thinking To Kill A Mockingbird is terrible. It's not as if it's universally considered to be The Single Best Book Ever Written or anything anyway.

The main argument in favour of some of the current books seems to be that they're easier for teenagers to engage with or whatever. Perhaps reading a book that's too hard will put them off reading forever! This seems a bit condescending to me - it's as if it's saying 15-year-olds can't actually cope with nineteenth century stuff, so give them something easier. Likely there are some kids who wouldn't be able to cope - but that doesn't mean everyone else should be denied access to the "harder" stuff. And I rather suspect that if you've reached the age where you take GCSEs and you're still at risk of being put off reading by a book that's too hard, reading great literature probably isn't ever going to be your thing anyway.

There's also, relatedly, the length argument. There's not much time: short books are better. Hence the popularity of Of Mice and Men. And it's true, a lot of nineteenth century books are very long - but not all of them. And it's not as if all the reading has to be done in the classroom, or that every last part of the book need be analysed. Most people should be perfectly capable of reading even a very long book over the course of a term or two as homework.



-

PoliticsPosted by Jim Baker Tue, April 29, 2014 10:00PM

And it's the second verse

Nobody knows the words

Sing anyway!

La-la-la-la-la-la

Blah-blah-blah-blah-blah-blah

Oh good we're almost there

God save the Queen!



Linguistics and prescriptivism

LinguisticsPosted by Jim Baker Fri, January 10, 2014 01:13PM

There is an article in the Telegraph this week in which Lynne Truss complains about the failure of linguistics to involve itself in prescriptivism.

Her particular prescriptive argument here seems to be that if we are not careful with the way we write – or at least, if we are not careful with what spell-checking software we choose to use, which isn't really the same thing – we may end up with such ambiguities as being unable to distinguish between “everyday” and “every day”, “anyway” and “any way” etc. There are a few ways I can think of in which one might react to ambiguity in language. One is to throw out absolutely everything that could lead to ambiguity, which would in fact be tantamount to throwing out the whole of language and starting again. Another is to accept it unless it actually causes problems. But the examples Truss gives don’t actually cause major problems, because with most of them it’s always going to be clear what is meant, and those cases in which it might not be are rather contrived and not likely to actually occur very often.

If we actually look at linguistic facts – and presumably when Truss calls for linguists to get involved in prescriptivism she does so because she recognises that we probably have better understanding of actual linguistic facts than most people – we find clear examples of this. Spellings like “any body” for “anybody”, which presumably Truss would condemn, are (I believe) frequent in the works of at least one major author (Jane Austen), something which does not appear to have caused many people to have thrown Pride and Prejudice out of the window in a fit of confusion at what the words mean. The ancient Romans and Greeks – and the modern Chinese – did/do not use spaces at all, writingallwordsinonelongstringlikethis, and this does not seem to have led to a great deal of cultural impoverishment.

I strongly suspect that if linguists were actually to get involved in trying to “improve” language, traditional prescriptivists wouldn’t like it very much. (I am now going to get slightly Marxist.) The problem is that prescriptivism, typically, isn’t actually about “improving” language, or “trying to prevent its decline”. It just deludes itself that it is. It’s actually about preventing members of lower-status social groups from enjoying the benefits and privileges open to higher-status groups. It’s about taking one variety of language, the choice of which would appear entirely arbitrary if we were looking at purely linguistic criteria, and saying “this way of speaking/writing is better”. In fact the only tangible way in which this variety is “better” is that it is associated with higher-status groups – people from a certain social background or with a particular type of education. But now it’s been set up as superior – a sort of superiority that typically has a quasi-moralistic tone to it, or else is used to make aspersions on a person’s intelligence – it can be used to exclude people who don’t speak or write it: “You don’t speak or write correctly, therefore you can’t come to this university / work in this job.” Of course, it’s no fault of the people who don’t speak or write in this way that they grew up in an environment where they learned to speak differently, or went to a school where the standard rules of writing were not so well taught.

This isn’t how Truss presents things, of course. Indeed, she implies a rather more egalitarian motive – prescriptivism has a positive role in “remedy[ing] literacy levels” and so on. Now, let us not pay too much attention to the fact that literacy levels in this country seem actually to be improving. Would typical prescriptivism actually help matters? The answer is that it almost certainly makes things worse. Prescriptivism rests on a sizeable foundation of basically arbitrary rules like the following:

– The apostrophe is used:

(a) to mark possessives, except when it isn’t (its not it’s);

(b) to mark omitted letters, except when it isn’t (can’t not ca’n’t).

Forcing people to waste time learning such rules clearly does nothing to help literacy and as a linguist I am strongly inclined to suggest that the best thing to do may be to get rid of apostrophes altogether. I might also, for instance, suggest a more regular spelling system. But these are precisely the sorts of things prescriptivists don’t want. Their interests are in maintaining the status quo, not in creating a new system that might actually serve people better.

Let’s look at some other examples of something a prescriptively-minded linguist might be tempted to recommend. A couple of things prescriptivists tend to be big on are “clarity” and “logic”. First consider the first. Standard English, currently, has only one second-person pronoun that is in common use, namely you, which is both singular and plural. Now, there are a number of instances in which it might be fairly useful to have a distinction here as most languages do, e.g. when talking to two people: “You, come with me – you, not you.” Many English dialects actually do make such a distinction, saying for instance you in the singular and youse in the plural. I happen to think this is eminently sensible and everyone should do it – and it is certainly defensible in the interests of clarity. But I imagine your average traditional prescriptivist would be horrified by such a suggestion! It goes exactly against everything they believe about language – that the standard variety is always superior to non-standard ones.

Secondly, let’s look at “logic”. There is a recent example of a school banning the use of various spoken forms including “we was” and “you was”. (They actually spell these things “we woz” and “you woz”, which represents exactly the same pronunciation – is even, in fact, a more logical way of spelling that pronunciation – but cleverly makes it look even more non-standard and is thus a good way of stigmatising these forms even further.) But from a logical perspective, we might argue that it makes more sense to use was consistently as the past tense of be, because no other verb makes a distinction along the lines of that made by was and were. Again, no traditional prescriptivist would ever actually endorse this. Logic isn’t really the point; the relative social status of the two varieties is.

I don’t actually think linguists should be devoting their time to telling people how to speak or write, but as I’ve illustrated if we did there’s no guarantee that what we’d say would be the kind of thing Lynne Truss would like us to be saying. We can be content to describe (and explain) language. Which, contrary to Truss’s slur at the end of her article, is actually what most researchers in most fields do with their respective objects of study. I have no clear sense of what the average epidemiologist does all day, but I would guess that a large chunk of them devote all or nearly all of their time purely to describing diseases, because without that not much can be done to prevent them. Actually curing diseases is largely someone else’s job, namely doctors’. If a building fell down I would be at least as much inclined to blame the engineers and the builders as the architect – and of course, there are lots of academics who are more concerned with describing existing architecture than trying to build new buildings. Certainly it is the job of a subatomic physicist to describe subatomic particles, not to try and invent a better way of constructing a universe. It is the job of a historian to describe and explain history, not to try and change the past. Linguistics is not actually alone in being a descriptive discipline.




Gender segregation

PoliticsPosted by Jim Baker Sat, December 14, 2013 11:27PM
There is a bit of a furore at the moment over gender segregation in talks at UK universities, see e.g. http://www.bbc.co.uk/news/education-25378713.

Perhaps I'm missing something, but I can't really see how this is meaningfully different from having female-only colleges or designating the student union Women's Campaign meetings (feminist meetings, essentially) as "women-only spaces", both of which exist in Cambridge. Arguably these are even worse, as one gender is excluded entirely. Now, either or both of these may be problematic for various reasons, and equally are defensible for various other reasons, but I don't see anybody in the media attacking them. Perhaps this is just out of ignorance, but it does lead me to suspect that this is as much an anti-Muslim thing as anything else. It's OK for (mostly white) liberal feminists to enforce segregation, but for brown-skinned strongly religious people to do it - outrageous!

Next »