I was surprised to see this article on concussions in hockey. In it, Hall-of-Fame players Eric Lindros and Ken Dryden argue for radical action on concussions, including a seemingly unthinkable ban on hitting. You might expect that from the intellectual goalie Dryden, but it’s a shock coming from Lindros, who made his living as a physical player. But after a career derailed by concussions, he plays non-contact hockey for recreation, and has been surprised that he still enjoys it.
That brings up Thing I Don’t Get About Hockey, #973: much of the hockey that actually gets played is non-contact. Women’s hockey doesn’t have body checking, children at an early age don’t either. And a lot of the recreational leagues don’t. And yet, whenever anyone discusses violence in hockey, experts talk about the physical nature of the game as being integral. Take it away and it’s not the same sport anymore. So by implication, women, children, and part-time players are playing some other sport. Those ads we keep seeing during commercial breaks pleading with parents to let their kids play hockey, are in fact, encouraging them to play some weird de-fanged facsimile of the sport. It’s a hypocrisy where the sport’s gatekeepers want us to keep the sport violent, but also want to maintain the idea that the sport is a wholesome part of Canadian society.
To be clear, I don’t for a second believe that removing the physical contact from the game is going to happen any time soon, but having respected players talk seriously about it is going to shift what is acceptable in discussion. The sport’s going to need major changes to make the sport safe, but so far, they haven’t been willing to make changes that will change how the sport is played, unlike the NFL. But if respected folks are going around talking about a massive overhaul, then those play-changing alterations will seem more palatable.
I’m also glad that Dryden brings up another point that I wish someone would acknowledge: just how much the game has changed over the years. He notes that if you look back at games from the 50’s - 60’s, there’s actually less hitting than now. That’s something I’ve noticed whenever the CBC shows old games late at night. Even as late as the 1980’s, you can see a distinct reduction in the physicality of the game. Specifically, there seems to be a different philosophy, with players often passing up the opportunity to hit another player. Today, in the same position, anyone would hit an opposing player at any opportunity; there’s no question of tactics or choosing when to hit, you just hit anyone you can any time you can, that’s how it’s done.
But Dryden points out that it’s not hard to understand why there was less hitting: in the original-six era, shifts were much longer than they are now. Players didn’t have the energy to hit everything that moved, and even if they did, no one had the energy to get up the speed to hit particularly hard.
Oh, and this leads to Thing I Don’t Get About Hockey, #1,562: Hockey is an extremely traditional sport, yet no one seems to care that it keeps changing. I always thought that the fundamental difference between hockey and baseball is embodied by the implements of the sports: the bat and the stick. Both baseball bats and hockey sticks are traditionally made of wood. But as soon as someone considered making the bats out of something else, they immediately banned the concept and assured everyone that bats would be forever made out of wood. In hockey, they started making the sticks out of carbon fibre, and no one said a word. The players quietly made the revolutionary change without much pushback from the sport’s traditionalists who howl at any rule change. Now wood sticks are as rare as players with perfect teeth.
Similarly, if you ask anyone, they’ll say they never want the game to change, even though it’s now totally different from what it once was. Hopefully, more people will recognize how much and how frequently the sport changes on its own, and realize that we can improve it when necessary.
Sunday, August 26, 2018
Sunday, August 19, 2018
Who’s A Good Ancestral Wolf?
There’s this new movie called Alpha, about the first ever pet dog. It looks like another of those movies that is going straight for heartwarming, without any big stars or promises of explosions or special effects. I don’t know how it got a summer release instead of going straight to download. Or straight to the Walmart $5 DVD bin that somehow just keeps going no matter how many people abandon DVD’s.
And, of course, it has dogs, which can carry a movie all on its own.
I was wondering if it even had religious overtones. I know that seems like a stretch, but in today’s movie industry, if a movie doesn’t have stars, action, comedy, or art house chic, it’s a mystery how it even exists. I start to wonder if it’s an extension of some brand that’s really popular in the American-Christian subculture.
But then it hit me: this movie is set — according to the ads — 20,000 years ago. Apparently, that’s based on some guesswork, since dogs might have been domesticated before or after that date. But still, 20,000 years. That’s an American family movie taking place 20,000-screw-you-Young-Earth-Creationists-years-ago. So it’s not just non-religious, it’s going to be offending the super-Christian. That’s kind of surprising. I’m sure that for entertainment aimed at the entirety of American culture, they would probably choose to take science’s side and assume that the Christian Fundamentalist boycott wouldn’t amount to much. But for family-friendly entertainment I would think that they would rather have hard-core religious families on side, and just let Neil DeGrasse Tyson rant about the inaccuracies. That’s especially given that this movie is going to have appeal to people who aren’t really in love with the usual Hollywood fare.
So this is kind of a weird concept: heartwarming family science. And I just read a review that pointed out that the stars in the sky are accurate even taking into account the drift of stars over those twenty millennia. So now I’m picturing Neil DeGrasse Tyson in the theatre weeping along with everyone else, having never seen such scientific accuracy in a movie.
And, of course, it has dogs, which can carry a movie all on its own.
I was wondering if it even had religious overtones. I know that seems like a stretch, but in today’s movie industry, if a movie doesn’t have stars, action, comedy, or art house chic, it’s a mystery how it even exists. I start to wonder if it’s an extension of some brand that’s really popular in the American-Christian subculture.
But then it hit me: this movie is set — according to the ads — 20,000 years ago. Apparently, that’s based on some guesswork, since dogs might have been domesticated before or after that date. But still, 20,000 years. That’s an American family movie taking place 20,000-screw-you-Young-Earth-Creationists-years-ago. So it’s not just non-religious, it’s going to be offending the super-Christian. That’s kind of surprising. I’m sure that for entertainment aimed at the entirety of American culture, they would probably choose to take science’s side and assume that the Christian Fundamentalist boycott wouldn’t amount to much. But for family-friendly entertainment I would think that they would rather have hard-core religious families on side, and just let Neil DeGrasse Tyson rant about the inaccuracies. That’s especially given that this movie is going to have appeal to people who aren’t really in love with the usual Hollywood fare.
So this is kind of a weird concept: heartwarming family science. And I just read a review that pointed out that the stars in the sky are accurate even taking into account the drift of stars over those twenty millennia. So now I’m picturing Neil DeGrasse Tyson in the theatre weeping along with everyone else, having never seen such scientific accuracy in a movie.
Tuesday, August 7, 2018
The Twenty-First Century's Yesterday
Comic book author Warren Ellis pointed out something interesting. He's looking forward to next year, because it was the year when the original Blade Runner took place. No, it's not because he's saving up for one of those cool florescent light umbrellas. He points out that it is the last thing in science-fiction. We all lived through 2001 without any monoliths, and went right through 2015 without hoverboards. But next year will be our last future year. I would point out that it's not just Blade Runner: Akira also took place that year. That will be -20 geek points, Mr. Ellis.
Of course, it's not really the last sci-fi future: Buck Rogers is still waiting for us in the twenty-fifth century. And all of Star Trek has yet to happen, even Khan conquering a quarter of the world. That was supposed to happen in the 1990's, and he's really taking his time about it. The point is, when it comes to near-future science-fiction stories, the predictions run out after the first couple of decades of the twenty-first century.
That's not too surprising. Back in the twentieth century, 2000 was pretty much what people thought of when they thought of the future. There wasn't much need for a sci-fi author to set a story any further into the future. That would be like saying that the story took place the day after the future. So now those of us who lived in the twentieth century have to get used to the idea that we're living beyond what was once our future.
(As an aside: I referenced the light-up umbrellas in Blade Runner above, even though I was pretty sure that I remembered seeing that someone was actually selling them. So I looked it up, and found that they were available, but are no longer being sold. So that seals the argument that we are now ahead of the future.)
Ellis thinks that this development will be liberating, because there are no longer any expectations handed to society by our fiction. That's good, because it's always been disappointing when we cross one of these future anniversaries. Above, I referred to how 2015 didn't measure up to the future depicted in Back To The Future II. And to me, the big symbol of disappointment in the future was 2001, which represented incredible discovery as a movie, while the real-life year will always be associated with a shocking act of terror that represented humanity at its worst.
So I'll also look forward to not having it occasionally shoved in my face that the world's future is not as good as I was promised. I'll miss the invitations to nostalgia, but in today's world, that's hardly a problem.
Of course, it's not really the last sci-fi future: Buck Rogers is still waiting for us in the twenty-fifth century. And all of Star Trek has yet to happen, even Khan conquering a quarter of the world. That was supposed to happen in the 1990's, and he's really taking his time about it. The point is, when it comes to near-future science-fiction stories, the predictions run out after the first couple of decades of the twenty-first century.
That's not too surprising. Back in the twentieth century, 2000 was pretty much what people thought of when they thought of the future. There wasn't much need for a sci-fi author to set a story any further into the future. That would be like saying that the story took place the day after the future. So now those of us who lived in the twentieth century have to get used to the idea that we're living beyond what was once our future.
(As an aside: I referenced the light-up umbrellas in Blade Runner above, even though I was pretty sure that I remembered seeing that someone was actually selling them. So I looked it up, and found that they were available, but are no longer being sold. So that seals the argument that we are now ahead of the future.)
Ellis thinks that this development will be liberating, because there are no longer any expectations handed to society by our fiction. That's good, because it's always been disappointing when we cross one of these future anniversaries. Above, I referred to how 2015 didn't measure up to the future depicted in Back To The Future II. And to me, the big symbol of disappointment in the future was 2001, which represented incredible discovery as a movie, while the real-life year will always be associated with a shocking act of terror that represented humanity at its worst.
So I'll also look forward to not having it occasionally shoved in my face that the world's future is not as good as I was promised. I'll miss the invitations to nostalgia, but in today's world, that's hardly a problem.
Saturday, August 4, 2018
Hey, You, Get On To My Cloud
I’ve noticed a problem creeping into technology: So often, we don’t know what to call things. It used to be simple: WordPerfect was a “program” or “application.” MySpace was a “website.” Angry Birds was an “app.”
But now those are blurred together. Facebook is a website that is a social network. But many people access it on an app. Instagram caught people’s attention as a photography app, but it also allows people to access a social network. And you can access that social network through a website.
Yes, I know, this isn’t exactly rocket science. But it starts to break down whenever you’re talking to people who don’t have the best knowledge of technology. People want to try to classify a thing, and that’s difficult when so much of today’s technology is really an abstract service that is available in a number of ways.
In their effort to try to understand things, a person with little acquaintance with technology will often hang on to a particular classification. Say, they first encountered Facebook on the web, so in their minds it is, and will always be, a web site. But they first use Twitter as an app, so it will always be an app. So this person thinks of these two things as apples and oranges, even though they do essentially the same thing. And this sometimes leads to people trying to make bizarre arguments, like that Facebook is less popular than Instagram because people prefer apps to websites.
That’s why the “cloud” analogy — annoying though it may be — is quite useful. The cloud concept may be poorly understood by much of the mainstream population, but in today’s world that may be helpful. A vaguely-defined amorphous concept is pretty useful for defining services that may manifest themselves in our lives in multiple ways.
But now those are blurred together. Facebook is a website that is a social network. But many people access it on an app. Instagram caught people’s attention as a photography app, but it also allows people to access a social network. And you can access that social network through a website.
Yes, I know, this isn’t exactly rocket science. But it starts to break down whenever you’re talking to people who don’t have the best knowledge of technology. People want to try to classify a thing, and that’s difficult when so much of today’s technology is really an abstract service that is available in a number of ways.
In their effort to try to understand things, a person with little acquaintance with technology will often hang on to a particular classification. Say, they first encountered Facebook on the web, so in their minds it is, and will always be, a web site. But they first use Twitter as an app, so it will always be an app. So this person thinks of these two things as apples and oranges, even though they do essentially the same thing. And this sometimes leads to people trying to make bizarre arguments, like that Facebook is less popular than Instagram because people prefer apps to websites.
That’s why the “cloud” analogy — annoying though it may be — is quite useful. The cloud concept may be poorly understood by much of the mainstream population, but in today’s world that may be helpful. A vaguely-defined amorphous concept is pretty useful for defining services that may manifest themselves in our lives in multiple ways.
Subscribe to:
Posts (Atom)