Wednesday, November 26, 2014

Out in the cold

I've attached a photo of Windows 8 updating to 8.1.  I'd assumed my convertble tablet already had 8.1 because I'd checked for updates a few different ways and the 8.1 update was never offered (until recently).  I bought this device after 8.1 was released, so it was reasonable to assume it was pre-loaded.  Before today I didn't need 8.1, I tend to stay on the desktop side of things.  Now I need it to try out a new app that requires it.

Anyway, if you can see the photo, Windows 8 has the least verbose install screen I've ever seen.  You can hold a mouse cursor over the progress bar to see a percentage, otherwise all you see is the name of what's being installed, progress bar and a brief description of what it's currently doing.

It's design decisions like this that make me feel like I could not be more differently minded than whoever designed and approved this.  Why obscure any useful information?  Why no download bitrate (when it was downloading)?

Is it clean?  Yes.  Would a percentage indicator clutter it up too much, or a bitrate counter?  I really don't think so.

I have to say I'm indifferent to Google's new Material design rules as well.  They seem to be raising contrast in their interfaces, which I think is good.  Every update to the web Gmail dashboard seemed to be an attempt to remove contrast, to make it harder to pick options and features out at a glance. Though now that I think about it that could have been because of their advanced email tagging system, which raises contrast between the messages themselves.

One thing I absolutely dislike about the Material design in Android 5.0 is the recent app screen.  On previous versions they're neatly spaced in a vertical or horizontal column, the most recent few are on screen but you can scroll over to other ones.  In 5.0, at least in a preview I saw, you could only see the most recent app, and you had to slide the top one away to see the next one under it "like pieces of paper on top of each other".  So instead of opening recent apps and clicking on the one I want, I have to wade through them to find the right one.  It's a small thing, but it doesn't seem more convenient to me.

By the way, Chrome for Android has had similar functionality for a while, where the tabs aren't evenly spaced out, they're usually clustered to one side you have to dig through them to find the one you want.  That and it's inability to disable the address-bar-hiding full screen mode are why I don't use Chrome for Android much anymore.


Would the good people at MasterCard please fuck off?

So let me tell you about my night.  I have a long drive coming up tomorrow, about seven hours.  I'm going to be stuck in out the boonies, in a part of the US where internet access is a line-of-sight issue and if you can get it it's pretty bad quality.

So here's where my desire to have DRM-free media pays off.  I've just learned that 2000 AD has DRM-free digital comics available direct from their web site.  Awesome, and hey have a couple (hell, a ton) of series I keep hearing about available to purchase digitally.  So I buy two books to copy to my tablet so I can read them out in the middle of no-internet-land.  Or, I try to buy them.

My bank card is a MasterCard, and 2000 AD's payment processor (Sage Pay I think?) participates in this ridiculous program MasterCard set up called SecureCode.  What is MasterCard Secure Code? Why it's a system that requires users to create yet another password for you to remember every time you want to use your card online!  Yes, isn't that great?

It firstly annoys me and secondly worries me that MasterCard, a company who holds a lot of my financial security in their hands, doesn't understand basic security.  From having this bank card I already have one password, my online banking password.  Then I have another password, with the site I'm purchasing from.  Don't forget the password I have to log into my computer in the first place.  How many passwords would MasterCard like me to have in order to buy something online?

Pro tip:  Forcing someone to create an unnecessary password isn't more secure, it's less secure.  It's one more thing I have to remember, it's something that most people will either write down or use the same password they use for every other service.  Either option is insecure.  In some ways it's less secure than not requiring a password, because now if MasterCard Secure Code is hacked, the poor slob who only has one password now has that one password in some hacker's hands.

I'm sure it makes MasterCard feel better, because it's essentially a CYA maneuver.  Cover Your Ass.  Unfortunately it's a shit implementation of a shit idea that doesn't help me one iota.

The first time I faced a Secure Code screen it allowed me to opt out.  I tried many times on many devices to opt out tonight, and it wouldn't give me the option again.  Googleing "Secure Code opt out" results in many articles announcing Secure Code, assuring people that of course you'll be able to opt-out. Ha!

Here's another pro tip, for free MasterCard:  If you want more secure transactions, do one of two things:  1) Set up an authorization system where I can grant or revoke permission to a particular store or service to charge my card, kind of like Twitter and Google have in order to access my data.  2)  Get my cell number and text or email me every time my card is charged.

These aren't new ideas, they're in use by a lot of large companies.  I decided not to set up a Secure Code password, and I cancelled the transaction.  I might set it up in the future, but honestly I doubt I will.

For bonus reading, here's a little article I found (pdf) called "Verified by Visa and MasterCard SecureCode: or, How Not to Design Authentication".

Also this, this, and this.

I have to say, I can't wait until MasterCard (and Visa, they have their own version I've yet to deal with) goes the way of the dinosaur, with its ridiculous processing fees and its security theater.  I mean come on MasterCard, when you have me wishing I could use Paypal instead you know you've done something horribly, terribly wrong.


PS-I eventually purchased the comics using an American Express Serve card, which I was planning on cancelling, but it looks like I might have to keep it loaded for just such an occasion in the future.

Tuesday, November 25, 2014

A little post about snobbery:

So maybe a year ago a friend of mine told me that it wasn't "proper" to put two spaces after a period anymore.  Last week my sister mentioned it, and today my brother did.  This wasn't them telling me because I do it, this was them mentioning it because using two spaces is how we were all taught

So there are two types of fonts, proportional and monospace.  Proportional "should" have one space after ending punctuation, monospace "should" have two spaces.  Note the quotes, because it's all really down to taste, training and usage and there's no punctuation jail for people who choose to do it their own way, unfortunately.  Here's a bit of dickishness about "proper" spacing some guy chose to inflict on the world, go read that if you want lessons in looking down your nose at people.  I'm not going to waste a lot of time writing about this because:

1)  There's no objective right and wrong---it's all down to taste, and
2) Even if there were an objective right it wouldn't matter anyway, because the amount of spaces after a period has no tangible impact on the movings and dealings of the world at large or its inhabitants.

In the article I linked above, the guy mentioned that every typographer (actually, every "modern" typographer, kind of implying that some do disagree, but they're not modern so they don't matter) knows to use one space, the same way waiters know which fork goes where.  That's an apt comparison, because neither thing matters.  They're both social signals.  Having a bunch of rules at the dinner table was a way for people of "high" breeding to signal their place in the social structure, as well as an easy way for those people to weed out the lowborns.

Do yourself a favor.  Re-read that Slate article and ask yourself, is the author helping anybody?  Does his tone and style make you think he's trying to make anybody's life easier with his advice?  Or is he just trying to make his preferred social signal the dominant one?  I mean, you might save a few minutes across your entire life by not typing that extra space, but you've blown those extra minutes in the time spent reading his article in the first place.  He actually acknowledges this:
And yet people who use two spaces are everywhere, their ugly error crossing every social boundary of class, education, and taste.
Because without social boundaries, how could we be dicks to all of our friends?  Without appearing to be dicks, I mean.  The best part of social boundaries is there's so many of them.  Is this guy dressed better than you?  Oh, and he has better manners?  Bummer.  Wait, he used two spaces after a period in a text!  Ooh, you can use that to cement your superiority over him.  Hooray for creating hoops people have to jump through to hang out with you, or to be seen as human beings in your eyes!!!

Full disclosure: I do most of my writing in a monospace font for a dozen reasons, and it's a hundred times easier to proofread (and just easier on the eyes) to enter two spaces after a period.  However, most places where I publish writing, including this blog, remove those extra spaces as part of their formatting system.  For instance, even if you type two spaces after a period in an html document, it only renders one.  I'm pretty sure Blogger ignores the second space too.  I don't worry about it in either case, because it doesn't matter.

Let's get this straight.  When writing, here are things that matter:

  • Tenses matter
  • Proper punctuation matters
  • Proper spelling matters (only every now and then)
  • Pluralization matters

Clarity matters above all, and these other things matter because they introduce redundancy and secure the clarity of the information you are trying to communicate.  If I write an email and put a comma in the wrong place, the meaning of the entire sentence (and due to clouding the context, the entire paragraph or email) could be messed up.  If I write my friend an email reminiscing about an old camping trip but only use the future tense he might ask me if I'm trying to invite him on another trip.  If I use one space or two after a period it affects, wait for it...absolutely nothing.

No! the Slate article says, It looks worse.  It hurts my delicate eyes to see two spaces after a period instead of one.  Also this way is right, and that way is wrong, because Argument From Authority!

Oh shit, so if I use two spaces after a period, that could end up distracting you to the point that you'll crash the plane you're flying?  It'll cost lives?  It'll make you blind in one eye, it'll break your computer screen, it'll bring God's wrath down upon me, it will have some sort of tangible affect on the world?  Nope?  It will actually have no affect at all except for annoying you?

Sounds like a great reason to keep following ending punctuation with two spaces to me!



PS - You know what?  He mentioned social boundaries, I'm honor-bound to quote Cloud Atlas:
I understand now that boundaries between noise and sound are conventions. All boundaries are conventions, waiting to be transcended. One may transcend any convention if only one can first conceive of doing so.

Monday, November 17, 2014

Day One Patches

And while I'm thinking about the games industry, what's with all the complaining about day one patches? Far Cry 4 has a patch waiting to download as soon as it's unlocked.  It'd be cool if they downloaded the patch with the pre-load data, but I know it's not likely.

Day one DLC sucks, because that means they left content out of the game in order to try to get more money, a few more dollars on top of the $60 they're already charging you.  It just feels gross.  Why not give us a complete game?  Expansions or episodes like Bioshock Infinite gave us make more sense.  The two episodes back in Rapture wouldn't really have fit in the game, they're their own stories.  And really if they had released those episodes Day One I wouldn't have minded, because they're not part of the finished game, they're their own thing.  Day one DLC with extra characters, missions, guns, etc, are things that were designed to fit in the main game but studios chose to leave them out to charge for the extras later.

Bitching about day one patches though?  Are you fucking kidding me?  Are people really upset that they're fixing bugs this quickly?  Yes the days of NES and PS1 are over, where a game had to be completely bug-free, because there were no updates.  It is an inconvenience?  Yes.  But in PC at least I don't blame a studio for not testing every possible configuration.  It would be impossible. People are going to report issues, and Ubisoft is actually so on top of it many of the issues were fixed when the game came out.  And let's not talk about all the games and shitty ports that never saw a single patch at all.  you'd think we might be a little more appreciative when someone is jumping on it,

A brief aside, I don't want to come off like an Ubisoft fanboy.  I thought it was really cool when they released Rayman Origins DRM-free, but apparently it was just a phase because Rayman Legends had DRM.  I don't really like Uplay but you only need to log into it to activate games and then you can leave it in offline mode, it's not that big of a deal.  I did really like Far Cry 3 though, and I'm really looking forward to Far Cry 4.  It's supposed to be more of the same, which sounds good to me.  I do hope the story is a bit better, I see what they were trying to do with Far Cry 3 but I think the game got away from them.

Anyway, I know the frustration stems from wanting to play a game and having to wait for patches, and I know this is an even bigger deal because it affects the biggest fans, the ones going out on a limb and buying the game before they get any word of mouth from friends.  But do me a favor and look at the Saints Row 2 pc port.  Look at Deadly Premonition.  Be glad that Ubisoft is on top of it.  It's annoying but it sure as hell beats the alternative.


Midnight Local Release Times

So because I'm not the type to pre-order games, there's a practice I've been unaware of in the gaming world:  Local Midnight release times.  That means if you live on the east coast your game is unlocked and playable a few hours before the west coast.  That means if you life in New Zealand your game is unlocked like 18 hours before the US.  The game I've just pre-ordered, for anyone curious, is Far Cry 4.

Let's not pretend like release dates for any corporate media have ever been consistently reasonable, and album releases still get the shortest end of the stick with months apart from country to country and even different track listings between them.  Rock Paper Shotgun loves to say that there are no oceans on the internet, and there should be release date delays.  And look, a globally consistent release date is far more fair than a Tuesday release in one country and a Friday release in another.  But it still rubs me wrong.

I'm in US Central time.  In a few hours people I know on the east coast will be playing Far Cry 4, and I'll be waiting an hour.  In the grand scheme of things it's not a big deal, and I'm not angry just a little flummoxed by it.  It just seems weird to me.  I don't fully get it.

What are they trying to achieve here?  Is it supposed to be more fair that people somewhere are getting it 18 hours in advance?  Is it a tactic to save their activation servers from getting hammered (if so that's actually a good strategy)?  Is it because executives in corporations have brain disorders where they jump to the most convoluted of a series of options?  (I only ask that one because I've had conversations with higher-ups at many corporations and their requirements for a good idea versus a bad idea are incredibly foreign to me.  I'm serious, when talking to executives at different times from different companies each time I got the strange sense they were from other planets.  Like they all took logic courses from the same terrible teacher.  Honestly I think it's due to the corporate structure bringing it's own challenges and rituals and they tune their mind to that system rather than the actual nuts and bolts of what their business actually does.  That's the only explanation I can think of.  And really, sometimes the difference between something like a good logo and a bad logo is intangible, indescribable, so perhaps they forgo their personal taste for a choice they can properly defend, even if it isn't the best option.  I don't know.  What a strange rant.)

Anyway, it's better than properly staggered releases, and perhaps it does help keep their network from collapsing.  It's not the worst idea I've ever heard of, but I'd rather a global release date and time set by GMT, not local time, so everybody has the same opportunity.

Though now that I think about it, if you're going to do global time, why not 8pm?  Midnight, when most people are in bed or about to go to bed?  Why not announce the 18th as the release date but unlock it at 8pm on the 17th?  (and the week before you can make that distinction, let people know when they can actually play it)

I don't know, I've only just been made aware of this type of release, perhaps I need to think about it some more.

Til then,


Thursday, November 13, 2014

What is really dead?

(wow, in a bit of design brilliance, entering in title in Blogger and then hitting enter twice publishes the post.  A blank post, with only a title, which is why some of you may get some rss silliness over this)

Okay, let's keep this short and sweet.  Can we stop declaring the death of things?  I mean concepts, I mean media.  People are still using vinyl.  You can still order a telegram delivery.  Cars did not kill horses, digital will not kill books, video did not kill radio, etc etc.  I can't tell you how many times in the last few years I've read an article about how blogging is dead written on a blog.  Tumblr is a blogging platform.  So is Facebook to some degree.  Sites like TechCrunch and HuffPost, that's debatable but the line has been blurred, they certainly are blog-like.  Twitter is a microblogging platform (among other things).  Youtube is a videoblogging platform (among other things).

Nothing dies.  I guarantee you someone right now is engraving writing on a stone tablet either to be cute or to create something designed to outlast paper and silicon.

The proper response when someone cites the death of music, or of movies, or of platformer games, or whatever, is to roll your eyes and blow a raspberry in their general direction.

It can be up to personal taste or historical analysis if you want to declare the end of an era or a zeitgeist transferal or a demographic shift, a disruption etc etc.  That's fine, and change is constant.  But seeing as how calligraphy is still widely practiced, live music is still considered a treat, and DOS and Amiga and even the NES can easily be emulated on our phones, let's stop pretending that any platform or format really dies.