Understanding extend() and object inheritance in Backbone

Asking how best to do object inheritance in javascript is a bit like asking how to raise your kids—everyone has a different system, but as long as they live to adulthood, mission accomplished. But sometimes when working across several different libraries, plugins and code styles, doing inheritance right can quickly become a little confusing.

Let's start with a simple example using Underscore's (or jQuery's) extend() method. We'll pretend that we're writing an image Gallery module, and that we can set a timeout value to determine how long an image stays visible:

This is usually known as object-literal inheritance, or as I like to call it, the "hash-smash" pattern. But in fact, it's not technically inheritance. Each call to extend() will simply merge the properties of the two objects returning a new object. So in reality, we've got three copies of Gallery floating around in memory.

Contrast this with the more traditional approach of using object prototypes and constructors:

Notice that because we've assigned setTimeout() to the object prototype, the property is shared between all it's children. However, the timeout property itself is unique to the instance objects (because we hung it off of 'this' in our constructor function).

Okay, now let's look at another version of an extend() method. This time from Backbone.js:

So looking at this, it would seem to do exactly the same things as the above plain-prototype example. We're using 'new' to get a new instance object that has its own timeout property. Meanwhile, setTimeout() stays attached to the prototype. At the same time, the syntax pattern is closer to the previous extend() method we used from Underscore, where we pass in an object literal that extends Backbone's base Model. It kind of seems we're getting the benefits of prototypes with the simplicity of of the object literal hash-smash.

But in reality, there's a pretty serious defect in the above code. Those tests are simply providing a false positive. Let's look at a more complex example where it's easier to see what we're doing wrong. Here, we're adding another property, an object to describe our gallery's data source:

Holy failing tests! What's going on? It looks like the timeout property is still be set correctly, but when we try to set the URL it's being overwritten by the other instance. Let's look at Backbone's extend method directly to see how it works:

Indeed, we can see that Backbone's extend() works quite differently than its Underscore and jQuery counterparts. Much like a constructor function, it sets up the prototype chain and returns a child object. But whereas a constructor is just limited to assigning a prototype, Backbone's extend() will actually create a Surrogate object that sits in the prototype chain between the Model object and the instance object. So all the properties we pass into extend() are added to the prototype (extend() also takes a second optional argument for static properties, as well). 

So why did our tests fail? Why are we able to set the timeout value on the instance objects, but not the dataSource? The problem actually lies with how we (aren't) declaring these properties on the instance object itself. Timeout only has a simple scalar value, and when we set it using 'this', we're implicitly declaring a new property called timeout on the instance object, which overrides the prototype property. 

However, when it comes to the dataSource property, we're actually using 'this' to access the dataSource property on the prototype, not set it.

Is this a design flaw in Backbone? Not at all. We simply need to be careful about where we're assigning properties. Here's a final example, where we're using Backbone's initialize() to properly declare and assign instance properties:

Until we can all standardize on ECMAScript 6, it's likely we'll continue to see different methods for extending objects and setting up prototypes.

Using Games to Model Complex Thinking

Ian Bogost coins the term "procedural rhetoric" as using rule-based logic to further rhetorical arguments. Or as he puts it, "the practice of using processes persuasively."

My big takeaway from this is how Bogost views games as a vector to expose the complexity of social problems that are difficult to express in words or pictures. This should be obvious to any gamer that's picked up Sim City or Dwarf Fortress. Applying it to more topical issues, I wonder what would happen if we gave this generation of school children the Healthcare Cost Simulator just as our generation got Oregon Trail.


The Itch of Disconnect

Dave Pell writes

Technology used to be a way to solve life’s little problems. Now, technology is used to solve the little problems caused by technology. On some level, we know that doesn’t make sense, but we don’t have an app to convince us. Where’s the computer algorithm to prove that the quiet walk without the phone calls is the balance?

I would describe the feeling of being disconnected as a constant itch. It's muscle memory unable to fulfill the gestures in it's programming. Where once the activity of getting off a plane involved a dozen lit cigarettes, it is now the glow of a hundred phone screens lighting up.

Successfully disconnecting takes practice and discipline. Like an acetic monk, you must surpress your natural urges. Learn to recognize the urges in your fingers or your brain for another quick hit of e-dopamine, and tell them to shut up for a few seconds while you unwind.


Password "hell" isn't your fault

Peter Cohen recently wrote over at iMore about the huge  potential for in the new OS X Mavericks for fixing the "password hell" that the typical user finds themselves in today. This is typically characterized by two problems: 

  • Your password is insecure because it is some combination of either too easy to crack, or too widely used across your many, many internet accounts.
  • The solution to this is often to create complex passwords for each account, either randomly generated or using some kind of algorithmic pneumonic that will generate something both unique and cryptographically secure.  This increases the burden of complexity on the end-user.

The problem with putting the responsibility on the user is that it creates a situation where both remembering and storing these passwords becomes incredibly cumbersome, carrying the risk that they'll either give up with such a system, or institute a less-than-secure practice around storing these passwords.  And that's why the iCloud Keychain seems so appealing.

The iCloud keychain promises to behave like many other applications, like 1Password and LastPass, but with one key difference: you have no say in how your passwords get stored. With a tool like 1Password, the master passwords file can be stored locally on a computer, or in the cloud. The decision is up to you to decide where that information lives.

Every time I get an email from a large company notifying me on a data breach, I lose more faith in allowing corporations like Google or Apple to take control of my internet credentials. These companies have enormous targets painted on them and are a much richer source of wealth to hackers than the PC sitting in my basement. 

To consumers that are paying attention, it should be obvious by now that companies don't always follow security best practices, often forgoing encryption of user data, relying on easily compromised hashing methods, not salting data or simply storing passwords in plaintext. Despite the news of major data leakage and increasingly regular intervals, we keep seeing these companies push cloud-based solutions on us without assurances of basic common-sense security practices.

Emails I get about compromised servers only elicit a groan from me anymore. We treat it as a modern inconvenience instead of a major breach of trust. As long as companies insist on routinely capturing our details by asking to "sign up" , it means people will continuously supply insecure passwords. And that means we shift the responsibility onto companies to not just protect our account with them, but to protect all the accounts that exist under these credentials. 

Companies want to have control over our data, and we want more convenient ways to identify ourselves online. But until we both  start fixing the trust equation over how our personal data is protected and stored at both ends, "password hell" is going to continue to exist indefinitely.


Defend and Attack: Flat Design

Flat design is a new favorite punching bag on Twitter, and I've been hearing quite a lot of criticism (and by criticism, I mean flippant remarks masking a seething hatred of questionable proportions). 

I should state up front that I join the ranks of being unimpressed by many of the flat designs I see. That is no reason, however, to ignore the trend or fail to understand why it exists.

The goal of flat design is straightforward: The history of HCI design has been a long, slow march to transform the experience of using toggle switches into a more palatable form. But the experience of using a computer is flat to begin with, as it has almost always taken the shape of the human being stationary in front of a flat screen, unmoving and fixed in place. Designers have worked hard to provide us with an evolving virtual reality of layers, depth and skeuomorphisms (gasp!)  to ease the burden of being hooked up to a machine all day.

Desktop as a literal desktop. Xerox Star, 1981.
Desktop as a literal desktop. Xerox Star, 1981.

Flat design is an approach from the other end of things. It makes the assumption that you are using your computer and existing in the "real world" at the same time. When machines are no longer constrained to a desk, some of the desire to be cloaked in the blanket of a windowing system goes away. New computers are smaller, aware of our 3D space and less reliant on hardware abstractions (like a pointing device). Mobile computer use is simply less isolating and emphasizes speed of action over completeness of metaphor.

The promise of flat design is that it requires less physical and mental acuity for operation. Of course, in practice, I find the opposite to be true. Lack of visual hierarchy and an unclear interaction model are the biggest problems. Understanding what the design asks of the user, and how the user should expect the system to react are the two fundamentals that seem to be missing in any flat design I've used.

 Where do I click first? Windows 8, 2012.

Where do I click first? Windows 8, 2012.

Additionally, in the cases of newer Google and Microsoft designs, I've also seen a decided anti-contrast philosophy, which is mystifying at best. When talking about Windows 8 Style (formerly Metro), we see an unfortunate partitioning of information into uniform boxes (screens rarely need more constrained boxes), and all of it seems to embrace the idea that the entire surface area of the screen should be filled with information; all of which is important and equally urgent, fracturing any ability to focus -- only made worse by the complete lack of governing behind animations and moving parts.

I don't think flat design is all bad, in fact I tend to appreciate the borderless (and canvas-less) nature of it, and the treatment given to typography. However, as a design philosophy, it fails to really achieve much, and in many cases, takes us back too far.

If I was to postulate on flat design's popularity among companies and hatred among the design rabble, I think it's pretty evident: flat design looks fantastic on flat material. It looks extremely compelling in static images, mock-ups and advertising banners. In controlled demo environments it appears smooth and easy to use. It sells itself really well. The ultimate problem is that it breaks apart under real world use where OS software isn't custom tailored to the precise tasks outlined in the demo, and needs to interoperate nicely with the things people actually need to do with it.

 God forbid anyone has to use this. The Island, 2008.

God forbid anyone has to use this. The Island, 2008.

In other words, flat design seems like the brain-child of someone who has watched too many Hollywood interpretations of computer systems.