comments 16

Welcome : About this site

Audi A2 : The user cannot open the bonnet Bench designed to prevent lying down: 'redesigned to face contemporary urban realities' Some HP printers shut down the cartridges at a pre-determined date regardless of whether they are empty

Increasingly, many products are being designed with features that intentionally restrict the way the user can behave, or enforce certain modes of behaviour. The same intentions are also evident in the design of many systems and environments.

This site aims–with readers’ input–to examine and analyse the ideas and techniques of these architectures of control in design, through examples and anecdotes, and by keeping up-to-date with relevant developments. If you can suggest an example, please get in touch, or add a comment–all help is much appreciated.

The intentions may be purely commercial, socially or environmentally beneficial, or a mixture, but the implications of such architectures of control in the years ahead are likely to become significant across many fields–e.g. the operation of markets, innovation growth, freedom of individual action and consumer engagement with technology.

My starting-point is the research I did for my Master’s dissertation, Architectures of Control in Consumer Product Design [750k PDF], which although limited in detail, has a fairly wide scope. This has been split into a series of pages–see the links in the sidebar. Since starting the blog in November 2005, many more examples have come to light, and with comments and suggestions from readers, it’s grown into what’s hopefully a useful resource for designers and others interested in the relationship between technology and society.

Best wishes
Dan Lockton

Datchet, Windsor, England
November 2005 (updated October 2006)


  1. I’m a lot less clued up on this than you Dan so I doubt I could tell you anything about this that you don’t already know.

    My day job is with American Express and as you can imagine we work under Smart Filter. Now this alone isn’t news its been around for a while. I never gave it much thought before, so what if I can’t look at porn at work. No thanks I get enough of that at home.

    However Smart Filter is getting increasingly pervasive. It’s not just for porn anymore it blocks out all kinds of stuff that I want to read. Now I’ve got broadband at home that isn’t so choosy but its still worrying. There is after all millions of people who live under Smart Filter. Is this a path we should be on? We’re not just on it we’re barreling down it at increasing speed.

    The mountain biker in me says hit the brakes before we go over the handlebars.

  2. Dan

    “The mountain biker in me says hit the brakes before we go over the handlebars.”

    That’s a good way of putting it – I guess the rejoinder would be that in many cases, it’s no longer the rider who operates the brakes.

    Someone else, whether it’s a company that sells software, hardware or content, or a government that believes it can use the law to control how technology operates, or a government being influenced by the company that sells the stuff, has his or her hands on your brakes, and will decide whether or not (and when) you can use them. And if you cut the cable and fit your own, they’ll know.

  3. The biggest problem is that there is a prevelant assumption in the public that this isn’t going to be a problem. They just can’t seem to wrap their heads around it. It’s yet another obscure science/technology thingy they don’t understand that they have to pay attention to.

    I’ve discussed this with a great deal of people and this keeps coming up like a brick wall.

    I percieve this to be part of a larger problem. It’s becoming increasingly complicated being an informed citizen in a democracy nowadays. The public is overloaded with science and technology news much of it with political implications. Large segments of the public are overwhelmed by it and tune out whenever the subject comes up. This is a situation that is only going to get worse as time goes on.

    Let’s hope grass roots internet activism can save the day. I personaly am very confident that we’re standing on the threshold of a new politics. But I’m Canadian and we’re a maddeningly optimistic lot.

    I’d like to interview you at some point Dan. You’re a good speaker and I’d like to pick away at your brains for about five minutes on tape.

  4. Dan

    “It’s yet another obscure science/technology thingy they don’t understand that they have to pay attention to.
    … The public is overloaded with science and technology news much of it with political implications. Large segments of the public are overwhelmed by it and tune out whenever the subject comes up.”

    I know plenty of people like that too. It isn’t easy to articulate even the issues of intellectual property reform without getting into arguments at cross-purposes about what property is, and so on.

    But on the ‘architectures of control’ issue, there are at least some ‘scare story’ headlines that might grab the public’s attention. People want – and expect – to be able to record TV programmes and watch them whenever they want. They want – and expect – to be able to share music with friends. They don’t want to have control of their cars taken out of their hands by roadside speed control beacons.

    The problem I’ve come up against time and time again is the old classic “If you’re not doing anything wrong, you have nothing to fear”. There are some pretty convincing Socratic irony-type ways of showing otherwise uninformed people that they really would be better off not spouting that as a mantra whenever issues such as DRM, ID cards, surveillance, etc, are raised. I’m working on a post investigating this but still gathering examples.

  5. Dan

    On the issue of public understanding of science & technology, there’s bound to be debate as to how good or bad it is, and how much of a problem it is even if it’s bad.

    My own view is that people would generally be happier if they felt they had a better understanding of science & technology issues that affect their lives, but this might not always be the case. For example, if you live near a nuclear power station, you might be happier having only a limited understanding of everything that goes on inside it; a little education may scare you more than none at all, and a proper education would probably scare you less, but leave you with an appreciation for the complexity of the issues.

    The lack of scientific and technological expertise among our elected politicians explains a lot, I think, because it impacts on how science & technology issues are presented in education, budgets, and so on.

  6. Based on the feedback on your Speakers’ Corner rant err speech the best point you make in terms of switching on light bulbs in people’s heads is the one about non transferable music in Japanese mobiles. A lot of people whose eyes glaze over through the DRM geekspeak perk up on that point.

    That’s one paticular nail you may want to hammer again.

  7. From the article:

    “The intentions may be purely commercial, socially or environmentally beneficial, or a mixture, but the implications of such architectures of control in the years ahead are likely to become significant across many fields–e.g. the operation of markets, innovation growth, freedom of individual action and consumer engagement with technology.”

    I’d argue that there are at least five motivations, which apply individually and in varied combinations in different instances:

    * Funnel money to the “architect”. Directly (enforcing a “sell the blades” business model is a popular way — locking out competing ink cartridges, game catridges, game discs, razor blades…) or indirectly (enforce ad-watching, enforce extraction of personal information that can be used for marketing or sold to unscrupulous data brokers like ChoicePoint…).

    Examples: HP (printer cartridges); Sega (Genesis console cartridges — there was a trademark lawsuit over a Sega console’s architecture of control, as I recall)…

    * Social engineering for sincerely-believed benevolent reasons. (I think all such social engineering is bad in its effects, so there’s only two kinds of social engineering: “misguided” and “malicious”. Flexibility is a strong survival advantage, for people and cultures and species. And thus, freedom. Social engineering by its nature involves curtailing freedom, beyond the minimum necessary to prevent violent chaos.)

    Examples: Numerous filesharing apps have deliberately quirky or crippled user interfaces, generally in an attempt to maximize upload to download ratios by making it hard to select many files to download at once, or to keep the app from trying to get only a few at a time even on a machine with broadband and the capacity to handle many simultaneous network connections. They neglect to realize that, network-wide, there obviously must be equally many uploads and downloads no matter what, so curtailing downloading beyond the limits the limited availability of uploaders already necessarily imposes is pointless and in fact degrades the network by reducing it below its theoretical maximum utilisation. They should instead encourage users to share more files or function as hubs, supernodes, or whatever they are calling them this week. Making the former less traceable and the latter less performance-degrading would be a start.

    Specific examples: Shareaza has purposely hacked a standard Windows list box control to make shift clicking in search result lists behave like control-clicking. And ONLY in search result lists. (Other versions may vary.) More intelligently, torrent and eDonkey clients enforce sharing the already-received portion of a file you are downloading and throttle downloading if you upload too little. (The eDonkey clients do more, less benign social engineering as well. Both could stand to enforce upload ratios only to the extent warranted by demand, so if few people want the file you aren’t penalized for that!)

    * Social engineering with ulterior motives. I have no concrete example at this time, only hypotheticals, but it’s certainly possible, and in fact likely to be done here and there.

    Hypothetical example: a Web site for American political discussions may make it hard to indicate a preference for a candidate or party other than Republican or Democratic. Or, more generally, a political site may make it hard to indicate a neutral position on an issue. Consider a panel of pairs of radio buttons for pro/anti abortion, pro/anti nuclear power, and so forth. No way to suggest “yes, with reservations” or “no, except under dire circumstances” or whatever. Mark anything and you can not unmark it (without starting over anyway) and are forced to indicate one or the other extreme position. Worse, it might simply have you pick “left” or “right” and then you must claim to be either for both abortion and nuclear power or against both!

    More concrete example: Perhaps the infamous butterfly ballot in the 2000 US Presidential election in Florida was not actually badly designed, but architected perfectly to suit the designer’s intentions after all? (There’s probably a Pulitzer in it for anyone with proof.)

    * Pushing an environmental agenda.

    Example (somewhat hypothetical): a hybrid car that makes it difficult to manually override the computer and force it to run the engine, which you might want to do for added horsepower to tow something or climb a steep hill if it fails to switch over automatically.

    * Political. Some of the malign social engineering examples above double as political, but other possibilities exist as well. For example, making voting for a write-in candidate difficult benefits both major parties in any jurisdiction. Most voting machines require hoop-jumping to do this, if it works at all. Voting machines also discourage voter-verifying the paper trail (often by simply failing to provide one) and generally discourage election procedure transparency by merely being used at all. This can serve a specific political aim (rig one election) and more general political aims (keep citizens uninvolved and passive; discourage inquiry; discourage accountability; encourage apathy and a jaded acceptance of being treated like a mushroom, i.e. kept in the dark and fed … well, you know).

    Specific example: Linux distros (packaged in some form, such as install discs or their ISO images) often deliberately do not contain popular freeware that isn’t open source, or even that is but whose license espouses the “wrong” philosophy, depending. This is a relatively benign example, but making access deliberately easier to software that comes from politically like-minded folks is definitely a political architecture of control. And if you don’t believe me, look again — almost nothing is as internally heatedly politicised as the free software movement!

    In a similar vein, Microsoft (and via MS coercion/bribery/blackmail, OEMs) make it easy to get a computer with Windows on it and hard to get one with Linux on it, as well as likewise encouraging use of some other software over alternatives. Internet exploder is the most consistent of these. Computers these days come with trialware (and often spyware!) that people are likely to ignore or even use, which serves marketing (i.e. money-making) purposes as well as political ones (anti-open source, pro-apathetic consumers accepting of being told what to consume and in what quantity by whoever’s selling it).

    Most architectures of political control, you’ll notice, also effect more general social engineering (e.g. Linux distros) or else money-making (e.g. Microsoft and OEM-preinstalled software) aims in addition to the strictly political ones.

    Ultimately, though, you can sum up the free-software tug-of-war political control this way: it’s easiest to get a Windows computer and use it as such. Next easiest to get a MacOS one and use it as such. Commercial interests and anti-free software political agenda. Next easiest is a Linux computer, where the large barrier of having to install and configure an operating system yourself must be leapt. Also, it’s likely you don’t actually save any money upfront, because you probably end up buying a Windows box and wiping it to install Linux. Microsoft exacts their tax even if you won’t use the copy of Windows you’re supposedly paying them for. (And nope, you can’t not activate Windows and return the disk, if you even got one, and demand your money back. Not and actually get it, anyway. Often you can’t not activate Windows at all; it’s often pre-activated by the computer vendor!) If you want to avoid the MS tax you generally have to either build the PC yourself (much bigger barrier to entry than install-the-OS-yourself!) or at least get it from an obscure, hard to locate store that probably requires traveling to an out-of-the-way, fairly distant location. At current prices, you may spend more than getting a Windows PC from the local best buy — it’s just that instead of Microsoft getting the “tax” money, Exxon-Mobil does. (Or Shell, or someone…) Finally, it’s harder to have a PC set up with the free software you want than with all-GNU or all-BSD or some such all-one-license-type scheme; or a Windows + Firefox PC vs. a Windows + IE PC; or a Windows-without-spyware PC for that matter. (The barrier is low here, though; it’s easy to find and uninstall much of the crudware on a new PC, and to find and install Firefox, Ad-Aware, and Spybot S&D, and run the latter two once each. Well, unless you are on dial-up or don’t even have Internet access, in which case you’re probably stuck with IE, and Linux is also not likely in the offing. It’s hard to find install discs without the right connections; almost as hard as it is to download 600+MB files (plural) over dial-up successfully and without p@*!ing off your Internet provider.

  8. Dan

    Not sure – I think it’s just ones from your IP being dropped by SpamKarma; I’m out of the office at present but will look into it when I can. Sorry for any inconvenience.

  9. Dan

    Thanks for the above (now recovered!) comment – fantastic wide-ranging analysis of the control motivations (and implementations, real and possible) in many different areas.

    Your assessment of the “barriers to freedom” inherent in new PCs is especially thorough and you’re exactly right: the problem is, of course, that Mr/s Average will put up with that dreadful, rapidly expiring McAfee subscription, and continue using IE, etc, in ignorance, probably as part of some vast botnet, rather than risk “messing around” with anything.

    As with so many issues in this area, technical ignorance, or poor technological (and rational) education is responsible for most duping and coercion. Embedding control in products and systems themselves makes it even more difficult for the user with an ‘average’ technical knowledge to escape the system. 30 years ago John Doe could fix his car. He can’t now. Once upona time he could see how the voting system worked and raise the alarm if he noticed something awry. Today he’d be told “it’s just too complex, computerised, whatever” and possibly arrested too.

    Equally, you’re right with the ‘social engineering being either malicious or misguided’ viewpoint. I tried to get that across with frequent qualification and use of the word ‘contentious’ in my original dissertation: cowardly perhaps, but I wanted to pass, and was unsure of the mindset of the people who would be assessing it.

    With your permission I will quote your analysis of the issues around software lock-in in a future post.

  10. Consider all my comments creative commons sharealike.

    Your remark regarding caution in your dissertation, using words like “contentious” due to uncertainty about “the mindset” of those who would “assess” it touches on a related matter to architectures of control in design. One that may be termed architectures of control in society. There are rules in society intended to minimize harms and the infringement of one person’s freedom and autonomy by another. (I would argue that all sensible laws can be deduced from one “natural right”, that to self-determination, or by the codifying of a convention that, while arbitrary, is from a set of choices where one must be consistently applied by all to minimize danger. The side of the road you drive on would be the textbook example.)

    But there are unwritten rules that, to a large extent, seem designed to reinforce behaviors that benefit a small minority and punish anything else. A lot of these rules seem designed around “forcing” a variety of interactions into a “zero-sum” form, creating a form of “taxation”. Most of these benefit the monied at the expense of everyone else, naturally; the textbook examples can all be inferred by the observation that living frugally gets you ostracised. Your clothes are out of fashion because you keep them until they actually are nonfunctional. You don’t have the latest model of cell phone, if you have one at all. You aren’t hip to the latest movie and TV related trends, perhaps because you spend the time more productively, or find other shows more interesting (or even educational?). It’s mighty suspicious that everything that helps you “fit in” either is expensive or exposes you to a lot of advertising (oftentimes both), isn’t it? Mighty suspicious.

    Of course, that sister site about “How they Change Your Mind” goes a long way to explaining how “they” can arrange to distort social interaction rules to funnel money to “them”. And sometimes other forms of concession.

    Consider what it takes to (pardon my choice of example) get laid. You either “fit in” (with aforementioned expenses”, join a “singles group” (and they all either cost money or are religious, so it’s again either pay or be advertised to by someone with an ulterior motive), or simply pay up-front (which to top it off is usually illegal, besides being risky in other ways). (Used to cost a diamond ring, so I suppose the price has actually been dropping. :))

    Oh and did I mention that most “singles” anythings discriminate against men?

    Some of this is the overvaluing of phony things, but a lot of it’s gotta be Big Business finding a way to charge monopoly rents for yet one more thing that ought to be free. Albeit indirectly.

    One spot of maybe-disagreement: John Doe not being able to fix (all) car problems anymore is partly due to a legitimate increase in technical complexity. We all reap the benefits of better fuel economy through smart fuel injection systems in lower prices for commodities (which are shipped, using fuel), and personally if we use motor vehicles ourselves. It crosses the line into illegitimacy if the chips and such are made proprietary, and particularly given DMCA-enabled lock-in codes, to limit entry into the secondary market for repairs and improvements to a vehicle.

    Really, all John Doe can’t be expected to fix is the chip itself. The chips should be for sale at a small markup over cost from wholesalers, and retail at Radio Shack. It’s easier for John Doe to fix an electronics glitch in their computer than their car right now, because standard computer chips (memory, CPU, graphics-related, etc.) are commodity standardized chips with, as a rule, many competing manufacturers (Intel and AMD; nVidia and ATI; too many RAM manufacturers to name).

  11. Hi. I found your link i dunno how… This subject touches me (all!!! well almost) deeply, and just reading the “about” post and comments I felt some old déjà vu shit… (we’re bad). Unfortunately I’m lazy to write AND spanish, so shall I contribute in other way than reading, it will be just a bit… In my language, I would say: “Esto es todo una movida muy complicada relacionada con no follar.” I think you get my point. There’s something “Defective By Design” deep down.

    Thanks, Dan, Jose. Special thanks to none of, have fun, keep posting (or not)

  12. Pingback: Welcome, new readers at fulminate // Architectures of Control

Leave a Reply