The iPad Luddites

Is it possible for a Geek God to also be a Luddite? That was the question that popped into my head as I read Cory Doctorow’s impassioned anti-iPad diatribe at Boing Boing. The device that Apple calls “magical” and “revolutionary” is, to Doctorow, a counterrevolutionary contraption conjured up through the black magic of the wizards at One Infinite Loop. The locked-down, self-contained design of the iPad – nary a USB port in sight, and don’t even think about loading an app that hasn’t been blessed by Apple – manifests “a palpable contempt for the owner,” writes Doctorow. You can’t fiddle with the dang thing:

The original Apple ][+ came with schematics for the circuit boards, and birthed a generation of hardware and software hackers who upended the world for the better. If you wanted your kid to grow up to be a confident, entrepreneurial, and firmly in the camp that believes that you should forever be rearranging the world to make it better, you bought her an Apple ][+ …

The way you improve your iPad isn’t to figure out how it works and making it better. The way you improve the iPad is to buy iApps. Buying an iPad for your kids isn’t a means of jump-starting the realization that the world is yours to take apart and reassemble; it’s a way of telling your offspring that even changing the batteries is something you have to leave to the professionals.

Doctorow is not the only Geek God who’s uncomfortable with Apple’s transformation of the good ole hacktastic PC into a sleek, slick, sterile appliance. Many have accused Apple of removing from the personal computer not only its openness and open-endedness but also what Jonathan Zittrain, founder of Harvard’s Berkman Center for Internet & Society, calls its “generativity” – its capacity for encouraging and abetting creative work by its users. In criticizing the closed nature of the iPhone, from which the iPad borrows its operating system, Zittrain, like Doctorow, invoked the ancient, beloved Apple II: “a clean slate, a device built – boldly – with no specific tasks in mind.”

Tim Bray, the venerated programmer who recently joined Google, worries that the iPad, which is specifically designed to optimize a few tasks and cripple others, could lead to “a very nasty future scenario”:

At the moment, more or less any personal computer, given enough memory, can be used for ‘creative’ applications like photo editors and IDEs (and, for pedal-to-the-metal money people, big spreadsheets). If memory-starved tablets become ubiquitous, we’re looking at a future in which there are “normal” computers, and then “special” computers for creative people … I dislike this future not just for personal but for ideological reasons; I’m deeply bought-into the notion of a Web populated by devices that almost anyone can afford and on which anyone can be creative, if they want.

What these folks are ranting against, or at least gnashing their teeth over, is progress – or, more precisely, progress that goes down a path they don’t approve of. They want progress to, as Bray admits, follow their own ideological bent, and when it takes a turn they don’t like they start grumbling like granddads, yearning for the days of their idealized Apple IIs, when men were men and computers were computers.

If Ned Ludd had been a blogger, he would have written a post similar to Doctorow’s about those newfangled locked-down mechanical looms that distance the weaver from the machine’s workings, requiring the weaver to follow the programs devised by the looms’ manufacturer. The design of the mechanical loom, Ned would have told us, exhibits a palpable contempt for the user. It takes the generativity out of weaving.

And Ned would have been right.

I have a lot of sympathy for the point of view expressed by Doctorow, Zittrain, Bray, and others of their ilk. The iPad, for all its glitzy technical virtuousity, does feel like a step backwards from the Apple II and its progeny. Hell, I still haven’t gotten over Apple’s removal of analog RCA plugs for audio and video input and output from the back of its Macs. Give me a beige box with easily accessible innards, a big rack of RAM, and a dozen or so ports, and I’m a happy camper.

But I’m not under any illusion that progress gives a damn about what I want. While progress may be spurred by the hobbyist, it does not share the hobbyist’s ethic. One of the keynotes of technological advance is its tendency, as it refines a tool, to remove real human agency from the workings of that tool. In its place, we get an abstraction of human agency that represents the general desires of the masses as deciphered, or imposed, by the manufacturer and the marketer. Indeed, what tends to distinguish the advanced device from the primitive device is the absence of “generativity.” It’s useful to remember that the earliest radios were broadcasting devices as well as listening devices and that the earliest phonographs could be used for recording as well as playback. But as these machines progressed, along with the media systems in which they became embedded, they turned into streamlined, single-purpose entertainment boxes, suitable for living rooms. What Bray fears – the divergence of the creative device from the mass-market device – happened, and happened quickly and without much, if any, resistance.

Progress may, for a time, intersect with one’s own personal ideology, and during that period one will become a gung-ho technological progressivist. But that’s just coincidence. In the end, progress doesn’t care about ideology. Those who think of themselves as great fans of progress, of technology’s inexorable march forward, will change their tune as soon as progress destroys something they care deeply about. “We love the things we love for what they are,” wrote Robert Frost. And when those things change we rage against the changes. Passion turns us all into primitivists.

55 thoughts on “The iPad Luddites

  1. Chris Duffy

    So you think Google and the Internet is making us dumb, but the iPad is a refinement not understood by Luddites?

    “When something exceeds your ability to understand how it works, it sort of becomes magical, and that’s exactly what the iPad is.” -Jonathan Ive in the iPad video on Apple.com

    At least progress is shiny.

  2. CS Clark

    I suppose sooner or later music always sounds like noise.

    But are they right to complain not because it’s happening as it has before but because it’s happening *again*, and this time, this time it was going to be different? It was supposed to be a thing of beauty!

    They might also be thinking of the iPad as a new form, rather than the mass-market closed off end version of an existing form which has previously been generative. And – do devices automatically lose their generative properties over time, or do they lose them as new devices that are better for that sort of thing are invented or become widespread? (Did people stop buying radios for transmitting as well as receiving because of increading ubquity of the telephone?) In which case it seems retrograde that the generative device that will provide these abilities already exists as a mass-market product.

    I also wonder whether it’s beside the point to debate how many people will actually create things using a specific tool. Firstly that we don’t know (no, not even with IQ tests) who the people who will create worthwhile stuff that the rest of us will consume, so there’s a benefit to all of us to exposing everyone to a wide range of tools. Secondly, that given that many are excited not about what is available now but what could/will be available, if you start with the wrong philosophy you are less likely to end up with worthwhile results. For example, you might end up with a majority of things being clones of existing games and brain functions.

    @Seth Finkelstein – I’ve always thought of Google more as Nyarlathotep.

  3. Daniel Peiser

    I would rather buy a “less magical” but more usable device, and Apple seems to think the same: A rumored smaller iPad could be launched in 2011, and I’m sure it will benefit from the impact of the current, larger and more impressive (but clumsy) iPad.

  4. Nick Carr

    Kevin Kelly (and Bratton),

    I can’t think of a device that incorporates more “generativity” (I keep putting that word in quotes, because the word itself annoys me) than a piece of coal used to draw a picture on the wall of a cave. Here’s why: the piece of coal, as a simple byproduct of a natural process (burning wood), incorporates no manufacturing or design intent. It doesn’t come into existence as a device, ie a tool. Therefore, when our ancient cave-dwelling ancestor picked it up to draw a picture, he was not only using a tool to do creative work, he was actually inventing the tool. That’s a profoundly generative act – one of the very highest order.

    (Also, on a related note, read Charles’s eloquent comment above.)

    Having said that, what I particular had in mind when I wrote the sentence you quoted are manufactured devices (those designed for a purpose) geared toward a general, or mass, market. (As I noted, there will always be specialized devices geared toward a particular creative class.) And (reflecting the comments of the iPad critics) I’m thinking of two sorts of generativity (separate but related and sometimes intertwined). First is the user’s ability to involve himself in the actual workings of the device, to express creativity or at least personal agency in modifying, repairing, or maintaining the device. (Changing the oil in a car is more generative than driving the car into a Jiffy-Lube in response to a message appearing on a dashboard display. Adding more RAM to a computer is more generative than putting a decal on the computer’s case.) This is the sense of generativity that Doctorow was talking about when he mentioned that computers (and other electronic devices) use to come with schematics. It’s pretty clear, particularly in the last few decades, that one of the major commercial thrusts of manufacturers is to remove this type of generativity from their products – to in effect hide their workings from the consumer. (See Matthew Crawford’s book “Shop Class as Soulcraft” for an in-depth discussion of this trend.) The second type of generativity is the user’s ability to use the device for creative work. I gave the examples of the phonograph and the radio, both of which incorporated creative functions originally but no longer do (the creative functions moved into specialized devices). The diminishment of this type of generativity, in the iPhone/iPad as compared to earlier general-purpose PCs, is Zittrain’s point. That doesn’t mean that the iPad can’t be used for creative work; it just means that, as we saw in the evolution of the radio and phonograph, the iPad places a strong emphasis on the consumption of creative works rather than their creation and as such probably manifests a broader trend in the design of computing devices.

    So that’s what I was getting at when I wrote that “what tends to distinguish the advanced device from the primitive device is the absence of ‘generativity.'” The consumer eagerly trades off generativity for convenience and ease of use (and a “high tech” image and feel), and the manufacturer responds (quite happily, since the tradeoff makes the consumer a more compliant consumer) by reducing the scope for personal agency in the product’s design.

    Nick

  5. Seth Finkelstein

    Nick, I think there’s something of an error right at this point in your argument, which has given your post a geek-bashing tinge (note, a tinge – I’m not saying you set out to go bashing) – “The consumer eagerly trades off generativity for convenience and ease of use”. One part of the criticism is not of the consumer, but the manufacturer. It doesn’t change anything for the consumer if the manufacturer publishes schematics. It’s the manufacture’s decision, whether or not to support that sort of openness. In fact, some of this has been made illegal with the DMCA.

    I don’t know if it would be a good idea for me to get into a long debate on this, but this post has a thread of a type of criticism that’s irritating. To oversimplify to get it into a comment-box, it’s a subtext of “Those elitist pointy-heads, those self-centered geeks, don’t they know that not everyone is a double-dome like them, and that ordinary people like their gadgets that just work?”. Well, yes, that is understood, Zittrain, Doctorow, etc are not stupid. The argument being made is one about social and legally promoting certain values – and this is often misunderstood (indeed, per my previous point, you seem to deride this very advocacy itself for being advocacy).

  6. Nick Carr

    Seth,

    I don’t particularly disagree with anything you say – and I believe I indicated in the piece that I’m as much in agreement with the Doctorow/Zittrain view as not – but I think you’re oversimplifying when you contend that “It’s the manufacturer’s decision, whether or not to support that sort of openness.” On the particular subject of schematics, one of the reasons they’re rarely included anymore is that the devices are so complex that fiddling with them requires professional competence – and being “locked down” becomes essential to proper functioning. And this is not purely the choice of the manufacturer; it also responds to the desires of the consumer – for a sleek, idiot-proof machine that works without requiring any involvement on his part. (Including schematics, I’d also argue, makes a device feel less “modern” and “high tech” by suggesting it’s not as mysterious – and “magical” – as it seems, so it’s also important to branding and image, which consumers also value.) Apple, I’m pretty sure, restricts iPad/iPhone apps not only to protect a new source of revenue but to tightly control the user’s experience (and, as part of that, to protect the user from problems). So it responds to the wishes of the general consumer (who does not take the hobbyist’s view of devices) even as it furthers Apple’s commercial interests.

    Anyway, this is a very complex and fascinating subject, and I wouldn’t want to suggest that it doesn’t have a whole lot of facets.

    Nick

  7. Karim

    My first computer didn’t have a serial port. I had to buy a handful of discrete components and solder them onto a circuit board before I had an RS-232 port. Did this experience, as Doctorow suggests, put me “firmly in the camp of people who believe you should forever be rearranging the world to make it better?” No. It put me firmly in the camp of people who believe computers should come with a damn serial port.

    The sad thing is that we keep having this argument about abstractions and making things easier for people. You lower the barrier to something for millions of people, and there’s always some gearhead screaming that it’s too easy, too simple, the controls aren’t granular enough, you can’t customize it, etc. ad nauseam.

  8. Albo Fossa

    There are geeks who use “real” computers and “non-geeks” who use “appliance computers” (e.g. iPads). The geeks may often forget that the non-geeks are valid users of appliances.

    The non-geeks are people who live lives in the “real” (three-dimensional, not-on-the-screen) world. Imagine that: spending more, maybe even SUBSTANTIALLY more, of your time AWAY from your computer device than you spend WITH it! Imagine life as a landscape gardener, or an oil painter, or a doctor, or a carpenter, or a race car driver, or a solar panel installer, or a gymnast, or a movie star, or a great chef!

    Some of these non-geeks don’t have, and don’t wish to have the time to spend all day learning and fiddling with a danged computer device. They may spend an hour or two in the evening checking and sending email and ordering something online: for them, an iPad is plenty, and they shouldn’t be looked down upon for any lack of sophistication for that characteristic.

  9. twitter.com/epobirs

    The mistake here is regarding the iPad as a computer rather than an appliance. Recall that Jobs was seeking an appliance model for making computers more accessible by non-technical people from the beginning of the first Mac development. The Web is the killer app that ties it all together. The universal application for the universal info appliance.

    The real triumph here is Nintendo’s. This is a direct descendent of the business they applied to revive the video game console. A close environment where all publishing passed through the maker of the hardware. The Web makes things a bit more open but the best native apps rely on developers using hardware and software not part of the delivery system most consumers see.

    Go back to 1982. If you just wanted to play video games, you could buy an Atari 5200, among others. If you wanted to create video games, you bought an Atari 800 computer. Not a lot has changed in three decades other than the appliance has gotten a lot more traction. If you bought an Atari 800 back then, it had no software out of the box except for a BASIC interpreter. As was the case with most other computers for a long time. If you weren’t going to take up programming the appeal was very limited.

    Personally, I’d rather have a Tablet PC slate model than an iPad but I’m one of those Atari 800 guys from way back. My concept of what such a device should offer is very different from those 20-somethings who can barely remember a world without the Web. The ones who want to make things know where to get the gear for that. Those who merely want to consume the latest info no longer need to pretend they’re going to understand what is happening under the hood, any more than they would do an oil change on their car instead of paying someone to deal with it.

  10. Chris Duffy

    I’ve been thinking all morning about what it is that really bothers me (and perhaps other Luddites like me) about this device.

    Then it occurred to me: it’s the ambiguity surrounding its identity in the current mix of technology products.

    If I look at the iPad like a big iPod or media consumption device, I have no problem with it. It’s fine as peripheral, but not suitable as a netbook or lightweight laptop. After all, The very first thing you do when taking it out of the box is syncing it with iTunes on a computer.

    But then I read reviews -many linked to from Apple.com- that treat it as if it is a netbook, while brushing aside it’s numerous limitations from the perspective of someone who actually uses one. Moreover, if it’s not intended to compete with laptops, why are educated journalists comparing it to them?

    I think it’s pretty clear that Apple wants people to look at it that way, because they want that market segment’s dollars, but not on that market’s terms.

    The “user experience” of netbooks may well suck and be insanely great on an iPad, but if they’re fundamentally apples and oranges, one isn’t a suitable replacement for the other.

    So when the tech press sings the chorus of how this is the future of computing, we’re either buying into a misleading conception, or aren’t really talking about the same thing.

    Is it the future of Human-computer interaction? Sure, that I can buy. Is it the future in terms of appliance-centric consumer electronics? Okay, there again, I can see that in some cases. Is it the future of computing? (i.e. totalitarian OEM control, little adherence to industry standards, complete disregard for interoperability and data portability, etc) No way. Or rather, ‘Way Frightening,’ if it is. I’m talking about practical user function, not “passion.”

    It’s a question of “new and improved” over “new and removed” If it’s a toy, that’s fine. I don’t think Apple wants you to feel that way about it though- even if that’s what it really is.

  11. www.facebook.com/profile.php?id=691981658

    I don’t think the line between geek and non-geek is absolute. I have a netbook running linux. I install all sorts of things on it. Sometimes it is working, sometimes not. I certainly have a place in my life for something that always works. Doctorow seems to have infinite free time. After working a full day, cooking, cleaning, paying the bills, walking the dog three times, working out and hacking my computer, I damn well want the thing I read or watch TV shows on to just work. The thing I am tinkering with might not be when I am just too tired to tinker anymore.

  12. Sporitus

    Progress is too abstract word. If your do not set-up the speed and direction, you are not able to measure the progress. In any other cases it would be not the progress, but just the process :-)

  13. Jason Treit

    A terrific post and discussion, well worth bookmarking. JZ’s generativity thesis has taken a while to catch ears. It’s good to see a pendulum swing of thoughtfulness in the other direction now.

    Amused, tho, at the Professor Frink-like extrapolations of fingers directly applied to surfaces being the future. The iPad, magical or not, doesn’t erase the discovery that begat technology: our meaty paws are neither the strongest nor the subtlest thing for every task.

  14. Len Bullard

    For the status conscious it is a must have despite the fact that in nine months it will be an obsolete coffee table book. For the Apple stockholders, it is an ‘everyone must buy’ for obvious reasons.

    For a non-insignificant number, it is a 1k choice between it and a set of tires and a front end alignment for the second car they have to keep running to make the mortgage. In short, in times like these, despite the “brilliance of execution” or the lack or presence of “generativity”, iPad lust is ludicrous. Apple stepped into the zeitgeist and has yet to clean it’s shoes.

  15. Andytedd.wordpress.com

    I was sceptical about the Ipad at first, but now I realise it will be the first computing device specifically aimed at the ‘Shallows’. And thanks to the magic of Jonathon Ives, a highly covetable one.

    I can see why people might reject this as a backwards step, but on the other hand, if a lot of functionality in a device is redundant why not get rid?

    The only thing I find hard to reconcile is Nick’s apparent contradiction – he seems to be in favour of depth, yet I think this gadget will be at its best as a shallow hypermedia use it while you are doing something else thing. The Ipad is a product which reduces complexity for people who dont want it, but that does not make critics of it Luddites

    Cheers

    A

  16. Detritus

    the idea of “progress” as some sort of impersonal, implacable force is a myth. Progress is whatever we as a society come up with as some sort of collective “consensus” – and I put that in quotes because it’s not real consensus, usually, but mostly involves some kind of struggle ending in compromise between a variety of stakeholders.

    Take the example of DDT. That was a beneficial chemical, that was “progress” – until people discovered how bad it was for wildlife and the rest of the ecosystem. People, at least in the “developed” world, decided as a society to not use it anymore and banned it. That was “progress” too. It’s still being sold and used in the “developing world”, because the people there don’t have the political/economic power to get it banned.

    To brand people Luddites because they have a different idea of what “progress” means seems like a narrow and shallow view.

  17. Karim

    Criticizing a new technology or gadget doesn’t make one a Luddite, but maybe FEARING it does. Geeks usually criticize things on a logical, no-nonsense, quantifiable basis: if the argument was about crappy price/performance (i.e. the argument made about Microsoft’s tablet PCs and UMPCs), that would be par for the course. Instead, we have people AFRAID of what the iPad might represent: more DRM, less “generativity,” less hacking & tinkering. We even have people who phear teh iPad because it doesn’t fall into some tidy, predefined category of computing device.

    These people should take their own advice and hack the iPad. Create an IDE app and interpreter for it. Write an app that uses the iPad Bluetooth to talk to an Arduino board. Publish your music and books without DRM. Or come out with a $500 tablet that is more hackable, as I imagine is planned at several of Apple’s competitors. If you don’t like the future, CHANGE IT. “Doctorow, heal thyself.”

  18. Blake Merriam

    Been looking forward to your weighing in on the iPad for the last few weeks now Nick. :) I’m suprised with your sympathy for Doctorow and others. I’ve found that you have often been eager to critique those who champion “technology for technology’s sake” (Going back to your argument in “Does IT matter?”) as Doctorow seems to do.

    What I find interesting is this debate over the iPad and how it may change the fortunes of the languishing publishing business. Jobs and company know like the back of their hand the dynamics and requirements of establishing a new computing platform/category. The iPad is not supposed to be like a PC. My guess is 99% of iPad owners have a PC at home and can commune with the internet by blogging, videotaping, uploading all they want. It’s the whole holistic nature of the iPad that strikes me as unique. So relax on your couch, at Starbucks, on the train, etc. and consume the rich multimedia content that the New York Times and others will create for you with the touch of your fingers (not something that can be done quite the same with a laptop). Apple and the news industry have gone all out in one of the biggest PR blitzes I have seen in a long time. Is it any wonder that Walter Mossberg was on Charlie Rose last Friday singing the iPad’s praises? It may well save his job and everyone else at WSJ.

    The extent to which the iPad’s content delivery will be proprietary (to save the news industries business model presumably) is *facinating* to me. But content is still king, and Apple can PR the iPad all they want, but it will be now up to others to create the content that will determine if this “3rd category” (After PC and Phone/Music Player) establishes a foothold. The iPod had all the music in the world as it’s content, no problem there. The big question is will developers create the kind of content/apps to justify the iPad?

  19. Datruss.wordpress.com

    I too wrote a little diatribe about the iPad, http://pairadimes.davidtruss.com/ipads-are-for-iconsumers/ “I’m a huge Mac fan, but I have no interest in a bigger version of my iPhone that isn’t a phone, isn’t a camera, doesn’t like to multitask, requires me to have a laptop on the side and then doesn’t fit in my pocket.”

    The key point being the laptop ‘on the side’. I’m probably going to end up getting an iPad (on a later version), but I want to see schools putting content creator, not content consumer tools in the hands of kids! Spend that money on a 2G/160G 12inch netbook with a camera & a full-sized keyboard and save the iPad for home entertainment.

    You say, “In the end, progress doesn’t care about ideology.” – Perhaps, but it seems progress does care about consumerism and product placement, and so some not-so-altruistic ideologies might just be influencing progress in a way that -at least to me- doesn’t need to be thrown into schools, (when other ‘tools’ would be more effective at helping students be active creators of digital content).

  20. Kevin Kelly

    Nick, the two facets, and the two definitions, you give for “generativity” suggests to me that the word you are using needs more clarity before you invoke it for your theory.

    Let me suggest a different metaphor, one which may answer your challenge.

    The nature of every invention is to start out vague, incomplete, and open to change. It begins primed for hacking, and for re-definition. It is many things to many people. At this stage, the device is in the hands of tinkerers, nerds, fans, and hacks who will make it do all kinds of things no one had thought of. This skeletal generality enables one kind of generativity. The tool itself is being invented, which is, as you note, the highest kind of genesis. In this mode a device or invention or idea is thrilling and its naked potential appeals to a set of early adopters (which are often called an elite) who explore this glorious incompleteness in many ways.

    But for many others (who are often called the masses), this very openness, this ill-defined thing requires too much expertise, or control, or knowledge, or care, or time to use and to them its skeletal generality is a turn off. One has only to think of the early days of automobiles, or windmills, or radios, or cell phones, and now unappealing most people found their incompleteness (“they hardly work and are hard to use”).

    But the masses usually don’t have to wait long for the natural history of an invention to kick in. A device become more specialized and “complete” as it evolves. As it does, it becomes more specific in what it does, more closed in its identity. Yet at the same time, it becomes more powerful in evolving identity. It becomes more completed, more approachable, more understandable, more able to do things for more people. For instance the first cameras gave great latitude to the photographers. Since you had to make your own film, you could make it do all kinds of things — favor the infrared spectrum, or embed it in fabric, or make it three feet wide. But as the outlines of popular photography became clearer, the camera honed in one a certain specific design, film was manufactured, and equipment more certain what it was for. The result were cameras that anyone, not just photo geeks, could use, and the result was an incredible generativity as millions and eventually billions of people started photographing EVERYTHING. As the craft became more specific, it became more ubiquitous, and the levels of creativity unleased by this easy to use device vastly exceeded the amazing creativity of its founders. In this way a refined device is more “generative” than earlier when it was vague and incomplete.

    So there is a natural arc in each invention which moves it from the generative openness when it is new-born to the refined generativity as it becomes well defined.

    Nick, you seem to suggest that it is modern manufacturing/consumerism that closes off all inventions to the first kind of generativity, but I suggest this maturity it has always happened, long before the industrial age. It is merely being accelerated now.

    But if that was the total story it would be a pretty small world. What happens is more greater. Each new unformed, hackable, potential invention is refine by use, and this use makes it more specific, more conditional, more open to use by know-nothings, more powerful in defined ways.

    And then these mature products enable entirely new tools. In turn these new tools are again open to the first kind of generativity. Hacker and nerds and tinkers flock to the immature zone, where they can help define the new new thing.

    So, the same story is told over and over again. Once upon a time the early adopters made their own electrical parts — capacitors, resistors, etc. — from which they cobbled together radios and other equipment. Once it was clear what a capacitor was, the frontier moved on to making the gadget — but not with the old guard complaining about the loss of the joy of making your capacitors. Then as radios became more defined, hams did not make radios, but at least repaired them. So they made their own Altair computers. For a while. What you don’t make your own computer? No, but I wrote the operating system. What you don’t write your own OS? No, but I wrote my own program? What you don’t write your own programs? No, but I code my own website. What you don’t code your own website? No, but I write my own apps. What you don’t write your own apps? No, but I weave my own lifestreams…..

    New-borns with infinite potential but low-productivity become middle-agers generating great productivity and unleashing fantastic creativity; in turn the mature keep the frontiers expanding by generating more new borns. I speak here, of course, of ideas and devices.

  21. Erik Sherman

    I took Cory Doctorow’s remarks a bit differently. Although he expresses his distaste for the iPad in hardware terms, I think what bothers him is that it’s a device that you have to Apple’s way, which means buying only the software that Apple approves of in the way Apple allows. To use the appliance analogy, that would be like having the manufacturer of the toaster be the arbiter of what you could put into the toaster and being the only place to buy bread. He was also talking of having the power to make things better for yourself and maybe for others. But when everything is locked down – and the hardware aspect is only one, and really an analogy at that – then you don’t get to work “outside the box,” because the box, and what is approved for the box, is all that there is.

    Personally, I think that the way things are moving, you need to be able to create as well as consume media on the fly. The thoughtful comments on this post are an example. And progress might be a media device that encourages such interaction by making it easier to do. From that view, progress has been the movement from top-down communications to bi-directional. Perhaps enforcing the “either consume or produce” model is the truly Luddite stance.

  22. len

    So now Apple also gets to choose the developer languages thus setting itself up for a developer smackdown and claiming that limiting choice is a means of improving the user experience. Isn’t this what Apple claimed about Microsoft, the sort of thing Microsoft has been repeatedly sued for? Isn’t this Apple becoming the antithesis of the famous Mac commercial? So rather than being a war on Adobe as Jobs claims, they’ve gone to war on the developer ecosystems themselves.

    Wow. All of this to push video to a weak wifi tablet through Verizon high speed networks? Wow.

Comments are closed.