Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the whole question Tablet vs Computer is pretty dumb in the first place. It's not a versus, it's a peaceful cohabitation, only some evangelist and/or journalists want to push us to believe there's a war but that's really not true.

Tablet and PC simply correspond to two totally different use of media. A tablet is consumption centered whereas the PC is able to produce complex goods and services than then can be consumed by a tablet. Will the tablet push away PC usage to very low level in some areas? Probably! Will it replace a PC for every usage ? Are you kidding me: Development, 3D Effect, Advanced Audio Processing, Advanced Photo processing. That's just as stupid as saying that photoshop is dead because Instagram can do sepia...

PS: I will absolutely never buy a tablet, I fight every day to stay on the creator side of society, with a tablet I would just give up!



It's not a versus, it's a peaceful cohabitation, only some evangelist and/or journalists want to push us to believe there's a war but that's really not true.

I wouldn't call it a war, but Atwood (who, by the way, is neither an evangelist nor a journalist with an agenda) makes the point that tablets are about to take over large shares of the PC market. And my point was that this way, people (non-hackers) get even further removed from the "close to the metal" experience that I and others in my generation used to have.

With all respect, I think you and dozens of other posters in this thread are missing the point by "defending" the notebook and saying it won't get fully replaced. Of course it won't, but that's not an interesting question. The interesting question, in my mind, is: how can we avoid the TV-ization of computing? How can we make sure future generations don't equate a computer with a consumption device rather than a productive device? How can we make sure they can find out that this "magical device" isn't actually driven by fairy dust? Whether or not some share of the population uses notebooks, when the default for the millions and billions of non-hackers out there is to use a tablet, doesn't matter very much at that point.

Fortunately there are initiatives like the Raspberry Pi. It's telling that one of the main initiators of that project, David Braben, is also an old-school guy who grew up with computers in the 80s. Or so I'm guessing -- at any rate, he wrote some of my favorite 80s games, Elite and later Frontier. But I fear that if you plot the Raspberry's sales curve into Atwood's graph it won't look anything like the iPad.


Don't worry, you're full of respect ;).

It's maybe a philosophical question but who doesn't have an agenda? I mean when you buy a device, don't you want to convince the other that you haven't bought a stupid toy or a useless object? Everybody has an agenda, even someone just buy an iPad he has incentives to convince that his device is great, the only ones who don't have an agenda are the ones who genuinely don't care.

I have an agenda, I am trying to push people to see computers as something else that a tool to consume media/ slack around. But from the current design of tablets, especially Apple's iOs Devices it's getting really hard to see something else than a "pure-consumption" product.

To continue on the philosophical level, I think humans deep down aspire to do very little while receiving a lot of pleasure. The most productive of us have managed to channel this desire towards building systems that, when they will be working, will provide us with a lot of rewards with little additional work (the startup!). In a way the self-driven programmers have achieved a Post-Slacker era in their mind and god knows how hard it is to keep that up. The iPad/iPhone/iPod Touch/Android Phones are enabler of the slacktivist part in everyone of us, and that's one of the big reasons of their success.

Sure Raspberry Pi is great but, again, it's in the realm of post-slacktivism and thus it will probably NEVER be popular. It doesn't mean it won't be successful.

I hope this comment will find interested minds!


I think tablets will replace "PC's". But I think it won't be as bad as you think, You'll still have you dev environment, you mouse, keyboard, and touchpad all working in harmony. Here's a snippet from another post I made (discussing mobile vs desktop OS's, hope it provides some alternative viewpoint:

"I'm pretty sure this is the direction all OS's we will be using are going to end up in the coming years. Some people will cry and yell and holler that there is no way something can be made for both keyboard, mouse, and touch screen. I see this as simply a failure of vision. That's absolutely the way things are going to be going, and it's coming sooner rather than later. (I actually think the OSX launchpad is pretty close to allowing this implementation) Devices are getting smaller and smaller. I use a Mackbook air. but guess how it gets hooked up at home? That's right, wireless keyboard and mouse, nice big monitor, I never see the actual computer. It's only a matter of time before my laptop gets replaced by a tablet that has comparable hardware specs. The OS allow for my normal desktop interfaces, along with a nice touch screen interface. I'll use it to pick out movies to play on my TV, from my couch. I"ll use the same device to write code at my desk (with big monitor and keyboard). My kids will use it to play video games, both mobile, and on the TV. I'll use it to send email in the backseat of a car moving at 65mph. (As a wireless comms guy, I fully appreciate the technology it takes to perform that last action.) But the bottom line is, It's going to happen. Sooner rather than later."

I guess my point is, mobile and desktop can blend perfectly. Those that don't need a monitor or keyboard won't get one, devs and graphics guys will. Technology will keep the size of processing power shrinking. Could you ever imagine something as small as an MBA being a full on computer.

Last bit, and my only reservation of this whole "movement", is that, I hope they keep it more open, more like OSX, vs overly walled garden a-la IOS. But the signs are there, I think it can be done, and done very well, it's only a matter of time.

edit: It seems you biggest problem address the issue that these devices are built for consumption rather than creation. And I'll agree, that's my biggest reservation about the mobile concept. I guess I could have been more clear about that. I'd like my iPad (not that I actually have one) to be more like OSX, where I can hook up a monitor and keyboard, and go to town on my OS. While being able to switch into the "launch pad" mode while mobile.


I'm interested. I agree.

Apple's devices (and other embedded devices) will never replace PC's - and exactly for the reasons you state. Thus the Raspberry Pi and the PC (PC platform) have a bright future.

Apple's devices will live until Apple EOL's them.


Well things will need to be produced somehow, I imagine this will still be done with some form of computer. Whether or not you keep the keyboard has little to do with the "freedom" of the device.

The question is what the difference will be between the consumption devices and the production devices.

Whether the devices for creating content become specialized pieces of equipment priced out of reach of the general market and you get to touch one for the first time when you turn 18 and start your computer science course or perhaps with increasing viability of virtual machines you simply download a program that gives you all of your power tools but inside a sandbox so if you mess it up you can just re-image and go again.

There will also be as you suggested "hacker hardware" like the Raspberry Pi, I think the key for these is to make sure the costs are low and that they are available to kids in school.

The "hacker spirit" can overcome many obstacles. Hell in the 70s nobody had a computer at home and the original hackers used to break the law by breaking into companies computer systems via telephone lines just to play around with them.


Thanks for this optimistic reply. You're right, the "hacker spirit" has proven time and again that it can overcome any obstacle. Those who explicitly seek out open hardware will always be able to find it.

My concern is just that the lure of simplicity (and parent's paranoia) will mean that kids will be more likely to end up with a closed system rather than an open one, and consequentially deprived of the ability (and, more importantly, the inspiration) to tinker. But maybe you're right that hacker souls will always seek out systems that allow them to do what they want to do, and it won't make a difference in the end. I hope so!


Yes, I think inspiration is the key here. I was largely lucky in this regard though, I had very liberal parents who let me have a computer that I could mess around with as well as unfiltered internet etc.

Most people in my peer group at the same time were only allowed limited access to their home computers and often were not allowed to install any programs on them etc.

Really though, I think it is in the governments best interest to make sure that kids are inspired to tinker with things if we want to stay competitive with BRIC countries in terms of creating and innovating.

I can only speak as a British person here but I think that from Alan Turing to BBC Computers (and now raspberry pi) etc , the "hacker mentality" is very much a part of our national DNA and it would be a great shame if that was lost.


I would rather buy my child their iwn computer then have her mess up mine but I guess that was not as affordsble a decade ago.


I wouldn't call it a war, but Atwood ... makes the point that tablets are about to take over large shares of the PC market.

No he doesn't. He says that everyone now owns a PC, so innovation is moving to post-pc devices.

I think the term "post-pc" causes many geeks to project a vast ethical struggle onto the tablet market. Chillax, game consoles didn't kill off programming, this is no different.

Personally, I prefer the term "portable device" but I guess that's not as link-baity.


>I think the term "post-pc" causes many geeks to project a vast ethical struggle onto the tablet market. Chillax, game consoles didn't kill off programming, this is no different.

The iOS locked-down norm will likely migrate to the conventional computing world. Mountain Lion's Gatekeeper software establishes default behavior of not being able to run unsigned software. This gives Apple the power to censor software and is the a step towards App Store-only software on the desktop.


> This gives Apple the power to censor software and is the a step towards App Store-only software on the desktop.

And that will lead to more open platforms being where consumers find the most innovative software, as well as being where commodity software is cheaper and/or better because competition will necessitate it.

Locking down your platform to prevent crapware that users don't want getting onto it is one thing. Locking down your platform when noncrapware that users do want is also available but only one someone else's gear is how you turn Apple back into being what they were before the iPhone: a footnote.

Of course, this does mean those developing for more open rival platforms will have to actually produce noncrapware, not the poor excuse for software many places ship today. It's about time our industry grew up and stopped pretending that shipping junk and charging for it is acceptable anyway.


What about virtual machines?

Let's say your OSX is locked down, but you can get a virtual PC app from the store that allows you to install Linux without affecting your OSX setup.

Now you can use your Linux VM to program anything you want and share it with a whole community of others who run Linux inside a VM.

Perhaps at this point this whole community grows big enough that it becomes the main target market for companies who produce "power user" software.


I would like to see a network enabled virtual machine approved for the iOS App Store, or the basic human decency of allowing sideloading on iOS, before trusting Apple with the OS of the future.

Now, Tim Cook has shown a good track record of undoing the most extreme abuses from Steve Jobs megalomania (you supporting employee charitable contributions, paying dividends to investors, admitting imperfections in factory working conditions), so there is hope that he will do the right thing for humanity with this issue too.


>how can we avoid the TV-ization of computing? How can we make sure future generations don't equate a computer with a consumption device rather than a productive device?

Why do we need to concern ourselves with this? The Desktop already is "tv-ized" for most home users. People who are interested in producing will continue to do so but the majority most likely never will be and trying to make them will only turn them against us further. We should just get out of their way and make it as easy as possible to do what they want to do.


Fortunately there are initiatives like the Raspberry Pi

With its USB ports, video acceleration, web browser... The RPi has more in common with a PlayStation than a BBC Micro.


>Will it replace a PC for every usage ? Are you kidding me: Development, 3D Effect, Advanced Audio Processing, Advanced Photo processing...

No. But for a large proportion of the PC Market this doesn't apply. They only ever used their PC for email, youtube, email etc. So for what is probably a majority of PC users (ie. Not you, me or most HN readers) this is 'Tablet vs Computer'.

To call the argument 'dumb' is dumb. Make your point without attempting to dismiss the argument in such a cheap manner.


I hope the barrier to entry to be a producer will stay as low as it is now. I worry that as PC demand goes down, the cost to get started in programming will go up. I started to learn how to code web sites while procrastinating one afternoon during finals week; I discovered Apache on my macbook. The barrier to entry was almost accidental, because I had all this great stuff on my computer waiting for me to discover it. Those experiences will be more rare in the post-PC erra.


Funny you mention that, but in a world totally dominated by iPads the price to just have your application being available by legal means is 99$. When I began to code(around 12-15) 99$ was WAY above anything I could imagine. I don't know if my parents would have allowed me to spend such a sum. Hopefully you can get started and publish for Android for free!

So, who knows what the future holds for the cost of content creation... Again people often ignore the long-term externalities they produce when they make their choices, buying a PC or a tablet being one of these choices.


Are they ignoring the "long-term externalities" or is it just impossible to actually know what they are? Or maybe they just disagree and are putting their money where their mouth is. To even expect a non-programmer to consider this issue strikes me as pretty crazy.

I bought an iPad and I don't think that it's going to affect distribution of software in any meaningful way, besides making it easier for developers to reach users through the app store. For one, anyone can deploy an app to Heroku for free. You can host plain html, css, and js for free. You can learn how to program an iOS device for free with tons of great guides and materials online. When you want to distribute it, there is a hurdle to clear, but (and this might be a cultural thing), $100 as a 12-15 year old is not such a wild sum. It also makes my experience browsing the app store better.


Xcode is free (as in beer): https://developer.apple.com/xcode/index.php

You only need to register in the iOS Developer Program if you plan on distributing your apps in the App Store.


You have to join (and pay) if you want to run your apps on hardware -- the free download only lets you run in the simulator.


I don't know when you were 12-15, but $99 at 2010 prices was about $34 in 1980 and $23 in 1975 [1].

[1] http://www.westegg.com/inflation/infl.cgi


I'm 25. Maybe I was born in a conservative family on spending but I would have to seriously push my parents for weeks to obtain $60 worth of something as abstract as a SDK (like they even know what it is...)!

And for families that live check after check, they often have a computer but affording a $100 license can be very painful, I was in that situation!


true , but it's only recently that there has been so much good development software available for free.

When I was first looking at computers a computer cost $1000 and a copy of MS C++ was something like $300


true , but it's only recently that there has been so much good development software available for free.

I absolutely disagree. When I was 12 (in 1994) I became interested in Linux because it had so many development tools available. I thought computers and programs were magical, and Linux/bash/Perl/gcc et al. made it possible for me to learn programming. And Slackware Linux could be obtained for 10 guilders or less.

My generation became hackers through GNU and Linux, just as the generation before used MSX, C64, or a ZX machine with a free BASIC interpreter.

IMO, the sickening development is not so much that Mac and iOS developer accounts cost $99 per year. It's that the world (Apple and Microsoft) is slowly moving to a model where there is a gatekeeper who decides what gets in and what does not. As a bonus the gatekeeper gets 30% of every purchase. I can sympathize with the need to provide a 'trusted' source of software, but it should also be possible to install whatever the heck you like.

What would the Internet be if it followed this model?


I hadn't used Linux in 1994, my first experiences with it were ~1998 but I remember that getting it to install and work correctly with my hardware as well as getting X to work was no mean feat.

So I imagine that would have been quite a large hurdle for somebody with a casual interest in learning to program to jump through in 1994, I also remember paying somewhere in the region of $100 for my Linux distribution then (SUSE I think).

Even at the point you had installed it , you would be compiling binaries with GCC that would have targetted Linux/glibc etc so you wouldn't have been able to share many of your creations with the rest of world apart from a slim minority of Linux users.

I'm not sure what the best compromise between having a "trusted" source of software is and being able to install what you want.

Most people seem quite happy with the app store lock-in, in fact most people I know who own android devices are not even aware of it's sideloading feature, they just get stuff from android market.

Of course if Apple become over restrictive then they do risk damaging their own ecosystem to the benefit of competing platforms.

Unfortunately there does seem to be more popular support that I had previously expected for the Internet to develop something closer to this model. I recently watched a documentary on cyber-bullying in which groups of parents were calling for an authority from government to be able to control the content of social networking websites as well as remove any anonymity from the Internet.


I hadn't used Linux in 1994, my first experiences with it were ~1998 but I remember that getting it to install and work correctly with my hardware as well as getting X to work was no mean feat.

X was no worry - our machine only had 4 MB of RAM, so I was practically forced to use pseudo-terminals. The learning curve for Linux distributions was not that steep. Slackware, especially in those days, was orders of magnitude less complex than current distributions. I picked up a cheap bargain UNIX book, which was enough to get started.

Even at the point you had installed it , you would be compiling binaries with GCC that would have targetted Linux/glibc etc so you wouldn't have been able to share many of your creations

Well, sharing my creations with the world wasn't very much possible anyway, since we had no internet connection. Besides that, for me the magic was in creating a program.

Besides that, your comment is factually incorrect, since a DOS port of gcc was fairly quickly available (DJGPP). In fact, IIRC Id Software's Quake was later compiled with it.

I'm not sure what the best compromise between having a "trusted" source of software is and being able to install what you want.

Me neither. I see how it is beneficial for some family members and friends to have a controlled software ecosystem. Also, App stores improved usability a lot.

On the other hand, if kids only get their hands on devices that are controlled completely by corporations, how will the next generation of hackers learn?


I think maybe they will learn different things.

I imagine virtual machines, both cloud and local will become a commodity at that point so whilst they may not be taking their iPad apart or replacing the software it could potentially provide a dumb terminal to an infrastructure of disposable Linux instances all loaded with state of the art FOSS.

I can see them doing things like building mashups of their social data and possibly using the next generation of arduino like devices to create real world interfaces.

They will still "hack" just their building blocks will be different. Hell in the 70s you probably weren't a real hacker if you weren't a whizz with a soldering iron, how many of the RoR type hackers today practice that?


It's really weird to hear someone say that - because I grew up in the 80s with a computer that had a commercial-quality (for the time) assembler built-in! Sure you could buy Pascal or Lisp too, but out-of-the-box you got the same tools the pros were using, and all the documentation with it too.


Well truthfully my first computer (an Acorn BBC) had a BASIC interpreter built in, I think you could also do some assembler out of the box although I never tried that at the time.

Even my first DOS PC had QBasic pre installed but it had no ability to make .EXE files which was what you needed to be able to do if you wanted to submit your software to shareware libraries.

It seems at that point that Microsoft wanted a devide between "toy" programming languages like QBASIC and "professional" ones like Visual Basic which cost money.

If you wanted to create "proper" windows software you needed to fork out some cash. Contrast that with now where you can download Visual Studio Express along with all documentation, compilers etc for free as well as a whole load of open source languages and libraries.

I now make my living as a programmer and I use almost no commercial tools at all, I doubt many people were doing such a thing in ~1993


Thanks for reminding me why I make more money now than I dreamed possible as a child, yet can't seem to afford anything beyond AmazonBasics and store-brand cereal.

Inflation is such an insidious way to pick the pockets of the working class. I wish the government would tax cash and equivalents instead of inflating the currency.


There's nothing inherent in the nature of tablets or iOS that prevents them from being used for learning programming. It's purely an issue of the availability of suitable apps.

Have a look at Codea (http://twolivesleft.com/Codea/) for an example of such a tool.

(Disclaimer: I know the guy who wrote it, but i genuinely believe its a very cool environment for learning programming and writing basic games)


Agree with your point overall, but if you look at it this way, there were lots of people who were, before, using a computer just as a "consumer" of media, internet, music, information, etc... they had to get a PC because there was nothing else to replace that tool. Now, for THESE people, a tablet makes more sense: it fits their needs. So the conversion we may be seeing right now is around those who were simple consumers in the first place.

Creators who need to consume and to create will of course prefer to have a desktop, a laptop on top of a tablet. And there may not be much evolution on the desktop or laptop side because there is no such need to be: the desktop environment has had dozens of year to evolve, its concepts are solid for its intended usages.


I don't think the point is that the market will shift all to tablets. It is more that, given an upbringing within a "consumer" household. The individuals there will have little opportunity to experience and gain familiarity with, something that can produce. You and I will continue to use PCs, our kids and our friends will likely be using tablets more and more.


the idea of the general purpose mechanical vehicle died a long time ago, the idea of the all purpose computing device should also. while these ARE general purpose machines as far as their internals go, even with pcs and laptops different tools are required to perform different tasks.

Artists use wacom pads and cintiqs to "create" audio experts use mixing boards and MIDI instruments, 3d artists use even more specialized tools (this is my proffession so I happen to know more about it than the others) a lot of what is happening with tablet computers is that the keyboard/mouse combination, once a first class input device used by coders and office employees is becoming a peripheral just like all the others I have been talking about.


Great comparison. If you look at things like Garageband on the iPad, you can certainly make music with the touchscreen only. But any musician would probably still be more productive when using the real instrument as the input device. Same with office work: yes, you can write on the virtual keyboard, but can still attach a physical one if you want


>It's not a versus, it's a peaceful cohabitation, only some evangelist and/or journalists want to push us to believe there's a war but that's really not true.

The issue is; lots of people who never should have had desktops had to get them because there was no other option. It's not that the Desktop market will die or go away, it's simply that it will contract to what it should have been all along: a very small home market and a large business one.


The thing we need to look at different is that there are types of creator. Damon Albarn recorded a massive chunk of one of the Gorillaz albums on an iPad, David Hockney sketches on one.

There are ways in which tablets have massive creative potential. The interface for instance allows a very direct interaction for some mediums and when you look at something like Garageband it has a very shallow learning curve (and price point) which will potentially pull a lot of people in who would have never have looked at that sort of thing on a desktop or laptop.

I think it's important to differentiate between the things that allows developers to be creative and the things that allow others to be so.

To say I'm on the side of the creator and am anti-tablets is to subscribe to a very fixed and narrow definition of what enables creativity.


It's not impossible to remain a creator while using an iPad (or other tablet) - at least not when it comes to development. There are certainly people who are happily developing on an iPad (albeit with a more powerful remote backend)[1]. I think we are likely to see even more of a shift in this direction as these devices become even more capable.

[1] http://yieldthought.com/post/12239282034/swapped-my-macbook-...


Ok, granted, it seems to work but when you look at the configuration:

iPad 2 (16Gb, WiFi)

Apple wireless keyboard

Stilgut adjustable angle stand/case

iSSH

Linode 512 running Ubuntu 11.04

Apple VGA adapter

This is basically the same configuration as an iMac! I'll take an example, most of the programmers I know when doing some serious coding have 2+ monitors because it's always good to run on the side, get the result live. Or just because you need to compare two files. On such a small screen, it's close to impossible. Also for a programmer a lot of the wasted time is browsing through long files, with an ipad and, again, a small screen = nightmare.


This is just not really that serious. There is not going to be widespread adoption of serious programmers doing this in anger. I promise. It's not going to happen. It's fitting a square peg in a round hole. This attitude wants me to have this overly complicated setup to have a really horrible version of what I can do swimmingly well with a little Thinkpad or whatever notebook you like. Done and done. No nonsense.


To be fair , that one article is the only evidence I have seen thus far of anybody using an iPad as their primary development machine.

Wonder how he is getting on with it?


the iPad is a _great_ couch, porch, or kitchen computer. Just because you buy one doesn't mean it needs to replace everything else :)


The question is what makes the thing a tablet or a PC for the purpose of what we're discussing. We can probably all agree on the point that a thing with a bigger screen and a keyboard will always be a big part of the mix.

But, something could fit that description while being more similar to a tablet. I think that's the point Atwood is making.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: