Making Curriculum Pop

This great article from Wired offers a kaleidoscope of views on the impact the tablet. Check the great "Fake Steve Jobs" post and blurb by someone who channels Marshall McLuhan. While those two were funny / interesting my favorite entry was the "window" idea by Kevin Kelly.

Enjoy...

How the Tablet Will Change the World

Photo: Dan Winters; tablet: Stan Musilek

The iPad is the first embodiment of an entirely new category, one that Apple CEO Steve Jobs hopes will write the obituary for the computing paradigm that Apple itself helped develop.
Photo: Dan Winters; tablet: Stan Musilek


13 OF THE BRIGHTEST TECH MINDS SOUND OFF ON THE RISE OF THE TABLET

Everyone who jammed into the Yerba Buena Center for the Arts in San Francisco on January 27, 2010, knew what they were there for: Apple CEO Steve Jobs’ introduction of a thin, always-on tablet device that would let people browse the Web, read books, send email, watch movies, and play games. It was also no surprise that the 1.5-pound iPad resembled an iPhone, right down to the single black button nestled below the bright 10-inch screen. But about an hour into the presentation, Apple showed something unexpected — something that not many people even noticed. In addition to the lean-back sorts of activities one expects from a tablet (demonstrated by Jobs while relaxing in a comfy black armchair), there was a surprising pitch for the iPad as a lean-forward device, one that runs a revamped version of Apple’s iWork productivity apps. In many ways, Jobs claimed, the iPad would be better than pricier laptops and desktops as a tool for high-end word processing and spreadsheets. If anyone missed the point, Apple’s design guru Jonathan Ive gushed in a promotional video that the iPad wasn’t just a cool new way to gobble up media — it was blazing a path to the future of computing.

Models of the Form

1900-1800 BCE
The wedge-shaped cuneiform on this Assyrian tablet is actually early legalese.

1700s
Erasable slates used by schoolkids put a premium on memorization.

1888
A paper mill employee cut, ruled, and bound reject sheets to create the legal pad.

1993
The Apple MessagePad—and the Newton OS—almost recognized handwriting.

1987
Star Trek introduced the PADD — Personal Access Display Device.

Illustrations: Elizabeth Traynor

Even though the iPad looks like an iPhone built for the supersize inhabitants of Pandora, its ambitions are as much about shrinking our laptops as about stretching our smartphones. Yes, the iPad is designed for reading, gaming, and media consumption. But it also represents an ambitious rethinking of how we use computers. No more files and folders, physical keyboards and mouses. Instead, the iPad offers a streamlined yet powerful intuitive experience that’s psychically in tune with our mobile, attention-challenged, super-connected new century. Instant-on power. Lightning-fast multitouch response. Native applications downloaded from a single source that simplifies purchases, organizes updates, and ensures security. Apple has even developed a custom chip, the A4, that both powers the machine and helps extend its battery life to 10 hours. The iPad’s price puts it in the zone of high-end netbooks: $500 for a basic 16-gig, Wi-Fi-only model. (A version with AT&T 3G connectivity will cost $130 more, plus $30 a month for unlimited data.) But don’t call it a netbook, a category Jobs went out of his way to trash as a crummy compromise. The iPad is the first embodiment of an entirely new category, one that Jobs hopes will write the obituary for the computing paradigm that Apple itself helped develop. If Jobs has his way, before long we may be using our laptops primarily as base stations for syncing our iPads.

The fact is, the way we use computers is outmoded. The graphical user interface that’s still part of our daily existence was forged in the 1960s and ’70s, even before IBM got into the PC business. Most of the software we use today has its origins in the pre-Internet era, when storage was at a premium, machines ran thousands of times slower, and applications were sold in shrink-wrapped boxes for hundreds of dollars. With the iPad, Apple is making its play to become the center of a post-PC era. But to succeed, it will have to beat out the other familiar powerhouses that are working to define and dominate the future.

There’s a lot to love about Apple’s vision. As we start to establish the conventions made possible by advanced multitouch, we’ll perform ever more complicated tasks by rolling, tapping, and drumming our fingers on screens, like pianists tickling the ivories. The iTunes App Store model gives us a safe and easy means to get powerful programs at low prices. Rigidly enforced standards of aesthetics will ensure that the iPad remains an easy-to-navigate no-clutter zone. And since we’re obligated to link our credit cards to Apple, micropayments are built in, providing traditional media companies with at least a hope of avoiding the poorhouse.

But there’s also a lot to worry about. It’s a pain to lug around an external keyboard, which many people will require if they’re serious about banging out documents. (My brief exposure to the iPad’s onscreen keyboard wasn’t encouraging.) Apple’s system is closed in a way that the Mac (and even Windows) OS never was — all apps are cleared through Cupertino, and developers and publishers are a step removed from their users, who make transactions through the App Store.

That Apple-centric vision assures a nasty fight ahead. In particular, the iPad represents a head-butt to another bold new model for computing: Google’s Chrome OS.

In some ways, Chrome is even more radical than the iPad. Spawn of a pure Internet company, it is itself pure Internet. While Apple wants to move computing to a curated environment where everything adheres to a carefully honed interface, Google believes that the operating system should be nearly invisible. Good-bye to files, client apps, and onboard storage — Chrome OS channels users directly into the cloud, with the confidence that the Web will soon provide everything from native-quality applications to printer drivers. Google hopes that a wave of Chrome-powered netbooks set for release this fall will hasten that day, and its designers are already sketching out the next generation of Chrome OS devices, including touchscreen tablets.

Google vice president Sundar Pichai contends that having an iTunes-like app store is unnecessary, because desktop software is just about dead. “In the past 10 years, we’ve seen almost no new major native applications,” he says, ticking off the few exceptions: Skype, iTunes, Google Desktop, and the Firefox and Chrome browsers. “We are betting on the fact that all the user will need are advanced Web apps.” (Pichai acknowledges that the Web can’t currently handle powerful games but says that new technologies like Native Client and HTML5 will fix that problem.)

Though critics of Google worry about the company’s power, Chrome OS is an open source system, and the Web apps Google encourages will, unlike Apple’s, be available on any device or browser.

Apple won’t talk on the record about Google’s browser-centric approach, but Jobs did address the notion when I interviewed him about interfaces several years ago. “While we love the Web and we’re going to have the best Web browser in the world, we do not want to make our UI look like a Web page,” he said. “We think that’s wrong.” Clearly, he still thinks so. Apple favors the pristine orderliness of autocracy to the messy freedom of an open system.

While Google and Apple are each positioning themselves as pioneers of the next paradigm, Microsoft — the company that dominates the current one — has a more iterative approach. It’s taking an evolutionary path that integrates the seismic changes in the digital world into its flagship products, without any jarring leaps. Three years back, Microsoft introduced Surface, a technology that lets people use their fingers and objects to interact with table-sized displays. Later this year, the Xbox will implement a motion-tracking system called Project Natal. Chief strategy officer Craig Mundie, Redmond’s delegated seer, says it’s all part of a transition from the GUI — the graphical user interface that began with Mac and Windows — to the NUI — a natural user interface based on touch, gestures, and voice recognition.

Full article can be read here.

Views: 8

Events

© 2024   Created by Ryan Goble.   Powered by

Badges  |  Report an Issue  |  Terms of Service