Читаем The World полностью

‘For the past thirty-three years,’ said Steve Jobs, in remission from cancer and looking back on his life while talking to students in 2005, ‘I have looked in the mirror every morning and asked myself: “If today were the last day of my life, would I want to do what I am about to do today?”’ Jobs had changed the world: ‘Of all the inventions of humans, the computer is going to rank near or at the top as history unfolds.’ Intolerant and intolerable, unkind and often cruel, Jobs believed creativity was about following your instincts – ‘connecting the dots’. Jobs was the son of two teachers – a Syrian and his Swiss lover – but ‘my biological mother was a young, unwed graduate student, and she decided to put me up for adoption’ – and he was adopted by an American coastguard. As a schoolboy he worked at the business-machine company Hewlett Packard, later travelled to India, embraced Zen Buddhism, dropped out of college (to take a calligraphy course), then at twenty founded a company in his parents’ garage, where he started to design the first consumer computer. He called it Apple.

The idea of computers was not new.*

Their development made inevitable the arrival of smartphones and computers facile and small enough to be used by ordinary people, but it took forty years to happen. In 1959, Robert Noyce at Fairchild Semiconductor invented a single piece – a monolithic integrated circuit, a chip – that made the revolution possible just at the same time as Paul Baran was developing his messaging network to function after a nuclear apocalypse. In 1968, Alan Kay at Xerox predicted a ‘personal, portable information manipulator’ that he called a Dynabook, just as the first active-matrix liquid-crystal display was developed. In 1975 IBM created its first portable device, the same year that a Seattle lawyer’s son, Bill Gates, dropped out of Harvard to develop a system of instruction for computers to use – software – bought by IBM. Five years later, Gates launched a more sophisticated system, Windows.

In 1974, the Pentagon’s ARPA communications, designed to connect the leadership after nuclear war, had been extended into academia by Vint Cerf and Bob Kahn, who called it the inter-network – internet. In 1980 ARPANET was closed, but the European nuclear research organization CERN started to use the system, which in 1989 inspired a thirty-four-year-old mathematics professor there, Tim Berners-Lee: ‘I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain system idea and – ta-da! – the World Wide Web.’ Like Edison or Watt before him, he did not claim to have invented it: ‘Most of the technology involved in the web, like the hypertext, like the Internet, multi-font text objects, had all been designed already. I just had to put them together.’ He invented a system of addresses – //www

– that became so universal the internet almost became a groove of the human brain. ‘I never foresaw how big the Net would become,’ Berners-Lee told this author, ‘but I had designed it to be totally universal. And there was a moment as it grew exponentially that I realized it would change the world.’

In 1984, Jobs, a visionary bundler of ideas, tweaker of inventions and crafter of exquisite designs, launched the Macintosh, a computer that a consumer could use to move between different facilities, adding a hand control that he called a mouse and the ability to choose new fonts, inspired by the course of calligraphy he had once taken. ‘I was lucky,’ he explained. ‘I found what I loved to do early in life.’ But ‘Then I got fired. The heaviness of being successful was replaced by the lightness of being a beginner again.’

When Jobs returned to Apple he devised, starting in 1998, a series of devices beginning with an ‘i’ (standing for ‘internet, individual, instruct, inform and inspire’). In 2007, his iPhone changed human behaviour, creating a fashionable but indispensable machine. By 2020, around 2.2 billion iPhones had been sold, 19 billion smartphones altogether – tiny mechanisms that forever changed human nature and behaviour in ways not yet clear. Smartphones became technologies so essential they became almost membral extensions. By 2005, at least 16 per cent of humans were using smartphones; by 2019, the figure was 53.6 per cent, 86.6 per cent in the west. The internet opened a mass of new knowledge to citizens, and many abandoned more laborious yet more trustworthy sources of information. The internet thickened society, adding new layers of discourse and power to give a dynamic to already pluralistic societies – a further shift from ‘sovereign power’, in Foucault’s analysis, to ‘disciplinary power’.

Перейти на страницу:

Похожие книги

Знаменитые мистификации
Знаменитые мистификации

Мистификации всегда привлекали и будут привлекать к себе интерес ученых, историков и простых обывателей. Иногда тайное становится явным, и тогда загадка или казавшееся великим открытие становится просто обманом, так, как это было, например, с «пилтдаунским человеком», считавшимся некоторое время промежуточным звеном в эволюционной цепочке, или же с многочисленными и нередко очень талантливыми литературными мистификациями. Но нередко все попытки дать однозначный ответ так и остаются безуспешными. Существовала ли, например, библиотека Ивана Грозного из тысяч бесценных фолиантов? Кто на самом деле был автором бессмертных пьес Уильяма Шекспира – собственно человек по имени Уильям Шекспир или кто-то другой? Какова судьба российского императора Александра I? Действительно ли он скончался, как гласит официальная версия, в 1825 году в Таганроге, или же он, инсценировав собственную смерть, попытался скрыться от мирской суеты? Об этих и других знаменитых мистификациях, о версиях, предположениях и реальных фактах читатель узнает из этой книги.

Оксана Евгеньевна Балазанова

Культурология / История / Образование и наука