Netymology: From Apps to Zombies: A Linguistic Celebration of the Digital World

Netymology: From Apps to Zombies: A Linguistic Celebration of the Digital World

by Tom Chatfield
Netymology: From Apps to Zombies: A Linguistic Celebration of the Digital World

Netymology: From Apps to Zombies: A Linguistic Celebration of the Digital World

by Tom Chatfield

eBook

$3.99 

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

Composed of 100 bite-sized entries of 400 to 600 words each, Netymology weaves together stories, etymologies and analyses around digital culture's transformation, and creation, of words.

Tom Chatfield presents a kaleidoscopic, thought-provoking tour through the buried roots of some of the digital age's most common terms: from the @ and Apple symbols, to HTML and Trojan horses, to the twisted histories of new forms of slang, memes, text messages and gaming terms.

There's also discussion of the trends behind digital words, and of the ways language itself is being shaped by new forces - and revelations about how these forces are, in turn, reshaping us.


Product Details

ISBN-13: 9781780879949
Publisher: Quercus Publishing
Publication date: 03/28/2013
Sold by: Hachette Digital, Inc.
Format: eBook
File size: 2 MB

About the Author

Tom Chatfield is the author of four previous books of non-fiction exploring digital culture. Tom is also a fortnightly columnist for the BBC, a TED veteran, international speaker and broadcaster, and has worked with some of the world's leading technology firms. He took a doctorate and taught at St John's College, Oxford, before moving to write in London.
Tom Chatfield is a freelance author, consultant, game writer and theorist. His first book FUN INC. was published worldwide in 2010. Tom has done design, writing and consultancy work for games and media companies, including Google, Mind Candy, VCCP, Preloaded, Grex, Red Glasses and Intervox. He has spoken widely on technology, media and gaming at forums including TED Global, the Cannes Lions Festival, the House of Commons, RSA, ICA and the World IT Congress. A former senior editor at Prospect magazine, he has a doctorate from St. John's College, Oxford, and writes widely in the national press, including for the Observer, Independent, Sunday Times, Wired, New Statesman, Evening Standard and Times Literary Supplement, and the site Boing Boing.

Read an Excerpt

It’s easy to forget that, for most of its existence, the English word ‘computer’ referred not to machines, but to people who performed calculations. First used in the seventeenth century, the term arrived via French from the Latin computare, meaning to count or add up.Computare itself derived from the combination of the words com, meaning ‘with’, and putare, which originally meant ‘to prune’ in the sense of trimming something down to size, and which came to imply ‘reckoning’ by analogy with mentally pruning something down to a manageable estimate.

Long before eminent Victorians like Charles Babbage had even dreamed of calculating machines, human computing had been vital to such feats as the ancient Egyptians’ understanding of the motion of the stars and planets, with mathematicians like Ptolemy laboriously determining their paths (he also managed to calculate pi accurately to the equivalent of three decimal places: no mean feat for the first century AD).

As mathematics developed, the opportunities for elaborate and useful calculations increased – not least through the development of tables of logarithms, the first of which were compiled by English mathematician Henry Briggs in 1617. Such tables immensely simplified the complex calculations vital to tasks like navigation and astronomy by providing pre-calculated lists of the ratios between different large numbers – but whose construction required immense feats of human calculation both by mathematicians and increasingly necessary groups of trained assistants.

Even as recently as the Second World War, when Alan Turing and his fellows were establishing the revolutionary foundations of modern computing, the word ‘computers’ still referred to dedicated human teams of experts – like those working around Turing at Bletchley Park in England.

According to the Oxford English Dictionary, it wasn’t until 1946 that the word ‘computer’ itself was used to refer to an ‘automatic electronic device’. This was, of course, only the beginning; and since then both the sense and the compound forms of the word have multiplied vastly. From ‘microcomputers’ to ‘personal computers’ and, more recently, ‘tablet computers’, we live in an age defined by Turing’s digital children.

It’s important to remember, though, just how recently machines surpassed men and women in the computation stakes. As late as the 1960s, teams of hundreds of trained human computers housed in dedicated offices were still being used to produce tables of numbers: a procedure that the first half of the twentieth century saw honed to a fine art, with leading mathematicians specializing inbreaking down complex problems into easily repeatable steps.

It’s a sign of how fast and entirely times have changed since then that human computation is almost forgotten. And yet, in different forms, its principles remain alive in the twenty-first century – not least under the young banner of ‘crowdsourcing’, a word coined in 2006 in an article for Wired magazine by writer Jeff Howe to describe the outsourcing of a task to a large, scattered group of people.

From identifying the contents of complex photographs to answering fuzzy questions or identifying poorly printed words, there remain plenty of tasks in a digital age that people are still better at than electronic computers. We may not call it ‘human computation’ any more, but the tactical deployment of massed brainpower to solve some problems remains more potent than ever.

Table of Contents

Introduction 1

1 Selfie Consciousness 5

2 #WhyDoWeDoThis? 8

3 Transistor (Not Iotatron) 11

4 Emoji and Emoticons 14

5 Computers 17

6 Signs of Our Times: @ and 20

7 Marking Up 23

8 Myths and Monsters 26

9 Speak, Memory 28

10 Why Wiki? 30

11 Buffed-Up Gamers 32

12 Very, Very Big and Very, Very Small 34

13 The Names of Domains 37

14 Rise of the Robots 40

15 Cyber-Everything 43

16 Three-Letter Words 46

17 Everyone's an Avatar 49

18 On Memes 51

19 Hacking Through the Net 54

20 Do You Grok It? 57

21 Sock Puppets and Astroturf 60

22 Bluetooth 63

23 The Cupertino Effect 65

24 The Scunthorpe Problem 68

25 The Coming of the Geeks 71

26 Beware of the Troll 74

27 Bitten by Bugs 77

28 Bits, Bytes and Other Delights 80

29 Twinks, Twinked, and Twinking 83

30 Talking Less About Trees 85

31 ZOMGs, LOLZ 88

32 Lifehacking 90

33 The Multitasking Illusion 93

34 The Streisand Effect 96

35 Acute Cyberchondria 99

36 Casting the Media Net 102

37 Bionic Beings and Better 105

38 Technological Singularities 108

39 Google and Very Big Numbers 111

40 Status Anxiety 114

41 The Zombie Computing Apocalypse 117

42 To Pwn and Be Pwned 120

43 Learning to Speak 133 122

44 Getting Cyber-Sexy 124

45 Slacktivism and the Pajamahadeen 127

46 Gamification and the Art of Persuasion 129

47 Sousveillance 132

48 Phishing, Phreaking and Phriends 135

49 Spamming for Victory 138

50 Gurus and Evangelists 141

51 CamelCase 144

52 The Blogosphere and Twitterverse 147

53 Phat Loot and In-Game Grinding 150

54 Meta- 153

55 TL;DR 156

56 Apps 159

57 Fanboys and Girls 162

58 Welcome to the Guild 165

59 Facepalms and *Acting Out* 168

60 Finding Work as a Mechanical Turk 171

61 Geocaching 174

62 The Beasts of Baidu 177

63 Snowclones 179

64 Typosquatting 182

65 Egosurfing and Googlegangers 185

66 Infovores, Digerati and Hikikomori 187

67 Planking, Owling and Horsemanning 190

68 Unfriend, Unfavorite (and Friends) 193

69 Sneakernets and Meatspace 196

70 Going Viral 198

71 Dyson Spheres and Digital Dreams 201

72 Welcome to Teh Interwebs 204

73 On Good Authority 206

74 A World of Hardware 209

75 Darknets, Mysterious Onions, and Bitcoins 212

76 Nets, Webs and Capital Letters 215

77 Praying to Isidore and Tweeting the Pope 218

78 QWERTY and Dvorak 221

79 Apples Are the Only Fruit 224

80 Eponymous Branding 226

81 Mice, Mouses and Grafacons 228

82 Meh 230

83 Learn Olbanian! 232

84 Booting and Rebooting 235

85 Cookie Monsters 237

86 Going Digitally Native 240

87 Netiquette and Netizens 243

88 The Names of the Games 246

89 Flash Crowds, Mobs, and the Slashdot Effect 249

90 Godwin's Law 252

91 From Beta to Alpha to Golden Master 254

92 Mothers and Daughters, Masters and Slaves 256

93 Bit Rot 258

94 Nonprinting Characters 260

95 Wise Web Wizards 262

96 Disk Drives 265

97 Easter Eggs 268

98 Why Digital? 273

99 Filing Away Our Data 275

100 Artificial Intelligence and Hiring Tests 276

… and Finally 279

Acknowledgments 281

Select Bibliography and Further Reading 283

Notes and References 285

Index 299

From the B&N Reads Blog

Customer Reviews