Software is a Long Con
I had a conversation with a bridge engineer one evening not long ago. I said, “Bridges, they are nice, and vital, but they fall down a lot.”
He looked at me with a well-worn frustration and replied, “Falling down is what bridges do. It’s the fate of all bridges to fall down, if you don’t understand that, you don’t understand bridges.”
“Ok, I do understand that,” I replied. “But they fall down a lot. Maybe if we stepped back and looked at how we’re building bridges –”
“You can’t build a bridge that doesn’t fall down. That’s just not how bridges work”
I took a deep breath. “What if you could build a bridge that didn’t fall down as often?”
“Not practical — it’s too hard, and besides, people want bridges.” By now, he was starting to look bored with the conversation.
“I bet if you slowed down how you build bridges, you could make ones that lasted decades, even in some cases, centuries. You might have to be thoughtful, set more realistic expectations, do a lot more of the design of a bridge before you start building it, but..”
He interrupted me again. “Look, you’re not a bridge engineer, so you don’t really understand how bridges work, but people want bridges now. So no one is going to build a bridge like that, even if it were possible, and I’m not saying it is.”
“But people get hurt, sometimes die, on these bridges.”
“Bridges fall down. Sometimes people are on them when they do. That’s not my fault as a bridge engineer, that’s literally how gravity works,” he said.
“I know there will always be accidents and problems with bridges, but I really do think that you could build them with careful planning, and maybe shared standards and even regulations in such a way that bridge collapses could be rare. Some of the problems with bridges are faults we’ve known about for decades, but they still get built into bridges all the time.”
He took a deep breath, and pinned me with a stare. “Even if we could, and it’s still entirely possible that no one can build these mythical bridges you’re talking about, that would slow down the building of bridges. People need bridges to get places. No one could afford to build bridges that slowly, and people would complain.” He stretched out the –plaaaain in complain, in a way that made clear this was the end of the argument and he’d won.
“They might not complain if they didn’t fall off bridges so often,” I mumbled.
He heard me. “Unlike you, people know that bridges fall down.”
Just then, a friend of mine, also a writer, also interested in bridges, stopped by.
“Hey guys!” he said. “So it looks like there’s a crew of Russian bridge destroyers with hammers and lighters who are running around in the middle of the night setting fires to bridges and knocking off braces with hammers. They started in Ukraine but they’re spreading around the world now, and we don’t know if our bridges are safe. They’ve studied bridges carefully and they seem to be good at finding where they’re most flammable and which braces to knock off with their hammer.”
We both regarded my friend a long moment, letting it sink in. I turned back to the bridge engineer and said, “Maybe we need to make them out of non-flammable material and rivet them instead of using exposed braces and clamps.”
But he was already red in the face, eyes wide with anger and fear. “GET THE RUSSIANS!” he screamed.
OK, obviously it’s not bridges I’m talking about, it’s software. And that other writer is Wired’s Andy Greenberg, who wrote a piece not that long ago on Russian hacking.
Greenberg’s detailed and riveting story focuses largely on the politics of hacking, and the conflict between an increasingly imperialist Russia, and Ukraine, with an eye towards what it means for America. For people who respond to such attacks, like FireEye and Crowdstrike, these kinds of events are bread and butter. They have every reason to emphasize the danger Russia (or a few years ago, China) pose to the USA. It’s intense, cinematic stuff.
It’s also one of a long sequence of stories in this vein. These stories, some of which I’ve written over the years, show that our computers and our networks are the battlegrounds for the next great set of political maneuvers between nation-states. We the people, Americans, Russians, whomever, are just helpless victims for the coming hacker wars. We have Cyber Commands and Cyber attack and Cyber defense units, all mysterious, all made romantic and arcane by their fictional counterparts in popular media.
But there’s another way to look at it. Computer systems are poorly built, badly maintained, and often locked in a maze of vendor contracts and outdated spaghetti code that amounts to a death spiral. This is true of nothing else we buy.
Our food cannot routinely poison us. Our electronics cannot blow up, and burn down our houses. If they did, we could sue the pants off whomever sold us the flawed product. But not in the case of our software.
The Software Is Provided “As Is”, Without Warranty of Any Kind
This line is one of the most common in software licenses. In developed nations, it is a uniquely low standard. I cannot think of anything infrastructural that is held to such a low standard. Your restaurants are inspected. Your consumer purchases enveloped in regulations and liability law. Your doctors and lawyers must be accredited. Your car cannot stop working while it is going down the freeway and kill you without consequences, except maybe if it’s caused by a software bug.
It is to the benefit of software companies and programmers to claim that software as we know it is the state of nature. They can do stupid things, things we know will result in software vulnerabilities, and they suffer no consequences because people don’t know that software could be well-written. Often this ignorance includes developers themselves. We’ve also been conditioned to believe that software rots as fast as fruit. That if we waited for something, and paid more, it would still stop working in six months and we’d have to buy something new. The cruel irony of this is that despite being pushed to run out and buy the latest piece of software and the latest hardware to run it, our infrastructure is often running on horribly configured systems with crap code that can’t or won’t ever be updated or made secure.
People don’t understand their computers. And this lets people who do understand computers mislead the public about how they work, often without even realizing they are doing it.
Almost every initial attack comes through a phishing email. Not initial attack on infrastructure — initial attacks on everything — begins with someone clicking on an attachment or a link they shouldn’t. This means most attacks rely on a shocking level of digital illiteracy and bad IT policy, allowing malware to get to end-user computers, and failing to train people to recognize when they are executing a program.
From there, attackers move laterally through systems that aren’t maintained, or written in code so poor it should be a crime, or more often, both. The code itself isn’t covered by criminal law, or consumer law, but contract law. The EULAs, or End User Licensing Agreements (aka the contracts you agree to in order to use software), which are clicked through by infrastructure employees are as bad, or worse, as the ones we robotically click through everyday.
There are two reasons why I gave up reporting on hacking attacks and data breach. One is that Obama’s Department of Justice had moved their policies towards making that kind of coverage illegal, as I’ve written about here. But the other, more compelling reason, was that they have gotten very very boring. It’s always the same story, no one is using sophistication, why would you bother? It’s dumb to burn a zero day when you can send a phishing mail. It’s dumb to look for an advanced zero day when you can just look for memory addressing problems in C, improperly sanitized database inputs, and the other programatic problems we solved 20 years ago or more.
Programmers make the same mistakes over and over again for decades, because software companies suffer no consequences when they do. Like pollution and habitat destruction, security is an externality. And really, it’s not just security, it’s whether the damn things work at all. Most bugs don’t drain our bank accounts, or ransom our electrical grids. They just make our lives suck a little bit more, and our infrastructure fail a little more often, even without any hackers in sight.
When that happens with a dam, or a streetlight, or a new oven, we demand that the people who provided those things fix the flaws. If one of those things blows up and hurt someone, the makers of those things are liable for the harm they have caused. Not so if any of these things happen because of software. You click through our EULA, and we are held harmless no matter how much harm we cause.
When I became a reporter, I decided I never wanted my career to become telling the same story over and over again. And this is, once again, always the same story. It’s a story of software behaving badly, some people exploiting that software to harm other people, and most people not knowing they could have it better. I’m glad people like Andy Greenberg and others at my old Wired home, the good folks at Motherboard and Ars Technica, and others, are telling these stories. It’s important that we know how often the bridges burned down.
But make no mistake, as long as we blame the people burning the bridges and not the people building them, they will keep burning down.
And shit software will still remain more profitable than software that would make our lives easier, better, faster, and safer. And yeah, we would probably have to wait a few more months to get it. It might even need a better business model than collecting and selling your personal information to advertisers and whomever else comes calling.
I could keep writing about this, there’s a career’s worth of pieces to write about how bad software is, and how insecure it makes us, and I have written many of those pieces. But like writing about hackers compromising terrible systems, I don’t want to write the same thing telling you that software is the problem, not the Chinese or the Russians or the boogeyman de jour.
You, the person reading this, whether you work in the media or tech or unloading container ships or selling falafels, need to learn how computers work, and start demanding they work better for you. Not everything, not how to write code, but the basics of digital and internet literacy.
Stop asking what the Russians could do to our voting machines, and start asking why our voting machines are so terrible, and often no one can legally review their code.
Stop asking who is behind viruses and ransomware, and ask why corporations and large organizations don’t patch their software.
Don’t ask who took the site down, ask why the site was ever up with a laundry list of known vulnerabilities.
Start asking lawmakers why you have to give up otherwise inalienable consumer rights the second you touch a Turing machine.
Don’t ask who stole troves of personal data or what they can do with it, ask why it was kept in the first place. This all goes double for the journalists who write about these things — you’re not helping people with your digital credulity, you’re just helping intel services and consultants and global defense budgets and Hollywood producers make the world worse.
And for the love of the gods, stop it with emailing attachments and links. Just stop. Do not send them, do not click on them. Use Whatsapp, use Dropbox, use a cloud account or hand someone a USB if you must, but stop using email to execute programs on your computer.
Thanks to my Patrons on Patreon, who make this and my general living possible. You can support this more of work at Patreon.
Image CC Skez
Well, I think we should get the Russians and the hackers.
But otherwise, wow. Well said, Quinn. We have been shipping our software engineering offshore and shipping people who have no stake in America here on temporary visas to do work Americans could do at $3 an hour more. It’s like having your bridges built by engineering firms with no legal liability and no intentions of ever using the bridge.
May the road rise up and meet us, metaphorically speaking. Or at least the software.
“stop using email to execute programs on your computer”, Might as well stop communicating electronically when every picture, every document, every video as well as every link is an opportunity to execute malware whether it is attached or on dropbox or where ever. In the interim I act as if my computer and communications are always compromised. You left out “big data” tracking every click and mouse movement and selling them. I have netstat -f, 28 allegedly akamai TCP ports open right now.
The “two factor” identification nightmare of phone companies supplying passwords with minimal spoofing is a bridge designed to fail. This is how CWA got Brennan’s email account and phone hacked. And shades of Murdoch’s UK and US phone hacking, what was probably done to Weiner as well.
Notice Bannon’s involvement with gamer virtual money sales, when does a reseller just become a simple fence? When does gamer hacking roll into vote hacking. This is not rocket science connecting the dots when you have a gamergater and a Russian stooge standing together on Assange’s embassy steps.
I see what you did here. I suggest you just wear a “Putin bought me” t-shirt.
Sorry about the coarseness. That is my problem. I have a foot in mouth disorder. I think most of us are grumpy because there is no solution to the people will die law of the jungle business model short of going into communities. No technical person really believes their computer, phone, ATM, transactions, or “savings” are secure. Non technical people think, “why should I not use my debit card instead of cash, it is so easy, that is the way it should work. Why should I not use my phone to purchase items over the net.” Have multiple emails, be careful of two factor ID hacks, never use a debit card.
The people who brought technology to the world are not “nice” people. Gates, Jobs, Zuckerberg, Ellison, Bezos to name a few are known by the gossip of their ruthlessness.
I am a bridge engineer.
Or rather, I am a software engineer. I once built a bridge, but it was made out of logs and I was 12. It’s probably fallen down and rotten by now. Still, I’ve learned a little about engineering since then. Knowing how bridges and software are built, there are some fundamental differences.
The first is the halting problem. Bridges, and other infrastructure governed by the laws of physics, do not suffer from an equivalent of the halting problem. By way of example, there are Roman bridges still in use today. They where some of the first bridges, and they very well stand to outlast human civilization. A bridge made out of nickel would see the death of our sun intact. Unlike civil engineering, in software engineering it is exceedingly difficult to prove mathematically that some turing machines have no bugs. As they increase in complexity, this becomes impossible. This is an immutable property of software, it is not an immutable property of bridges.
The second fundamental difference is that software is primarily a social science, not an engineering discipline. Primarily, being the operative word. Computer science is all about the study of turing machines. It’s a branch of mathematics. Software engineering is a misnomer, it’s primarily about communication with turing machines, but mostly with other humans interacting with the software and code. The turing machines just execute the results of those human interactions over and over again. Regulating good code would be more like regulating polite speech, or a soccer match, than building material standards. There are more variants of communication with turing machines than their are languages between humans at this point. Amongst each one there are too many dialects, accents, and pidgins to count. You can count the types of bridges on your hands and toes.
I think the first problem is intractable, because mathematics says it is intractable. I also think that “perfect is the enemy of good”, and that the second problem is where efforts should be spent. We now exist in an age where our algorithms have collectively become a form of super consciousness which is changing our society as fast as we change them. By changing the economic pressures which govern how software is written, we can change how software is written and how it influences our society. In other words, the solutions to improving software will be social, not technical, as you point out.
“Bridges” will still fall down though. They will always fall down, but we can change how fast they fall, and how much damage they cause when they do.
Great article :)
this is certainly a discussion whose time has come. for the last ~40 years individuals have been mesmerized and delighted with the machines they could buy (or build) for personal use. those machines, personal computers, were novel, mysterious, and allowed much easier typing, much faster communications with others, and rapid access to an enormous universe of information.
but those machines had problems, problems we happily overlooked in part because of our fascination with and desire for the newest and best of that technology, and in part because we believed each time when we were told that we were buying “greatly improved” equipment – storage capacity of a megabyte, then 10 megabytes, processing speed from 3 mhz to 33 ghz (and multiple processors on the same cpu chip) – amazing. hynotizing. but dysfunctional in the long run.
from its beginning in the 80s, the personal computer’s harware changed by the month. in five years time an expensive and well- built machine could be passe or worse, no longer compatible with other equipment.
software similarly became outdated within years either because it was not sufficiently capable, because it didn’t fit with newer equipment, or because its originator refused to support itbin favor of a newer of his products.
hard drives were reliable, mostly. but occassionally catatrophically not, and fragile. over time the hardware interface, the plug, that connected a device like a harddrive or a disk drive to the cpu would change repeatedly then become unsuported.
worst of all were several extremely serious general problems with using a personal computer (used by individuals and organizations):
– the data stored on round floppy disks, square floppy disks, compact disks, digital video disks, flashdrives could not be reliablely stored over time. this due to storage medium becoming outmoded and thus unreadable, and to the “fading” over time of the stored electronic characters.
consider photographs printed on photopaper 100 yrs ago. they are still readily reveling their content. consider a typed manuscript from the 1920’s. still readable and copiable.
– it has come to pass in the last decade that no information stored on a personal computer is safe from being stolen without extraordinary effort to secure it. no information stored on a computer is safe from government(s) spying.
– the ordinary user of email, whatsapp, cloud storage, etc. has no idea of exactly where info he sends to others is going to go, nor to whom. further, he has no idea of how much information about himself he is revealing/making available to unknown others and organizations.
– the use of computer input devices (peripherals) like scanners, flashdrives, hard drives and output devices like sd cards, flashdrives, etc PLUS widely varying and obtuse instructions for transferring data, absolutely guarantee a lot of data is lost in transfer efforts which the originator would rather not have been lost.
– the operation of this machine (personal computer) and its components is completely opaque to the user without extraordinary effort.
humans have been way too accepting and complacent about this useful new machine whose manufacturers have settled on individual users most of the costs of poor engineering, very rapid obsolescence, lack of privacy, and incompetent long term storage capability.
it’s time for an accounting. making great fortune by constantly dumping new deficient machines and software on the matket is no longer acceptable.
out of time for now, but this is an extremely important topic to discuss – and cuss :)
“Over time, all data approaches deleted, or public.”
It was about 70 years after the Model T before we collectively decided that you shouldn’t sell a model of car unless it passes a crash test.
Generations had to come and go. By then the vast majority of Americans couldn’t remember a time when ordinary folk didn’t use cars every day. That’s what it takes. Nobody has any interest at all in real quality or safety until the product has become so completely boring that everyone has one, but nobody thinks about. Only then does actual quality (as opposed to superficial, showy quality.) start to matter.
But, maybe we’re getting there. Slowly but surely.
You know, you were so close to the actual cause but you didn’t say it. If governments, states and individuals allowed bridges that would fall down in a year they could pay like 1/1000 of the price, perhaps less. Heck you could throw a plank over a creek and drive your car over it, instant bridge!
If software purchasers (Often you) refused to buy software that was poorly built and not guaranteed, you might have some safe software that would last. You aren’t interested in paying 1000x the price, are you? So enjoy these planks and give me my $30.
” an increasingly imperialist Russia,”
This wholesale attack on an entire profession using a phony analogy is amazing. Those of us who have made our careers in IT have much to answer for, but this stupid post could well have been produced by a monkey at a typewriter.
The analogy sucks. Equating mechanical structures with logic is false. If you were going to insist on making that kind of silly construction, bridges and highways would be the equivalent of hardware while cars, trucks and busses along with the all too human people who drive them would be software and the liveware that operates on the hardware. To get an idea how silly the post’s proposition is, turn it around. If bridges and cars had evolved as profoundly as hardware and software have over the last 50 years we would have bridges over the oceans, cars would traverse them in seconds and the cost per crossing would be pennies instead of the trillion or so dollars we desperately need to spend domestically on aging bridges and transportation infrastructure to keep it from collapsing.
Reliability testing/debugging is hard, and expensive. At the gross level it is pretty straightforward, most cars run off the showroom floor while most hardware and software executes out of the box. But, I would no more buy the first model year of a new car than the initial release of a new generation of hardware or software. There are too many unknowns, things that do not work exactly as expected, or have short serviceable lives or plain old screw ups. By the second model year, or version 2.0, early adopters have identified many of the bugs, and manufacturers have fixed them. The more standard deviations out you get from base functioning the harder it is to identify potential or low incidence problems and the sometimes bizarre ways people will operate cars or software. The amount of testing needed to find lower order bugs rises exponentially and varies inversely with probability.
Consider the space shuttle. It used rudimentary guidance computers designed in the ’60’s (you know, when lots of our bridges were new). It was so primitive it used magnetic core memory, wires with magnets strung on them, semi conductor memory had not been invented yet. The programs were kept on magnetic tape. For each mission they used three tapes, one to control launch, one to run orbital maneuvers and the third to control re entry. They mounted the first tape for lift off, loaded the second one when in orbit, and the third one to return to earth. That system was in use into this millennium while the astronauts were taking notebook computers, tablets and digital cameras to do their jobs. Why? Because that extremely simple system had been thoroughly debugged. NASA knew exactly how it worked and exactly what each of a limited number of instructions would do every time for every possible path through the programs.
Are any of us excited about the opportunity to trade our current systems for the extra time and effort required by simpler systems that have been debugged? Will you take 3 tapes to pay for your groceries or gas at a pump? One tape to initialize the transaction, the second tape to process it or run the gas pump, and the third to complete the transaction and turn off the pump. That kind of thing is the price of holding back technology until it is completely debugged and absolutely reliable. As it is today people are bitching at the extra seconds it takes to authenticate credit cards with chips. Mag stripes were easier and quicker, but they were hackable.
Moore’s law drove hardware development for half a century. It was pretty simple. Semi conductor density doubled roughly every 18 months, so 36 months, 3 years, is two Moore cycles which yields a 4 to 1 density increase. That makes everything that had gone before obsolete. Thank an electrical engineer. Since I started in the IT business in the late ’70’s that is more than a dozen Moore cycles. That is why IT is cheap today. There has been no equivalent change in bridge technology, and bridges built in the ’70’s are falling down. Disk drives, for example, went for about $1,000 per mega byte when I started, a 20mb drive was about $20k and the size of a washing machine. Current 3.5″ drives that hold several terabytes are around $25 per terabyte, that’s a couple of cents per mega byte. Bridge prices have not come down equivalently. We have reached a point where densities are no longer doubling every year and a half, but other changes in technology keep happening so the Moore equation still holds. Every three years it is all obsolete. Embrace what it does for us, don’t rage at it.
Semi conductors got faster along with getting denser. That has meant that periodically interfaces between components have had to change to keep up. Think cars, if we had maintained the model T starting user interface we would still be using hand cranks. Technology changed and so did the starting interface. Happens with IT too, but quicker, think Moore. Who’d a thunk it?
Some things like homes and coffee pots are commonly purchased. Others, like apartments, mineral rights and software are commonly leased. Cars roll both ways. If you don’t like leasing software or anything else, or don’t like the terms, the answer is simple, don’t do it. No one is holding a gun at your head compelling you to lease software under terms you do not agree with. Ain’t capitalism wunnerful?
In short, the author can put his phony bridge analogy and profound ignorance where the sun don’t shine. Maybe if he goes back to his typewriter in a million years he will produce “War and Peace”.
Remember, don’t click on those links in your email. Phishing catches someone every few minutes, be careful it is not you. Liveware is the biggest bug, always has been, always will be. Regards, Archy
lefty, i have read your long comment twice now. it is thoughtful and informative.
my beginning statement below (“as usual. lefty you miss the point”) was inaccurate, reflexive, and churlish. i apologize.
your fine comment and the comments of some others here represent the views of people who have worked to construct this new computer-based communications system we are using more and more.
other comments like mine represent the views of users/consumers of the system and its products.
both sides have valid stories to tell.
thanks again for a point of view filled with some very interesting historical background.
Thanks orion, I’d be pleased to get along better with you.
Hi. Hope these facts too shocking, but I worked in tech for many years before I was a writer, and I’m a woman.
I know all this history. I was there, helping, for some of it. I’m comfortable with my analogy. I’m comfortable both embracing, and raeging at, things I care about.
Hi back to ‘ya. Another note of appreciation for you. Working in a field dominated by male geeks, many of whom are solidly embedded in the Autism spectrum and largely having minimal social skills is not easy, plus the sharks in marketing… You are resilient, it has been a very rough ride.
We agree on a lot of issues, things like the profound naivete of many American technology users and especially your list of specifics at the end of your piece. It seems much of our corporate and public sectors are not much better informed than run of the mill users, but ready to prosecute, persecute and bumble along regardless of how little they understand.
Your 6 points at the end of your posting are specific and clear. They are a wonderful place for people, and institutions, to start building an informed and technically literate society. Without understanding there is little hope for achieving change, or ensuring that the change we get is for the better. It is the Pogo Syndrome once again, “We have met the enemy and he is us”.
Hiding your message under a sketchy analogy does not help communicate, it obfuscates. At least that’s the way it seems to me as a (semi informed) consumer of your writing here.
Your linked article on the risks to technology journalists and researchers was elegant and compelling. I will look for more of your work. You have made a fan.
Moore’s law drove hardware development for half a century.
This is only true with a whole bunch of caveats. Otherwise, how would you explain the slowdown in processor speeds over the last ten or so years?
This statement gets touted around like it’s gospel, and it’s most assuredly not. The only great gains recently have been on the amount of power needed side. When that dries up, we’ll be at the physical limitations of known material science.
There may be more to come in advancements, but it’s not at all obvious how the gluttony enjoyed in the first couple of decades of the industry will continue.
Part of why there has been so little concern with the ‘correctness’ of code is because of this early quick advancement in speed and power. A developer could roll any kind of trash together assured they could be sloppy. Those days are at their end.
We absolutely need the same rigor in all software enjoyed by pacemakers. Hopefully as it becomes more obvious that advancement in raw power can no longer be taken for granted, folks will demand better quality. I do believe Quinn has it right. Software needs to become a true engineering discipline, accountable for it’s bullshit.
Eula’s have to go. Standards must be made. It’s better professionals in the field do it themselves before someone ignorant does it for them.
We have reached a point where densities are no longer doubling every year and a half, but other changes in technology keep happening so the Moore equation still holds. Every three years it is all obsolete. Embrace what it does for us, don’t rage at it.
I see on re-read that you covered the major caveat, but seem to think advancement will be always forthcoming. This is some kind of faith that needs a new word. Scientism maybe?
Wouldn’t surprise me to find there are limits somewhere. We have not hit them in 50-75 years, so it’s not in my living memory. I would not bet against Moore’s Law and EE’s in the foreseeable future. One of the things that happens is that when we hit limits alternatives are found. CPUs are a good example, when we hit the wall on processor speed and heat around 4gHz and the millennium, multiple processor cores picked up the throughput. Rotating disks giving way to SSDs is another example. Dunno what to call it, beyond faith in electronics engineering. Wouldn’t want to confuse it with Christian Scientists:)
Pacemakers are not a good example. I can tell you horror stories of hardware failures like heart leads and batteries shorting, and software interfaces that rely, as in early computer communications, on obscurity for security. Is there any reason to believe the applications software running implantable devices is engineered any better than their wireless communications software or the flaky hardware?
It’s a question of how many nines is acceptable. Too much software can’t even get two nines in normal use.
as usual lefty you miss the point.
– the hardware is unreliable over not so long a time (cpu burn out, hard drive failure), and constantly being changed over not so long a time, e. g., isa, eisa, pci, epci, agp, etc, etc
– storage devices are constantly changing and eventually become obsolete – round floppy disk, square floppy disk, hard drive, cd, dvd, flash memory, solid state memory
– data saved, e. g., photos, documents, is not secure from loss over a relatively short time (a few years) due to characters fading or due to obsolescence of the storage device
– personal information is essentially sprayed out unprotected by personal computers to any number of ready organizations- google, usg,
– patent protection excesses allow organizations like microsoft to read the ids of devices attached to your motherboard
– the way personal computers – phones to desk tops – work and the instructions on how to use them effectively are opaque, ambiguous where ambiguity is fatal, and often missing critical detail.
– increasingly both hardware (android) and software are offered as bait to get us to give up personal info that can be monetized.
– it is impossible for ordinary users to track or follow info coming into or going out of our computers, or difficult to understand even roughly what they are doing.
we accept these products for what they allow us to do. we do not get good products that we can rely upon for work or storage over time, or that we can trust.
these are not good consumer products. they are designed to make money for someone other than the purchaser and user.
What can you expect, you named yourself after a 3rd rate router, a low rent rendezvous.
It’s not just IT. People are getting further and further removed from understanding what makes the devices they depend on work. A hundred years ago it was a mostly mechanical world. Devices were pretty simple, you could look at them and see how they worked. Not so much today. Even worse, most people can’t even describe the principles involved. Pretty scary. Still worser is that our governments are, and have been, methodically under funding maintenance of systems that make our country work, like bridges and highways and the electrical grid. Almost makes me glad I’m old, but I fear for my grand kids.
this is the key, lefty; it couldn’t have been put any better:
“… People are getting further and further removed from understanding what makes the devices they depend on work. A hundred years ago it was a mostly mechanical world. Devices were pretty simple, you could look at them and see how they worked. Not so much today… ”
it is very difficult for a consumer of electronic communications equipment like desktops, laptops, tablets, phones to have any idea of how his equipment works or, equally or more important, exactly where his intentionally communicated data + his personal metadata are going.
it boils fown to needing to trust an opaque communications system that can use and abuse and hurt a consumer’s finances or reputation. the trust required to use this system is blind trust and erodes with abuse. adding to the problem is that computer usage is becoming widely mandatory (a little 4th grade neighborhood boy whose family has no internet connection has a computer assignment for homework).
i don’t have an answer, but the situation is unstable. some regulation, e. g., of google’s vacuuming up personal info, of government and corporate spying, of microsoft’s absurdly favorable patent protections might help. but everything can’t be regulated. that still leaves, e. g., unreliable long term storage of info, as only one of several remaining major problems.
and we’re only now just entering the quantum computer/communications world. if only we could see those fucking little electrons zipping along, we might feel a little bit more in control :)
It’s getting harder even for those of us with a background. As an example, when I first looked inside a radio (long long ago) it was easy to see where the signal and power came in and to trace what each did until sound came out the speaker. Today I need a magnifying glass to even identify the miniaturized surface mount components, and I’ve got no soldering equipment small enough to actually work on boards. OTOH, stuff works far better, has lower failure rates and is orders of magnitude cheaper. Once I diagnose a problem I’m a parts pusher, replace at the the board level. Old industry story, long ago the Datapoint folks commissioned Intel to reduce a board level processor of discrete components to a single chip. Once Intel did it Datapoint decided they didn’t want it because with the discrete component processor they could diagnose a problem, and replace a single component to put the processor back in service. The chip was the 8008 and when Intel remaindered it, that became the first generation of personal computers and the world changed and Datapoint is long gone.
You’re right, we’ve got huge problems with the new world our connectiveness has created, and damned little insight from regulators/legislators. Technology is moving far faster than those attempting to regulate how it is used and abused, and the gap is growing. OTOH my spouse and her sister who is currently in Paris are txting, and the round trip from txt to reply is running about 2 seconds. That is technically amazing and cheap too in addition to cluttering up NSAs storage and adding hay to the stacks they’re searching.
Where we hit the physical limits on Moore’s Law was when transistors could switch on changes of a single electron. Stuff like cosmic rays rather than intentional changes could be in the driver’s seat. As you note quantum comm is a whole different world. Ah for the good old days when we could see what those little electron critters were doing, NOT.
Tks again for the chance to have a constructive conversation.
i’m stunned that this author would write this. shall I comb back through the long history of this author’s works and point out the many times she has railed against the very principles she is now looking down her nose at us from? Government regulation? Quinn Norton is writing in favor of government regulation?
I mean, she is 100% right, finally. but how about taking some responsibility for being one of the individuals, and even more so part of the worldwide community, who has spent years belittling anyone who dared suggest that government regulation of the software industry was a good idea?
I have no idea what you’re talking about. I’ve been writing about this for years. I’ve never been anti-regulation. I’ve been anti-gov, but freely admit that regs and standards are one of the only good reason to have governments — something I’ve been saying since 1994. I think you must be talking about some other Quinn Norton.
Hi, one who calls himself “Tim”, “this author” is named Quinn Norton.
Please use her name or get the fuck out. And, no, I will not tell you this a second time.
Nicely done. This needs wide circulation.
The crap-filled operating system that most PC users have had to put up with since the late 1980s and the entire third-world business milieu it spawned, a matter of ol’ fashioned robber baron skills rather than technological advancement, went a long ways toward setting the foundation for the current state of play Ms. Norton describes so well.
Standing ovation for not only tackling this issue in the first place, but nailing the problems perfectly. I want to cry from relief that *finally* someone with deep knowledge of software and its industry is saying this openly.
Of course you’re getting push back from the very people who have caused the problem and refuse to listen to anyone outside their echo chamber.
I’ve been sitting across from those people in meetings for 40 years trying to tell them that their view of the world through the limited and limiting lens of “the software development lifecycle” (from some holy writ, if you listen to the reverent tones of its acolytes) is damaging institutions, the software industry itself, and mostly, the people that software developers like to think of as the cause of all their problems: Users and anyone dependent on developers getting it right.
The software industry and the people who adapt and apply its products in organizations and businesses (IT shops) have caused a range of problems that really must be listed and talked about finally before we are ever going to solve them.
The standard IT shop is perfectly captured by this quote in your article: “People don’t understand their computers. And this lets people who do understand computers mislead the public about how they work” (and I think you’re very generous to add “often without even realizing they are doing it”). I’ve sat across from these guys as they knowingly lied to customers knowing damn well there was no way they would be caught (and doing everything in their power to get me to stop telling the customer). But even giving them the benefit of the doubt, the way software is developed and promulgated has convinced generations of IT staff that it’s perfectly normal that bridges not only fall down frequently, but they don’t work very well as bridges while they’re still standing, and what flies in the ointment they’ll admit exist are, in their minds, generally the fault of users.
Much has been written about this by venerated UX people, only to be generally ignored by the developers who could do something about it. This issue has been tackled in person by people like me, who are quickly marginalized by the very powerful IT hegemony. Strong words that I’m well aware sound hysterical to the Great Minds who are being criticized, but it’s long past due that we look at what their arrogance has wrought.
Examples from just my career of trying to get them to fix critical systems on behalf of users include:
Critical child welfare data goes missing because of poor software design and children died;
Vital medical research data goes missing because of poor software design; who knows how many lives were changed;
Highly classified (for a good damn reason) public health data spread far and wide because of poor software design.
In every one of those cases the software that the subject matter experts were using had built-in flaws that they had no way of understanding, let alone anticipating; but when the problems were exposed, it was the users who were blamed. “Well, bridge users should have known not to use bridges if they have no idea how they’re built; they’re obviously idiots because they didn’t know how to avoid falling off because of bridge design flaws.”
The inmates are running the asylum. Maybe we can get more people to care about that finally.
Yeah, I did UI for a few years, I guess that shows in my shouty hair pulling :D
My cap’s off to you on UI. It is harder and more frustrating than application programming. It is amazing how many different ways there are for users to misunderstand or mis-enter. OOP event driven programming made it harder and CGI is a stinker. Both have contributed to generally poorer user experiences, but there are a lot more of them.
As I indicated in my initial reply, we in the IT business have huge issues to answer for. However, promoting a false analogy as you have done does not help understanding or further the discussion. The user interface did not work.
As Quinn points out, this is an old, repeating story. If you care about it, I recommend listening to a keynote by Dan Geer from BlackHat 2015 in which he covers this topic and other related topics and proposes some thought provoking policy suggestions.
Well worth your time:
I worked as a Structural Engineer early in my career, now I work as a Software Engineer. There are probably many reasons this analogy doesn’t quite work. One difference is initial cost. Another is cost of failure. Another is visibility of the design. Also, I think bridges CAN be designed so they can be repaired in place. I often think in analogies as well, but this analogy (software is like a structure) shouldn’t be taken too far or too seriously.
…I’m not sure where I said I didn’t think bridges can be designed to be repaired in place?
Lefty – reactive / cynical / ego bruised much? You make some good points, but your absolute arrogance completely undermines you.
I thought the bridge analogy was a brilliant litteraty device to get people who otherwise would switch off at the mention of anything software / IT related – to pay attention. It’s a serious issue and both lives and livelihoods are impacted.
There’s no reason to take the analogy overly seriously / literally. It reaches its mark.
The bridge of your “litteraty” (sic) device has collapsed. Getting people to pay attention is great, and there is lots to pay attention to. You are correct that IT impacts both lives and livelihoods, and the impact increases by the minute. But attracting attention with an invalid analogy does not help people understand the issues, or address them. ‘=2’ and ‘not=2’ as in ‘=2 a bridge’ and ‘not=2 a bridge’ represent fundamentally different logical states.
You seem to have read what I posted, and I appreciate that, but you are on a bridge to nowhere with your mis-characterizations of my emotions.
Here’s what I included when I posted: “Enjoyed this piece. The writer in me enjoyed the literary device. The geek in me enjoyed the inherent truths beautifully illustrated. The privacy & security (aka community) advocate arghhh-yes’d!! the “it begins with someone clicking on an attachment or a link they shouldn’t… most attacks rely on a shocking level of digital illiteracy”. The ascetic monk in me (a weak voice I’ve been nurturing) enjoyed the bruised egos and overly rational responses. Your mileage may vary. 🙂”