Computers are extremely complicated machines, to say the least. The modern IT world is so complicated and moves so fast that even the most cutting-edge technology ages in dog years, times three. As such, it's no surprise that many people still believe things that are either inaccurate, outdated or just downright fabrications.
Human beings have been described as the "story telling ape". We seek out patterns and stories to explain everyday life, and myths, both computing and urban, are an example of this. In some cases they are useful - parables are an important feature of learning by example - but in some cases they can be counterproductive.
This week, we take a look at some of the more prevalent urban legends of Silicon Valley. Some have a basis in truth, while others are just a good tale to tell. Let us know if you have any favourites we've missed.
Honourable mention: Macs cost more than PCs
Shaun Nichols: This one only made honourable mention because, well, it's true on some levels. You can get a Dell or
Gateway notebook or desktop PC for less than an iMac or Macbook. The catch is that you also get less hardware.
Apple likes to load even its low-end models with a certain amount of power and connections. If you were to check the option boxes for all the bells and whistles on a Mac, the cost difference shrinks dramatically, and in some cases the PC is even more expensive.
So the rub here is whether you actually want and need the extras found on the
Apple computer. If not, then it's cheaper to go with a PC. Pound for pound, however, the idea that Apple arbitrarily prices its systems higher is wrong.
Iain Thomson: Apple has always concentrated on the high end of the computer market because it likes making quality products, and that costs money. But I have to say that a quick trawl through web sites shows that you do seem to get less for your money from Apple.
Looking at base specifications, an
Apple MacBook Pro 17in shipped to California sells for $3,036 (£2,137). A Dell Studio 17, pretty much the same machine (although it's not as pretty), costs a touch over $2,000 (£1,407), albeit with a $375 (£263) sale discount. In trying times like these, that's a big saving and it's difficult to see how Apple justifies the price.
Honourable mention: Pirated material makes up the bulk of internet traffic
Iain Thomson: In one legal case after another this same statistic gets trotted out: pirated material, especially torrents, makes up the vast bulk of internet traffic. This is a highly useful fact if you're trying to limit some people's bandwidth, for example, or pressing for the jailing of a suspected software pirate. But how true is it?
I've heard figures of 50, 60 or even 70 per cent but, when it comes to specifics, people seem less certain. The fact is that no-one's particularly sure and those that are, aren't telling, or at least providing the data to back up their claims. One court case in Canada forced an ISP to confess that, in fact, such material made up less than 10 per cent of traffic.
What also makes this claim suspicious is that not all files sent by torrents or peer-to-peer systems are pirated. Most Linux distributions are sent using these methods, and plenty of people, myself included, use the technology to send large files between systems. Until I see hard data, I'm treating this with a pinch of salt.
Shaun Nichols: When Comcast laid out its plan to cap bandwidth usage it said that, in order to reach the limit, a user would have to download an average of three full-length movies per day. As few people download entire movies, and even fewer do so every day, the idea that pirated media traffic is clogging systems to the tune that some companies are crying seems improbable.
Maybe this has a little more to do with copyright law than it does actual bandwidth problems. Cable providers certainly don't want to find themselves in the crosshairs of the Motion Picture Association of America or the Recording Industry Association of America for 'enabling' users to pirate material.
Being able to claim that peer-to-peer traffic is clogging the tubes is a nice excuse to discourage users from pirating movies and songs
10. Companies must replace their systems every 2 or 3 years
Shaun Nichols: IT salespeople avert your eyes now: the idea that all systems need to be replaced every couple of years is hogwash. The mantra of the IT world is that the latest and greatest is a necessity you can't live without. The reality is that, if you fit an average business with the cutting-edge every couple of years, you're wasting a ton of cash.
Except for industries like biotech or computer animation that require every bit of power they can get, the
latest technology is often much more than is needed. If well maintained, your average employee can get by just fine on a system that is five or even 10 years old.
Think for a second about the programs most businesses use every day: an office suite, a web browser and an email client. None of those is really the sort of thing that will stress out a decent PC built in 2005 or 2006. With budgets being slashed left and right these days, opting to add a little extra RAM or reinstall Windows rather than simply buying a sleek new computer is something more buyers should consider.
Iain Thomson: The replacement cycle was actually a necessity at one point, but I can't see it being relevant today. In the 1980s and 1990s, when processing power really was a problem for day-to-day applications, there was a case for a two- or three-year upgrade cycle. Processor technology was in its infancy, and you could get serious lag trying to crunch a spreadsheet.
However, for most of today's functions, processors have got so fast that they exceed the bounds of what is needed in all but the most niche of areas. If a netbook running an Atom processor can do 90 per cent of what you need it to, then why bother getting the latest Nehalem systems?
Actually there is a reason: future proofing. If you're buying a new computer, you want it to last as long as possible. Back in the day this meant it would be out of date in a few years but this is less true now, particularly if you are prepared to be flexible with software. I have a
desktop system built in 1999 that runs Ubuntu quite happily and will continue to do so until a critical component fails.
9. Bill Gates is whizz-kid programmer
Iain Thomson: Bill Gates is many things - billionaire, philanthropist, businessman, anti-fashion (and initially anti-grooming) icon and a mean poker player. But a master programmer he is not.
Gates certainly can program better than the bulk of the population, and at one time was probably one of the top 1,000 programmers in the world, but that was in the 1970s when programmers were distinctly thin on the ground.
In fact, most of Microsoft's early success was down to Paul Allen, who was an excellent programmer in the most artistic of ways. It was Allen who built the emulators that allowed Microsoft to test its software, and he came up with many of the early tricks that made Microsoft so successful. By the time he stepped down to deal with illness, Allen had also recruited a cadre of similarly gifted folk who took his work forward.
Gates is a brilliant businessman who saw the
market opportunity opening up for him and exploited it ruthlessly and to great effect. But his business acumen is much better than his programming skills.
Shaun Nichols: This myth cuts both ways. While some people may praise Gates as a genius by mistakenly thinking he's responsible for the bulk of Microsoft's early code, just as many people like to put him personally at fault for everything that doesn't work in Windows. As if he somehow left bugs in Office or designed the new Vista interface on a personal console in his office.
The truth about the origins of Microsoft's biggest products does in fact make Gates look like a genius, but not for his programming. Gates made some brilliant decisions regarding the negotiation to buy and sell the rights to the code for things such as DOS, and he is said to have personally spurred the company to develop Windows after seeing the first Macintosh. That doesn't mean, however, that Bill was down in the trenches writing the source code at three in the morning.
Bill Gates is without a doubt a marketing and management whizz kid. But a programming genius? Not so much
8. Macs aren't compatible with anything
Shaun Nichols: Macs run OS X, PCs run Windows. Therefore, if you want to run a Windows-only program or document, you can't use a
Mac. Right?
Wrong. Since Apple went to Intel chips for its computers, they have essentially been dual-boot machines. Apple's own Boot Camp tool allows you to create a Windows partition on your Mac that runs Windows just as it would on a machine made by a PC vendor
If you don't want to reboot, products such as Parallels or VMware Fusion will load up Windows programs right from the Mac desktop. Even if you don't want to install Windows at all, most of the big programs, such as Office and
Photoshop, have file formats that seamlessly switch between the Mac and PC versions.
This is old hat for anyone who follows the computing world even casually, but many would-be buyers still don't realise just how much has changed over the past five years in regards to Mac/PC compatibility.
Iain Thomson: This myth does have groundings in truth. It used to be hell trying to take data across platforms, and I still have a stack of discs that refused to let its data go because of formatting issues.
Credit where credit is due, but this is seldom, if ever, a problem these days. Manufacturers have recognised that hordes of annoyed customers are less valuable than trying to lock people into a particular system.
7. Computers last longer when left running
Shaun Nichols: Not sure where this one started or how it got perpetuated. Perhaps by someone who was really into flying toasters. The idea is that the process of shutting down and starting up takes such a strong toll on a computer, that it is better for the machine if you leave it running non-stop.
This is of course utter nonsense. Starting up really doesn't put any more of a strain on a system than anything else, and in fact regular shut-downs and restarts actually help clear out memory and can prevent crashes.
Never mind the obvious waste of energy that comes from leaving computers on all night, extra work can also make moving parts such as fans die faster. You're also sucking more dust into the computer. So you're doing your PC more harm than good by leaving it running all night.
Iain Thomson: Actually, in the early days of computers this was true, and I suspect that the aging sysadmin population is keeping this myth going. Back in the dawn of computing it did make sense to leave your computer on. The reasoning was that computers heat up when they are in operation, not just the processor but connecting wires and components.
When the system was shut down the components cooled and, by turning a computer on and off, you greatly increased the thermal stress on the hardware itself with the constant cycle of hot and cool. But manufacturers got smart to this decades ago and it is no longer the case that systems should be kept running.
6. Hackers could bring about World War Three
Iain Thomson: The 1983 film
WarGames did a lot of good things, sparking my interest in computers and Ally Sheedy for starters. But I could kick the scriptwriter sometimes for the fears that the film invoked. It was about as useful for home computer users as the 'The birth of a nation' was for racial harmony.
The chances of anyone being able to use a standard modem to break into missile command and launch weapons, or even to access the supercomputers that control them, are almost exactly zero, particularly considering the technology of the day. After
WarGames came out anxious parents were reportedly ripping computers away from their precious little snowflakes in case of accidental Armageddon.
The command and control systems used by the military are some of the most locked down
computer networks on the planet. The military is paranoid enough to take any kind of networking extremely seriously (something we should be grateful for) and hackers stand no chance.
Actually the chances of accidental nuclear war were very high. In September 1983 civilisation nearly went up in smoke after a faulty Soviet satellite detected multiple missile launches from the US. Only the quick thinking of the controller, Colonel Stanislav Petrov, saved the day, which is why I raise a glass of vodka to him every 25 September.
Things were no better on the American side. When nuclear warheads were originally delivered to the US military they needed an eight-digit arming code to be activated, the kind of thing that comes in handy for red flashing numerals in James Bond films. What was the code used? 00000000. Words fail me.
Shaun Nichols: In the 1980s and 1990s there was a bit of romance for the dashing young hacker who could infiltrate the most secret of computer systems without being detected. Surely the public's lack of technological knowledge in the early days of the internet helped to fuel this.
What people never really considered was the billions of dollars spent each year by governments, and the private companies that they contract with, on intrusion detection and prevention systems, and the speed with which authorities will respond to possible threats. Just look at what's happening to poor Gary McKinnon just for hacking some Nasa machines. Imagine the response had he accessed a truly highly sensitive system.
As Iain also points out, human error or technological failures are far more of a concern than the ability for one rogue hacker to somehow infiltrate any sort of system that controls nuclear weapons.
5. Apple/Linux code is more secure than Windows
Iain Thomson: As I've said before there is nothing inherently superior from a security standpoint between Windows, Linux and Apple's operating system. All contain flaws; it's a question of how they are examined and treated.
Apple's operating system is a masterful piece of work but, from a security standpoint, it does have exploitable flaws. The only thing is that malware writers aren't really interested in Apple software. They are in it for the profit, and write code for 90 per cent of computers rather than 10 per cent.
If Apple's and Microsoft's market positions were reversed I sometimes wonder if I would get the same volume of hate mail from Microsoft fanboys telling me I'm a stooge in the pocket of Steve Jobs's marketing machine.
Linux too is not inherently totally secure, as any number of hacks have shown. The ace in Linux's deck is that it has an army of fans who constantly check and recheck the code, and who fix faults as soon as they arise. Commercial companies will never be able to match this tech support resource, either in size or dedication.
Meanwhile, as I've mentioned, Windows is the target everyone is pushing to beat. It doesn't help that Microsoft software is not exactly the best in the world; it's good enough to do the job and that's it. That sums up Microsoft's business strategy in a nutshell.
Add in the fact that, because it's so ubiquitous, the company is hyper-sensitive about issuing dodgy patches. If a duff patch takes down servers then Microsoft's
business clients - its core customers - scream loud enough to wake the dead.
Shaun Nichols: As the largest target, Microsoft understandably gets the most press for its vulnerabilities. The fact that the company issues patches far more often than others doesn't help either. But one look at the massive patches Apple issues every few months should tell you that OS X isn't really bulletproof either.
Really, the argument is rather pointless when it comes to overall security. A well patched system with basic anti-virus protection should be more than secure in the right hands, regardless of who wrote the OS. Likewise, a foolish user lacking in common sense can get just about any machine infected with malware.
This brings up one of the more under-discussed parts of IT security: the so-called 'meatware' problem. No computer is more reliably infected than by human error. All the security software and proper application coding in the world won't help much when a user wants to install software that is hiding a Trojan. It's why social engineering tricks such as fake codecs and phony anti-virus scans remain by far the most popular ways of delivering malware.
Personally, I'll take a well-patched system and some common sense over heavy-duty protection and smug naivety any day.
4. Artificial intelligence just needs a fast enough computer
Shaun Nichols: Perhaps a creation of science fiction movies, many seem to believe that the creation of artificial intelligence is simply a matter of transistors - that once a fast enough computer is put together, true artificial intelligence will soon follow.
This, of course, is not really true. While the computing speed of the human brain is still much, much faster than the average desktop, there is much we don't know about psychology. As a result, there's still a ton of theory that needs to be worked out on the nature of intelligence and the function of the brain before scientists can even start to construct true artificial intelligence on any sort of computer hardware.
So rest easy, folks. We've still got a while before Proteus asks to be let out of the box.
Iain Thomson: The point when computers overtake humans in intelligence terms, known as 'The Singularity', has long been predicted, but I have my doubts. That we will have computers with the same number of synapses as the brain is not in doubt, probably in about 2030 based on current
technology development levels and the assumption that society won't have broken down and we're hunting each other for food.
But so what? Synapses aren't transistors, they can make and remake their own connections. It's a bit of a red herring. True, there is a lot of promising work in using different types of computer to mimic the human brain, but there's another fundamental block to true AI: software.
Human 'software' comes from so many sources that it would be nearly impossible to build a similar system for computers, particularly as we operate at such a fuzzy level of logic. It will be a very long time indeed before this problem is cracked and I suspect I will not live to see HAL refusing to open my
front door.
3. The internet was developed to survive a nuclear war
Iain Thomson: This piece of technology folk law has been trotted out so many times it's become received wisdom. The argument goes that the military developed the internet protocols so that in the event of a nuclear attack damaged parts of the network would be automatically routed around, and data flows would continue, allowing for retaliation and the eventual triumph of the West etc. It's a nice story, but unfortunately it's not even remotely true.
How am I so sure of this? Well, a few years ago I was fortunate enough to share a two-hour taxi ride with Bob Taylor, one of the creators of the internet's precursor, Arpanet. I asked him about this and he had a good chuckle. Yes, he said, it was possible that that excuse was made at the time by some official to Congress in order to get funding, but it was rubbish and a logical impossibility when you think about it.
For electronics the most damaging thing about a nuclear war is not the blast itself, which has only local effects, but the electromagnetic pulse. The minute a nuclear device goes off the pulse blast knocks out pretty much everything in sight for miles around.
Both US and Soviet war plans called for the detonation of nuclear warheads in space over each other's countries to cripple as much infrastructure as possible. About four or five relatively small warheads detonated over the US would destroy around 90 per cent of unprotected electronics in the country. The internet would have had no chance.
Shaun Nichols: 'Beating the Reds' was a bit like a magic phrase when it came to securing research funding in the 1960s and 1970s. This is probably how the idea got started.
Given that most of the early infrastructure for the internet would have had trouble making it through a big earthquake, thinking that it could emerge unscathed through nuclear Armageddon is rather laughable.
Add to this that by the 1960s most of the country's technological hotbeds, places like Minneapolis, San Jose and Boston, were among the highest-priority nuclear targets for the USSR, and you'd have to get a pretty bleak outlook for Arpanet should World War III have ever broken out.
On the plus side, you can rest easy knowing that, should the internet ever gain self-awareness and set itself on eliminating the humanity a-la SkyNet from the Terminator movies, it most likely wouldn't survive either.
2. More CPU power equals more speed
Shaun Nichols: This is a misconception that has spanned two eras. In the 1990s and first half of this decade, the thinking was that higher
clock speeds translated to pure performance, that twice as many megahertz meant twice as fast in practice. Then dual-core chips came along and it changed to twice as many cores means twice as fast.
While this is convenient marketing jargon, it's also pretty bad measurement and not at all accurate. The CPU is one of many components of a PC, and as such is also one of many bottlenecks. Things such as memory and hard drive speeds can have just as much or more of an impact on a system's overall speed as a processor.
The multi-core argument only further muddies the water. While two cores are of course faster than one, they're not always twice as fast. Certain instructions, for example, need to be processed in such a way that they simply can't be run in parallel, effectively limiting many operations to single-core functions.
Perhaps the problem is that the CPU is the most macho of all the computer parts. Many of us nerd types have to fight off the urge to let out a big Tim Allen "cave man" grunt when rattling off the specs for our quad-core beasts. The fact is, however, that the CPU isn't the only star of the speed show.
Iain Thomson: Shaun has this spot on. For years the computer industry, both processor manufacturers and system builders, staged a computing arms race in advertising and PR. Each increase in clock speed was hailed as a competitive advantage beyond price.
The prime example of this was the race between Intel and AMD to build the first 1GHz processor. One of my fondest memories is an Intel spokesman coming into the office just after AMD beat them to this mark. Obviously my first question was how he felt about losing. He looked my in the eye and said "Well you know, Iain, speed isn't everything," and managed to keep a straight face - with a little effort. I'm not surprised that he's now running the UK operation - that took balls.
As you rightly point out, however,
processor speed is little to do with overall performance. Cache sizes, graphics capability and hard drive access times all play their part.
Software too is critical; code has to be written to perform on multi-core systems and older software won't see much of a speed bump.
The shift in emphasis from processor speed is no bad thing. I was a little ashamed reading your description of attitudes towards it. I, and I suspect a fair few readers, have displayed such sad characteristics. Yes, the phrase 'Oh, you're running a 486, how retro!' has crossed my lips.
1. Virus companies write most malware
Iain Thomson: If you want to make a security software specialist spitting mad trot this one out. I've heard it everywhere, even from rational people who understand a little about computers. It's not true and never has been.
There are actually very few proper malware writers. Until recently the vast majority of attacks came from script kiddies, who took someone else's malware code, tweaked it slightly and then released it into the wild. This has changed slightly as malware has become more about profit, but it is still the case.
Anti-virus specialists are adept at spotting the hallmarks of the true virus writers, and if one of them started writing the stuff themselves it is highly likely that they would be spotted fairly quickly. But this ignores the key point about this myth.
The teams of anti-virus researchers in the industry are driven people, in a way that makes the average coding geek look like a stoned slacker. They see themselves as the thin blue line between computers succeeding and failing, and take unusual steps to do so.
It's also one of the few industries where competitors share secrets. Once a signature file for a specific piece of malware has been developed it gets emailed to all competitors who also share information (which is almost all of them - even Microsoft). That means that whichever security software you use you get roughly similar protection.
So what, I hear you say, there are cases of fire-fighters who set fires just so they can be a hero and put them out. Well yes, but if one researcher suddenly started solving all these signature files without a good explanation, then questions would be asked.
Shaun Nichols: This myth is insulting to the good and the bad guys. I think a large part of it comes from a misunderstanding as to the nature of vulnerability disclosures and proof-of-concept code.
What usually happens is that a researcher discovers a vulnerability in a product. The researcher then either directly contacts the company or contacts a third party, such as a TippingPoint, which then passes it on to the company, which then patches it. The researcher usually releases a sample proof-of-concept script to show that he or she actually did find the flaw. Around 99 per cent of the time, this is done before the public even knows about the flaw.
This, to some people, seems unethical. Why would one try and create ways to attack a system? The answer is because the bad guys are really smart people too. The 'white hat' researchers who find and report vulnerabilities for a living are plugging holes that those who create malware and attack kits would otherwise find in time and exploit as 'zero day' attacks for which there are no fixes.
The bottom line is that the bad guys really don't need any help in finding flaws, and getting a vulnerability out in the open is almost always better than sticking your head in the sand and hoping nobody writes an exploit.
So, please keep in mind that these are MYTHS that are often debated and yet to be proven in the realm of cause or effect. There is evidence to substantiate opposing answers, but, considering that we'rediscussing "MYTH", take this information for what it is; a common sense approach to these questions. Afterall, if you have any questions, you can always call us at (612)396-7623 or email us by visiting our website. Until then . . .
Caincorp.NET
(612)396-7623