Category: technology

The beeps, boops, and blops that engulf modern life

  • HP C7000 Bladecenter: awesome new test lab in one unit!

    Last year I picked up a surplus HP C7000 Bladecenter (generation 1, complete with BL460c and BL480c G1 blades, fibre channel, Cisco switches, etc) to run Openstack test loads. At about 150 KG, I paid just under £3 per KG, which is pretty damn good for a self-contained test lab (near scrap metal prices, actually).

    It took me a while to get somewhere, because my super-cheap home-friendly rack wasn’t deep enough. Eventually I acquired a second-hand Kell Systems 24U silent rack (now called the APC Netshelter SX) and plopped it in the corner of the living room (after taking the front door to the flat off the hinges). Now I’m building up a new routing and switching core for the house, and enabling the C7000 at the same time. The rack also came with a rackmount APC SmartUPS with external battery pack, which is a nice addition.

    I know the C7000 isn’t new, but let me tell you, from someone used to bootstrapping with generic white boxes, it is a pretty cool piece of hardware. I just spent an hour going through basic technical documents to get ready to put everything into service, and I have to say I’m impressed. For the price of two rackmount servers I’m putting 10 machines online, complete with remote management, Cisco switching, high-speed interconnects, and full redundancy. HP clearly knows what they’re doing.

    Total cost so far — Bladecenter, Blades, Switching, UPS/PDU, silent rack, spares — including shipping, about £1,200. I’ve paid more for a laptop. Resale value (if I wanted to) probably three times that. But seeing as the newer HP BL servers are the same form-factor, and second-hand prices for something slightly older are extraordinarily reasonable, incremental upgrades aren’t expensive. So my test lab is set for the next several years.

    HP engineers, my hat’s off to you. The architecture of the C-Class (or whatever your marketing wonks call it) is amazing, especially at second-hand prices. For anyone who is looking to test or startup with some in-house hardware (yes, I know, ‘the Cloud’, etc), take a look at a second-hand C7000.

    When I get it all running, I’ll post some pictures. But, since the rack looks like decent office furniture, nothing is exciting until you open it up.

    Now, anyone have a stack of HP driver DVDs they need to get rid of?

  • Pi-Top First Impressions

    So, my Pi-Top arrived last week. Today I took everything out and put it together. Here are some initial observations:

    Slick packaging, well thought-out, and very professional. It looks like a real product the minute you open the box.

    This is a cool educational device that will help kids tinker, but it is definitely a work in progress. But this is not the Arm-based laptop I’ve wanted for years. When I get some time I’ll fire up my Samsung Arm Chromebook and see what modern Ubuntu looks like on it.

    Each ‘education’ drive seems to want to roll their own UI. Sugar was pretty radical, but Pi OS just seems like a launcher on top of Raspbian, and is buggy out of the box. I haven’t used a Kano, so I can’t comment there. I do wonder if there’s any real benefit in not sticking to a standard interface — they’re not creating value here, and including LIbreOffice would also probably be a good idea.

    The keyboard and trackpad really suck. I mean, really suck. There is no way to touch-type, keypresses often get lost, and the touchpad to the side thing isn’t working for me. I was going to write a review of the Pi-Top on it, but I’m already so frustrated with the keyboard that I don’t think that’s possible. It is the main method of input, so it would be nice if it worked well. Perhaps it will get better the more it’s used. For the record, the OLPC keyboard also sucked.

    I’ll kick the tyres on it for a few days, then write up a more detailed review.

  • Moving to the public cloud? Yes, you still need operations staff.

    A quick note, following from news of Google Compute outage yesterday, and outages caused by DNS changes at Amazon S3 slightly more than a week ago, it’s important to remember that moving to the cloud still requires operations (sysadmin, devops, whatever we want to call it).

    There is a belief that moving to the public cloud allows companies to outsource most, if not all of their operations staff. But there is a very real danger in abrogation of ops responsibility.

    If you are outsourced to the cloud, do you have a disaster recovery plan? What happens if the systems that are ‘too big to fail’ do just that?

    I’m not saying the cloud is bad — it enables companies to go to the web with next to no capex investment. But that doesn’t mean it is the end all be all, and if you’re not taking care of operations in-house, it’s very likely that you will regret it.

    Here’s some further interesting reading: TechTarget Cloud outage report 2014.

  • Why Privacy Matters, A Real-World Example

    Earlier this month, someone I went to school with a long time ago was arrested and charged with some fairly serious crimes. I wouldn’t call him a friend in pre-social media terms, but we knew each other a long time ago, and that’s good enough in the era of Facebook. Which is telling.

    I found out about the arrest through his friends posting on his Facebook page, which showed up on my Facebook news feed. Curiosity engaged, a quick search with Big Brother G turned up reports from the local newspaper and television station, which had some pretty gory detail of the accusations. Of course, in the USA, suspects are innocent until proven guilty, right? And these are unproven accusations, right?

    Within an hour, the article text from one of the two primary sources spread throughout at least two dozen sites. Whether they are affiliates, or simply stole the content is irrelevant. The information, including the name of the suspect and all the details of what he allegedly did, are now out in the open. And with tools like the Internet Archive (AKA the Wayback Machine) and Google, they can never be sequestered. His life, as he knows it, is over.

    When I studied journalism many years ago, we discussed the ethics of police log and courtroom reporting at length. The need to balance the public’s right to know and a subject’s right to privacy was central to those discussions. We knew that publishing misleading information could destroy the life of an innocent person. There was a lot of talk about the difference between a private person and a public figure. Simply because something was in the public record didn’t make it newsworthy, and we understood the tenet of innocent until proven guilty; anonymous and alleged references were common.

    Perhaps the new generation of journalists didn’t participate in similar discussions. In the last few years, the local newspaper where I grew up, the San Luis Obispo Tribune, started putting every suspect’s mug shot (for non-Americans, that’s the picture the police take of a suspect when they are booked into jail) online, along with the details of any alleged crime the suspect has committed. While technically legal — the Tribune is not overtly presuming guilt — very few of those on the booking sheets are public figures, and I would argue that there is no public interest served. This practice is little more than gossipmongering, and anyone who purports to be a journalist and participates in it should be ashamed.

    Even worse for my old classmate, and stereotypically so for those of us with a print versus broadcast bias, the local television news station, KSBY, decided to go out on the street to solicit responses to a presumably innocent man being charged with a crime that he denies. They gave airtime to the man’s flustered boss, who didn’t know quite what to say, and a local lady who doesn’t like the man’s house, because his curtains are usually drawn when she walks her dog past it. And, yes, this report was also put up online.

    Perhaps you would argue that the fault here is not one of lack of privacy, but the ethics of the journalists involved. I agree, however, in the interconnected world anyone can push an agenda and watch it spread. The nature of journalism has fundamentally changed, and the role of the journalist as gatekeeper of the spread of news is long over. Now we pull news with search engines as often as we are pushed it by editors. Sometimes the pull, like Google News, is packaged automatically for us. Algorithms will get better, and as more of our personal data seeps out onto the public Internet, we will be hard pressed to control it. Once you do something people seem to find interesting, it will be available for more and more people to see.

    And, there is no such thing as exclusively local news now — as evidenced by the spread of news of my old classmate’s arrest. Memories are stored in the cloud now, not on yellowing newsprint, and the lifespan has gone from years to infinity. There is no way to pull back what has already been released into the wild — the information multiplies like the rabbits outside Stanstead Airport. And a retraction or update, usually buried on an interior page and not even addressed by broadcast news, is much less juicy than an accusation; good luck getting it carried by a dozen more websites.

    The European Unions’s Right to be Forgotten is, at best, a bandage over a seeping wound. Google wasn’t Google 20 years ago, and the next sea-change will be just as profound. Until we realise that our data, our reputation and person, are integral to our freedom, we cannot affect a change in the world. Most people don’t even know why they might want to do so.

    Perhaps the story of my old classmate can serve some purpose there. Without being proven guilty of any crime, his reputation has been ruined for all time. A quick search online will, now and forever, turn up the details of this arrest. Presuming his innocence (as is the law), that is the opposite of American ideals. If he is guilty, a report on conviction removes any need to report on the arrest.

    This both an issue of privacy and of journalistic ethics, so the lines are a bit blurred, but no matter what your opinion on this case, the reality is that there is nothing stopping the same thing happening to you tomorrow. Even if you have nothing to hide, you have something to protect.

    P.S. While I’ve tried to anonymise things as much as possible, there are enough identifiers here to find the case, if you care to.

  • Do Not Throw Away That Laserwriter, Ramsay Wood!

    Yesterday I was having dinner with Ramsay Wood at the Priory Tavern. We hadn’t seen each other for a little while, and were having a nice catch up. Then he said something which offended my every sensibility.

    ‘I have to throw away my Laserwriter,’ he said. ‘My guy came around to install a new printer, and he couldn’t figure out how to make it work. He said something about the drivers.’

    ‘That doesn’t make any sense,’ I said. ‘Drive me over to your house, and home after I fix it. I’ll have it done in 15 minutes.’

    As a bit of history, I used to work with Ramsay, then for Ramsay, and part of that was to help him out with his Mac and small network. I told him to buy the aforementioned Laserwriter more than a decade ago (second-hand — those things were expensive). It is a Laserwriter 16/600 PS, one of the last couple of models made, definitely more than 15 years old, and a total workhorse. I also knew that Ramsay couldn’t have printed more than a couple thousand pages — nothing for this printer — in the last 10 years.

    A bit of an aside, here. I expect that even if Ramsay was printing 10,000 pages per year, this printer would likely still be working. But my lovely Lexmark X544 colour laser multifunction printer lasted only two years, and not very many pages, at that. And why, Lexmark, does the scanner stop working if the printer fuser needs to be replaced? But I digress.

    What followed was a series of simple troubleshooting. The printer was on the wrong IP subnet (confirmed via the config page that prints at power up). Corrected that via telnet (there aren’t any Macs old enough to run the Laserwriter utility at Ramsay’s), and configured the computers with the new IP using the generic postscript driver (really, new guy, no driver for a postscript printer?). I also took a minute with Ramsay’s new Sky router to make sure that there would be no IP conflicts by setting DHCP static assignments — Sky’s brilliant netadmins specify all IPs from .2 to .254 for DHCP, as well as using a default admin password of ‘sky’. After that, the Laserwriter was back online. I also set the new IOS-friendly printer to a static IP, so it would, you know, continue to work, even if someone else asked for the same IP. Total time elapsed, about an hour.

    What lessons were hammered home by all this?

    1. I miss sometimes having simple problems that I can fix myself. Although I’m not sure that I miss doing desktop support for a living.
    2. My current printer is a POS compared to this one which is definitely more than 15 years old. Increasing revenue with consumables and repairs is definitely part of the busieness model.
    3. Ramsay’s new guy definitely didn’t go looking too far into the IP side of things. Or he didn’t understand how things work.
    4. Sky definitely do not care about making their router setup sensible or secure. Even O2 was better.
    5. Time to get the Powerbook G3 Pismo from the office and figure out what’s wrong with it. Who has a copy of MacOS 9 kicking around?
    6. I am not as good as I think I am, but it still was fun!

    Excelsior!

    PS — Anyone have A/UX floppies and a clean ROM for an SE/30? It has ethernet….

  • A bit of PC history (personal computing, that is)

    I just finished reading Walter Issacson’s biography of Steve Jobs. If you’re interested in the personal computer revolution, you should read it. It’s an insightful look at one of the men who definitely helped shape the digital world. I also realised that that as of April 2014, I’ve been running Linux on the desktop for more than 10 years.

    Both of these made me think a lot about my computing history. Maybe this isn’t interesting to everyone. Maybe it is. I’ll leave it up to you to decide.

    I originally wrote a blow-by-blow description of every computer I owned, and how each affected my development as a computer user, person, and proud geek. Then I thought more about the stages of my computing life, and through decoupling the individual items, I think there are more interesting themes. For those that are curious, the original list of primary workstations I’ve used is below.

    The first part of my computing life was, like many of my generation, running games and basic word processing. My first computer was an Apple IIc. Complete with an Imagewriter II, it was an efficient platform for word processing and strategy games that ripped off Star Trek. I guess I spent a lot of time on it, but didn’t do much. I got it when I was 10, and immediately started making AD&D character sheets.

    My second computer was a DEC Rainbow 100 with an XT attached via a slave interface. It was a hand me down from my mother, and frankly it was a POS. But it had a hard drive, so it was relatively more awesome than my IIc was. I think I had DOS 5 on it. Not that it could do much (even less than my IIc). I don’t recall having much in the way of software.

    My third computer was a 286, complete with a new monochrome monitor. I think it even had a modem for the BBSs. 5.25″ floppy, still. One of my friends showed me that MODs were cool, and we soldered a headphone jack onto the speaker cable so I could play music. MODs are still cool, BTW. I had DOS and Windows 3.1, and I probably had MS Works or something like it. It was okay for school work. But not inspiring. But it marked the transition of my computing life from simple user, to tinkerer and someone who pushed beyond the boundaries of what was easy do do with computers.

    Two of my friends had 486 boxes back then. I was envious. They also had SVGA and soundcards, which meant they played games, and I didn’t. I still don’t, much.

    I started working on the high school newspaper the summer between my sophomore and junior years. They had these funny little microwave oven looking computers that all the students had to use. They were monochrome, with a 3.5″ drive, and could print out to a laser printer. And they were really expensive, so we were told not to screw around with them. There were two more capable computers, which colour screens. Those were for the advisers, of course. This was my first experience with Macintosh.

    At first, I was reluctant. How could Macintosh be good, if it looked so little like other computers? But, with Aldus Pagemaker (the Ragemaker), a laser printer, an art waxer, and some backlit layout tables, I began to see what we could make with computers. And I began to love Macintosh. So much that I bought my own Macintosh IIsi, sporting 9 MB of RAM and an 80 Mb hard drive, hooked up to a 12″ colour monitor (my first). I made newspapers with that computer. We installed a network and made one of the older Macs a server. We made money, had fun, and won awards. And we were just high school students. The Mac empowered us to do more. By the time I went to university, I was in fully love with Macintosh.

    And thus began the second part of my computing life. I like to call it advanced tinkering. The Mac made things like networking — which was frankly awful on WinTel machines — easy.

    One of the first things I did at university was to buy one of the new Powermacs. An 8100/80 — the top of the range. There was a good deal through the university, and I had some money. And I knew that it would be even better than my IIsi. I collected it from the university bookstore, and disassembled it on the hood of my car in the parking lot to ‘see how it worked.’

    And it was better, in every way (except for the case). I had been on the Internet before, but only with shell access. I added a 14.4 Kbps modem to my 8100, and with my university SLIP connection, I was on the Internet from home. Ragemaker was running faster than ever, and I discovered the joy of Filemaker Pro. I was doing more things with my Mac. I traded the 8100 for a Powerbook 520, and enjoyed my first Mac laptop, and also the first Mac I didn’t love. I traded it for the second Mac I didn’t love: a 7200/75, one of the first PCI-based Macs.

    I call this period of my computing life: when Macs sucked. It lasted a good long time, even though I remained a Mac user, and advocate, for another 10 years or so. The 520 had a crappy passive-matrix monochrome screen and an underpowered processor. The 7200 was totally unstable — bad, rushed hardware combined with an ageing operating system. Even the 8100 probably wasn’t all that good, but I had never owned a new computer before, so it was exciting.

    I was hanging out with an interesting crowd in those days, the slo.punks, a group of computer science students, systems administrators, and, well, authentic individuals. I also worked in computing on campus — I didn’t come from money, and computing paid a lot more than other jobs. And through these social and work roles, I began to run into Unix. I didn’t understand how these computers — which often had less resources than my Powermac — could handle multiple users, and operate on the Internet so well. I needed to replace the 7200, which I managed to sell off, but what could I replace it with?

    Normal people would have bought a Powermac 7500 or maybe an 8500. They were regarded well. But I knew the Mac backwards and front, and frankly this was not a good time for the Mac hardware or OS. Clones were coming in and muddying the waters. So I thought about it for a while.

    During that while, I cobbled together a 386 and installed Slackware Linux. My hard drive was too small to install X Windows, so I made do with a text console. It got me by for a few weeks, and reminded me how much I hated PC hardware. Remember IRQ conflicts, Himem, and all that other crap we put ourselves through to make computers work? Well, I had seen MicroSoft Plug and Play in operation at the university, and I was not impressed. So PC clones, Windows, and by extension Linux, were out of the running. If the Mac was out, and PC clones were out, then I was looking at Unix workstations. I was not rich enough to buy new, and neither Sun nor SGI impressed me. But something black, cubic, and monochrome did. So I set off to Mill Valley to pick up my NeXT (040) Cube.

    I call this part of my computing life, Unix, and the rebirth of Steve Jobs.

    It should have been obvious that the combination of a GUI with a fanatical eye for detail and Unix would lead me to NeXT. And, while I was stepping down significantly in processing power — the NeXT was discontinued when I was in high school — the integration of GUI and Unix was the most exciting computing experience of my life. I also began to inherently understand the Unix Way — small, elegant solutions could be linked together to solve larger problems. On the Mac, I had dozens of very complicated programmes. On NeXT, most of my computing revolved around the command line, Textedit.app, and Mail.app.

    That Spring, I applied to study for a year in Sweden. Taking a NeXT Cube to Europe in economy class didn’t seem a wise idea. I managed to buy an ex-demo Apple Powerbook Duo 280C for the trip. Leaving my beloved NeXT in California, I began to install applications to make the Mac as NeXT-like as possible. Greg’s Browser gave me Miller Columns. Dragthing gave me something like a dock. All in all, the Duo 280C was a great portable computer for the time that I needed it. But, when I got back to California I sold it, and went back to my NeXT. Over the following two years, I moved off the Cube onto a Turbo Colour Station (faster, with colour). I also had a desktop Mac (7500/100) for DTP and Filemaker. This was just as MacOS X Server was coming out, and the NeXTies in the city (and there were more than a few of us) were excited that we might get a modern version of NeXTStep/OpenStep for Mach.

    I finished university and moved to London for the first time. I knew that I would be between places, and that I needed a portable computer. Thinkpads were very expensive, as were Twinheads (those were the brands that were certified for OpenStep). So, while I was still a student, I purchased a Powerbook G3/266 (Wallstreet II). The G3 was a big upgrade in processor power, and with two batteries loaded, it was a real road warrior. As much as I loved the NeXT, I didn’t intend on coming back to California any time soon, so I sold my Turbo Colour Station (as well as my desktop Mac). MacOS was at version 9 (System 9 to some of us in those days), and I was making some money with Filemaker, BBEdit, and Webstar. As a NeXT fan, when I looked at MacOS X, all I could see were what I thought were UI design mistakes — like a single dock for launching apps, and one for running apps. There were lots of small annoyances, and frankly all the software I used would run in emulation on MacOS X. I didn’t think that was a great idea at the time.

    But time marches on. I went back to California, replaced my Powerbook with a newer version, and embraced MacOS X as much as I could. I separated launching and running applications by moving the doc and using Dragthing. It was a good laptop, which was unfortunately stolen when I lived in San Francisco (along with my 5 GB iPod, my favourite ever backpack, and my stainless steel espresso mug). In San Francisco, I acquired another Turbo Colour Slab, as well as a Turbo Cube. I also had a prototype G4 Mac loaned from a friend that served as an in-house Web and Filemaker server. I replaced the lost Powerbook with an iBook, since I needed something portable for travel to the UK — during my years in San Francisco I was chronically under-employed (it was the dotcom bust, after all), and did most of my consulting work for UK contacts.

    Eventually, one of those contacts turned into a real job. So I left (my heart) in San Francisco, and moved full-time to London. Not long after, I sold my iBook, and bought a Powerbook G4. The iBook wasn’t a bad computer, but it wasn’t as good as the G4. Or so it said on paper. To be honest, both of those computers, along with my subsequent Powerbook G4 Aluminium, were not great. They all had hardware problems, which necessitated AppleCare.

    And let me tell you, if there is one word for the reason I stopped buying Apple computers, it’s AppleCare. If you were lucky enough to have AppleCare in Europe, you could send your computer away for repair in Amsterdam. It probably only would take two weeks or so. I wasn’t rich enough to have a second computer for when my primary one needed repairs, so I looked around for another solution, much as I had years before when I started with NeXT after a string of disappointing Macs.

    Also, something was happening to Apple in those days. Apple was fiercely in the camp of small developers, with a huge and active community of freeware, shareware, and commercial companies. But something was turning inside Apple, and they began a long run to eventually control their application marketplace. Even back then, I knew that taking the ideas small developers had used and making system software that did the same thing wasn’t a good idea. Also, the one application which could have kept me on the Mac — OpenOffice — wasn’t ported to MacOS X, and Apple didn’t care to do so.

    So, a lot of people said that Dell support was exceptional. And if your hardware isn’t exceptional, but your support is, well, does it really matter? I bought a Dell Latitude D600 and started trying to get a decent desktop Linux running on it — I still didn’t like the Windows UI very much. I went through Debian, Mepis, and eventually landed Ubuntu Warty Warthog. Ubuntu took the software library of Debian, packaged a good-enough UI (KDE3), and made everything work.

    I call this part of my computing life Linux. It was an evolutionary change from what came before, but what a change.

    Now, the UI and software were not as complete as MacOS, or NeXT, but Linux is all about options, both in application and user interface. Most of the options were, and are, somehow better than Windows, and somehow worse than the Mac. But they were free, both as in beer, and as in speech. All the time I spent on the Mac, people were passing back and forth pirated and cracked applications, and on Windows it has always been worse — just go to any ‘developing market’ computer store and you can buy every application you need for $2, complete with malware which will steal your passwords, if not your bank accounts and identity. I honestly don’t know why people bother.

    And so began my time with Linux — mostly (K)Ubuntu and Mint — through four Dells, two Asus Eee PCs, a Lenovo Thinkpad (service not nearly as good as Dell), and now back to Dell with an XPS 13 Linux Edition (Sputnik). I also had a Macbook Air for a month or so, but not having the flexibility of Linux was too confining — my wife has that computer, now. I also have a Samsung Chromebook that I bought to run Ubuntu on — Arm is awesome — but hardware support is sketchy, so I went back to ChromeOS (itself a build of Gentoo Linux).

    I was incredibly fortunate to begin my computing life as the personal computer came to maturity, and even more so to see the mainstreaming — at least amongst technical people — of the opensource and free software movements. The change from mainframe and minicomputer days to the personal computer was a liberation of computer users. The opportunity to participate in the FOSS revolution was likewise a revolutionary moment. Today’s resurgence of the mainframe model as cloud computing is perhaps a step back, but at least there is a healthy market for competition on where and when individuals can store their data — if they choose to do so at all.

    With the recent confirmation of mass data collection — which isn’t surprising to most technical people, even if it is disappointing — we sit at the cusp of another groundshift in computing. Whether it means that users will be empowered and own their own data, or whether they will give up all rights to the people who run cloud services, is anyone’s guess. But at least today we have an option.

    For me, I have slowed my changes over the years. I’ve never twiddled every setting, and there are times that I miss the combination of elegance and convenience that the Next had. It could be that I don’t have to miss it though — FOSS gives freedom and choice that we never thought about before. The citizen-programmer is not far off, and as long as we remember that we are making a purchase every time we trade privacy for convenience, there should be a way to move forward that is both convenient, elegant, and useful.

    Or we could all just go buy a Next. But there are precious few of them left.

    A somewhat incomplete list of my workstations over time:

    Apple IIc
    DEC Rainbow 100 w/ XT slave
    286/12 (upgraded MB to 286/25)
    Mac IIsi (12″ monitor)
    Powermac 8100/80 (16″ monitor, 14.4K modem)
    Powerbook 520 20/500
    Powermac 7200/75
    386 (Slackware)
    Next 040 Cube
    Powerbook Duo 280c
    Color Turbo Slab
    Powermac 7500/100
    Color Turbo Slab (Polycon)
    Powerbook G3/266 (Wallstreet II)
    Powerbook G3/333 (Pismo)
    Turbo Cube (SF)
    iBook G3/500 (Icebook) (SF)
    Prototype Powermac G4 (SF)
    Powerbook G4 (Titanium) (London)
    Powerbook G4 (Aluminium)
    Dell Latitude D600 (Ubuntu Hoary)
    Dell Vostro 1400
    Asus Eee PC 900
    Asus Eee PC 1000HE
    Dell Vostro V13
    Apple Macbook Air 11
    Lenovo Thinkpad X220
    Samsung Chromebook Series 3
    Dell XPS 13

    Palmtops, Mobile Phones, and the like:

    Newton 2000
    Palm Pilot IIIxe
    Newton 2100
    Nokia Brickphone (maybe a 5110?)
    SE T20e Tomb Raider Edition
    SE T68i
    SE Z600
    Nokia E61
    Nokia E71
    Nokia N810
    Nokia N900
    Nokia N9
    Nexus One
    Nexus S
    Samsung Galaxy S Duos
    Nexus Galaxy
    Nexus 4
    Nexus XT1033 Dual-SIM
    Moto G XT
    Nexus 7
    Nexus 7 2013
    Jolla
    Blackberry Z10
    T2Mobile Flame FirefoxOS Reference Phone

    Servers and Museum Pieces:

    Mac SE/30 (w/ Ethernet)
    Raspberry Pi (the cluster grows at home)
    HP Microserver
    HP Tower Server
    Apple TV (2)
    Pivos Xios Android Set Top Box

    Lustlist:

    Cobalt Qube
    Next Turbo Dimension
    Quadra 700
    Sparc and HP Nextstep Workstations
    SGI Indy
    BeBox

  • Troubleshooting

    Computers are not magic. Nothing in technology is. Computers are tools that follow the operator’s instructions. Commands can be complex, and underlying code can possibly be send incorrect instructions, but in the end, there is no room for interpretation. Computers cannot ‘get mad’ at you, or have a bad day, or do something that they’re not told to do. Not yet, anyway.

    I deal with technical problems every day. Generally, I try to solve them, with methodical and unambiguous techniques that produce quantifiable results. That’s a complicated way of saying I work on a problem until it’s solved and I know why it happened. This is a cornerstone of my entire professional life, and the process is simple. No matter how complex your problem, troubleshooting follows the same steps:

    1. You have a problem. Define it.
    2. Identify the variables (what can change, especially what you can change).
    3. Change one variable
    4. Test. Do you still have the problem? If not, quit, you have solved it.
    5. Change your previous variable back.
    6. Change another variable.
    7. Test.
    8. Repeat until a solution is found. Eventually, you will be rewarded.

    Sometimes you get a complicated problem, with interaction between multiple variables. But that’s when your process has to be absolutely methodical and boring. Even if the system is burning down around you — especially if the system is burning down around you — you must stay calm; troubleshooting takes as much time as it takes.

    I bring this up because, over the years, I have encountered a staggering number of technical people — engineers, computer scientists, systems and network administrators — who do not manage to be methodical, for one reason or another. Many people in the IT field don’t have a good troubleshooting process, and waste a lot of time and effort as a result — both their own, and that of those they work with (like me). Even if they solve a problem, they won’t know the cause, won’t be able to recreate the problem, cannot come up with a permanent fix, and cannot apply this experience to future problems.

    Sometimes these folks are highly pressured and attempt everything they can think of at once. Sometimes they ‘don’t care what the problem is, as long as it’s fixed.’ Many times they simply do not have a background in or experience of problem solving, and also don’t understand what benefits a step-by-step process brings. But a cool head, methodical work habit, and good documentation, combined with sensible precautions (you did back up, right?) will always yield the desired results. Rushing and not knowing why things are working will only lead to problems down the road.

    I would like to thank the science teachers I had in California public school, who taught me how to design an experiment at an early age. I’m not sure if it was third grade or seventh, but valid experimental procedure has become my ingrained response to solving technical problems. Without it, I wouldn’t have had a good job in university, wouldn’t have managed a technology career, and would not have the life I lead today. My hat is off to you, my former teachers. Here’s hoping there are still some people out teaching the basics.