Chicago Boyz

                 
 
 
What Are Chicago Boyz Readers Reading?
 

 
  •   Enter your email to be notified of new posts:
  •   Problem? Question?
  •   Contact Authors:

  • CB Twitter Feed
  • Blog Posts (RSS 2.0)
  • Blog Posts (Atom 0.3)
  • Incoming Links
  • Recent Comments

    • Loading...
  • Authors

  • Notable Discussions

  • Recent Posts

  • Blogroll

  • Categories

  • Archives

  • The End of An Industry

    Posted by Carl from Chicago on February 14th, 2015 (All posts by )

    When Best Buy first moved into town maybe 15-20 years ago I was excited. I could spend hours in there looking at gadgets, components, routers, TV’s, and had thoughts and dreams of tying them all together. Later, Fry’s opened up, and you could walk through the aisles and buy all the pieces to build your own PC out of parts and make it the hottest gaming platform in town.

    Recently I saw this article in Business Insider (I really like that app / site / etc…) about how to upgrade your MacBook pro (the machine I am writing this blog post on). If you have an earlier model (2011-2), you could spend less than $200 to upgrade your RAM and install an SSD drive (one without moving parts, essentially a big memory chip) and pull out your old (mechanical) hard drive and your machine will then give you many more years of excellent Apple service. Apple’s integrated operating system / hardware plan means that my older machine takes advantage of all the new features in every software upgrade of the operating system (now my Mac “rings” when I get an iphone call and that is a bit annoying but who’s complaining) as long as it has the horsepower to keep up.

    So I took the (minor) plunge and went on Amazon and bought an SSD drive and upgraded RAM and it arrive in a couple of days for less than $200. I am going to take this over to my friend Brian’s house since he’s better at this than me and we are going to take apart the machine and put in the new drive and memory.

    The real point of this story, however, is that the implicit industry of “taking apart devices and rebuilding them” that existed on the consumer side for the last 30 or so years (that I have been part of, at least) is dying. You can’t take apart newer Apple machines and upgrade them. While you can theoretically “jailbreak” your iPhone, fewer and fewer people I know even think of that and instead they are part of the world that views them as integrated devices that you can either use, take to a tech, or replace.

    On the Windows side you can absolutely still take it apart, but the stakes are getting to be so low that it makes little sense. By the time you fix up / build your old system, you could have just bought a new Windows machine for almost no cost. There are excellent Windows machines that are very cheap and you will have to get used to Windows 8 anyways or Windows 10 (soon).

    The new Chrome Books take this to an extreme in that you get a completely integrated device with OS for $200-$300. These devices have made a giant splash at schools and they are all SSD and so cheap that there is no economic point to pulling them apart, either. At least on the windows machines you might have some incentive in order to “save” the operating system or expensive version of MS Office that you bought.

    It isn’t that “tinkering” is dead – look at the Rasberry Pi machine that you can buy for almost pennies – but that it is pretty pointless economically. There was an entire industry of people that opened up machines and upgraded with strange screwdrivers and this industry is mostly in our rear view mirror.

    Cross posted at LITGM

     

    11 Responses to “The End of An Industry”

    1. Bruce Hoult Says:

      I did a similar MacBook Pro upgrade a year ago, except I bought a used 2011 17″ MacBook Pro (the last 17″ they made) specifically to upgrade it, and I left the old the old 750 GB hard disk in place and instead replaced the almost never used DVD drive with the SSD.

      I still choose the parts for and build my own desktop machines, even if I’ll be running OS X on them, most recently in July when I built a 4.5 GHz quad core beast using the new overclockable i7 4790K CPU. It’s a beast.

      The reason for the decline in hackability of laptops and phones is twofold:

      – they are now so cheap that it’s not economically worth it, whether you pay someone else to do it, or use your own time that you could otherwise bill for.

      – the drive to make everything smaller and thinner, and more reliable, means that components are no longer mounted in sockets or with connectors. They are permanently soldered together, or even integrated into fewer and fewer chips. The modern cell phone or tablet chip (of which the Raspberry Pi uses an older one) contains not only the CPU but also RAM, graphics processor, and possibly flash drive (“hard” disk/SSD) and the radios for WIFI, BlueTooth, and cellular as well.

    2. Mike K Says:

      I have an older Macbook Pro that is slow and maybe that upgrade will be something to think about. It will not run the latest OS but maybe a RAM upgrade will make it work. I bought a Macbook Air and mostly use that but the memory is smaller so the old machine sits on my desk.

    3. DirtyJobsGuy Says:

      This is true with just about everything. As a kid in the 1970’s I was interested in amateur astronomy. Prices for actual telescopes and parts were to say the least “astronomical”! So the amateur made most everything from grinding mirrors to fabricating supports from pipe fittings. I also got kits to make goose down jackets, tents and sleeping bags since the low cost asian factories had not yet arrived. And please don’t get me started on Heathkit. Trying to explain to youngsters that it was somewhat cost effective to build your own color TV is pointless.

    4. TMLutas Says:

      The first generation of any technology will be limited in usefulness and expensive. Tinkerers abound. At a certain point, somebody on the production side realizes that they can make money simply by analyzing what the tinkerers are modifying the stock machines to do and doing that as a next generation feature will increase sales. As more and more needs are met by the stock model, tinkering declines. It’s more expensive. It takes longer, You can buy easier than you can build so even the builders will buy on occasion. Tinkering never dies out entirely. It just becomes a hobby that is inexplicable to those not bitten by the bug.

      Tinkering is on the verge of a new renaissance. We’re starting to realize that all the stock features are dangerous to our security by virtue of them being stock. I would be open to a voice controlled TV, but only if I’m doing voice control within the bounds of my network security perimeter. Samsung can’t afford to make such a beast in every TV. They send the traffic outside secure perimeters to a specialist on the cloud. That is nuts, in my opinion. The secure alternative simply will not sell.

      But even when the voice recognition costs drop, you are still left with a machine subsystem that inherently spies on you as it’s major feature set. Create enough of those and the black hats will attack that feature and sell the results. Tinkered solutions require tinkered attacks and are resistant the economies of scale that make most attacks affordable. If you’ve got all custom, tinkered solutions, anybody seriously targeting you is a personal enemy, not somebody out to harvest a few hundred thousand credit cards.

      Security is durably going to provide an advantage to the tinkerers and that’s going to change the world.

    5. dearieme Says:

      I haven’t soldered anything since I was fourteen.

    6. dearieme Says:

      I’ve just found this sad story from Chicago’s history.
      http://www.thehistoryblog.com/archives/34829

      RIP

    7. Kirk Parker Says:

      DirtyJobsGuy,

      Trying to explain to youngsters that it was somewhat cost effective to build your own color TV is pointless.

      I have a different take on the matter. Not saying it’s easy to pull it off, but if you can successfully explain that to youngsters, you’ve not only given them something interesting from the recent past, you’ve also helped them along the way to actually understanding economics.

    8. Mike K Says:

      Mike Rowe on credentials vs qualification.

      QVC had a serious recruiting problem. Qualified candidates were applying in droves, but failing miserably on the air. Polished salespeople with proven track records were awkward on TV. Professional actors with extensive credits couldn’t be themselves on camera. And seasoned hosts who understood live television had no experience hawking products. So eventually, QVC hit the reset button. They stopped looking for “qualified” people, and started looking for anyone who could talk about a pencil for eight minutes.
      QVC had confused qualifications with competency.

      Perhaps America has done something similar?

      The whole thing is on Facebook and hilarious about how he got started on TV.

    9. Michael Hiteshew Says:

      >>They stopped looking for “qualified” people, and started looking for anyone who could talk about a pencil for eight minutes.

      I was always infatuated with Mary Tyler Moore. Even as a child, I found her combination of sweet and sexy very alluring. I was reading a bit about her one day and came across an anecdote on how she got her spot on the Dick Van Dyke show. They’d apparently been looking and looking and holding auditions with actress after actress and couldn’t find the right girl to play Rob Petrie’s wife.

      One of the people on the production team was sitting in on an audition for a commercial and she was there. He got up, walked down the hall to the producers meeting, walked in and said ‘I think I found her!’ She had never even been considered till that moment. Complete happenstance. She wasn’t qualified but she was capable. And then some.

      It was years ago that I read this but I think that’s basically accurate.

    10. Michael Kennedy Says:

      I used to interview applicants to UCI medical school. I was always interested in applicants show had real life experience. Mostly the other interviewers were interested in things like, “Did you volunteer in a hospital ?” It got so students applying knew what they needed on their resumes.

      It was a racket. The worst doctors I have seen are usually child prodigies. No real world experience.

      An early example.

      He did well in school, graduating from high school by the age of 16. When drafted by the US Army during World War II, he scored exceptionally highly on the Army General Classification Test, with the result that the Army sent him to medical school.[2]

      Brown graduated from University of Utah School of Medicine in 1947, and worked as a general practitioner for almost two decades. However, after almost losing a patient during a thyroidectomy, he decided to undertake formal surgical training.

      Despite excelling in the written aspects of certification for the American Board of Plastic Surgery, he failed the oral assessment (blaming his ‘domineering’ father).

      We had another case in Orange County, that has pretty much dropped off the radar, of a plastic surgeon in Orange County who allowed a woman to die in his office and was convicted of second degree murder. He had graduated from Johns Hopkins medical school at 19. He was an outstanding student. But unfortunately a psychopath.

      Real people make the best doctors. Even heart surgeons, which I once was.

    11. dearieme Says:

      “Real people make the best doctors.” I saw a doctor in our local teaching hospital yesterday. He was running more than an hour-and-a-half late; when I joined him in his consulting room he apologised. I said that I didn’t mind the delay, it was the appallingly high temperature in the waiting room that I objected to. He jumped up, reached for his window, and opened it wide. Well done that man. His doctoring was pretty good too.