Show more

(It must be said, at this point, that 'useful' and 'appropriate' are relative terms. In 2009 I gave a good friend a computer that had been built for Windows 98. It was running Puppy Linux from a CD, and saving data to a USB flash drive over USB 1.1. It did word processing, email, and basic web browsing. It had a whopping 64MB of RAM, and was covered in glitter, googley eyes, and carpet samples. But it was free, and it wasn't useless, and that was important.)

I went to school to become a programmer, and discovered that I don't enjoy programming as it exits today. I understand it well enough, and I *can* do it, but I don't *want* to. I make websites, and I build tools to help other people use computers.

I make my living as a systems administrator and support engineer. (and I'm looking for a new gig, if you're hiring.) That's a fancy way of saying that I solve people's computer problems. Professionally, I'm responsible for identifying and mitigating the shortcomings of various computer systems.

Guess what?
There are a lot of these shortcomings. Like, a lot. More than I ever expected.

Some of these shortcomings are legitimate bugs. Some of them are bafflingly short sighted or poorly considered architectural decisions. Just as many are cases of a divergence between the needs of the user and the abilities of a program. Modern programs are often feature incomplete, poorly supported, and difficult or impossible to customize. Modern computers are often slow, and cranky. I'm responsible for handling the fallout of this unfortunate situation.

I've seen how revolutionary a computer can be, if it is designed with the needs of the user in mind, and how disastrous the same can be when it is not. I've seen computers used to empower people, and used to oppress. I've seen computers be Good, and the consequences of when they are not.

So that's who I am, and my experience with computers so far. Those are my credentials, and my qualifications.

Before we go any further, let's talk about The Computer Chronicles.

The Computer Chronicles was a TV show that ran from the early 80s through the early 00s. Over it's nearly 20 year run, The Computer Chronicles covered nearly every facet of the newly developing Computer industry. It was hosted by people with Opinions.

The guests were, frequently, people who were proud of the things they made, or the software they represented.

Watching the developer of CP/M and DR DOS talk to a mainframe engineer who worked at IBM in the 50s about the future of computers as seen from the 1980s was eye opening.

On the one hand, this show serves as an excellent introduction to, or reminder of, the capabilities of computers 35 years ago. It helps us see how far we've come in terms of miniaturization, while also demonstrating again that, in many ways, there is nothing new under the sun.

Before the advent of the internet, reporters were writing their stories on laptops and sending them in over phone lines, 25 years before the release of the iphone HP released a computer with a touchscreen, three years before microsoft released he first version of windows Apple and Visicorp demontrated GUIs wih features that Windows wouldn't be able to approach for another 9+ years.

And, of course, I'm reminded again of Douglas Engelbart's 1968 "Mother of all Demos", in which he demonstrated the mouse, the GUI, instant messaging, networked gaming, and basically every other important development of the following 50 years.

It took 5 years for Xerox to refine and miniturize Engelbart's ideas to the point that they thought they could market them, and another 10 years before Apple refined and further miniturizaed the same ideas, and brought us the Mac.

Nothing is ever new.

The whole video of Engelbart's Online System (NLS) is available on youtube. Some of it is *really* interesting. Most of it is unfortunately dry. It's easy to forget that this was 50 years ago, and also mindblowing that it was only 50 years ago.

Anyway, back to Computer Chronicles. In an episode about Word Proccessors, the man they were interviewing said "There's a lot of talk about making people more computer literate. I'd rather make computers more people literate." There's a phrase that resonated with me in a big way.

It sounds like the kind of semantic buzzword shuffling so common in standard corporate speak, but I got the impression that the guy that said it, believed it. He believed that computers had gotten powerful enough that they no longer had to be inscrutable.

There were others working around the same time on similar ideas, or at least from a similar philosophy. Working to make computers, if not intuitive, at least comprehensible. I think this is a noble goal.

The computer is often thought of as a tool, but it is more like a tool shed, in which we store a collection of tools, a source of power, and a workspace.

The tools of the 60s and 70s were primitive, partially because of the limited space and limited power our toolbox could provide for them, but also because our ideas and understanding of how these tools should work were limited by the audience who was using the tools.

That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.

Computer culture had, by and large, a kind of elitism about it as a result of the expense and education required to really spend much time with a computer. This changed, slowly, starting in the mid 70s with the development of the Microcomputer Market and CP/M.

Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."

Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.

Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.

That, IMO, is the distinction.

I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.

When I was a kid I was brought up with computers that showed you how they worked.

You booted in to a command prompt or a programming language, or you could get to one, if you wanted to.

I got to play with GW Basic and qBasic and, a little, with hypercard.

I got to take apart software and put it back together and make things that made people happy.

I got to make things that I needed. I got to make things that make me happy.

Today, the tools to do that are complex to compensate for the vast additional capabilities of a modern computer, but also to reinforce technical elitism.

I often wonder why Hypercard had to die.

It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.

Letting people build their own tools means letting people control their own destiny.

If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?

I'm mentioning hypercard specifically because I've been relearning hypercard recently, and it is *better* and more useful than I remember it being.

It's honestly revelatory.

Hypercard, if your unfamiliar, is powerpoint + instructions.

Here's a great introduction/example: loper-os.org/?p=568

The author walks you through building a calculator app in about 5 minutes, step by step.

Warning: There's a bit of ableist language tossed around in the last paragraph. Skip it, there's nothing worth reading there anyway.

You use the same kinds of tools you would use to build a slideshow, but you couple them with links, multimedia, and scripting.

Want a visual interface for your database of client data? Great! slap together a roladex card, and drop in a search function.

Go from concept to presentation ready in an hour or two (or less, if you've done this before!)

Hypercard was easy to use. Everyone who used it loved it. It was integral to many businesses daily operations.

Jobs killed it because he couldn't control it.

Microsoft doesn't ship any tools for building programs with their OS anymore, either.

They used to. There was a time when you could sit down at any windows or DOS machine and code up a program that would run on any other Windows or DOS machine.

But we can't have that anymore.

In the name of Ease of Use, they left out the Human aspect.

Use your computer how you're told to use it, and everything is easy.

Do anything new or novel and it's a struggle.

My nephew has an ipad.

He asked his dad how to write games. His dad didn't know. His dad asked me how to write games on an iPad. I told him not to bother.

My nephew asked me how to learn to write games.

I gave him a raspberry pi and a copy of pico 8.

Now he writes computer games.

He couldn't do that on his iPad.

@ajroach42 Raspberry Pi == the new C64/Beeb.

Lots of IT careers will be started with the Pi, just as they were with the C64/Beeb.

@profoundlynerdy @ajroach42 I'm all for all the little things that spark an interest in coding in young minds, but in all honesty I'd expect the JavaScript console in all non-mobile browsers will introduce two orders of magnitudes more people to coding than Raspberry π?

@22 @ajroach42 Because JS is limited to the browser in a lot of ways. Yes, I know you can run JS from a console if you want, but that's not obvious to the novice.

Python, by contrast is a good general purpose programming language that is syntactically easy to read (like BASIC, unlike Perl) and not so low level that you have to worry about pointers (as in C) directly.

@profoundlynerdy thanks for weighing in! What do you think of James Hague’s gentle arguments (which I agree with and practice myself, as a professional JavaScript and Python dev, full disclosure 😝): prog21.dadgum.com/203.html

@22 I understand his argument on the graphics front.

1. The lack of a GUI keeps things down to the brass tacks. If you give a new programmer a GUI they'll play with the GUI and not much else.

2. While JS dominates for client-side scripting in the browser, I think its going to have competition soon. I expect browsers will add Python interpreters soon-ish as MS Office recently did, if I'm not mistaken.

As for standalone executable portability, yeah, that's a weak point. Learn Python, then learn C.

@profoundlynerdy If I may continue to impose on your time—

1. I shudder to think how many kids we keep away from math and engineering by making them think that they have to be good at grade school arithmetic and calculation first. Similarly, I am ecstatic that visual (Processing.js, Cinder, OpenFramekworks, Scratch) and audible (Sonic Pi) programming draw so many in. As a learner and a teacher, I do not believe beginners should be made to prove themselves with stodgy, boring "basics".

@22 This.

I taught myself BASIC to a limited degree in the 90s on a Commodore 128D and Apple 2 clone. I was pushed away from programming by people who expected I needed to be a math whiz.

Even as a dyslexia sufferer with math issues as a result, I'm still able to "grok" Python just fine. So much for the nay sayers.

@profoundlynerdy You know, this is probably where we get all these stupid people asking math/puzzle questions on coding interviews—this absurd 1990s notion that skill in grade school math (not really math, more like calculation) is correlated with skill in coding. There's no valid reason to ask for Gauss' sum formula tricks to see if you can code—if there was, you might as well as about KD-trees or FFTs. If you like people who like math, fine, but don't pretend you're interviewing for coders.

Follow

@22 I couldn't agree more.

Your'e much better off asking algorithm questions that are actually relevant:

* What is the purpose of a linked list?
* How might I optimize this to use less memory? Insert [example of a giant left-hand grab of SQL data that uses 2GB data]. Expected answer: stack/queue one 2 MB record at a time [example code].

Yadda Yadda Yadda...

Sign in to participate in the conversation
Mastodon for Tech Folks

This Mastodon instance is for people interested in technology. Discussions aren't limited to technology, because tech folks shouldn't be limited to technology either! We adhere to an adapted version of the TootCat Code of Conduct and have documented a list of blocked instances. Ash is the admin and is supported by Fuzzface, Brian!, and Daniel Glus as moderators. Hosting costs are largely covered by our generous supporters on Patreon – thanks for all the help!