Computer History Documentaries – part 2

It’s been a while since I posted this list of some of the computer history documentaries and dramas that I’ve found most interesting, so I have a few more to recommend and add to the list:

  • Silicon Cowboys – fascinating documentary about Compaq, the development of their luggable PC, and their impact on the development of the PC Compatible market. (4/27/17 – this is currently on Netflix)
  • Bedrooms to Billions – The Amiga Years : incredibly well put together indie documentary about the Amiga
  • Bedrooms to Billions: documentary about the development of the home computer games industry in the UK and Europe. Includes many interviews with the original developers and many involved in the industry at the time. If you had any interest in computer games around the mid to late 80s in the UK, this is a must watch
  • Beep – documentary about sound and music development for computer games
  • Get Lamp – documentary by Jason Scott, covering text based computer adventure games
Nicola Caulfield & Anthony Caulfield (who produced the Bedrooms to Billions documentaries) currently have a new documentary called ‘The Playstation Revolution’ that just reached it’s funding goal on Kickstarter, but if you’d like to back it you can back via MegaFounder (linked from the Late Backer link on the Kickstarter page)

Updating/installing node.js on the Raspberry Pi

The latest versions of Raspbian (e.g. Jessie) come with an older version of node.js preinstalled. If you search around for how to install node.js on the Pi you’ll find a number of different approaches, as it seems there’s not an official ARM compiled version of the latest releases in the Debian repos.

This approach provided by this project has later versions compiled for ARM. Follow the instructions on their site to download and install from the .deb file.

Before you start, if you already have an older version installed (check ‘node -v’), uninstall it first. The version I had on my fresh Jessie install was from nodejs-legacy, so ‘sudo apt-get remove nodejs-legacy’ did the trick.

Amateur Radio homebrew: Raspberry Pi + Packet Radio + social networking integration

I’m putting something together for our River City Amateur Radio Comms Society homebrew show n tell later this year. Here’s my ingredients so far:

I’m thinking of building a bridge between Amateur Radio Packet and social networking, like Twitter.

So far I’ve roughed out node.js to Twitter integration using node-oauth, and I I’ve put together a simple prototype using the node-ax25 library to connect to the KISS virtual TNC on Direwolf. It receives packets and writes callsigns and messages to the console.

Right now I’m testing this on a PC running Debian, with a Rigblaster Plug n Play connected to an Icom 880h. Later when my TNC-Pi arrives I’ll migrate this to the Pi.

So far using the node-ax25 library looks pretty easy. Here’s some code so far to dump received callsigns to the console:

var ax25 = require("ax25");

var tnc = new ax25.kissTNC(
    {   serialPort : "/dev/pts/1",
        baudRate : 9600
    }
);

tnc.on("frame",
    function(frame) {
        //console.log("Received AX.25 frame: " + frame);
        var packet = new ax25.Packet({ 'frame' : frame });
        console.log(new Date() + "From "
            + formatCallsign(packet.sourceCallsign, packet.sourceSSID)
            + " to "
            + formatCallsign(packet.destinationCallsign, packet.destinationSSID));

        if(packet.infoString !=""){
            console.log(">  " + packet.infoString);
        }
    }
);

/**
 * Formats a callsign optionally including the ssid if present
 */
function formatCallsign(callsign, ssid){
    var formattedCallsign = "";
    if(ssid == "" || ssid == "0"){
         formattedCallsign = callsign;
    }
    else{
        formattedCallsign = callsign + "-" + ssid;
    }

   return formattedCallsign;
}

The output for  received messages so far looks like this:

Wed Apr 19 2017 23:10:00 GMT-0700 (PDT)From KBERR  to KJ6NKR

Wed Apr 19 2017 23:12:05 GMT-0700 (PDT)From AE6OR  to BEACON

>  Š¤¤@à–¤ˆŽ@@`–„Š¤¤@`®žžˆ²@`–„Š¨@`¨‚žŠ@aðHello from 5W Garage packet node AUBURN 73's Steli !

Wed Apr 19 2017 23:12:08 GMT-0700 (PDT)From AE6OR  to BEACON

>  Š¤¤@à–¤ˆŽ@@à–„Š¤¤@ஞžˆ²@`–„Š¨@`¨‚žŠ@aðHello from 5W Garage packet node AUBURN 73's Steli !

Wed Apr 19 2017 23:16:49 GMT-0700 (PDT)From K6JAC -6 to ID    

>  K6JAC-6/R BBOX/B KBERR/N

Wed Apr 19 2017 23:19:31 GMT-0700 (PDT)From K6WLS -4 to ID    

>  Network Node (KWDLD)

Wed Apr 19 2017 23:19:50 GMT-0700 (PDT)From NM3S   to BEACON

>  Mike in South Sac.  Please feel free to leave a message. 73's

Calling Twitter REST api from JavaScript with OAUTH

I’ve started on a project where I need to call Twitter’s REST apis from a Node.js JavaScript app. I’ve built a Java app before that integrated with Twitter, but used a library to help with the OAUTH authentication. Looking around for a JavaScript library, it looks like node-oauth does what I need to do, so gave it a go.

Twitter’s API docs are pretty good, but I ran into an issue with node-oauth with an error coming back from:

POST /statuses/update.json

which returned this message:

{ statusCode: 401,
 data: '{"errors":[{"code":32,"message":"Could not authenticate you."}]}' }

which is odd because using the node-oauth library to authenticate and then call any of the GET apis with node-oauth was working fine. If you Google this error there’s numerous posts about this error for all number of reasons, so I’m not sure it’s a particularly specific message.

Here’s a working example for a GET:

First, use node-oauth to authenticate with your Twitter key, secret, and app access token and secret (which you can set up here):

var oauth = new OAuth.OAuth(
    'https://api.twitter.com/oauth/request_token',
    'https://api.twitter.com/oauth/access_token',
    config.twitterConsumerKey,
    config.twitterSecretKey,
    '1.0A',
    null,
    'HMAC-SHA1'
);

Next, using the returned value from this call, make the GET request (it builds the request including the OAUTH headers for you):

//GET /search/tweets.json
oauth.get(
    'https://api.twitter.com/1.1/search/tweets.json?q=%40twitterapi',
    config.accessToken,
    config.accessTokenSecret,
    function (e, data, res){
        if (e) console.error(e);
        console.log(data);
    });

Attempting a POST to update the status for this account using the similar api:

var status = "{'status':'test 3 from nodejs'}";

oauth.post('https://api.twitter.com/1.1/statuses/update.json',
    config.accessToken,
    config.accessTokenSecret,
    status,
    function(error, data) {
        console.log('\nPOST status:\n');
        console.log(error || data);
});

And this is the call that returned the error about “could not authenticate”.

Looking through a few other tickets for the node-oauth project, this one gave a clue – this particular issue was about special chars in the body content to the POST, but it gave an example of the body formed like this:

var status = ({'status':'test from nodejs'});

I would have thought passing a String would have worked, but I guess the api is expecting an object? Anyway, this is working, and looks good so far.

Retro Battlestation (update 2): dialing up a local BBS with the 2002 Power Mac G4 Quicksilver

Since my first post after receiving my 2002 Power Mac G4 Quicksilver, I’ve learned a number of things about this machine.

Following the Retro Battlestations subreddit, there was a group activity a while back to dial into the group’s BBS using, yes, a real dial up modem, from your retro machine of choice. So I thought I’d give it a go.

The internal modem is on a board

Internal modem missing

about the size of a pack of cards, and it’s normally screwed to standoffs on motherboard in the top left of the motherboard in this photo. It was obviously removed in this machine, so that explains why an internal modem was not showing up as installed.

Not to be deterred, I noticed you can pick up an Apple USB modem for just $10 online so I ordered one. Wanting to give it a go truely old-style, I wanted to dial in from OS 9 – this is when I found out that this USB Modem is a ‘soft’ modem, in that it works mostly in software, and no, it’s not supported on OS 9.

In OS X 10.4 however, it gets recognized correctly as an External Modem in the System Preferences panel:

 

 

 

 

 

 

For a dial up terminal emulator for the Mac, I found most references pointing to ZTerm , so got it downloaded and installed, and it sees the USB modem. Got the BBS number configured, and off we go!

 

 

Success! Dialed in to the Level 29 BBS! I was expecting to see ANSI colors in the text display, so not sure if I need to change a setting in ZTerm, but so far pretty excited this works!

 

Using Netflix Eureka with Spring Cloud / Spring Boot microservices

I’ve been taking a look at this article on using Spring Cloud‘s integration/support for Netflix Eureka. I’ve started to put together a simple example using Eureka as a service registry for a couple of Spring Boot services, and what this would look like if deployed in Docker containers.

So far I’ve created the initial service that uses Spring Cloud’s @EnableEurekaServer annotation to start up the Eureka service.

Jumping ahead of the instructions, by default if you run this app it will attempt to reach out and find a local running Eureka server and register with it, but since this app is the Eureka server, you need to add config to tell the app not to do this by default. Otherwise you’ll see errors like:

com.sun.jersey.api.client.ClientHandlerException: java.net.ConnectException: Connection refused
at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:187) ~[jersey-apache-client4-1.19.1.jar:1.19.1]
at com.sun.jersey.api.client.filter.GZIPContentEncodingFilter.handle(GZIPContentEncodingFilter.java:123) ~[jersey-client-1.19.1.jar:1.19.1]
at com.netflix.discovery.EurekaIdentityHeaderFilter.handle(EurekaIdentityHeaderFilter.java:27) ~[eureka-client-1.6.2.jar:1.6.2]

Adding the recommended config per the article:

server:

  port: ${PORT:8761}

eureka:

  client:

    registerWithEureka: false

    fetchRegistry: false

    server: waitTimeInMsWhenSyncEmpty: 0

Now when I start up I see this:

2017-04-11 22:23:17.040  INFO 37607 - o.s.c.n.eureka.InstanceInfoFactory       : Setting initial instance status as: STARTING
2017-04-11 22:23:17.100  INFO 37607 - com.netflix.discovery.DiscoveryClient    : Initializing Eureka in region us-east-1
2017-04-11 22:23:17.101  INFO 37607 - com.netflix.discovery.DiscoveryClient    : Client configured to neither register nor query for data.
2017-04-11 22:23:17.110  INFO 37607 -com.netflix.discovery.DiscoveryClient    : Discovery Client initialized at timestamp 1491974597110 with initial instances count: 0
2017-04-11 22:23:17.192  INFO 37607 - c.n.eureka.DefaultEurekaServerContext    : Initializing ...
2017-04-11 22:23:17.195  INFO 37607 - c.n.eureka.cluster.PeerEurekaNodes       : Adding new peer nodes [http://localhost:8761/eureka/]
2017-04-11 22:23:17.359  INFO 37607 - c.n.d.provider.DiscoveryJerseyProvider   : Using JSON encoding codec LegacyJacksonJson
2017-04-11 22:23:17.359  INFO 37607 - c.n.d.provider.DiscoveryJerseyProvider   : Using JSON decoding codec LegacyJacksonJson
2017-04-11 22:23:17.359  INFO 37607 - c.n.d.provider.DiscoveryJerseyProvider   : Using XML encoding codec XStreamXml
2017-04-11 22:23:17.359  INFO 37607 - c.n.d.provider.DiscoveryJerseyProvider   : Using XML decoding codec XStreamXml
2017-04-11 22:23:22.479  INFO 37607 - c.n.eureka.cluster.PeerEurekaNodes       : Replica node URL:  http://localhost:8761/eureka/
2017-04-11 22:23:22.486  INFO 37607 - c.n.e.registry.AbstractInstanceRegistry  : Finished initializing remote region registries. All known remote regions: []
2017-04-11 22:23:22.486  INFO 37607 - c.n.eureka.DefaultEurekaServerContext    : Initialized

Hitting http://localhost:8761 I get the fancy Eureka dashboard:

Looks good so far! More to come later.

Github repo for the code so far is here.

git error: “refusing to merge unrelated histories”

When working with local git repos, whenever I add a remote repo on github and then try to pull down the master from remote to local, I get errors about local repo not being on the same branch as the remote, or more recently this error:

fatal: refusing to merge unrelated histories

I think in the past I’ve done a ‘reset –hard’ like described here, but this didn’t work for me this time, I got the same error about ‘unrelated histories’

Turns out this might be related to a change in git behavior, as described here.

Doing:

git pull github master --allow-unrelated-histories

and then followed with:

git push github master

fixed my issue.

Retro Battestation: just received my 2002 Power Mac G4 Quicksilver

I just picked up a pretty good eBay deal on a 2002 Power Mac G4 Quicksilver. It was sold as working, and yes it does boot up and it did come with OS X 10.4.11 installed as advertised.

 

Inside, it looks almost new. When I recently took some old PC towers to the electronics recycling inside they looked like they’d accumulated 100 years worth of dust and god knows what. By comparison, for a 15 year old machine, this looks like it was kept sealed in a box for most of that time – it’s spotless with no dust in sight.

Clean!

It looks like it has 10.4.11 cleanly installed, but I also picked up a used OS X 10.4 Tiger DVD to do a clean install myself.

The DVD drive in the machine does not want to open. It whirs and clicks when you hold F12, but no go. I used the paper clip trick in the manual open hole on the front of the drive, it opens up and there’s nothing jammed in there, it just doesn’t want to open. I tried putting the DVD in there, manually closing the drive and then powering on, but it doesn’t spin up and read the disk.

By the way, on this Power Mac G4 Quicksilver, the DVD manual open hole is obscured by the front of the case, so the only way to get a paper clip in the hole is to physically remove the drive from the case to get access to the hole.

Given the issues with the DVD drive, I discovered that this machine will boot from a USB flash drive (there’s a discussion in this thread about all Intel Macs will boot from USB, but this feature apparently was supported on some G4 and G5 machines but apparently not all).

To install Mac OS 9 I copied the ISO from OS9Lives universal installer to a USB using Infrarecorder on a Windows 10 desktop, and holding down Option/Alt to get the boot menu, it shows the USB, and clicking on it starts to boot. I wasn’t sure about using the ‘Restore’ option on the OS9Lives universal installer, as it seems from the instructions that it wipes your partition.

 

 

Instead I’ve read in a few different forum posts if you just copy the ‘System Folder’ from an OS 9 image to the drive, along with ‘Applications’ (rename it ‘Applications (OS 9)’ if you’re dragging them to the same partition as OS X, if it’s a different partition then the name can stay as Applications).

 

Interesting that this just works – if you select the OS 9 System Folder as the Startup Disk in System Preferences, then when you reboot it just starts up.

To get a copy of the OS X 10.4 DVD onto a USB flash drive, I used Infrarecorder again to make an image, and then used ‘dd’ on my MacBook Pro to write the image to a flash drive.

I’m going to do a fresh install, but booting it up and looking around at what’s already on there, OS X 10.4 on a single PowerPC cpu machine, not a dual, and only 800Mhz with 512MB, performance is not bad, it’s pretty responsive. Both Tiger and OS 9 boot pretty quick (Tiger boots a few seconds faster which is surprising).

Quick observations:

  • Safari on OS X is terribly slow, practically unusable
  • Ten Four Fox on OS X is usable but sluggish on scrolling any page. Makes you appreciate how fast modern day machines are
  • Classilla on OS 9 is pretty snappy. Of the browsing options available, this is the better choice on this machine so far.

Next up I’ll be trying to boot from the image of the 10.4 DVD and doing a fresh install. More to come later.

Getting status from dd when writing disk images on MacOS

dd is a pretty useful tool for creating and writing disk images from a source to a destination, for example writing disk .img files to SD Cards for your Raspberry Pi (see here, and here).

The trouble is if you’re writing images that are several GB that can run for 20mins or so, you don’t get any feedback on the progress until it’s complete. Well turns out if you send a ‘kill -INFO’ signal to the PID of the process, it will output the current status of bytes written and bytes remaining. Found this tip here.

Serving static content, REST endpoints and Websockets with Express and node.js

I’ve yet to see a framework that is as simple as Express for developing REST endpoints. I’m experimenting with a React app that receives push updates from the server using Websockets. Is it possible to use Express to serve all the requests for this app: static content (the React app), REST endpoints and Websocket? Turns out, yes, and it’s pretty easy too.

Starting first using Express to serve static content:

This uses the static middleware for serving the static content.

Handling REST requests with Express is simple using the get(), post(), put(), and delete() functions on the Router. Adding an example for a GET for /status now we have this:

 

Next, adding support for Websockets, using the ws library. Incrementally adding to the code above, now we create a WebSocket.Server, using the option to pass in the already created HTTP server:

const wss = new SocketServer({ server });

At this point we add callbacks for ‘connection’ and ‘message’ events, and we’re in business:

This is the starting point for a React app to build a websockets client, more on that in a future post. The code so far is available in this github repo: https://github.com/kevinhooke/NodeExpressStaticRESTAndWebsockets