Screenshot features in Mac OS X and MacOS

10.13 High Sierra and earlier : Use key shortcuts (see below) or Grab.app (see here)

11.14 Mojave and after: Snapshot.app (more info here):

  • Similar to Grab but after you’ve taken the snapshot it allows you to annotate/markup the screenshot, and then you can either copy and paste from there or save as a file.
  • Additionally in Screenshot, it remembers the position on screen of an area screenshot so you can easily take repeated screenshots from the same portion of your screen.

Key shortcuts in all Mac OS X / MacOS versions (see here)

Shift + Cmd + 3 : captures whole screen

Shift + Cmd + 4 : starts a region screenshot, select a region of the screen

  • before 10.14 Mojave, saves a .png to your desktop
  • 10.14 Mojave and after – opens screenshot in the Screenshot editor app

PiAware dump1090 no longer loading on 8080 (and easy solution)

I recently noticed that my dump1090/PiAware web ui stopped loading. The unusual thing was that part of the webapp appeared to load but then it would hang with the ‘please wait’ spinner.

Looking in dev tools in Chrome, there’s a bunch of 404s on the data feeds:

A quick search found a couple of useful posts, especially this one, suggesting to first check the logs, with:

sudo journalctl -u dump1090-fa -n50 --no-pager

And here’s what I found:

Apr 22 20:24:02 raspberrypi systemd[1]: dump1090-fa.service holdoff time over, scheduling restart.
Apr 22 20:24:02 raspberrypi systemd[1]: Stopping dump1090 ADS-B receiver (FlightAware customization)…
Apr 22 20:24:02 raspberrypi systemd[1]: Starting dump1090 ADS-B receiver (FlightAware customization)…
Apr 22 20:24:02 raspberrypi systemd[1]: Started dump1090 ADS-B receiver (FlightAware customization).
Apr 22 20:24:02 raspberrypi dump1090-fa[27365]: Wed Apr 22 20:24:02 2020 PDT dump1090-fa 3.5.1 starting up.
Apr 22 20:24:02 raspberrypi dump1090-fa[27365]: rtlsdr: no supported devices found.
Apr 22 20:24:02 raspberrypi systemd[1]: dump1090-fa.service: main process exited, code=exited, status=1/FAILURE
Apr 22 20:24:02 raspberrypi systemd[1]: Unit dump1090-fa.service entered failed state.

Here’s your problem, right here:

dump1090-fa[27365]: rtlsdr: no supported devices found.

Quick fix, pushed the rtl-sdr usb dongle back in (it had come slightly loose) and rebooted, problem solved:

-- Logs begin at Wed 2020-04-22 20:27:02 PDT, end at Wed 2020-04-22 20:27:55 PDT. --
Apr 22 20:27:05 raspberrypi systemd[1]: Starting dump1090 ADS-B receiver (FlightAware customization)…
Apr 22 20:27:05 raspberrypi systemd[1]: Started dump1090 ADS-B receiver (FlightAware customization).
Apr 22 20:27:05 raspberrypi dump1090-fa[571]: Wed Apr 22 20:27:05 2020 PDT dump1090-fa 3.5.1 starting up.
Apr 22 20:27:05 raspberrypi dump1090-fa[571]: rtlsdr: using device #0: Terratec T Stick PLUS (Realtek, RTL2838UHIDIR, SN 00000001)
Apr 22 20:27:05 raspberrypi dump1090-fa[571]: Found Elonics E4000 tuner
Apr 22 20:27:05 raspberrypi dump1090-fa[571]: rtlsdr: enabling tuner AGC

Problem solved – look at that empty sky over Sacramento!

Using Avro Serializer with Kafka Consumers and Producers

Some of the Avro Serializer/Deserializer and Schema Registry classes are not available in jars from the usual maven-central repo. Confluent manage their own repository which you can add to your pom.xml with:

<repositories>
    <!-- For io.confluent Jars not in maven central -->
    <repository>
      <id>confluent</id>
      <url>http://packages.confluent.io/maven/</url>
  </repository>
</repositories>

And then you can add dependency:

<dependency>
  <groupId>io.confluent</groupId>
  <artifactId>kafka-avro-serializer</artifactId>
  <version>5.4.1</version>
</dependency>

This dependency will allow you to use the AvroSerializer in your properties:

value.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer

To generate Avro Specific classes from an .avsc file following the Avro developer guide here, add the Avro dependency and generator plugin:

<dependency>
  <groupId>org.apache.avro</groupId>
  <artifactId>avro</artifactId>
  <version>1.9.2</version>
</dependency>

and the plugin:

<plugin>
  <groupId>org.apache.avro</groupId>
  <artifactId>avro-maven-plugin</artifactId>
  <version>1.9.2</version>
  <executions>
    <execution>
      <phase>generate-sources</phase>
      <goals> <goal>schema</goal> </goals>
      <configuration>.     <sourceDirectory>${project.basedir}/src/main/avro/</sourceDirectory>
      <outputDirectory>${project.basedir}/src/main/java/</outputDirectory>
      </configuration>
    </execution>
  </executions>
</plugin>

The plugin configuration is looking for .avsc schema files in the /srv/main/avro folder. An example schema file looks like this:

{
  "namespace": "kh.kafkaexamples.avro",
  "type": "record",
  "name": "TestMessage",
  "fields": [
    {"name": "firstName", "type": "string"},
    {"name": "lastName", "type": "string"}
  ]
}

The plugin will generate the Avro class for any .avsc file it finds in the configured folder.

To use Avro messages with Confluent Platform (or Confluent Cloud), you also need to specify a url to the Schema Registry, otherwise you’ll see this error:

Caused by: io.confluent.common.config.ConfigException: Missing required configuration "schema.registry.url" which has no default value.
at io.confluent.common.config.ConfigDef.parse(ConfigDef.java:251)

You also need to prefix the url with http/https, otherwise you’ll see this exception:

Exception in thread "main" org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.net.MalformedURLException: unknown protocol: localhost

Assuming you’re running Confluent Platform locally, the Schema Registry property is:

schema.registry.url=http://localhost:8081

To publish a message using the generated TestMessage class from the above schema:

Producer producer = new KafkaProducer<>(props);
TestMessage message = new TestMessage();
message.setFirstName("firstname");
message.setLastName("lastname");
producer.send(new ProducerRecord("test-avro-topic", "1", message));
producer.flush();
producer.close();

Done!