The Programs of the Week of the H'ackathon
This Week’s Program: Aug 7 - Aug 11
Last week’s Tinyletter was just all out of whack. The subject line of
the email got mangled. It was The Programs of a Week of
#lang
. Markdown processing chewed that up. I also messed up the date
in the title line. I’m playing fast and loose on this little
letter. Move fast and break things, amirite?
60b6c919d770f69c2b18e0a5a86b7ff2fc4e9806
In this commit, I move the functionality for creating an RTMP sink
for Twitch into its own module. In the process, I also make the
twitch-stream-key
a
Racket
parameter. If
you’ve worked in Clojure, you are probably familiar with certain
*earmuffs*
(a var surrounded by asterisks on either side). This is a
convention of Clojure for vars that are intended for
rebinding. Long-time readers of This Week’s Program might recall how
I used this
in sonic-sketches
with clojure.data.generators/*rnd*
and binding
to get
deterministic results with pseudorandomness.
In Scheme, this kind of dynamic rebinding feature has a first-class
representation called a parameter. Now when streaming to
Twitch from within Overscan, you can parameterize
your program with
a Twitch stream key (it defaults to getting one from the environment).
Through the rest of the week I spend a fair bit more time working out documentation, and scratching my head about how best to expose parts of the Introspection API.
Harry’s H’ackathon
This week, the engineering team at my employer did our inaugural Hackathon! For one day, the engineering team split up into teams to work on a bunch of fun projects. I worked on a team where we decided to pursue doing some stuff with Augmented Reality (lol it was kind of a fun goof to learn about ARKit).
This was my first time really trying to get my hands dirty with the Swift programming language, which I found interesting. What makes Swift particularly challenging for learning in a day is contorting your brain to also accept Objective-C as input. In the end I got some exposure to ARKit, but I spent the better part of the day working through some stuff with Vision, Apple’s new computer vision framework that’s a smaller part of Core ML, Apple’s new machine learning framework.
I was exposed to a heck of a lot of new stuff all at once, and had a great time with the quick immersion. Apple is really good at creating frameworks. I would really love to return to app development at some point in the future, but the important thing is that I came away from this experience with two distinct learnings.
- You can not do both Computer Vision/Feature Detection and Augmented Reality at the
same time. The phone is just not powerful enough. We repeatedly hit
this wall with both
Vision
andCoreImage
. If anyone has any examples otherwise, I’d love to see them! - Apple’s programming monospace font, SF Mono, is really lovely. This font came as a part of macOS Sierra, and is used in XCode, but is not part of the Mac system fonts. To extract this font to use in your preferred tools and editor, follow this tutorial. I’m using SF Mono in Emacs now and it is really nice.
Have a good evening and a great weekend!
📱 Mark