I’m reflecting on my information strategy and how to improve it at the moment. Here’s the intro post to that, giving a bit of context.

In this particular post I’m going to chat specifically about discovery, AKA using the Intertubes to find out about interesting things. I can’t help but call my strategy for this my discostrat, apologies for that. I’ll talk a bit about how I’ve had it set up for a while, and some recent tweaks I’ve made to it, and maybe some ways I want to improve it more.

Photo by Vale Zmeykov on Unsplash

Continue reading

I’ve gradually shifted away from Twitter and Facebook over the past few years, migrating to the open alternatives of the IndieWeb and the Fediverse as replacements. I really like them both, both going some way towards providing the things that I want from social media (and being online in general), while removing some of things that I don’t want.

It’s far from perfect yet though. In this post (and more to come probably) I’m reflecting a little bit about my ‘information strategy’ – how to get the stuff I want from the big ol’ world wide web. I am mostly inspired to start on this by reading Ton’s thoughts on information strategies, as well as a vague feeling I have that while I get a lot out of the web, it comes with plenty of pitfalls and its usage can in fact to be to my detriment if unchecked and unmanaged.

As I’m more into the IndieWeb, I guess I’m kind of looking at all this through an IndieWeb lens, but the Fediverse is a bit part of it too for me. Sometimes you hear it referred to as ‘the open web’ as a catch-all.

What I want from the web

I’m totally guilty of being technology-focused (including in this post already…), but I think for reviewing my information strategy it’s better to try to start from a question of ‘where do you get your info from and what do you do with it’ rather than ‘what technology do you use’. So in a really high-level sense, what I want to get from spending time online is:

  • discovery (finding out about interesting things)
  • writing & reflection (producing not just consuming, posting publicly about things I’ve read or seen, and having to think about it before I do)
  • discourse & learning (getting others’ perspective on things, having my horizons expanded and my views challenged)
  • relationships: I think the ‘social’ part of social media should mean forming long-lasting bonds with people, not just being ephemeral blips on each others’ radars

I also want to give back and contribute to a healthy ecosystem – so I want to share helpful information with others, and contribute to others’ discussion and reflection. Good info is good praxis.

What I don’t want

I want to spend my time well, basically. With intention.

So that means avoiding:

  • staring at a screen excessively, doing the zombie scroll of doom trying to get the next hit of information or interaction
  • FOMO
  • information overload
  • filter bubbles
  • feeding of corporate beasts


This post originally also included me delving into my discovery strategy, but with the preceding waffle it started to become a huge post, so I’ve farmed off the discovery bit into a separate post – coming soon (UPDATE: here it is).

I also hope in upcoming posts to think out loud about the other pieces – of ‘reflection’, ‘discourse’ and ‘relationships’. I’m assuming this will all evolve over time, too, so they might actually be different topics by then. Which is where personal knowledge management and a wiki comes in, but that’s also for another day…

Yesterday was the Festival of Maintenance in Liverpool. I went along and thoroughly enjoyed it. Lots of interesting talks and nice people. The theme was basically the maintenance of things, sometimes purely engineering-focused (e.g. bridges and potholes and data), and sometimes with a pretty lefty alternative economics, post-growth, social care focused thread running through it too.

It was at The Fashion Hub, a really awesome space in the old fashion district of Liverpool, with the excellent DoES makerspace just upstairs.

It was chock full of interesting things. I have loads of notes that I hope to digest and probably microblog about over the coming days, but a quick note of a few themes that formed for me:


Students need to learn more about maintenance and repair, not just how to make new things. Tom of Holland mentioned how textile repair used to be in most textbooks, but no more. Civil engineer Mujib Rahman mentioned how students come into university wanting to know how to build things, and with less interest on how to maintain things.

maintenance as care

I’ve thought before how social care can be kind of be seen as a form of maintenance in a fairly abstract sense. But I liked the point made by Juliet Davis about how maintaining something for someone can be seen as an act of care for the person, too.

patterns of maintenance

Common themes seemed to arise throughout the day – e.g. the importance of monitoring in maintenance. Chris Adams mentioned during the panel discussion that a pattern language of maintenance could be a useful resource – I agree. The ODI have just released their pattern catalogue of collaborative data maintenance, but there could be scope for a general pattern catalogue of maintenance techniques and considerations.

I’ve been reading through the first chapter of The Little Schemer, and following along in Emacs (specifically: spacemacs).

I jotted down a few notes on getting set up to do this.

First off you’ll need to install a Scheme implementation. There’s a few of them out there. This Reddit thread has some useful discussion on the pros and cons of each of them.

As I’m on Linux and using Emacs, I went with Guile.

To install on Linux Mint, you just need:

sudo apt install guile-2.2

At this point, you could simply fire up Guile and work within the REPL there, if you wanted.

I want to write my Scheme in Emacs, and then send it to the REPL from there. The preferred Emacs package for that seems to be geiser. In spacemacs, geiser comes included with the scheme layer, so all you need to do is add that layer into your config and you’ve got geiser (and some other handy scheme bits and pieces).

Once you’re in a scheme file, run M-x run-geiser, choose guile from the dropdown, and that’ll start up the Guile REPL and allow you to send parts of your file to it for evaluation. C-x C-e for example will send the sexp before the cursor.

I’ve just gotten ProtonMail email working in mu4e in emacs via IMAP, with ProtonMail’s Bridge application and mbsync.

Most of these moving parts are new to me, so I’m not sure if what follows is 100% accurate or correct. But in case it’ll be of help to others (and to me in 3 months) here are the steps I needed and the resources I used to get it up and running.

ProtonMail Bridge

To get an IMAP client working with ProtonMail, you also need to set up a separate app called Bridge. I haven’t looked that much into the nitty-gritty of it, but I think the main point of Bridge is to keep the end-to-end encryption when using ProtonMail with IMAP – i.e. so the encryption continues all the way until it reaches your mail client. I think Bridge functions as a local IMAP server (and SMTP server) that you point your client apps at, and Bridge does all the encryption/decryption locally before communicating with ProtonMail’s servers.

To be honest, given the caveats around the utility of ProtonMail’s encryption (e.g. that it only really works end-to-end if it’s two ProtonMail users communicating), setting up the Bridge is a bit of an extra annoyance that I imagine would put plenty of people off. But anyway kudos to ProtonMail for making it so at least you can use IMAP.

It’s probably not too hard to set up on Windows or MacOS. But the Bridge app on Linux is beta, so it’s a bit extra work. You have to email ProtonMail to get access to it first of all, then they send you a bunch of instructions to set it up. On Debian derivatives, it’s basically just installing a deb file, but first with a bunch of steps for verifying the provenance of that deb.

Then you run ProtonMail Bridge and set up your account in there, same as on the other platforms. That’s easy enough.


So that’s all fine and dandy. Next step is to point your IMAP client at ProtonMail Bridge. There’s instructions on the ProtonMail site for using Thunderbird with Bridge on Linux, which is probably the easiest next step if you’re happy to use Thunderbird.

I wanna use mu4e in emacs though. Because emails are predominantly text, and anything involving a fair amount of text manipulation I would prefer to do in emacs.

So to use mu4e, you need your emails in Maildir format. I don’t know much about what this is. From what I’ve seen I’m guessing a file-based, Unix-y way of storing mail. Possibly better/more efficient, possibly just historical intransigence. I dunno.

mbsync is a utility to sync between IMAP and Maildir. I gotta say, any project still knocking around on sourceforge I immediately expect to be dead as a dodo. But mbsync kept on coming up when coming across recent-ish reddit posts about ProtonMail and IMAP, so it seems to be an OK choice for the syncing. Another option is offlineimap.

On Mint, I installed mbsync with:

sudo apt install isync

So then you need to configure mbsync to point at an IMAP server, and to map your IMAP folders to your Maildir folders. This goes in your .mbsync config file.

This reddit post has a handy minimal guide of how to set up your mbsync config… or at least it would, if it hadn’t been posted on ghostbin and ghostbin hadn’t gone kaput. Luckily though someone made a copy of it in a gist on github.

I’ll copy it here too for posterity:

IMAPAccount protonmail
Port 1143
User USERNAME_HERE@protonmail.com
 you can enter a command to retrieve your password
 "gpg2 -q -d ~/.authinfo.gpg | awk 'FNR == 1 {print $8}'"


IMAPStore remote
Account protonmail

 can change .mail to something else
MaildirStore local
Path ~/.mail/
Inbox ~/.mail/INBOX/

Channel inbox
Master :remote:
Slave :local:
Patterns * !"Drafts" !"All Mail"
Create Slave
SyncState *

Group protonmail
Channel inbox

I actually got an error with that as is, something about unknown section keyword 'SSLType'. You need to delete the blank line before SSLType NONE.

I also found a different .mbsync that was helpful, posted on pastebin. Copying below:

IMAPAccount protonmail
Port 1143
 you enter a command to retrieve your password
PassCmd "gpg2 -q -d ~/.authinfo.gpg | awk 'FNR == 1 {print $8}'"
IMAPStore pm-remote
Account protonmail
MaildirStore pm-local
Path ~/.mail/
Inbox ~/.mail/INBOX/
Channel pm-inbox
Master :pm-remote:
Slave :pm-local:
Patterns "INBOX"
Create Both
Expunge Both
SyncState *
Channel pm-sent
Master :pm-remote:"Sent"
Slave :pm-local:"sent"
Create Both
Expunge Both
SyncState *
Channel pm-trash
Master :pm-remote:"Trash"
Slave :pm-local:"trash"
Create Both
Expunge Both
SyncState *
Channel pm-spam
Master :pm-remote:"Spam"
Slave :pm-local:"spam"
Create Both
Expunge Both
SyncState *
Group protonmail
Channel pm-inbox
Channel pm-sent
Channel pm-trash
Channel pm-spam

This one has a few other folders mapped, and with a pm prefix, which is probably useful if you’re also syncing other accounts (but maybe there’s a better way of doing that).

I don’t know what Create, Expunge, and SyncState do.

Once configured, you should be able to sync with

$ mbsync protonmail

At this point we should be able to read emails in mu4e.


I’m using spacemacs, for which there’s a mu4e layer, so I just added that to my list of active layers in .spacemacs config. This promptly gave an error when refreshing the config, something to do with maildirs. Commenting out the line


in .emacs.d/layers/+email/mu4e/packages.el sorted it out. Haven’t looked into exactly what it does, so will have to go back and figure out some other solution at some point.

OK, now I could start mu4e with M-x mu4e… and got another error:

error in process sentinel: Database empty; try indexing some messages

Don’t know why, but running:

mu index --maildir=~/.mail

sorted that out. And mu4e worked, woohoo!


To also send mail via ProtonMail from within mu4e, a bit of extra config is needed for the smtp side of things.

The previously mentioned ghostbin/gist had all the details for that, copying again:

Create a file ~/.authinfo with the following contents:

machine login USERNAME_HERE@protonmail.com port 1143 password PASSWORD_PROVIDED_BY_BRIDGE
machine login USERNAME_HERE@protonmail.com port 1025 password PASSWORD_PROVIDED_BY_BRIDGE

This should be secured so the password isn’t included in plaintext in the authinfo file, there’s more info on that in the gist.

Note that the email domain doesn’t need to be protonmail.com – I used my custom domain and it works fine.

mu4e config

The same mini-tutorial also has some handy (and I think required) mu4e config:

(setq mu4e-maildir "~/.mail"
    mu4e-attachment-dir "~/downloads"
    mu4e-sent-folder "/Sent"
    mu4e-drafts-folder "/Drafts"
    mu4e-trash-folder "/Trash"
    mu4e-refile-folder "/Archive")

(setq user-mail-address "USERNAME_HERE@protonmail.com"
    user-full-name  "YOUR_NAME")

;; Get mail
(setq mu4e-get-mail-command "mbsync protonmail"
    mu4e-change-filenames-when-moving t   ; needed for mbsync
    mu4e-update-interval 120)             ; update every 2 minutes

;; Send mail
(setq message-send-mail-function 'smtpmail-send-it
    smtpmail-auth-credentials "~/.authinfo.gpg" ;; Here I assume you encrypted the credentials
    smtpmail-smtp-server ""
    smtpmail-smtp-service 1025)


With all that, everything seems to be just about working – now I just need to get used to using mu4e. But I’m liking it already – much quicker to navigate around emails and to write responses.

I also need to figure out if all of the mbsync mappings are correct, for things like Archives, Spam and Trash folders.

Just came across the poem All Watched Over by Machines of Loving Grace by Richard Brautigan (via the Adam Curtis documentary of the same name.)

It’s kind of fascinating. I like it.  I know it came from a period whose technological utopianism certainly didn’t come to pass, and might have been a bit off-key in the first place, but its sweetly optimistic (…or bitingly critical, depending on what way you squint at it).

All Watched Over by Machines of Loving Grace

I like to think (and
the sooner the better!)
of a cybernetic meadow
where mammals and computers
live together in mutually
programming harmony
like pure water
touching clear sky.

I like to think
(right now, please!)
of a cybernetic forest
filled with pines and electronics
where deer stroll peacefully
past computers
as if they were flowers
with spinning blossoms.

I like to think
(it has to be!)
of a cybernetic ecology
where we are free of our labors
and joined back to nature,
returned to our mammal
brothers and sisters,
and all watched over
by machines of loving grace.

If it were written today it must surely be ironic. But I wonder if it was heartfelt back in the 60s? I find what it paints to be kind of a mixture of pleasantly bucolic and desireable, and weird and creepy all at once. Not sure if I want it or not. I like the idea of a cybernetic ecology, where we are free of our labours, and joined back to nature. Not entirely so keen on being watched over by machines of loving grace. (Though the benevolent AIs in Iain M. Banks’ Culture novels could be good role models if we did want machines of loving grace…)

It’s interesting that the poem doesn’t really make a case for technology, other than the nod towards a kind of fully automated luxury communism at the end. It just sort of assumes that tech is the route to liberation – I guess that’s the flavour of the time. I’m not a primitivist, but I’m not sure that an IoT meadow will have all that much better benefit than the analogue equivalent.


We had a fun EvalApply session today, during which we decided to add to our homepage on evalapply.space a sentence about our interest in examining programming and technology in a wider societal context.  Early on we discussed that this was important to us all, and we often end up chatting about these topics when we meet – perhaps more so than SICP, so far!

We were pondering EvalApply as a name for the group for a short period – from an early email:

In addition, thinking about it further, it also has a double meaning to me that I really like.
Before we apply a function we must first evaluate its arguments.”
Taken metaphorically I feel that this captures [our] philosophical and political views towards technology in a broader sense.  We consider the social ramifications of technology before recommending its use.  We evaluate the arguments before applying its function.

Some things that I remember we chatted about today:

The important of constraints, or having a limited palette.  I can’t remember how it came up, but for me it recalled some of the ideas from old tracker music software and the demoscene, where using restricted hardware and software can be a useful creative constraint.

We talked about community moderation (further to a short note about it earlier this week), with Panda making a strong point that not everyone has the resources to extensively moderate a community.  It had come up for me recently in the context of the Fediverse, and the discussion over the defederation of the Gab instance, and the problems with freedom 0.  “The freedom to run the program as you wish, for any purpose” – this is not good if the purpose is, let’s say, neo-nazism.

Dan discussed the philosopher Simondon (an inspiration for Deleuze and Guattari, I understand), and the topic of alienation and technology.  Not just alienation as a result of losing autonomy in a capitalist system; but also alienation from technology – not knowing how things work or being able to tinker with them.  Emacs being a beautiful example of software you can see the innards of and tinker with, should you wish to.

Dan did a bit of SICP.

Dan and Panda chatted about another French philosopher, whose name I have unfortunately forgotten, and the philosophy of autism.

I read a couple of paragraphs of SICP.

Dan described the difference between technics and technology, which is really interesting – a distinction between the machines themselves, and the analysis of them.


Eval’ing and apply’ing in the MayDay Rooms

Been thinking lately that it could be a good municipal function to provide people with access to an ‘online home’, analogous to ensuring provision of physical homes.

In the same way it could be social, affordable, in a co-op, heck even (but hopefully not) private and rented. The municipality provides some infrastructure and codes/regulations to make sure there’s a home for everyone and that everyone can move freely if they want. But equally you can build your own home or move into an intentional community if you want and have the wherewithal to do so.

Not talking about a StateBook – if the state has any function in it, I think it should be regulating for open protocols and standards, or even just bare minimum access to data and data portability (newsocialist.org.uk/do-we-really-need-a-statebook/). I’m thinking more like Indienet – (indienet.info/) – the project in Ghent (coordinated by @aral@mastodon.ar.al) to provide each denizen with their own connected node in a wider p2p/federated network. I mean municipal more in the sense of libertarian municipalism, self-determination and federation of villages, towns, cities.

Obviously access to physical housing is a mess, at least where I’m currently living, so maybe not the best reference point. But I’m finding it an interesting framing. Every Facebook or Twitter profile is currently a home on the web, and it’s as if billions of people all have the same corrupt landlord.

This is kind of implicitly assuming that everyone *needs* a home on the web. That is certainly a debatable point. It is definitely becoming more of a part of the fabric of everyday life, and you could argue that it shouldn’t be.  I vacillate on this a bit but overall I tend to think that the benefits can outweigh the negatives, when it has a social motive and not a profit motive.

I just spent a few days away in the Lake District.  It’s a beautiful part of England, and a great place to get away to relax and slow your pace down a bit.  Living in London at present, I notice that it takes a couple of days for me to properly unwind and appreciate the peace and quiet and nature on offer when coming to the Lakes.  Day 1 my head is usually wrapped up in something and I don’t fully appreciate my surroundings.  By day 3 or so I can happily just stare at a tree for 30 minutes or so (well maybe 15 minutes).  Usually at the moment however I also come back after three or four days.  It’d be good to spend a full week or two there and see what happens, or even go full Walden and spend a year there.

This time I did one big walk, hiking from Skelwith up to Swirl How in the Coniston set of fells.  It’s 2630 feet high, just 3 feet shorter than the Old Man of Coniston.  The walk there and back took about 7 hours.

The view from Swirl How down into the Greenburn valley. Wetherlam in the foreground to the right.

One of my favourite views that I’ve come across in the Lakes is en route to Little Langdale, looking through the Blea Tarn pass towards the Langdale Pikes.  This is a set of peaks rising from the Langdale Valley.  They have great names like Pike O’Blisco, Harrison Stickle, Sergeant Man, Pavey Ark, etc.

(Question: If you cross a stream near Harrison Stickle, is it a Harrison Ford?  Answer: yes. yes it is.)

Looking over towards the Langdale Pikes

We did a couple of shorter walks too.  One up the excellently named Iron Keld, leading towards Black Crag.  The signpost on the way is great – you have a choice of paths leading to either “Sunny Brow”, or “Iron Keld” and “Black Crag”.  It feels a  bit like choosing between Hobbiton and Mordor.  But for reference, Iron Keld is much more fun than Sunny Brow – it’s an old pine plantation.

The other short walk was up Loughrigg Fell, which joins Skelwith Bridge and Ambleside.   It’s a low fell but a beauty.  When you get near the top it is has lots of gentle undulations, lots of little paths to explore, and some great panoramic views – down towards Ambleside, over to Windermere, great views of Grasmere and Rydal Water.  This time of year it is covered in ferns and looks a little bit like Tellytubby Land in my opinion.

Top o’ Loughrigg

Loughrigg Tarn is a total beauty spot.  An idyllic smallish tarn on the south side of Loughrigg.  A good spot for taking a dog for a swim and looking over towards the Langdale Pikes from a different angle.  You get a good view down to it from the top of Loughrigg.

Loughrigg Tarn below

I can highly recommend a trip to the Lakes.

Kicks Condor discusses his ‘infostrat’ (information strategy), as filtered through a reading of Ton‘s writings on the topic.

What’s an infostrat?  Picking up from Ton and Kicks:

“deciding what and how to bookmark or archive stuff, sorting through conflicting news stories and accusations, and alternating “periods of discovery with periods of digesting and consolidating”


“what is my strategy to comb through the gigs and gigs of input I can plug myself into on the Web?”

I find it all very interesting and would like to work out an infostrat for myself.  Quite often I fall into the pit of infinite scroll and end up in a mess of information overload.  Need to change my filters.

What do I want from the world of information out there?  I would separate my goals in to the social and the informational.

For the social side: I want to not only communicate with people, but to over time become close to some of them.  I must say that until recently, social media has always felt remarkably asocial to me.  Ton seems to have achieved sociality very well over time through blogging. I’d like to explore if there’s a knack to that, other than just giving it time.

For the informational side: this is more what social media has traditionally given me.  However, so far, it’s facilitated more consumption than consolidation I would say.  So I am very intrigued by Kicks’ mention of the linkage between blogs and wikis.  I like the idea of the blog timeline crystallising into a personal wiki over time.

Thanks Ton and Kicks for the discussion.  I have some reading to do!

Despite the liberatory potential of technology, of which I see free software playing a big role, there’s a very real concern of ending up with a kind of technocratic ‘vanguard party’.

You can debate the merits of vanguardism in general, but couple it with the current disproportionate skew of tech roles to white and male – which is even more pronounced in free software at present – and throw in the ‘scratch your own itch’ trope.

That’s a huge systemic problem as vanguard becomes regime.

Some things I am learning: if you’re white and male and into free software (I am), recognise that you have a very blinkered and narrow view of the world.

* Spend half the time you use learning Yet Another Technology to educate yourself about race, gender and class struggles (historical and present).

* Pipe down and listen to others when it comes to discussions about what is needed in software.

* Don’t ‘scratch your own itch’ – serve a community. If you’re white, male and technically proficient you’ve got enough privilege in the bank to pay it back building for others rather than yourself.

Watched the Lecture 1a video for SICP. (Not sure how exactly these correspond to the book – is it by chapter?)

It’s Hal Abelson giving the lectures. I find him really engaging. And, I did not know this until I looked him up just now, but is a founding director of both Creative Commons and the Free Software Foundation. Rad!

Anyway, here’s a few notes I made while watching the video.

Continue reading

Last weekend was IndieWebCamp Utrecht. I went along and had a great time learning, hacking, and seeing some parts of Holland.

IndieWebCamps are brainstorming and building events where IndieWeb creators gather semi-regularly to meet in person, share ideas, and collaborate on IndieWeb design, UX, & code for their own sites. — IndieWebCamps

They’re a great way to learn more about the IndieWeb and also a great excuse to visit a new place you’ve never been before.

I travelled over on a rail and sail ticket from London -> Utrecht with an overnight ferry.  I went to IWC over the weekend, plus a day in Utrecht before, and a day in Rotterdam afterwards.

Continue reading

Carrying on the sporadic series (here are parts one and two), this is my next tinkering around with a means to connect a WordPress-based IndieWeb site to the Fediverse.

For my hackday project at IWC Utrecht I set up Matthias’ ActivityPub plugin that fully fedifies your WordPress site. It’s dead simple and most excellent.

Yes way!

Continue reading

In part 1, I discussed why you might want to bridge your Indieweb site to the Fediverse.

In this follow up post, I’ll describe one way of doing it that I’ve been tinkering with recently.

The tl;dr: using WordPress Indieweb plugins and Bridgy Fed; it works pretty smoothly; still a few quirks at present; it’s awesome and lots of fun to mess around with.

Continue reading

I experimented recently with setting up an Indieweb WordPress site as a standalone actor on the Fediverse. Thanks to the WordPress indieweb plugins, and Bridgy Fed, it’s pretty easy to do, with a few quirks.

This post is a bit of preamble as to what this means and why you might want to do it. Part 2 will go in to the details of one way to do it that I’ve played with – via Bridgy Fed.

Continue reading

When I was younger, mid-teens to mid-twenties, I had really debilitating social anxiety disorder (www.nhs.uk/conditions/social-anxiety/).

To the point that I often did not want to leave my room. I found it difficult to be in a room with other people, eat in public, stand in line at the supermarket. It really affected my mental health and development of relationships.

I still have remnants of it now, in that I’m fairly quiet in social situations and not the most gregarious. But it has mostly gone away, to the point that I can go to events, even do occasional public speaking, and not really worry about it.

So I guess I wanted to say, if you currently have it, or know someone who does – you can definitely overcome it.


UPDATE 7th Nov 2019: snarfed‘s (author of granary) preference is for people to use a different of his tools, twitter-atom.appspot.com/, for this use case.  (see the discussion here).  Various reasons to do so, but chief of which I would say is that using twitter-atom you use your own app key, versus everyone piling on to granary and risking it getting in to trouble with Twitter.  I’ll keep this post here, but please use that tool instead!

My online social experience is mostly through the indieweb. For following people and blogs, I use Aperture, a Microsub server, to subscribe to various social feeds. And then I read and interact with those feeds in various clients – e.g. Indigenous on Android, and Monocle on the web.

Although I don’t use Twitter anymore, lots of people I find interesting still do. So I want a way of following their posts from within my Indieweb readers.

I use granary.io to follow Twitter people in my indieweb reader. What it does is convert a Twitter feed into a feed in a format that I can subscribe to via Microsub.

Continue reading

For future peoples possibly encountering this same error:

"stream_socket_client(): SSL operation failed with code 1. OpenSSL Error messages: error:14094410:SSL routines:ssl3_read_bytes:sslv3 alert handshake failure stream_socket_client(): Failed to enable crypto stream_socket_client(): unable to connect to ssl://some.site:443 (Unknown error)

I was getting this on a server that was running PHP 7.1.  Upgrading to 7.3 resolved it.

(I also enabled SSL on the site that the call was being made from – a test site so it didn’t previously have any SSL – but I don’t think that made any difference.)

So where possible I would like to get my ebooks directly from an independent publisher.  epub format, DRM free.  I can do this from Verso for example.  They watermark the books, putting my email address at the end of every chapter.  While I disagree with that, I can definitely live with it, and kudos to them for eschewing DRM.

Most other books I don’t have that option.  Next option, I’d like to buy from hive.co.uk, which, to a lesser or greater degree, supports independent bookstores.  I think it’s open for debate how and how much that support actually manifests, but at least the intent is there (fingers crossed that it’s not some cynical ethical-washing).

But Hive uses Adobe Digital Editions, a baleful DRM system from Adobe.  It takes a book and licenses it to me only, first demanding I create an account with Adobe, then that I install Adobe’s app on my machine, and finally that I tie my device to this Adobe account.  All my books must go through app, most likely sending information of my purchases to Adobe too.  This for books that I have dutifully paid for, to load onto a device that I have purchased and own.  Somehow into this equation Adobe have insidiously inserted themselves.

Continue reading

Chapter 2: The Commons: From Tragedy to Triumph

The chapter starts by outlining Garret Hardin’s tragedy of the commons argument.  In short, my understanding of the argument is that due to the inherent selfishness of individuals the commons are doomed to overuse — unless they are turned into private property, or turned over to the state, and unless the users of a resource are regulated through coercion. Hardin’s paper is more generally about population limits and his views appear quite bluntly Malthusian.

Having seen functioning commons, Ostrom disagreed with Hardin’s analysis.  She studied commons that worked (and also those that didn’t), and captured her analysis of what made a commons sustainable in her work “Governing the Commons.”

Continue reading

This chapter starts out with a brief biography of Ostrom and her work, providing some context. I think it’s the right amount – the ideas are more important, but it is interesting to get some biographical context. The patriarchal system she faced early on is pretty galling – difficulties in getting where she got to, just by virtue of being a woman.

Ostrom doesn’t slot into a particular predefined school of thought, with some ties to some conservative right thinkers, yet some radical views. I like that Wall approaches it not so much trying to pin her ideas down to any particular ideology, but looking at what practical effects the ideas have had (and can have).

Continue reading

In case it helps anyone, here are the steps I needed to take to get geben working in spacemacs in order to debug local web apps (this in Ubuntu 16.04):

  • Add geben package to .spacemacs and reload
 dotspacemacs-additional-packages '(some-other-package geben)

  • Assuming you have xdebug installed, add the following to your php.ini file in /etc/php/7.1/apache2/php.ini




  • Open the file you’re interested in debugging
  • Start geben in spacemacs with M-x geben
  • Navigate to localhost/some-app.php in a browser

That should trigger geben. Debugging time!

A bonus note: I didn’t have any luck with geben-find-file when trying to add breakpoints to other files in the project, but using geben-open-file worked (just a little bit more cumbersome.)

At HWC London tonight, I worked on a small thing – figuring out why my avatar was appearing blurry when pulled in to other sites following a webmention I’ve sent to them.

For example, on this like of one of Chris’ posts at boffosocko.com:

There’s me at the bottom left, cheerfully blurry.

I wasn’t quite sure why, because the h-card I added into my WordPress theme links to a profile image on my site that is 654×654.

Looking at this with Calum we saw that I have multiple h-cards appearing on any given page, and (other than the one I’ve hard coded) they all point to my image on Gravatar.  Not only that, they are specifically pulling out a 40px square version of my gravatar.

With a little inspection it turns out that every post on my site has a h-card embedded in it.  It’s in the post footer that is added to each post, like this:So the bit that says ‘by’ and my name, also includes h-card microformats.  And in that h-card markup, the image source is my gravatar image, at size 40px.

I wasn’t sure if having an h-card in every single post even made sense, but a bit of discussion with Barry helped me to understand the places you might have the h-card, and that while there’s various ways of doing it, an explicit h-card per post is certainly fine.  Barry pointed me to the authorship page on the wiki for more details on this.

OK, so where does the h-card per post come from in my site?  Given that it contains microformats, and I don’t think WordPress has much microformats built in, the most likely candidate is for it to be somewhere in my fork of the Sempress theme.

A quick search for h-card in the code of my theme for h-card shows yup, that’s where that post footer is being rendered.  It’s in the sempress_posted_on function – there’s a call to get_avatar, a built-in WordPress function.  In that call, the argument for the desired avatar size is being passed in as 40.

So I’ve bumped that up to 96, and all should now be well.