+ https://innoscholcomm.silk.co/
Amy added '101 Innovations in Scholarly Communication - Silk' to Bookmarks
+ https://innoscholcomm.silk.co/
Amy added '101 Innovations in Scholarly Communication - Silk' to Bookmarks
Productive SocialWG f2f day 1. Many resolution, such spec, so convergence, wow.
@dustyweb points out that the SocialWG is already in Federation Space.
In reply to:
They're easy to make you know!
PHP session_start docs: "..will magically contain the session data."
Seems legit.
In reply to:
Give me chance, we only started deciding this yesterday :p
I just took a 17 hour train (Starlight 14) from San Francisco to Portland, right up the west coast of the USA. At 9 last night, a connecting bus (the Thruway) took me from SF to Emeryville, Oakland, where I boarded the train at 10. I bought my ticket in advance online ($132, which includes the bus and also a Portland to Seattle stretch for later this week) and was assigned a seat upon boarding via the sophisticated system of a lady with a handdrawn grid and a pen.
Totally different from any other train I've been on, which is why I'm writing about it. The train is huge. I didn't manage to count the carriages, but there are many. The cheapest Coach seats are wide - probably the same as the Caledonian Sleeper - but recline further with more legroom, and have an extra leg rest bit that pops out underneath, so it's a totally viable bed. The whole train is two storey, with most seats above, and loads of bathrooms below.
Overnight I pretty much slept through, waking at Sacramento to close the curtain as the station lights were bright. There were many stops, and people came and went from the seat beside me. At one point there was some kerfuffle from someone who thought her phone had been stolen, and also the seat assignment system resulting in people being assigned to the same place, but mostly I wasn't really disturbed. At one point I woke up to nobody, and rotated to lie across the two seats. I woke again at 0730 when the sun was up and we arrived in Klamath Falls. Plenty of empty seats in my carridge.
From my vantage point it was largely dilapidated sheds situated amongst unkempt scrubland. Some tourist-looking people disembarked here, so there must be something to do. There are both a restaurant and a cafe car, the latter of which serves decent coffee with a smile for $2, and there is at least one vegan option on the menu (a burger, unsampled). The restaurant took reservations for breakfast and lunch, with a member of staff passing through the train to offer this. Above the cafe is an 'observation lounge', with seats clustered around tables, and reclining armchair-types, all with power outlets, but the special feature is double-sized windows. So really light, great vantage. I considered relocating, but wasn't sure about the etiquette of permanently occupying space here, and couldn't be bothered dragging my stuff back and forth.
We proceeded and the scenery picked up with lakes and mountains, gradually becoming more epic, peaking at alpine-esque snow-covered pine forests. Scenery became less interesting after Eugene. Flat, industrial-farmland-y. Some stops are long enough for 'smoke breaks', with passengers allowed to disembark to stretch their legs for around half an hour.
Way more interaction with fellow passengers than I'm used to in the UK. Always takes me by surprise. A bunch of friendly (I think?) comments on my hair, plus a few people who wander through the train rambling good wishes at everyone. Someone giving out homebaked cookies. An old couple dressed as Santa Claus. Most people - in great variety - seem consistently intimdating when silent, and friendly when they open their mouths. Maybe I just don't know how to read Americans yet. The expectation/demand of interaction with strangers here is something I'm still figuring out, and I think I unintentionally offend people when I'm not very good at making conversation.
At 1430 there was a wine tasting. Who boards a train and goes to a wine tasting?
Arrived in Portland over 30 minutes early. No data connection here. Not even 2G. Wilf's Wine Bar just beside Union Station let me use their wifi.
Plotting vegan food places on a map of Portland. Plots whole of Portland. Going to have to find another way to narrow this down. Haven't even opened Happy Cow yet.
Installed letsencrypt on my local machine (a Chromebook running Ubuntu in crouton, didn't melt):
git clone https://github.com/letsencrypt/letsencrypt
cd letsencrypt
Ran in manual mode:
./letsencrypt-auto certonly --manual
Followed instructions for domain validation (dealt with rogue .htaccess file that stopped .well-known being accessible).
fullchain.pem
, chain.pem
, cert.pem
and privkey.pem
were generated into /etc/letsencrypt/live/mydomain.tld
.
cPanel -> SSL/TLS Manager.
Uploaded privkey.pem
to Private Keys. Uploaded cert.pem
to Certificates.
Manage SSL Hosts -> Browse Certficates -> picked the cert (it prefilled domain automatically). First two boxes prefilled with private key and cert. Pasted CA bundle into the third box from the lets encrypt site.
Worked!
property
parameter and will be able to verify mentions marked up with RDFa.Specifically all apps running in Docker containers, served on subdomains through the nginx reverse proxy container.
tl;dr I didn't figure it out yet, if you're looking for a guide you're not in the right place.
Disclaimer:
Things I currently run through the reverse proxy:
Decided off the bat not to bother with Etherpad, as I haven't looked under the hood and don't know how I'd do domain validation at all. Figured the others would be doable.
Ran the letsencrypt Docker container. The current docs run it with -p 80:80 443:443
and the auth
command and no plugins. I left out the ports and used the manual
plugin, as I can't conceive of how it would do domain validation it across containers, so I'm not even going to try rig that up:
sudo docker run -it --rm --name letsencrypt \\\\\\\\
-v "/etc/letsencrypt:/etc/letsencrypt" \\\\\\\\
-v "/var/lib/letsencrypt:/var/lib/letsencrypt" \\\\\\\\
quay.io/letsencrypt/letsencrypt:latest certonly --manual
I gave it several subdomains, for each of the different apps I have running, and Worked through domain validation for all of them (putting a file in .well-known).
Down to three subdomains, I (finally!) generated one cert for all of them. It stored the files (cert.pem
, chain.pem
, fullchain.pem
and privkey.pem
) under the name of the first one on the list (in /etc/letsencrypt/live/sub.mydomain.tld
).
I had it on good advice that if I restarted the proxy container I wouldn't need to restart all of the other containers. The instructions for SSL in the docs for the proxy say:
$ docker run -d -p 80:80 -p 443:443 -v /path/to/certs:/etc/nginx/certs -v /var/run/docker.sock:/tmp/docker.sock:ro jwilder/nginx-proxy
It also says "The certificate and keys should be named after the virtual host with a .crt
and .key
extension". I had .pem
s so I threw caution to the wind and renamed:
fullchain.pem
-> sub.mydomain.tld.crt
privkey.pem
-> sub.mydomain.tld.key
Put them in a directory that I mounted to the proxy container when I relaunched it (above command). Given the rename, the proxy is supposed to just find them I guess?
Failure 1: port 443 is already in use. o.O I couldn't figure out what is already using it, but later wondered if Gold is maybe sneakily hijacking it. Otherwise I found this docker bug which may be it but I am in no position to upgrade docker right now (I know, I know). Faced with imminently needing to relocate to somewhere without wifi and not being able to cope knowing my Etherpad was down, I relaunched the proxy container as above, without -p 443:443
, but still with the path to certs. Whew, everything came back up.
Except... The first subdomain, the one that the certs are named after, is now serving Gold, not the Apache container it was supposed to point to. The subdomain that is supposed to be serving Gold is also serving Gold. Neither are using the correct cert.
So... that's where I'm at.
.pem
to .crt
by renaming the file. I will attempt to convert with openssl.fullchain
or one of the others? Since nginx only asks for cert and key, I assume fullchain
is the right one.. I've seen tutorials which cat the cert with the CA bundle, which I believe is what fullchain
is, so..CERT_NAME=sub.mydomain.tld
flag as they all point to one shared cert.If you've read this far and have the remotest clue about any of these puzzle pieces, please let me know all the things I'm getting totally wrong.
See also: HTTPS: Not a terrible experience
+ http://martin.atkins.me.uk/activity-streams/
Amy added http://martin.atkins.me.uk/activity-streams/ to https://rhiaro.co.uk/bookmarks/
The online social world is simultaneously and varyingly a mirror, augmentation, and reduction of our offline social lives. Human behaviours are complex and nuanced and context-dependant, inconsistent, sprinkled across many different media, subject to multiple interpretations.. and all kinds of other words that mean it's difficult to wrangle this space into a coherant or complete data model. But, we (computer scientists, software developers) do it anyway.
For some reason, nobody has come up with a solution that pleases everybody yet.
It seems probable, given the diversity of systems, requirements, desires and general worldviews, nobody ever will. So here I come, blazing in with neither history nor experience, to try to write about how I see things. There's a lot of interesting past work in this space, and a full literature(/mailing list/blog post) review is worth doing. But for now, in the interests of time and sanity, I'm sticking to the current state of things as I understand them. In this post, I look at only two different (but overlapping) models. I also want to emphasise that there can't possibly be a single right way of looking at this space. Such models aren't in conflict (as some may think) but simply different ways of looking at the same thing. Where proponants of one may see shortcomings in the other, the opposite is equally true. People's different experiences and needs have led them a way of looking at things that they find perfectly intuitive. Often, this means alternative perspectives are seen as unintuitive and perhaps even in competition. Of course, there's a fine line between accounting for different perspectives, and losting interoperability entirely.
So, let's take stuff. We start with the idea that there is stuff on the social web that we want to describe. One subset of stuff is people. I'm not thinking about modelling people and relationships here.
Other subsets of stuff, which I want to focus on right now, are things and happenings. I'm deliberately trying to pick silly terminology to keep this abstract. You might also think of things as content, items or objects, and happenings as events, interactions or activities.
The two social data models I've spent the most time thinking about recently are ActivityStreams 2.0 and microformats2 (AS2 and mf2 henceforth) and the relationship between them. To various degrees, I've written about them before.
ActivityStreams was designed as a model for logging happenings (as opposed to a command language, which in some systems it has been used for). AS2's strength is happenings. A happening is modelled as an Activity
, which may have a relation to a thing, aka an Object
. (Technically an Activity
is a type of Object
, as is everything else, Actor
, Collection
, etc, but what I'm referring to here are generic Object
s that don't belong to one of these other special classes). There are more specific types of Activity
in the AS1 vocab than there are Object
s.
Microformats was designed for marking up all kinds of web documents, and a subset of mf2 is focusing on social, specifically feeds and blog posts. Most people who work with mf2 find that they can adequately represent anything they need to as a thing. mf2's strength is things. The most common type of thing is an entry
(a post), and you're also likely to see feed
s and card
s (profiles). All things can have relationships with other things. There's no particular language for logging happenings with mf2; the existence of a set of things serves that purpose. And no specific types of thing; rather, types of thing are inferred from the semantics of their properties and relationships with other things.
Both have JSON representations. This is the default for AS2, but mf2 is embedded in html, which means their JSON representation is generated by a parser. Even though there are a set of parsing rules, output can be complicated and vary depending on the structured of the html being parsed, and I've found myself writing my own 'parsing rules' to extract what I want from parsed mf2. Fortunately Aaron and Ben are working on a vastly simplified JSON representation of mf2, called jf2.
AS2 is already under the remit of the W3C Social Web WG (edited by James), and at the recent face-to-face, jf2 was (somewhat controversially) accepted as an Editor's Draft for the group as well.
The WG is chartered to produce a 'social syntax'. AS2 works great for those with happenings-centric worldviews, but doesn't always sit right with those who want to focus on things. On that basis, I think the two syntaxes have the potential to complement each other. I don't see why the WG can't produce two syntaxes for these two perspectives, which can be used together in systems which mix and match things and happenings. Not everyone agrees. There are certainly some redundancies and overlaps with both specs in their current state, and thus a lot more work to be done. But the editors are all keen to work together to iron these out, and I'm hopeful that now there's more of a feeling that these specs are complementary rather than competing, working on this will improve both. The strengths of one can plug the weaknesses of the other, and everyone can benefit. We also need to leave room for expansion in the future, as obviously this is anything but a static space.
An addendum for those who are adamantly opposed to one or the other of the models, because the other is obviously right/better: I encourage you to accept that different experiences lead to different conclusions, and if there was an obviously right/better way of doing this we'd have solved it long ago. Bringing together the different experiences of different people in order to benefit everyone is kind of the point of the WG.
These are the two models being considered because their supporters wrote them up as specifications and can demonstrate history and implementation experience. Anyone have a different model of the social world they wish to bring to the table..?
This post is my own opinion, and does not necessarily represent the opinion of the Social Web WG!
See also day 1 minutes and day 2 minutes.
What follows is more detail on my perspective of the main conversations we had over the two days. Clarifications and corrections welcome.
We steamrollared through AS2 issues, which included core simplifications, vocabulary changes and reductions and editorial clarifications. James made most edits during the meeting, so there's a much updated working draft available now. You should read it and post any issues you see to be CR-blocking by the 15th of December.
JSON-LD related stuff has been moved to its own section; since it is optional, having references sprinkled throughout was confusing for people who don't necessarily want to deal with it. Chris pointed out that even for extensions, if you know the extension you want to handle you can still do that in plain JSON.
Still some consternation about which alternative syntax examples should be in the spec. Since they're non-normative/editorial anyway they're not a CR-blocker so left alone for the time being.
We don't need a verison number in the URL because any future widely-deployed extensions that want to make it into the core will likely be incorporated as part of a full new version (ie. AS3). At some point we should start a registry for extensions; James is keeping track himself on github, we could move this to a wiki page and ultimately a CG when the WG wraps up.
Lots of discussion about testing frameworks and how to meaningfully test production and consumption of syntax and vocabulary. I refer you to the minutes, as this is beyond my ability to summarise well enough, but it seemed like the people working on this gained some clarity and a plan to move forwards.
There's some contention around what it means to accept a spec as an Editor's Draft in the WG. Our general consensus was that it means a spec isn't necessarily going to be rec-track, or even the direction the group is going to take, but it's in-scope and worth some portion of our attention, even if that is just to inform other things. Most specs we've picked up are work that the editors were doing anyway, and this just means the WG should explicitly not ignore them. It's expected that the specs will change significantly going forward, in response to input from the WG.
As such, it's okay that some of our now-ED specs currently cover overlapping territory. We hope that WG attention will serve to refine, cut, expand, merge, and otherwise sort this out. We may end up multiple small specs derived from our current set of EDs which cover pieces of the social puzzle, or several specs will demonstrate multiple viable ways of doing the same thing. My preference is for the former, but the latter is better than nothing. (I tentatively extrapolate the latter into either one gets wide adoption and the others quietly fade out, or they all get equal adoption and people build bridges between them, either way not a total loss).
I finally convinced people to stop calling this "Amy's SocialAPI Document" and renamed everything to Social Web Protocols. This document describes the individual componants we are trying to standardise (based on user stories), covering both the API and Federation. Given the potential for the WG to produce multiple small specs, work on this is to continue to describe and serve as a guide to each building block. This should highlight both points of convergence between separate specs, and gaps that no existing specs are filling adequately.
Issues filed should be to that end. Point out obvious points of commonality between specs that I haven't noted, or where it would be worth replacing the vague overview with more spec-like details.
I aim to take this to FPWD next week (which does not imply WG consensus on the contents yet) with the expectation that this is currently an overview document: a guide to the different areas being standardised by the WG. If we end up with a bunch of small (or overlapping) specs, this could end up as a Note, detailing how they relate to each other as a guide to implementors. If we end up with one spec that covers everything, either this becomes it following input from the other drafts, or this has done its job at converging things and is dropped completely. The rationale behind going to FPWD with this document is to better advertise and explain the different angles of work the WG is doing to other WGs and the public, and to seek wider feedback thereon. The issues become a place to discuss features based on functionality, where they are not specific to one of the other individual specs.
Both now EDs. Micropub covers a subsection of functionality of ActivityPump, but is uncoupled from any other pieces of functionality, whereas ActivityPump intertwines lots of things. There are distinct similarities - both POST JSON to a specific endpoint to create, update and delete content. The editors are keen to cooperate so there's value in working on them both in sync.
jf2 is a social syntax to complement (not compete with!) AS2, where jf2 is content-centric and AS2 is activity-centric. The editors of both agreed that it would be beneficial to work on these in conjunction. I've written more about the relationship between them here.
PTD is an algorithm to help consumers who find themselves with implicitly typed objects to derive explicit AS2 types, if that's what they prefer to work with. My concern is that it's biased towards the microformats2 vocabulary and currently useful for a niche - but a very small niche. However, James pointed out that types are actually optional in AS2 in general, so it could be expanded to help go from untyped AS2 objects to typed ones as well.
My other concern is that this space is such a moving target, and having a fixed algorithm to derive post types based on properties is going to get dated really fast. A constantly shifting algorithm doesn't fit into W3C workflow, and of course brings its own problems (who has implemented which version as it changes over time). I don't have an answer to this.
Evan forced us to slow down and really think about what we meant by federation. I don't know if anyone else did, but I had a bit of a lightbulb during this discussion. Bogged down in the particular bits of federation that are within reach to me (basically, notifications), I forgot there are a great many other things one might want to federate, like search, following topics, user discovery, recommendations... The WG is not required to tackle all of these, as a 'federation protocol' is the icing on our charter cake, but we absolutely don't want to de-prioritise federation completely as it's important and actually ties in well with a lot of the work we're already doing. But taking a step back to reassess - and clarify to the outside world - what we're actually doing is important, and there's some effort going into that now.
Webmention has been an ED for a short while, and there were some issues to work through. Some really interesting points have been raised around technical details, security concerns and functionality enhancements, and lots of different ways to refine this spec are emerging. We talked through open issues, and group resolutions were made for most of them.
The one I'm most interested in is addition of the property
parameter for better disambiguation of the assertion being made by a webmention. It surfaces a need of people outside of the current core webmention implementors to send and verify claims more precisely, whilst adding minimal additional overhead for those who don't need it. Since bringing this spec to the WG is an effort to gain wider adoption and investigate broader use cases than what we have at present, taking these kinds of expansions seriously is good. The benefits were positively acknowledge overall, and I was hoping we'd see this added to the current version, even if marked 'at risk' pending future implementations. But the decision was taken to leave it out, in the same vein as AS2 currently relegating all addition suggestions to extensions. On the plus side, the webmention spec will link to all proposed extensions which are written up as specs, and if any extensions see enough adoption over the course of the work they can be integrated into the core.
Bridging between worlds has been an ongoing theme in this WG.
Chris has done some great work on Activipy, a Python library for handling AS2, and overnight between meeting days he added support for jf2 as well by simply passing in an alternative JSON-LD context. This means you can pass in AS2 and serialize out again as jf2, and vice versa! More or less, at least... this might even be improved by adding PTD in between for where mappings aren't already obvious.
In general, tensions between specs that could be overlapping have changed to editors supporting each other to drive all of the work forward, and, I'm hoping, to optimise where redundancies exist. We all have the same goals, after all. We've still got a long way to go, but we ended on the feeling that we can probably get there.
And we played almost no SocialWG bingo, so it must have been a good meeting.
Following on from a terrible experience, I discovered that indeed Gold had taken over port 443. As soon as I kicked it off, I could launch the nginx proxy container with -p 80:80
and -p 443:443
. So now the proxy knows where to find the key and cert, and is trying to load the subdomain over https, but is getting connection refused. The proxy docs say this might happen "if the container does not have a usable cert" so now I have to find out what is wrong with my cert?
Is it permissions? It's usually permissions. Nginx might reject things if they key is world readable? Tried setting to 0600 for the key and cert. No dice.
What next?
i hate everything i hate everything i hate everything i hate everything i hate everything i'm going for vegan corn dogs
+ http://www.w3.org/2015/Process-20150901/#transition-reqs
Amy added http://www.w3.org/2015/Process-20150901/#transition-reqs to https://rhiaro.co.uk/bookmarks/
Amy added 'AMiner - Open Science Platform' to Bookmarks
Unexpectedly found the ever-sought golden trio in Portland train station - seats and power outlets in just the right place to pick up wifi from the restaurant next door.
Arrived in Portland after a 17 hour train journey. Had no data connection and had done no prior planning, including caching a map or getting the address of where I was staying. Pantsed around Union Station looking for wifi for a while, and eventually the restaurant next door (Wilf's) were sympathetic. I made it to my destination, discussed the boundless possibilities of vegan food with my hosts for a few hours, and then we went for dinner at Epif Restaurant & Pisco Lounge; Charciqan. Also headed to a bar where I tried Kombucha. Not sure how I feel about it.
Breakfast tacos. Went to Heart coffee to hack for a bit; had a latte with house-made cashew milk that I forgot to take a photo of. Walked to Wolf and Bear's for falafel, and ate it in Townshend's tea shop along with coconut milk mate red bean bubble tea.
Breakfast tacos. Rain. Lunch at Vtopia, a dedicated vegan cheese shop o.O Salami and cheese toastie, and shared a cheese tasting platter. Also bought an expensive block of 'camembert'.
Ground Kontrol for a vegan corndog, weird Liam Neeson movie and pinball. The nostalgia for Windows pinball I got from playing IRL pinball was incredible. Reverse nostalgia?
Went to White Owl for vegan s'mores, but they don't sell them any more due to complaints from the fire department. Stopped by the entirely vegan mall (!?!), but everything was closed. Finally ended up at Rimsky's for tea, a truly fantastical place that would have been the highlight of the day had the day not been entirely highlights. "Sit at a table which hums."
Breakfast tacos. Torrential rain. The best burrito, empanadas rellenas, I've ever had in my life from Los Gorditos.
I went on a walking tour of 'underground' Portland. I was the only person, and the tour guide wasn't super keen to go out in the downpour, but he humoured me. I learnt lots about racism and crimping and tunnels.
Two big pizza slices from Sizzle Pie.
Discovered Scandal and Please Like Me.
Breakfast tacos. No rain! Fish tacos, gravy mash and sweet potato fries from Veggie Grill. I'm reliably informed that breakfast tacos don't count, so I technically didn't have tacos twice in one day.
Walked to and through Washington Park. Picked up a orange and olive oil donut from Blue Star. Lots of trees and mud and a pretty sunset. Walked about 9 miles in total. Homemade fancy mac'n'cheeze.
Leftover fish taco. Stumptown coffee. Vegan biscuits and gravy and other things at Paradox. Train to Seattle!
Not much to report. Pantsed around with HTTPS too much, ate a lot and walked a lot.
S3E13, Deja Q...
Q: "Simple: Change the gravitational constant of the universe."
Geordi: "What?"
Q: "Change the gravitational constant of the universe, thereby altering the mass of the asteroid."
Geordi: "Redefine gravity. And how the hell am I supposed to do that?"
Q: "You just DO it. GAHH! Where's that doctor, anyway?"
Data: "Geordi is trying to say that changing the gravitational constant of the universe is beyond our capabilities."
Q: "Well, then... never mind.
Proposition: Scandal is better than House of Cards. Thoughts?
Emergency code brownie
+ https://research.science.ai/article/on-the-marginal-cost-of-scholarly-communication
Amy added 'On the marginal cost of scholarly communication | read.science.ai' to Bookmarks
Amy added 'Canu2019t Disrupt This: Elsevier and the 25.2 Billion Dollar A Year Academic Publishing Business u2014 Medium' to Bookmarks
+ http://alexbilbie.com/2014/11/oauth-and-javascript/
Amy added http://alexbilbie.com/2014/11/oauth-and-javascript/ to https://rhiaro.co.uk/bookmarks/
+ https://matthew-brett.github.io/pydagogue/gh_delete_master.html
Amy added https://matthew-brett.github.io/pydagogue/gh_delete_master.html to https://rhiaro.co.uk/bookmarks/
After years of occasional trying and giving up, I finally figured out the foundation single crochet stitch. Now I'm bored of it.. Only 189 more to go for this round.
today i am writing javascript and i hate it everything is gibberish.
+ http://www.eurekalert.org/pub_releases/2015-12/pp-aos122215.php
Amy added 'An Open Science plan: Wikidata for Research | EurekAlert! Science News' to Bookmarks
If there's nothing wrong with me... maybe there's something wrong with the universe.
(Beverley Crusher, TNG S4E5)
+ http://www.tokiwinter.com/apache-httpd-mod_rewrite-one-rewritecond-many-rewriterules/
Amy added http://www.tokiwinter.com/apache-httpd-mod_rewrite-one-rewritecond-many-rewriterules/ to https://rhiaro.co.uk/bookmarks/
+ http://htaccesscheatsheet.com/
Amy added http://htaccesscheatsheet.com/ to https://rhiaro.co.uk/bookmarks/
+ https://apps.rhiaro.co.uk/burrow/
Amy added https://apps.rhiaro.co.uk/burrow/ to https://rhiaro.co.uk/bookmarks/