Go forward in time to September 2014.
During GUADEC we had a Birds-of-a-feather session (BoF) for what eventually became the Safety Team. In this post I'll summarize the very raw minutes of the BoF.
What is safety in the context of GNOME?
Matthew Garrett's excellent keynote at GUADEC made a point that GNOME should be the desktop that takes care of and respects the user, as opposed to being just a vehicle for selling stuff (apps, subscriptions) to them.
I'll digress for a bit to give you an example of "taking care and respecting the user" in another context, which will later let me frame this for GNOME.
In urbanism circles, there is a big focus on making streets safe for everyone, safe for all the users of the street. "Safe" here means many things:
It turns out that focusing on safety automatically gives you many desirable properties in cities — better urbanism, not just a dry measure of "streets with few accidents".
There is a big correlation between the speed of vehicles and the proportion of fatal accidents. Cities that reduce maximum speeds in heavily-congested areas will get fewer fatal accidents, and fewer accidents in general — the term that urbanists like to use is "traffic calming". In Strasbourg you may have noticed the signs that mark the central island as a "Zone 30", where 30 Km/h is the maximum speed for all vehicles. This lets motor vehicles, bicycles, and pedestrians share the same space safely.
Along with traffic calming, you can help vulnerable people in other ways. You can put ramps on curbs where you cross the street; this helps people on wheelchairs, people carrying children on strollers, people dragging suitcases with wheels, skaters, cyclists. On sidewalks you can put tactile paving — tiles with special reliefs so blind pedestrians can feel where the "walking path" is, or where the sidewalk is about to end, or where there is a street crossing. You can make traffic lights for pedestrians emit a special sound when it is people's turn to cross the street — this helps the blind as well as those who are paying attention to their cell phone instead of the traffic signals. You can make mass transit accessible to wheelchairs.
Once you have slow traffic, accessible mass transit, and comfortable/usable sidewalks, you get more pedestrians. This leads to more people going into shops. This improves the local economy, and reduces the amount of money and time that people are forced to waste in cars.
Once you have people in shops, restaurants, or cafes at most times of the day, you get fewer muggings — what Jane Jacobs would call "eyes on the street".
Once people can walk and bike safely to places they actually want to go (the supermarket, the bakery, a cafe or a restaurant, a bank), they automatically get a little exercise, which improves their health, as opposed to sitting in a car for a large part of the day.
Etcetera. Safety is a systemic thing; it is not something you get by doing one single thing. Not only do you get safer streets; you also get cities that are more livable and human-scaled, rather than machine-scaled for motor vehicles.
And this brings us to GNOME.
"Computer security" is not very popular among non-technical users, and for good reasons. People have friction with sysadmins or constrained systems that don't let them install programs without going through bureaucratic little processes. People get asked for passwords for silly reasons, like plugging a printer to their home computer. People get asked questions like "Do you want to let $program do $thing?" all the time.
A lot of "computer security" is done from the viewpoint of the developers and the administrators. Let's keep the users from screwing up our precious system. Let's disallow people from doing things by default. Let's keep control for ourselves.
Of course, there is also a lot of "computer security" that is desirable. Let's put a firewall so that vandals can't pwn your machine, and so that criminals don't turn your computer into a botnet's slave. Let's keep rogue applications (or rogue users) from screwing up the core of the system. Let's authenticate users so a criminal can't access your bank account.
Security is putting an armed guard at the entrance of a bank; safety is having enough people in the streets at all times of the day so you don't need the police most of the time.
Security is putting iron bars in low-storey windows so robbers can't get in easily; safety is putting iron railings in high-storey balconies so you don't fall over.
Security is disallowing end-user programs from reading /etc/shadow so they can't crack your login passwords; safety is not letting a keylogger run while the system is asking you for your password. Okay, it's security as well, but you get the idea.
Safety is doing things that prevent harm to users.
A good chunk of the discussion during the meeting at GUADEC was about existing things that make our users unsafe, or that inadvertently reveal user's information. For example, we have some things that don't use SSL/TLS by default. Gnome-weather fetches the weather information over unencrypted HTTP. This lets snoopers figure out your current location, or your planned future locations, or the locations where people related to you might live. (And in more general terms, the weather forecasts you check are nobody's business but yours.)
Gnome-music similarly fetches music metadata over an unencrypted channel. In the best case it lets a snooper know your taste in music; in the worst case it lets someone correlate your music downloads with your music purchases — the difference is a liability to you.
Gnome-maps fetches map tile data over an unencrypted connection. This identifies places you may intend to travel; it may also reveal your location.
While the examples above may seem far-fetched, they go back to one of the biggest problems with the Internet: unencrypted content is being used against people. You may not have someone to hide from, but you wouldn't want to be put in an uncomfortable situation just from using your software.
You may not be a reckless driver, but you still put on seatbelts (and you would probably not buy a car without seatbelts).
We are not trying to re-create Tails, the distro that tries to maintain your anonymity online, but we certainly don't want to make things easy for the bad guys.
During the meeting we agreed to reach out to the Tails / Tor people so that they can tell us where people's identifying information may leak inadvertently; if we can fix these things without a specialized version of the software, everyone will be safer by default.
While auditing code, or changing code to use encrypted connections, can be ongoing "everyday" work, there's a more interesting part to all of this. We are moving to sandboxed applications, where running programs cannot affect each other, or where an installed program doesn't affect the installed dependencies for other programs, or where programs don't have access to all your data by default. See Allan Day's posts on sandboxed apps for a much more detailed explanation of how this will work (parts one and two).
We have to start defining the service APIs that will let us keep applications isolated from the user's personal data, that is, to avoid letting programs read all of your home directory by default.
Some services will also need to do scrubbing of sensitive data. For example, if you want to upload photos somewhere public, you may want the software to strip away the geolocation information, the face-recognition data, and the EXIF data that reveals what kind of expensive camera you have. Regular users are generally not aware that this information exists; we can keep them safer by asking for their consent before publishing that information.
A lot of uncomfortable, inconvenient, or unsafe software is like that because it doesn't respect you.
Siloed software that doesn't let you export your data? It denies you your agency to move your data to other software.
Software that fingerprints you and sends your information to a vendor? It doesn't give you informed consent. Or as part of coercion culture, it sneakily buries that consent in something like, "by using this software, you agree to the Terms of Service" (terms which no one ever bothers to read, because frankly they are illegible).
Software that sends your contact list to the vendor so it can spam them? This is plain lack of respect, lack of consent, and more coercion, as those people don't want to be spammed in the first place (and you don't want to be the indirect cause).
Allan's second post has a key insight:
[...] the primary purpose of posing a security question is to ascertain that a piece of software is doing what the user wants it to do, and often, you can verify this without the user even realising that they are being asked a question for security purposes.
We can take this principle even further. The moment when you ask a security question can be an opportunity to present useful informations or controls – these moments can become a valuable, useful, and even enjoyable part of the experience.
In a way, enforcing the service APIs upon applications is a way of ensuring that they ask for your consent to do things, and that they respect your agency in doing things which naive security-minded software may disallow "for security reasons".
Here is an example:
Agency: "I want to upload a photo"
Safety: "I don't want my privacy violated"
Consent: "Would you like to share geographical information, camera
information, tags?"
We can get very interesting things if we distill these ideas into GNOME's Pattern Language.
Assume we had patterns for Respect the user's agency, for Obtain the user's consent, for Maintain the user's safety, and for Respect the user's privacy. These are not written yet, but they will be, shortly.
We already have prototypal patterns called Support the free ecosystem and User data manifesto.
Pattern languages start being really useful when you have a rich set of connections between the patterns. In the example above about sharing a photo, we employ the consent, privacy, and agency patterns. What if we add Support the free ecosystem to the mix? Then the user interface to "paste a photo into your instant-messaging client" may look like this:
Note the defaults:
Off for sharing metadata which you may not want to reveal by default: geographical information, face recognition info, camera information, tags. This is the Respect the user's privacy pattern in action.
On for sharing the license information, and to let you pick a license right there. This is the Support the free ecosystem pattern.
If you dismiss the dialog box with "Insert photos", then GNOME would do two things: 1) scrub the JPEG files so they don't contain metadata which you didn't choose to share; 2) note in the JPEG metadata which license you chose.
In this case, Empathy would not communicate with Shotwell directly — applications are isolated. Instead, Empathy would make use of the "get photos" service API, which would bring up that dialog, and which would automatically run the metadata scrubber.
Go backward in time to July 2014.
Federico Mena-Quintero <federico@gnome.org> Mon 2014/Aug/25 18:37:42 CDT