From stacky wiki
Jump to: navigation, search




  • post to write: lego blocks make a decent (and cheap) substitute for setup blocks
  • post to write: my woodworking clamps are more useful than I thought, because I can put them through holes in my bench.
  • idea to implement: flush toilet by foot
  • idea to implement: good picture of clock on wall
  • "don't be so clothes-minded"
  • is there a way to put phone keyboards on the back of the phone so that you can type with your fingers instead of your thumbs?

Privacy thoughts

Since I'm going to be working for Google soon, I feel compelled to sort out my opinions about how the company should work, if only to document how my opinions change over time. In particular, I want to record my thoughts (and to the extent possible, reasons) about privacy.

The basic problem is that humanity as a whole benefits greatly from the democratization and openness of data. I think anybody who wants to should be able to know how many emails were sent in the last month. However, some data is sensitive and shouldn't be publicly accessible. Worse, the data that I think should be available is often aggregate statistics about sensitive data. I don't think anybody should be able to know how many emails you sent last month.

Data without trust

In many situations, it is possible to gather and distribute aggregate data without any need to trust another party. For example, if me and a dozen friends want to know the average salary in our group, but none of us wants to reveal their salary, there are ways to minimize the amount of individual information shared.

I could think of a random number (maybe negative), add my salary to it, and tell the sum to the next person. They'd add their salary to that and tell the sum to the next person, and so on. The last person tells their final sum to me, I subtract off the original random number and announce the true sum.

Note that you are trusting people to be honest in this protocol (i.e. report their actual salary rather than garbage), but you aren't trusting them with your personal information. At least not much. If the first person has some information about the distribution from which I sample the original random number, then they do get some information about my salary from the number I give them, but not much. Also, I'm probably in a position of advantage in the above protocol, but as far as I can tell, it's not significant.

(This is only meant as an illustration. This protocol has other problems. Your neighbors (the person before you and the person after you) can share information after the fact to figure out your salary. Instead, I could ask you to encrypt your salary with my public key before adding (or multiplying, whichever doesn't mess with the encryption) you salary to the total. Then your neighbors would also have to collude with me to figure out your salary. It might even be possible to compose all of our public keys, so that every single person would have to decode with their private key in series before we got the end result; that'd be pretty cool.)

Data with trust

It's not always possible or practical to salt your personal data like that. For one thing, everybody needs to agree on the protocol, and then they need to perform that protocol. So people sometimes compromise and simply tell their salary to a party they've decided to trust. To the extent possible, people should have the option to avoid this.

Tracking web behavior

Should somebody be tracking your behavior on the web? I'm tempted to say no, but I also want to have access to aggregate information about how people use the web, and information like how many people visit my site and where they're coming from. Access to this kind of information makes the world better in principle. By studying data about how we behave, we can make ourselves better and happier people.

Question: what should tracking cookies collect? Should the resulting data be public? How easy should it be to block specific actions from being logged? Not too easy, otherwise the data collected is severely crippled by people blocking collection of data which they think is unacceptable, even if there is a guarantee of anonymity.

You should be able to change what you show me based on my past behavior (this is a good thing!), but I should know that it's going on, and have the option of getting (and linking to) a "vanilla version" which does not incoroporate previous behavior, location, or whatever other information which is specific to me.

To what extent should companies have policies about sensitive information. Google psycic and instant don't autocomplete or show results for "bisexual". I don't know how this happened, but I don't like it. I think it's okay to show you psycic/instant results based on your identity (maybe you don't want results about bisexuals), but this seems to be across the board. Porn has the same sorts of issues. People are eager to say that porn is bad, but we should not impose that opinion on others without a good reason for doing so. If there is a possibility that in the 24th century people will have different (presumably better) values than ours, then we should aim to discover those values as soon as possible. I believe that we can do that best by reflecting on the present as accurately as possible. That means not monkeying with psycic and instant in a content-dependent way. The behavior of these services should be data driven; the word "bisexual" should be treated just like every other word, and be subject to the same algorithmic analysis.

Changing our minds

How can we make the internet less of an echo chamber? It would be really nice to know what content is likely to change my mind about something. Such content is extremely valuable: even if it doesn't actually change my mind, it is likely to make me grow as a person.

Social networking meets online dating

Social networking software uses its relationship graph to suggest people to you. I guess the idea is that it's trying to discover the true relationship graph since you are likely to be friends with your friends' friends. But data about people could be used to generate fruitful relationships. Dating sites do this already, but why couldn't the same approach be used to match people for other kinds of relationships? Is data about people's behavior online good enough to predict when two people would enjoy hanging out or corresponding?

Getting rid of stuff

We see lots of ads inviting us to buy stuff. This is fine, since stuff often makes our lives better. But I find that I have more stuff than I want. It would be great if there were the same kind of constant pressure for me to reduce my collection of stuff.

Example app from the future: I go into spring cleaning mode in my apartment. Now everything I look at comes with bids. For example, as I go through my bookshelf, every book comes with an offer to sell it for $X. If this existed, I'm sure I'd have many fewer things. The problem is that generating bids takes time, as does inputting information about things. How would the other side of the system work?

The idea would be to take advantage of the fact that people often have long-standing desires. For example, I'd kind of like to have a skateboard. I'd be willing to authorize you (or a competent algorithm) to buy a skateboard on my behalf. It would be best if you (the algorithm) already knew quite a lot about skateboards, so knew what specifics to ask me about. You'd say something like "there are three kinds of skateboards. Depending of what you plan to use it for, you may only want a certain kind." Then I could answer your questions to whatever level of detail I wanted to. I could also specify a maximum price. Then I'd trust you to bid on my behalf when other people are cleaning their garages. It'd be even better if you algorithmically adjusted my bid based on the exact product available.

This leaves open the problem of how you get information about the things you're bidding on. When I'm going through my bookshelf, I guess I could provide photos of the books. Then you might be able to judge things like shelf wear. Of course, you could rely on me to some extent to give you accurate information. But this is a bit dangerous. You don't want people to game the system by adjusting their descriptions incessantly to get the maximum possible bid.