This is a Test
One of the things I’ve been spending a lot of time and frustration on is backfilling the testing. It’s sometimes tricky to figure out how all the pieces fit together. Thus far I’ve elected to avoid Cucumber and just use WebRat directly in RSpec for the higher level tests.
Down in the unit tests, there were few hairy places where it took multiple mocks setting up a system of objects to get an operation to pass. I installed Factory Girl to handle dummy objects. I haven’t looked to see if it’s really inserting objects into the test database, or just faking out all the relevant bits of code to make it look that way.
I’m making things especially difficulty on myself by not only integrating with an outside service, but relying on OAuth for user login. It also makes testing harder because I can’t test against those services when offline. Fortunately, I’ve heard about web mocking services. VCR is a gem built on top of others to streamline recording actual web interactions and then playing them back during test. It not only allows me to work offline, but probably spares me from running into API limits. I do have one ‘for real’ test to minimize the danger of hiding actual problems.
Cooking the Guestbook
Even with VCR, OAuth is an issue because it’s based on the browser redirecting. So I had to abstract over the login process a little bit, providing a method that the real callback invokes with the login information, and which the tests could call as well. Fortunately, Twitter provides a set of static tokens for single-user applications that I can use during testing.
Accounting for Everything
Siggnal is an interesting project because it has to deal with accounts on multiple services, many of which are not users of Siggnal itself. So I’ve got an account class which is separate from a user. At the moment it does actually represent a user – there is still no information to save, so every login is through OAuth, so all I need to keep track of is which account it represents.
An account object is basically the host service plus some user name and/or id. It then provides methods for getting related vote-by and vote-for information. At the moment, several places still assume Twitter, but I’m trying to leave some hooks in place for abstracting over other services.
Somewhere along the line, I added a logout action as well.
Somewhere along the line, I’ll probably have to create some concept of actual Siggnal users represented in my own database. In the mean time, a login is just a cookie. As a hedge against other services, I’ve started breaking the session down into service specific accounts and OAuth tokens. I’ve discovered that it’s possible to store Ruby objects (at least simple ones) in the session, which I assume uses some form of marshaling
Somebody Might See This
Since I’m thinking about doing a sort of soft launch for a little feedback, I’ve been looking at cleaning things up a little bit. For one, the page layout now includes a rather direct warning that’s it’s still a toy with no guarantees. I also put up a Mad Mimi e-mail form in case anybody is actually interested.
Out With the Old
Part of the cleanup was removing things that were not being used. I started working with Twitter search to avoid the OAuth hassle, but it has languished since. I’m not quite sure what to do with search since the whole point is collecting data about your own feed. It might be interesting to rate what is interesting for a search term, but I’m not sure how to prevent abuse. For now, I’ve just removed search as a problem for another day.
I also removed a lot of the vote scaffolding, since there isn’t an obvious need for most of it.
I never planned to make Siggnal a full featured client – ultimately, it would be a service that other clients send votes to. However, a little while ago I stumbled across Twitter Intents, which reduce reply/retweet/favorite to constructing a trivial URL.
More significantly, I had the insight that a two-dimensional graph of siggnal vs. noise would give a much better overview than single number as in Followcost. A 45 degree line shows the break even point, and it clearly separates high from low volume. I’d long known that newsyc50 was a big part of my feed, but seeing it as huge outlier (even a high siggnal one) convinced me to switch to newsyc150. This allowed me to see everything else as more than a blob. I might want some sort of zooming, but low-volume users don’t affect things much.
Actual use is a mixed bag. The data is making me more conscious of the realities of my feed, but I’m also wondering if there is enough value added for the amount of data which needs to be gathered.
A Tweet of Many Parts
Behind the scenes, I’ve been slowly moving away from URLs. I had basically been doing a wildcard search for a user name, then extracting the name and rechecking to make sure it wasn’t a bad match. I’ve broken out the identifiable parts, and then converted user names to ids – Twitter users can change names on a whim, and yet the name shows up in the relevant URLs. And of course, it makes searches more direct, and hopefully faster.
An unfortunate side effect was more verbose submission forms, with separate (hidden) inputs for each piece. On the other hand I took out the voter field – it will always be the currently logged in user, which is how it probably should have run to begin with.
In preparation for the name-to-id lookups, I started abstracting the interactions with Twitter a little bit. I was able to hide some, but far from all, of the OAuth mechanics. It also moves me close to a have a service-neutral internal API at some point in the future.
Migration Dependency Hell
In the end I elected not to use this abstraction in the actual migration, however, because it makes the migration dependent on the code as it now exists. I still had to depend on the model, which worries me considerably, but it’s just they way Rails (or at least ActiveRecord) rolls. I already had to set up some conditional model code to bridge one migration, and I fear further fragility in the future. I’ve actually been wondering if it would make sense to use a custom simplified model for each migration.