Heroku will log an R10 - Boot Timeout error when a web process takes longer than 60 seconds to bind to its assigned port. This error is often caused by a process being unable to reach an external resource, such as a database or because you have a lot of gems in your Gemfile which take a long time to load.

Dec 12 12:12:12 prod heroku/web.1:
  Error R10 (Boot timeout)
  Web process failed to bind to $PORT within 60 seconds of launch

There’s currently no way to increase this boot timeout, but we can beat it with a proxy implemented by our new heroku-forward gem.

Read on →

Knowing how well your API performs in real time is essential to any successful project. That’s because you can’t fix what you can’t measure.

We use and heavily contribute to Grape, a Ruby API DSL. Grape is a Rack middleware and we have been reporting API performance data to NewRelic with code from my older blog post.

It’s time to improve the reporting implementation and address performance monitoring in both development and production environments. Here’s what a single API request breakdown is going to look like.

Read on →

All Artsy URLs shared publicly are humanly readable. For example, you’ll find all Barbara Kruger’s works at artsy.net/artist/barbara-kruger and a post by Hyperallergic entitled “Superfluous Men Can’t Get No Satisfaction” at artsy.net/hyperallergic/post/superfluous-men-cant-get-no-satisfaction. This is a lot prettier than having id=42 in the browser’s address and is a big improvement for SEO.

We construct these URLs with a gem called mongoid_slug. Interesting implementation details under the cut.

Read on →

There is a great deal of misinformation on the web about detecting an iPad or an iPhone in JavaScript. The top answer on stackoverflow - and many blog posts using this technique - are all incorrect.

The conventional wisdom is that iOS devices have a user agent for Safari and a user agent for the UIWebView. This assumption is incorrect as iOS apps can and do customize their user agent. The main offender here is Facebook, whose iOS app alone accounts for about 1-3% of Artsy’s daily traffic.

Compare these user agent strings from iOS devices:

# iOS Safari
iPad: Mozilla/5.0 (iPad; CPU OS 5_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Version/5.1 Mobile/9B176 Safari/7534.48.3
iPhone: Mozilla/5.0 (iPhone; CPU iPhone OS 5_0 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Version/5.1 Mobile/9A334 Safari/7534.48.3

# UIWebView
iPad: Mozilla/5.0 (iPad; CPU OS 5_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Mobile/98176
iPhone: Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_1 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Mobile/8B117

# Facebook UIWebView
iPad: Mozilla/5.0 (iPad; U; CPU iPhone OS 5_1_1 like Mac OS X; en_US) AppleWebKit (KHTML, like Gecko) Mobile [FBAN/FBForIPhone;FBAV/4.1.1;FBBV/4110.0;FBDV/iPad2,1;FBMD/iPad;FBSN/iPhone OS;FBSV/5.1.1;FBSS/1; FBCR/;FBID/tablet;FBLC/en_US;FBSF/1.0]
iPhone: Mozilla/5.0 (iPhone; U; CPU iPhone OS 5_1_1 like Mac OS X; ru_RU) AppleWebKit (KHTML, like Gecko) Mobile [FBAN/FBForIPhone;FBAV/4.1;FBBV/4100.0;FBDV/iPhone3,1;FBMD/iPhone;FBSN/iPhone OS;FBSV/5.1.1;FBSS/2; tablet;FBLC/en_US]

Read on →

This post details the first of many challenges we faced in 3D transforming the homepage of Artsy (inspired by Meny): detecting CSS 3D transform support.

Front-end development is messy in today’s fragmented world. At Artsy, our goal is to do what it takes to provide an incredible experience for all of our users (IE8+, iOS and the usual suspects). Deploying bleeding edge tech, like CSS 3D transforms, is an exercise in compromising principals for practicality – and managing these “compromises” in well-documented code.

We looked to Modernizr’s feature detection approach to provide us with a reliable way to detect CSS3 3D transform support across browsers. They have some well- documented struggles around the issue. After flipping most of the tables in the office ┻━┻ ︵ヽ (`Д´)ノ︵ ┻━┻ , we settled on user agent sniffing as the most robust method for detecting CSS3 3D transform support. But why did none of the available methods work for us?

Read on →

The public launch of Artsy via the New York Times is a good opportunity to describe our current technology stack.

What you see when you go to Artsy is a website built with Backbone.js and written in CoffeeScript. It renders JSON data from Ruby on Rails, Ruby Grape and Node.js services. Text search is powered by Apache Solr. We also have an iOS application that talks to the same back-end Ruby API. We run all our web processes on Heroku and all job queues on Amazon EC2. Our data store is MongoDB, operated by MongoHQ and we have some Redis instances. Our assets, including images, are served from Amazon S3 via the CloudFront CDN. We heavily rely on Memcached Heroku addon and we use SendGrid and MailChimp to send e-mail. Systems are monitored by a combination of New Relic and Pingdom. All of this is built, tested and deployed with Jenkins.

In this post I’ll go in depth in our current system architecture and tell you the story about how these parts all came together.

Read on →

We now have over 4700 RSpec examples in one of our projects. They are stable, using the techniques described in an earlier post and organized in suites. But they now take almost 3 hours to run, which is clearly unacceptable.

To solve this, we have parallelized parts of the process with existing tools, and can turn a build around in just under an hour. This post will dive into our Jenkins build flow setup.

To keep things simple, we’re going to only build the master branch. When a change is committed on master we’re going to push master to a master-ci branch and trigger a distributed build on master-ci. Once all the parts have finished, we’ll complete the build by pushing master-ci to master-succeeded and notify the dev team of success or failure.

Here’s a diagram of what’s going on.

Read on →

Artsy Folio, our free iPad app for Gallery Partners, had been in the App Store for a couple of weeks before the iPad with a Retina display was announced. This had been something we expected internally and felt the application would be ready. We had all our image assets available in @2x versions and an image pipeline that would take scaling into account. With that in mind, we changed our artwork grid view to show a double resolution image. Finally, once we were happy that it worked fine on the simulator, we sent the build off to Apple for review.

The app passed review, and was Retina-ready before the actual release. But within hours of getting our hands on a real Retina iPad, we had to pull the app. This post will explain why, and what we did to work it out.

Scrolling the grid view was slow. Extremely slow. The reason why wasn’t obvious initially, but thanks to digging around using Instruments, we saw that a great deal of time was spent in Apple’s image processing libraries. This was a strong hint that the problem involved taking the file and getting it to the screen.

In our naiveté, Folio was originally using UIImage’s initWithContentsOfFile: to load (without caching) a jpg from the file system. Once the file was loaded into memory, we displayed it onscreen in an UIImageView. This was fast enough to deal with our small thumbnails of 240x240 but the moment that you start asking it to pull 3 or 4 480x480 jpg files off the filesystem, decompress them and then put them on the screen, you’re not going to have a smooth scroll.

As we knew that we were looking at an issue with getting images from a file, it made sense to start looking at ways to move image processing off the main thread. This Stack Overflow thread on UIImage lazy loading proved to be an essential start to dealing with our issue. We needed a thread-safe way to get the contents of a file and to pass them through once the images had been decoded. What we needed was initImmediateLoadWithContentsOfFile, a thread-safe way to go from a filepath to a UIImage.

Now that we had a way to get an image that was safe to go on a background thread, we gave our grid an NSOperationQueue and created a method to kick off a NSInvocationOperation with our the cell we’re looking at and the address it needs to load the thumbnail.

- (void)setImageAsyncAtPath:(NSString *)imageAddress forGridCell:(ARImageGridViewCell *)cell {
    NSDictionary *operationOptions = @{@"address": imageAddress, @"cell": cell};
    NSInvocationOperation *operation = [[NSInvocationOperation alloc] initWithTarget:self selector:@selector(asyncLoadImage:) object:operationOptions];

    [_operationQueue addOperation:operation];
}

When we had the simplest implementation of asyncLoadImage we found that scrolling would sometimes result in grid cells displaying the wrong image. It turned out that in the time it took to decode the jpg, the cell had already been reused for a different artwork. This one totally caught us off guard!

- (void)asyncLoadImage:(NSDictionary *)options {
    @autoreleasepool {
        NSString *address = options[@"address"];
        ARImageGridViewCell *cell = options[@"cell"];

        // don't load if it's on a different cell
        if ([cell.imagePath isEqualToString:address]) {
            UIImage *thumbnail = [[UIImage alloc] initImmediateLoadWithContentsOfFile:address];

            // double check that during the decoding the cell's not been re-used
            if ([cell.imagePath isEqualToString:address] && thumbnail) {
                [cell performSelectorOnMainThread:@selector(setImage:) withObject:thumbnail waitUntilDone:NO];
            }
        }
    }
}

This meant we could have our UI thread dealing with scrolling, whilst Grand Central Dispatch would deal with ensuring the image processing was done asynchronously and as fast as possible.However, this still wasn’t enough. We were finding if you scrolled fast enough, you could still see images pop in after the grid cell was visible. For this, we actually went back to the beginning, and made our image pipeline create a 120x120 thumbnail for each artwork that we use initImmediateLoadWithContentsOfFile to load on the UI thread. This is fast enough to smoothly scroll, and is replaced by the higher resolution image practically instantly.

The rest of the story is pretty straightforward. We wrapped all this up within a few days and got out a version of Folio for the Retina iPad, I ended up doing a talk about the issues involved in doing this in Leeds LSxCafé, and you got a blog post out of it.

At Artsy Engineering we encourage a culture of experimentation with something called labs.

A new feature released into production is usually only turned on for a handful of users. We get feedback from our own team and a tiny group of early adopters, iterate, fix bugs, toss failed experiments and work on promoting complete, well behaved features to all users. The labs infrastructure gives us a chance to sleep on an idea and polish details. It also allows us to make progress continuously and flip a switch on the very last day.

My favorite labs features push our collective imagination and give birth to productive brainstorms around coffee at a popular startup hangout around the corner from our Manhattan office. But the team’s favorite labs are, by far, those that ship as easter eggs. These are fun and sometimes useful features that don’t make much business sense. So, before I explain our rudimentary labs system, I want to invite you to our easter egg hunt. Check out https://artsy.net/humans.txt for instructions.

Read on →