We're a growing engineering team, and that tends to mean that people move towards being more specialized. This is great for an individual's technical depth, but can weaken their breadth of knowledge on new technology. We want to encourage the growth of both within our engineering team. To address this our front-end engineers worked together to run a series of workshops for the whole engineering team.
Our mission at Artsy has been to make a world where everyone is moved by art every day, and at a high level, the way that our engineering team supports that mission is through building software. We have built systems and databases and user interfaces that represent different facets of the art world, and throughout our work, we have... made some mistakes.
That's okay! Programmers make mistakes all the time. There is a large list of blog posts describing various programmer misconceptions, from subjects you might expect would be simple to model in computers, like units of measurement and time, to subjects that are based more in the human condition, like postal addresses and marriage.
In the interest of openness and sharing what we've learned, the Artsy Engineering team has come up with the following list of misconceptions programmers believe about art. Thank you to everyone at Artsy who contributed to this list.
When we talk about our React Native setup in abstract, there are two kinds of "now draw The Tick" for iOS developers:
- How do I build this React Native as a CocoaPods setup?
We're going to address the first part in this post. By the end of this post we're going to get an Emission-like
repo set up for an existing OSS Swift iOS app called GitHawk. The aim being to introduce no
UIViewControllers via a CocoaPod which is
consumed by GitHawk.
To do this we're going to use the CocoaPods'
pod lib create template, and React Native's
react-native init to
make a self-contained React Native repo. It will export a JS file, and some native code which Podspec will
reference. This keeps the tooling complexity for iOS and React Native separate. Read on to start digging in.
In 2017, Apple released ARKit to universal acclaim. It's a solid foundation for application developers to build Augmented Reality (AR) experiences without learning a whole new skillset in computer vision. Like a lot of Apple's technology, it's a clever blend of existing projects: SceneKit, CoreMotion, CoreML and some very clever camera work. From the developer's perspective, ARKit has an API which fits perfectly with the rest of Apple's APIs. You spend most of your time working with a few delegate functions and you're good to go.
For the last 2 months, I've been working with ARKit on a replacement for our View in Room feature on modern iOS devices to support a "View in My Room". I'm going to try cover how we approached the project, the abstractions we created and give you a run through of how it works.
We believe that our implementation is a solid improvement over similar features in other apps that allow users to place artworks on walls, and we're making the source code available free and open-source under the MIT license.
@alloy first mentioned React Native as an option for Artsy back in March 2015, and in February 2016 he made our first commit to get the ball rolling. Since then, we've grown a new codebase, Emission, which has slowly taken over the responsibility for creating the UIViewControllers presented inside our iOS app.
We've come quite far from where we started, and I was asked if I could give a talk to summerize what we've learned in the last 2 years as a set of native developers using React Native.
On the engineering team at Artsy, we've built a CMS for both internal and external editors to write and publish articles. We have a team of a dozen in-house editors creating new content on a daily basis. As many people starting using the app simultaneously, something became apparent. Editors would unintentionally go and override each other’s work because there was no way to tell if someone else was currently editing an article. As a workaround, team members would be forced to edit drafts in another editor such as google docs and copy their work over once ready. This made for a lackluster collaborative experience.
So we decided to implement a system that would make our editors more confident in our CMS by ensuring only one editor could go in and edit an article at any given time. I was tasked with coming up with an elegant technical solution for this feature. Here's the approach I took....
We've previously covered what Apogee is and how it's deployed, so all that's left to cover is the technology used to build it. As a refresher: Apogee is a Google Sheets Add-on we built to help our Auctions Ops team transform the data given to us by our partners into a format that our CMS can understand. This process, done manually up until now, takes a long time and is a perfect candidate for automation.
Apogee had some really interesting technical challenges that I enjoyed solving, and I'm excited to share some lessons I learned. So let's dive in!
Apogee: the point in the orbit of two objects at which they are furthest apart.
In 2017, the Artsy Auctions Operations team coordinated and ran 190+ sales on our platform. This year, our ambitions are set even higher. But scaling up the number of sales we run will require scaling up our tools and processes, too. This post describes Apogee, a tool I built to help us scale our businesses processes. I never thought I would be so excited to build a spreadsheet plugin, but I honestly had a blast. So let's dive in!