I've only been doing this mobile engineering thing for a tiny aspect of time (~3 years) compared to most developers in their fields. Since coming from a design background, I have noticed a few patterns in rotation at my current role that don't mesh well with my designer mindset.
First, let me start off by saying, I am in no way set in my ways. Everything from the moment I opened Xcode for the first time has been an open learning experience and nothing less.
The following opinions are my thoughts and my thoughts only, not to be taken to literally. This post is more of a metaphor for the balance between logic and design, efficiency and effectiveness.
There are certain phrases and ideologies that get tossed around the shop that I'd like to hear a lot less often — best practices being one of them. I tend to approach most of my engineering from a designer's perspective. I have always considered myself a product designer more than anything else in my career.
I strive to be a problem solver, a motivator, and an end-user advocate. I don't see much of that in the development community, and that sickens me. Most of these engineers are focused on crunching code rather than solving a problem — which would make everyone's lives easier in the long run.
To plan a business, a point-of-concept, a side project, or even a hack week idea, a product must provide something from all angles. A product needs to be flexible, and easily extendable. Simple to use, yet complex to break; very secure, although giving the appearance of being lightweight. This is the user experience that people are looking for. This is what most successful mobile products do.
The team building the product also needs the ability to pivot as quickly as possible on new features and ideas to keep their market advantage — to maintain their edginess. This means having highly modular code and the ability to leverage past projects and codebases to make implementation a lot less difficult.
At the highest level, our goals for our mobile engineering process are to:
- Consistently ship high quality software
- Achieve a rapid (10-day) delivery schedule
- Reduce the overhead of performing a release as much as possible
- Maximize code modularity and reusability
- Empower everyone on the team to ship, gather data and iterate on features they want to work on
Our typical work flow consists of a pseudo-agile process. The sprints are 2 weeks long and we aim to ship the finished software to the App Store, if applicable, or to the proper GitHub (public or private) repository. Unless an impromptu issues bubbles up, we begin or sprints on Tuesdays and finish them on Mondays (2 weeks later).
- Tuesday: Sprint Planning (3:00pm to +/-6:00pm)
- Friday: Initial internal release of a point-of-concept or minimum viable product (if available)
- Monday: Minor code review and product owner feedback from Week 1
- Wednesday: Reiterate on scope and initial planning to support feedback received
- Thursday: Major code review and team feedback
- Friday: Reiterate until code complete
- Monday: Final product review (via code or live simulation)
- Tuesday: Ship product and begin next sprint
- Wednesday: Start new project (concurrent with finishing the last sprint, if delayed)
Pull Requests & Code Review
We send out a lot of pull requests. It's actually quite manageable, regardless of the amount. GitHub truly has made it easier for our team to work remotely and collaborate on issues and solutions for the business (and, sometimes, the open-source community). Typically towards to middle of the week, the mobile repositories will be full of pull requests waiting to be massaged and analyzed. It's the foundation of creating, and reusing, great code.
We try our best to keep the reviews quick—they may sit in the queue for a few hours, or a day if submitted later in the week. But we always try to keep the conversation in comments to track changes or suggestions. And when the pull request is approved, we append a random silly animated gif:
The majority of our projects are fully tested. Of course, it would be hard for us to have a test-driven style, but we try our best to implement and test suite at least after the initial launch.
At the moment, we only focus on unit testing, and leave behavioral or UI testing to the QA team, but I'd like to eventually move past that. Automation makes things much simpler (and quicker) in the long run—
Imagine testing a user flow and being 100% confident that your code has not introduced any unwanted effects on the buttons or form validations.
Or even better, imagine being able to test performance in a wider range without degrading code quality or trying to stash code block that you've tweaked for optimization.
All of these scenarios would make it vastly easier to align ourselves with the issues our users would potentially have. The IDE and the CI would, instinctively, be our in-house end user. We could even program the server to throw a fit if a feature, UX flow or UI element is busted.