I'm usually not an early adopter of anything. I hardly ever get the 1.0, I tend to wait for at least the 1.3 or, even better, the 2.0.
But after experiencing VR for the first time with an Oculus Rift DK2 back in 2015, I was sure that (this incarnation of) VR would change gaming entirely, at least for me. I eagerly awaited the arrival of the consumer version of the Rift.
When the Oculus Rift CV1 (Consumer Version 1) arrived in March of 2016, I initially balked at the price and decided to wait for a price drop. I wasn't gaming much at the time anyway. I usually game in "bursts", and then don't touch anything but the most casual iOS games for weeks or even months. So holding out for a cheaper Rift didn't seem too hard.
Oculus Rift or PSVR?
Come October, a neighbor of mine told me he had pre-ordered a PSVR. My initial plan was to wait until he got it, try it out for a bit and then maybe either get one too or get a Rift. Before I even got to try the PSVR, something clicked in me. Just through some general discussions about VR and watching some Youtube gameplay videos, my VR spark was relit. Since I already own a fairly capable gaming PC (Intel Core i5, GTX 980, 16 GB RAM), I decided to skip the whole PSVR evaluation and got an Oculus Rift CV1.
At that point, the Rift still shipped with an XBOX One controller instead of true VR hand controllers. But the Oculus Touch controllers had already been announced, due for early December. Looking at the PS4's dated "Move" wands and the Vive's awkward and bulky-looking donuts-on-a-stick, I felt pretty sure that the Oculus Touch would put the Rift well ahead of the competition and it would therefore be worth the wait.
Now that I've got both the Rift and the Touch controllers and have had a chance to try them out with a handful of games, it's time for some (obviously totally subjective) pros and cons.
Oculus Rift Pros
- Touch controllers are lighter and feel more natural than PSVR and Vive controllers
- Built-in on-ear or in-ear headset (one less cable dangling from your head, and they sound great as far as I'm concerned)
- Headset is lighter than PSVR and Vive, overall pretty comfortable
- Allegedly less screen-door-effect than on the Vive (I have no first-hand experience)
- Very good first-run experience and quick setup
- Ability to retrofit some 2D games with VR capability through vorpX (this works too with the Vive, but not with PSVR)
Oculus Rift Cons
- Shorter headset cables (although I extended mine from 4 to 6 meters without any issues)
- You have to run USB cables from the sensors to your PC
- Slightly more expensive than the Vive (with Touch controllers and 3 sensors), much more expensive than the PSVR (unless you already have a gaming PC, which I did)
- Slightly smaller play area (not an issue for me, as the Rift's capabilities max out my available space)
VR Headset Wars
You might wonder why I'm not listing the Oculus Rift's (supposedly) "closed ecosystem" approach to games as a con. While I do acknowledge that this approach is controversial (although I don't seem to see as many complaints about Sony's equally closed approach), right now I don't think I'm missing out on much, as many games for the Vive are also playable on the Rift (and vice-versa) through Steam VR.
Secondly, I also think that if VR is going to become a serious "thing", there will have to be incentives for developers to make great VR games. Because great games will get more people to buy VR headsets, which will lower prices, which will make VR more accessible, which will make more people buy the hardware and games,... and so on.
Some argue that VR headsets are just another peripheral like a keyboard or a mouse. I would argue that VR is a platform, one that makes a completely new and separate category of games and experiences possible. And as with any platform, it takes initial investment to get get it to the tipping point where it goes from early-adopter niche gadget to mass-market household staple.
What that means is that to make VR a viable platform, it's going to take money. Money the platform owners need to invest to get the platform "over the hump". One way to approach that is through subsidized development, and that usually means platform exclusivity.
Right now VR headsets are a bit like consoles, in that you need so-called "system sellers" for consumers to consider making the investment in the platform. Without system sellers like "The Last Of Us" or the "Uncharted" series, the Playstation would have probably died by now.
The economics of VR are still significantly different from those of conventional PC or mobile games. Different rules apply, things still have to shake themselves out before we know which way the VR train is headed, if any. And until then, it's up to the platform makers, game developers and early adopters to each play their part in establishing VR not just as a new platform, but as a viable business.
The Outlook for VR
I'm completely hooked on VR, but even I'm not convinced it will catch on, at least not enough to become a market big enough to support the development of big-budget titles. Because that's what VR is lacking right now: truly big-budget, large-scale games that can justify a $40+ price tag.
With very few exceptions, VR games currently fall into one of the following categories:
- Demo-like experiences with minimal interactivity
- Fully interactive games, but either short or small (in scope)
- Retrofitted: Non-native VR through vorpX ranging from "cool with compromises" to "unplayable"
It remains to be seen where the VR platform will go and how fast we'll get there. Right now it's too early to tell because even the platform makers and game developers are still experimenting with what works and what doesn't.
Here's my totally personal and therefore subjective take on what I think is holding back VR the most right now:
- Cost of entry (hardware and game prices)
- Lack of "blockbuster" games
- Setup (clearing the room for roomscale experiences, cabling, etc.)
- Comfort (all headsets get uncomfortable after a while)
- Motion sickness (for some people and some games/experiences)
I considered putting image quality on this list too. But while I think image quality definitely needs to (and, over the next couple of years, definitely will) improve, I think it's the least of VR's problems.
Yes, you have some "screen door effect" on the Rift and Vive (less so on the PSVR), the PSVR's (nominal) resolution is fairly low and the brightness of all headsets could be higher. But with the games I've played so far, I tend to not even notice these issues after a minute or two of gameplay.
VR games are so incredibly immersive that I tend to forget I'm even playing a game. The illusion is so perfect that I instinctively navigate my arms around virtual obstacles, like the hills or buildings in Final Approach. I feel like I'm in the game and not looking at a screen that's displaying the game, and the hand controllers increase immersiveness even more.
I think the biggest achievement of VR is that it removes a layer of abstraction, much like touch on smartphones and tablets removes the layer of abstraction inherent to a mouse and a keyboard. And with this loss of abstraction comes the loss of the sense that you're looking at something "artificial" and that you need to use unnatural devices to indirectly manipulate the environment you're seeing.
Instead, you feel like you're manipulating this alternate reality directly and the fact that you're still seeing this reality through a (pair of) screen(s) fades away after a while. Additionally, since you're completely isolated inside the VR headset, whatever graphics quality you're seeing quickly becomes the new normal due to a lack of immediate reference. It's more like that when you pop back into reality when you take off the headset, you think, at least for a second or two, how high-res, but ultimately dull true reality is.
I don't know if VR will become a thing, but I sure hope it will. As enjoyable as computer games have been up to this point, they all pale in comparison to the immersiveness of VR.
VR certainly isn't for everyone or every game genre. But it's interesting - and speaks volumes to how much we're still fumbling around in the dark when it comes to VR - how genres you think would lend themselves best to VR (like ego shooters) aren't ideal when you actually play them, but genres you wouldn't have even considered playing in VR, like jump-and-runs or real-time strategy, are in fact not just playable, but are a lot more fun as you immerse yourself in the middle of their action.
There's no doubt VR is still very much early-days and the landscape will look a lot different five years from now. I just hope VR will be able to establish itself as a viable platform and won't die an early death like 3D TV did. Given the amount of money I've spent on VR hardware and games, I'm betting on it.
I recently found myself in a situation where I wanted to keep a file in a Git branch I was merging into unchanged, i.e. ignoring the file from the branch I was merging and instead keep the version of the file in the target branch. This proved to be a lot less straightforward than I would have thought.
I have to mention that I'm still more or less a Git noob. So the way I'm doing this might be totally wrong. Instead of the setup I describe below, it might make more sense to rethink my development workflow and how I manage my projects. But judge for yourself and if you have a better idea how to handle this, feel free to leave a comment.
To explain why this particular problem even needed to be solved, I'll start out by giving you an overview of the project I'm working on and how it's organized.
A partner and I are developing a theme for Shopify. From what I've learned so far, it appears to be good practice to do all your development in Git branches and continuously merge them into a master branch when they're "done". So each of us is working on a specific feature in a dedicated branch and then merging into the master branch. We've also set up a policy to the effect that the master branch is "merge-only", which means we never make changes to files in the master branch other than by merging another branch into it.
The requirement to keep files in master untouched by merges comes from the fact that, to develop Shopify themes, you need to continuously upload changed files to a development store to test your changes. You can't develop (i.e. test) a Shopify theme locally, as there's no way to parse Shopify's Liquid theme templating language. So where you could just fire up a web server on your local development machine when you're developing HTML, PHP, etc., with Shopify you need to upload changes to Shopify's servers before you can check the results in your browser.
The way we've set up our development workflow is that for every Git branch we create, a corresponding theme is set up in our development store. So while I'm developing some feature or whatever in a particular Git branch, file changes get uploaded to a corresponding theme in the development store. Themes have an ID, and that ID is used by the upload tool we use to identify to which theme it should upload the changes.
The theme ID is kept alongside other settings in a file called config.yml in the root of the theme folder. Since the theme ID corresponds directly to a Git branch, the config.yml is version-controlled with Git, just like the theme files themselves. So every Git branch has a complete set of theme files alongside a config.yml file with the theme ID that corresponds to the appropriate theme in our development store.
The problem we ran into was that when we merged a branch into master, the config.yml file would also get merged. So after merging, say, some-branch into master, the config.yml in master would now have the theme ID set to the theme of some-branch. But since the master branch should correspond to the master theme in the development store, if we uploaded the newly merged master to Shopify's servers, we would be uploading to the wrong theme.
So we needed some way of keeping the config.yml in the master branch untouched when we merged development branches into the master branch.
Git hooks and merge options to the rescue
After doing some googling, my initial plan was to use a merge driver. I don't fully understand this mechanism yet, but what I have learned is that it doesn't help in our particular case.
In short, and as far as I understand it, a merge driver lets you configure how you want merge conflicts to be handled by default. In our case, I set up merges of config.yml to always favor the "local" or target branch, i.e. master. In general, this approach would work, but only if there was an actual merge conflict. If merging the config.yml from a branch into the master branch could be achieved by a simple fast-forward merge, the merge driver didn't trigger. So I had to look for a different solution.
The solution I came up with isn't exactly elegant. But it's the best (read: only) way I could find and despite not being elegant, it works.
The general approach is to configure Git to not auto-commit merges and then, after the merge but before the commit, restore the config.yml from the master branch and then commit the merge. What this effectively achieves is that all files except the config.yml file are merged normally, but config.yml always remains the unchanged "original" file from the master branch.
Here's how to set this up.
First, I had to set up Git so it doesn't auto-commit changes after a merge. I set this up only for the Git project in question, not globally. I work on non-Shopify projects too, and for those I want Git to work as usual. The command to prevent Git from doing an automatic commit for a project is this (run inside the project folder):
git config branch.master.mergeoptions "--no-ff --no-commit"
What this does is add the following settings to the project's .git/config file for the master branch:
--no-commit means exactly what you'd expect: It prevents Git from automatically committing changes after a merge. When the merge completes, the changes to the files have already been made, but you can still roll them back - which is exactly what we're going to do in the next step, detailed below.
no-ff in there stands for "no fast-forward". From what (little) I understand, this forces Git to always do a merge, even if it could do a fast-forward commit. To be honest, I don't fully understand if this parameter is required in this case. But it doesn't seem to hurt, so I left it in there (it's probably a good thing I don't program rockets or surgery robots).
Now that we've prevented Git from automatically committing the changes the merge already applied to our files, we need to use this "pause button" in the default merge process to insert a little magic that will restore the config.yml from the master branch.
The config.yml in the working tree has already been changed by the merge, but the changes have not been committed yet. So what we need to do now is restore the config.yml to the state we want and only then do we commit.
The config.yml we want is in the master branch. So all we need to do is "restore" the it in our working tree to that state. This is achieved with a
git reset HEAD config.yml, which unstages the changes made by the merge, and then a
git checkout config.yml, which actually restores the config.yml in the working tree to its master branch state.
Of course you could do all this manually. But that would be nuts. Fortunately, Git has a thing called hooks.
Hooks allow you to run shell scripts at pre-defined points in time in the Git process. In our case the hook we needed is the pre-commit hook. As the name suggests, this one runs before every commit.
To set this up, I created a file called pre-commit in the project folder under .git/hooks and gave it an executable flag (
chmod +x pre-commit). Here's the contents of this file:
git reset HEAD config.yml git checkout config.yml
Now, when I do a merge into master, the merge stops just before the commit. Then I run a manual commit, this triggers the pre-commit hook that resets the config.yml file to the version in the master branch and then commits everything else as usual.
You might be asking yourself why the whole
--no-commit thing is even necessary. This is because a merge normally does a commit, so you'd think the pre-commit hook would run then too, right? Unfortunately, wrong. At least in my tests the pre-commit hook did not run on a regular auto-commit within a merge. It appears that it only runs on an explicit commit.
I've been using MailMate for years now and I'm a supporter of the ongoing development of MailMate 2.0 through the 2013 crowdfunding campaign. Today MailMate got a pretty big beta update that, among other, more important things, also includes a fresh new app icon and toolbar icons. Yes, we didn't have those before, at least not for all the toolbar buttons. It never really bothered me, but it's nice to see some attention paid to the GUI.
With this update MailMate's developer also introduced an interesting model to continue to support the ongoing development of MailMate. The original crowdfunding campaign for MailMate 2.0 raised roughly $42k, and that was all the way back in 2013. I think I paid $50 and have been using MailMate daily since then.
The new support model is called MailMate Patron. You pay at least $10 every 3 months or a multiple thereof. This doesn't get you a license, you still have to buy that as a one-off purchase (which I already did via the crowdfunding campaign). But if you want to help ensure MailMate will still be around for years to come, you can choose to pay this recurring patronage for however long you like. If you stop paying, you still have your license.
Next to a web browser, my email client is probably the second most used app on my Mac. Unfortunately, the state of email clients on the Mac is generally a pretty sad one. Apple Mail is all fine and dandy, but it lacks many power features and is a bit too sluggish and GUI-heavy for my taste. All the other options available (AirMail, Thunderbird, Outlook and others) never really held up under closer inspection. I found all of them either to be too flimsy, bloated, unstable or awkward.
Given that an app I use hundreds of times a day is a major part of my daily business, supporting the future development of MailMate is a no-brainer for me. So I signed up for $10 every 3 months. So that's $40 a year for an app that I'd really, really miss if it were to disappear. I think it's a bargain.
When he's released from jail after a 15-year sentence he got for stealing diamonds, a master thief seeks out his former partner and sweetheart, who has since settled into family life with another man in the small town of Banshee, PA. She's not happy to see him because she's hiding her former life from her family, he's not happy because she doesn't have the diamonds anymore. After some highly unlikely, yet highly entertaining and violent events, the thief finds himself in the position of Banshee's new sheriff and that's where the real story begins.
The first episode of this show left me a bit doubtful if I'd enjoy it. But episode for episode the show, its characters and the setting grew on me and now it's a cadidate for my all-time top 10.
The story has logical flaws and some parts of the plot seemed so ill-placed that I was a bit disoriented at times, and wondered if I was seeing a flashback (of which there are plenty), or if I had accidentally skipped an episode and was missing a part of the plot that would make the episode I was watching make more sense.
But that somehow didn't lessen the show's appeal. The interesting, colorful (especially Hoon Lee's Job. Oh. My. Lord.) and well-drawn characters keep things moving along and I found myself liking even the bad guys on some level.
Be warned, though, this show has lots of violence, some of it made me squirm, and I don't squirm easily. It fits the show well, though. It doesn't seem like it's just there for show.
All in all, this one's a keeper, one that I'll probably watch at least once a year.
I think it's safe to say that any kind of advice is usually understood as a general recommendation, one that is supposed to deliver a certain result when put into action. So most advice could probably be summarized as "Do [action] to get [result]."
But when you consider that lots of advice is just the result of the experience of the person dispensing the advice, wouldn't it be more accurate to see advice more along the lines of "This is what worked for me, your own results may vary?"
Everyone's experiences will vary based on personal, societal, economical, political, temporal and many other factors. So assuming that actions taken by one person will have the same results for any other person regardless of all these factors is pretty naive. Furthermore, how closely someone is willing and able to actually follow the advice will also have an impact on the results.
My personal formula for evaluating any kind of advice is this:
Value of advice in % = (action x accuracy x applicability)/10000
To explain: Advice is useless without action. So I multiply the action I take by how closely I follow the advice and by how applicable the advice is to my personal situation. All parameters accept values between 0 and 100, and I divide the result by 10,000 to get a percentage value.
Of course this isn't meant to be an actual mathematical formula. It's basically impossible to come up with accurate values for each parameter. But even with some "educated" guesses I think it can help make a more realistic assessment of how valuable any advice actually is to me, and adjust my expectations accordingly. At the very least it will make me think more closely about advice I receive rather than simply taking it as gospel.
Feel free to run this very advice through that same formula and share your results in the comments ;)