I’m back from some travels – plural, which is extremely rare for me.
New York City (first time visit) followed by the usual annual trip to Stockholm
for EA’s Frostbite DevDays conference, where various game devs from the company
converge from all around the world to chat and drink.
EA is a weird company in the sense that, for a video game company, people tend
to stay there for very long stretches of time. It’s very common to talk to
people who have been at EA for more than 10 years – at EA Vancouver, DICE,
Bioware, whatever. This makes it difficult to find new points of views on
technical problems… although, well, maybe it’s the same in many other big
companies like Activision or UbiSoft, I don’t know.
Either way, I was happy to meet several people who not only have shipped AAA
games at other companies, but also worked on those games’ cameras – which is my
current area of interest, being in charge of the Frostbite Camera System.
Finding people who work on (and care about!) cameras is a challenge to begin
with, seeing how little infrastructure and long term investments are generally done on that
crucial aspect of any game (more on that in a future post), so I’m pretty happy
with this year’s conference for that, at least.
It’s been an interesting week. Apple announced some new Macbook Pros and everybody’s unhappy in the Apple blogosphere – something I wasn’t sure could happen anymore. Just look at Michael Tsai’ roundup and be amazed. All those people unhappy because they finally realized Apple doesn’t care about “pro” users. Apple effectively made a new version of the MacBook Air, but called it “Pro” and that’s obviously not a great move.
It’s not a great move because it means a lot of compromises. A shitty keyboard. Mediocre specs. No useful ports. Some people are getting into the wrong debate, discussing how Apple designs for the future, but the reality is different. Will DSLRs use USB-C keys to store photos? Will network switches use USB-C for connections? What kind of future has Apple in mind where you won’t need adapters and dongles to efficiently transfer gigantic RAW pictures onto your NAS or other safe storage?
The truth is that this has been coming for a long time. This is a company that killed their pro-sumer photo editing software Aperture and replaced it with the family-friendly Photos. The company that crushed their pro video editing software into Final Cut Pro X and only looked back when petitions grew big. The company that is, slowly, inexorably, removing or hiding pieces of macOS’s underlying Unix system. The company that, over the years, removed the ability of customers to hack their own machines, whether it’s just replacing the battery or RAM or hard-drive in a notebook, or more important upgrades like replacing the CPU or GPU in a desktop machines.
At first this was all dismissed as “reasonable” compromises because hey, look how sleek those Macs look and work compared to the competition… but more and more people started getting annoyed. And now maybe we’ve hit some kind of point of no return? I does look like the majority (or a much more vocal than previously minority) is saying “that’s enough”.
I wonder if it’s too late. Apple has made 1/6th of the keyboard into a touch screen, while the remaining keys are slowly disappearing into the frame – it’s a matter of time until that keyboard is so flat that they have no problem replacing it with a giant touch screen. Actually, hardware is becoming so integrated that I wouldn’t be surprised if next year they were announcing a yearly subscription for MacBooks, similar to the one for iPhones. You were licensing your media, and then you were licensing your software – soon you’ll be licensing your hardware. And all the while they’ll continue their (timid, for now) attempts at hiding the file-system from users. Phil Schiller may say now that they will never merge macOS and iOS together but that doesn’t mean they can’t replicate the iPhone’s success formula with Macs… and it wouldn’t be the first time an Apple exec (or any exec for that matter) flat out lied about something they were doing.
I wonder if it’s too late. Even Microsoft is doing Apple-ish stuff. Their new Surface Studio looks amazing, but instead of being a monitor you plug into a computer you can replace or upgrade, it’s an all-in-one, tightly integrated system. Either you’re rich, or you learn to live with the same specs for 6 years.
I wonder if it’s too late. Tim Cook thinks you can replace PCs with iOS devices, and that the iPad Pro is the “future of personal computing”. Sure, he’s probably talking about the average, mass-market customer here, but that tells you all you need to know about where Apple’s focus is. Apple’s focus is not on the million-dollar markets anymore. It’s on the billion-dollar ones. They’ve tasted absolute power and boy how did it absolutely taste neat.
I wonder what Apple programmers will have on their desk in 5 years… Maybe that’s what will keep Apple in check eventually – can they build software and cloud services on average consumer hardware?
There’s not much to say except that, even before the Kickstarter campaign ended, half of us backers knew it would be a shit show. It’s just fascinating to see how exactly the shit show is going – from the totally dysfunctional project and scope management to the size of Chris Roberts’ balls for selling non-existing digital items for several hundreds of dollars… with the nice addition of fans that are so extreme they can make some Apple or Linux fanboy look balanced.
I personally backed Star Citizen for the same reasons I backed Richard Garriott’s Shroud of the Avatar: as a big “Thank You” for having made, in the past, some of my all-time favourite games. I mean, I was so in love with Wing Commander that I wrote my school notes in its iconic font for several weeks after finishing it. And I still have, to this day, t-shirt that came with the awesome collector’s edition of Wing Commander III… but those new games? Meh. Star Citizen was suspicious from the moment I learned they were using CryEngine. Shroud of the Avatar’s use of separate zones with loading screens (probably because of limitations with Unity’s streaming features) and antiquated UI made it vastly unappealing to me – although I give it a try once every few months to see the progress.
Hey, at least, in terms of pure entertainment, we can’t say we didn’t get some of our money’s worth with Star Citizen ;-)
One thing stuck out for me: Molyneux’s obsession with creating “living worlds”, i.e. games where you’re free to do many things (plant trees, build a house, have kids) and choose many paths (be good, be evil, choose this or that in each situation), and all the while witnessing the consequences of such acts. He’s not the only one trying to do this in video games, but he’s probably the one who tried it the most – or at least talked about trying it the most.
Technically speaking, this is a potentially fascinating problem. Will video game RPGs have to implement advanced AI and machine learning techniques for the game to truly react to your actions? Maybe. Hey, who knows, maybe Fallout 9 will be where the first sentient computer program emerges, after some guy in North Carolina has played it for 7 hours straight or something. But I’m wondering – is that even the point? Should video game designers strive for this kind of “perfect” sandbox experience? Or are they just working in the wrong medium?
There’s already a type of game where you’re free to do whatever you want, and the game world reacts accordingly – not only in a logical or plausible way, but also a narratively interesting way: tabletop, pen & paper RPGs… or, you know, just “RPGs”, as we called them back in the day1. If you’re writing a comic book while covering all pages with descriptions and inner monologues, maybe you should be writing a novel instead… and if you’re struggling to make a video game where you can do whatever you want, maybe you should be writing RPG books?
Damn you video games RPGs – especially JRPGs, who have close to zero “RP” in their “G”. ↩
I have shows for my kids that I’d rather they wouldn’t binge watch. For example, a weekly/6 months a year show like Dragon Ball is supposed to evolve with its audience. But if my kid watches 7 or 8 episodes a week because that’s all he ever wants to see when he gets TV privileges, it would take him only a few months before he ends up in front of the teenage power fantasies of the Saiyan Saga.
Say your kids watch stuff on Plex or Kodi or whatever. You can remove all the episodes of the show they’re watching by putting them in some separate folder, out of your HTPC’s reach. Then you use SaturdayMorning to bring the video files, one by one, every week day or every saturday or whatever you want.
With only one new episode ahead of them, you may find that your kids ask for TV slightly less often, diversify their shows, and/or get more excited about a “new” episode being available to watch.