The Stochastic Game

Ramblings of General Geekery

The iOS-ification of hardware

It’s been an interesting week. Apple announced some new Macbook Pros and everybody’s unhappy in the Apple blogosphere – something I wasn’t sure could happen anymore. Just look at Michael Tsai’ roundup and be amazed. All those people unhappy because they finally realized Apple doesn’t care about “pro” users. Apple effectively made a new version of the MacBook Air, but called it “Pro” and that’s obviously not a great move.

It’s not a great move because it means a lot of compromises. A shitty keyboard. Mediocre specs. No useful ports. Some people are getting into the wrong debate, discussing how Apple designs for the future, but the reality is different. Will DSLRs use USB-C keys to store photos? Will network switches use USB-C for connections? What kind of future has Apple in mind where you won’t need adapters and dongles to efficiently transfer gigantic RAW pictures onto your NAS or other safe storage?

The truth is that this has been coming for a long time. This is a company that killed their pro-sumer photo editing software Aperture and replaced it with the family-friendly Photos. The company that crushed their pro video editing software into Final Cut Pro X and only looked back when petitions grew big. The company that is, slowly, inexorably, removing or hiding pieces of macOS’s underlying Unix system. The company that, over the years, removed the ability of customers to hack their own machines, whether it’s just replacing the battery or RAM or hard-drive in a notebook, or more important upgrades like replacing the CPU or GPU in a desktop machines.

At first this was all dismissed as “reasonable” compromises because hey, look how sleek those Macs look and work compared to the competition… but more and more people started getting annoyed. And now maybe we’ve hit some kind of point of no return? I does look like the majority (or a much more vocal than previously minority) is saying “that’s enough”.

I wonder if it’s too late. Apple has made 1/6th of the keyboard into a touch screen, while the remaining keys are slowly disappearing into the frame – it’s a matter of time until that keyboard is so flat that they have no problem replacing it with a giant touch screen. Actually, hardware is becoming so integrated that I wouldn’t be surprised if next year they were announcing a yearly subscription for MacBooks, similar to the one for iPhones. You were licensing your media, and then you were licensing your software – soon you’ll be licensing your hardware. And all the while they’ll continue their (timid, for now) attempts at hiding the file-system from users. Phil Schiller may say now that they will never merge macOS and iOS together but that doesn’t mean they can’t replicate the iPhone’s success formula with Macs… and it wouldn’t be the first time an Apple exec (or any exec for that matter) flat out lied about something they were doing.

I wonder if it’s too late. Even Microsoft is doing Apple-ish stuff. Their new Surface Studio looks amazing, but instead of being a monitor you plug into a computer you can replace or upgrade, it’s an all-in-one, tightly integrated system. Either you’re rich, or you learn to live with the same specs for 6 years.

I wonder if it’s too late. Tim Cook thinks you can replace PCs with iOS devices, and that the iPad Pro is the “future of personal computing”. Sure, he’s probably talking about the average, mass-market customer here, but that tells you all you need to know about where Apple’s focus is. Apple’s focus is not on the million-dollar markets anymore. It’s on the billion-dollar ones. They’ve tasted absolute power and boy how did it absolutely taste neat.

I wonder what Apple programmers will have on their desk in 5 years… Maybe that’s what will keep Apple in check eventually – can they build software and cloud services on average consumer hardware?


Inside Star Citizen

Speaking of fallen video game superstars, I also recently finished reading through Kotaku UK’s various impressively thorough articles about Star Citizen.

There’s not much to say except that, even before the Kickstarter campaign ended, half of us backers knew it would be a shit show. It’s just fascinating to see how exactly the shit show is going – from the totally dysfunctional project and scope management to the size of Chris Roberts’ balls for selling non-existing digital items for several hundreds of dollars… with the nice addition of fans that are so extreme they can make some Apple or Linux fanboy look balanced.

I personally backed Star Citizen for the same reasons I backed Richard Garriott’s Shroud of the Avatar: as a big “Thank You” for having made, in the past, some of my all-time favourite games. I mean, I was so in love with Wing Commander that I wrote my school notes in its iconic font for several weeks after finishing it. And I still have, to this day, t-shirt that came with the awesome collector’s edition of Wing Commander III… but those new games? Meh. Star Citizen was suspicious from the moment I learned they were using CryEngine. Shroud of the Avatar’s use of separate zones with loading screens (probably because of limitations with Unity’s streaming features) and antiquated UI made it vastly unappealing to me – although I give it a try once every few months to see the progress.

Hey, at least, in terms of pure entertainment, we can’t say we didn’t get some of our money’s worth with Star Citizen 😉


The Rise and Fall of Peter Molyneux

I recently discovered Kim Justice’s YouTube channel about video game history, starting with this video on legendary publisher Psygnosis, and quickly ended up watching this epic, 4-part, Peter Molyneux series:

One thing stuck out for me: Molyneux’s obsession with creating “living worlds”, i.e. games where you’re free to do many things (plant trees, build a house, have kids) and choose many paths (be good, be evil, choose this or that in each situation), and all the while witnessing the consequences of such acts. He’s not the only one trying to do this in video games, but he’s probably the one who tried it the most – or at least talked about trying it the most.

Technically speaking, this is a potentially fascinating problem. Will video game RPGs have to implement advanced AI and machine learning techniques for the game to truly react to your actions? Maybe. Hey, who knows, maybe Fallout 9 will be where the first sentient computer program emerges, after some guy in North Carolina has played it for 7 hours straight or something. But I’m wondering – is that even the point? Should video game designers strive for this kind of “perfect” sandbox experience? Or are they just working in the wrong medium?

There’s already a type of game where you’re free to do whatever you want, and the game world reacts accordingly – not only in a logical or plausible way, but also a narratively interesting way: tabletop, pen & paper RPGs… or, you know, just “RPGs”, as we called them back in the day1. If you’re writing a comic book while covering all pages with descriptions and inner monologues, maybe you should be writing a novel instead… and if you’re struggling to make a video game where you can do whatever you want, maybe you should be writing RPG books?


  1. Damn you video games RPGs – especially JRPGs, who have close to zero “RP” in their “G”. ↩︎


Saturday Morning

I have shows for my kids that I’d rather they wouldn’t binge watch. For example, a weekly/6 months a year show like Dragon Ball is supposed to evolve with its audience. But if my kid watches 7 or 8 episodes a week because that’s all he ever wants to see when he gets TV privileges, it would take him only a few months before he ends up in front of the teenage power fantasies of the Saiyan Saga.

Enter SaturdayMorning, my little week-end coding project.

Say your kids watch stuff on Plex or Kodi or whatever. You can remove all the episodes of the show they’re watching by putting them in some separate folder, out of your HTPC’s reach. Then you use SaturdayMorning to bring the video files, one by one, every week day or every saturday or whatever you want.

With only one new episode ahead of them, you may find that your kids ask for TV slightly less often, diversify their shows, and/or get more excited about a “new” episode being available to watch.

You can head over to the SaturdayMorning website, or to the GitHub repository.


Richard Garfield’s A Game Player’s Manifesto

Richard Garfield, widely known as the creator of Magic: The Gathering, recently posted this “Game Player’s Manifesto” against what he calls “skinnerware”:

I believe that in recent years, while looking for revenue models that work for electronic games, game designers and publishers have stumbled upon some formulae that work only because they abuse segments of their player population. Games can have addictive properties – and these abusive games are created – intentionally or not – to exploit players who are subject to certain addictive behavior.

It’s a good read, as Garfield tries to formalize what’s OK and not OK in games, with clear guidelines about gameplay aspects that make a game become “skinnerware”, while still allowing some gray areas. Of course, many people were quick to point out that his own game, Magic, falls, at least, in these gray areas. After all, a certain percentage of Magic players are known to spend huge amounts of money to acquire rare cards, and, generally speaking, buying more packs give you better cards which gives you some advantage.

What saves Magic from the skinnerware category, in my opinion, is largely that it’s a physical game, not a video game1, so whatever you buy still has value and can be sold back. The other thing is that its “power-ups for money” mechanism is not quite open-ended. True skinnerware games typically let you buy an endless amount of coins or jewels or energy charges or whatever. Magic, on the other hand, has a limited (although quite big) catalog of cards. Trying to get them all by buying booster packs quickly gets you diminishing returns because of the rarity of many cards, so you would quickly turn to individual purchases at market price. It’s still a shitload of money, but it’s a finite shitload.


  1. Although there’s a digital version you can play↩︎


Live Asynchonously

Quincy Larson of FreeCodeCamp recently posted an article about work productivity:

Last year I turned off all my notifications. I stopped booking meetings. I started living asynchronously.

Now instead of being interrupted throughout the day — or rushing from one meeting to the next — I sit down and get work done.

Using one of the most awesome webcomics on the subject of interrupting a programmer as a starting point, he does the usual attempts at convincing people that open floor plans are bad, and that meetings are better replaced by asynchronous communication.

Offices and emails

I’ve never had even the slightest opportunity to get my own office1 so I frankly have no idea whether a private office would be an improvement – I just don’t know any better.

We do a fair bit of asynchronous communication, however. This is pretty much unavoidable, since, over here on the Frostbite Engine team, we have to deal with customers and co-workers that are spread across a dozen various places on Earth with up to 9 hours of time difference.

In some ways, however, it’s funny that Larson recommends replacing meetings with emails since a lot of my coworkers mainly complain about having to deal with too much email already. Also, the way he describes how a “quick” email conversation can replace a lengthy meeting is misleading since – having turned off all notifications and checking email only a couple times a day to improve productivity – this “quick” 4-message back and forth would actually take 2 days to complete.

In the zone

The part that caught my eye the most is the part about reaching a “flow state” – something that most people call being “in the zone”.

I have almost no problem reaching that state – even in an open floor plan.

Arguably, I’m not important enough to receive enough emails or meeting invites to experience the problems a lot of other people (most of them more senior than me, I assume) complain about, so that must help… but I basically get “in the zone” often enough that, on a regular basis, I finish a task, take off my headphones, and realize that it’s 2pm and that everybody had lunch already.

While most people use the Pomodoro technique to help protect themselves from distractions, I was, for some time, using that technique to help me take a break every now and then… because being “in the zone” for too long would frequently give me painful migraines (at least once a week). Even when I used Pomodoro timers on my phone, I would frequently not notice them going off!

Then again, I’m one of those people that most of you probably hate: the ones who can fall asleep in less than 5 minutes. So I suppose my brain and I really get along well when it’s time to shut off distractions. Yay brain.


  1. Video game companies are almost all using open floor plans, and nothing will change that any time soon. ↩︎


Missing The Point

It’s September 2016, and Apple showed once again some pretty cool hardware: dual cameras, clever asymmetrical core design, water resistance, blah blah. I’m not interested since I already have the very recent 6S (I’m not that rich or desperate) but it’s a very nice piece of technology.

The change that will create the most ripples on the rest of the market however is the removal of the headphone jack, I think. Actually, scratch that. The removal in itself is not that important – it’s what they replaced it with that’s important. Yet, 90% of the press gets hung up on the removal.

I think they’re all missing the point.

The matter of the jack port removal is temporary. It’s going to be very annoying for those of us whose audio needs are not limited to “one phone and one pair of headphones”, but it will be temporary. Hopefully.

I don’t imagine Lightning port headpones will take off – as a manufacturer you’d have to be crazy to invest in a proprietary connector for which you need to pay licensing fees, which is something you didn’t have to do before, and which would also prevent you from selling your products to half of your market. Plus, even low-cost manufacturers are already able, to some degree, to produce relatively cheap Bluetooth headphones. So that’s where the market will go, and where Apple wants to go anyway.

The real problem is that in my opinion Apple opened a can of worms with their wireless headphones: they run with a proprietary “secret sauce” layer on top of Bluetooth. Some people are worried about the potential for DRM but I’m mostly wondering if we’ll see some kind of “wireless protocol war” starting in the next couple years.

Right now, Apple’s “secret sauce” is supposed to be backwards compatible with normal Bluetooth devices, but you know how these things go. The proprietary layer will get bigger with each new release – I’m even expecting that you’ll have to download firmware updates for your headphones on a regular basis soon – but all those cool features will create envy. You can bet that someone like Samsung will come up with their half-assed version of another proprietary layer on top of Bluetooth, as a “me too” feature. Maybe there’s going to be a couple of those out there. Some of those implementations may have some kind of DRM, added under pressure from the movie or music industry, in exchange for some short term IP, marketing, or financial boost.

Eventually the Bluetooth SIG will try and draft some new version of Bluetooth that tries to fix all the basic problems that really should have been fixed before anybody decided to remove the jack port… and meanwhile, Apple has a 5+ year lead on wireless technology, keeps growing their accessory licensing revenue, and is laughing at how everybody else is still having trouble pairing headphones correctly. It’s like the dark ages of the W3C all over again, for audio.

So yeah, Apple is really clever here. I’ve got no doubt iPhone users will be buying increasingly more “W1” enabled headphones from approved manufacturers… it’s a smart move. But not a courageous one. Courage would be to open-source their Bluetooth layer. Courage would be to work with the Bluetooth SIG (which they’ve been a member of since last year) to improve wireless audio for everyone.

Hopefully Apple finds some real courage soon.


PieCrust 2.0rc2

PieCrust 2.0rc2 was published a couple days ago, and it’s mostly a bug fix and clean-up release, as you might expect. Run your pip install piecrust --pre -U to update, or whatever you do usually. More info after the jump.

Since there are mostly bug fixes in this release, there’s only a few small user-facing change to discuss here.

SFTP publisher

Are you using the recent new publishing features? You should! Now there’s an SFTP publisher. See the publishers reference for details.

Simplified URL functions

I did some simplification of the way URL routing is handled. The biggest change is that URL functions now don’t need to specify their parameters. Here’s how it works:

  • You define a custom source… for instance the PieCrust documentation site defines 2 separate sources, one for the user documentation and one for the API documentation.
  • In the routes defined for those 2 sources, you can specify a func property which is the name of a function available through the template engine to generate URLs for these routes. In the aforementioned documentation configuration, these functions are docurl and apiurl (so one documentation page can link to another by writing {{docurl('something/else')}} for example).
  • The parameters to pass to the URL function are the same as the placeholders in the URL itself… so if the route is /blog/%year%/%slug%, you need to pass the year and the slug into the function ({{posturl(2016, 'my-new-post')}}).

You can see the (updated) documentation on routes for more details.

Merged taxonomy terms

By default, PieCrust “slugifies” your taxonomy terms (like tags) using URL encoding, so that something like “étrange” (“strange” in French… not the accented first letter) is transformed into “%C3%A9trange” (which your browser will properly show as “étrange”).

However, you can also tell PieCrust to do different slugification1, with the site/slugify_mode configuration setting. So say you set it to lowercase,transliterate… it will transform your tag names to lowercase and replace accented and non-latin characters with their closest equivalent. In this case, “étrange” gets slugified to “etrange” (without the accent), and “Technology” gets replaced with “technology” (lowercase).

Of course, it means that 2 slightly different tags can resolve to the same slugified tag – for example, “étrange” and “Etrange”. In this case, PieCrust will now warn you that you spelled things differently in some pages, but will combine both – i.e. the page listing for the term “étrange” will also list the posts where you spelled it “Etrange”.

As always, the full list of changes is available on the CHANGELOG.


  1. Yes, that’s totally a word a made up. ↩︎


Wireless is better than wired

Gruber, about Apple probably removing the jack and maybe shipping wireless earbuds by default with the next iPhone:

[…]not that one port is better than another, but that wireless is better than wired.

It’s not that wireless is better than wired – it’s that Bluetooth (as it’s widely speculated that Apple would still stick to that technology) is a far better alternative than something proprietary like the Lightning port. The complete absence of the “proprietary vs. compatible/open” considerations from the Apple bloggers is not only baffling but incredibly worrying to me. If Apple was to use, say, AirPlay, that would still be a terrible choice.

A transition towards Bluetooth as the replacement to the venerable jack port would be tolerable in the short term, and would maybe (hopefully) drive improvement on how OSes and devices work with it – for instance, the experience of switching your wireless headphones between devices is far from ideal, and that’s a very common scenario for me, almost on a daily basis. But frankly, I’m not looking forward to having to manage yet another battery level, and having headphones become obsolete with wireless protocol updates.


Baking faster with hoedown

Update: the times reported by the Bench utility are CPU times, i.e. they represent the time spent working by your various CPUs. The “real/wall” time, i.e. the time you effectively have to wait as a user, is usually a third less than that. So the “real” time for my blog went roughly from 7 seconds to 5 seconds.

In a previous blog post about PieCrust performance, I mentioned how static site generators are dependent on the performance of their formatting and templating libraries. One of the most common formatters are Markdown formatters and, by default, PieCrust uses Python Markdown. It’s the easiest one to install and use, but it’s far from the fastest one.

As far as I know, the fastest one that’s still maintained is Hoedown, for which some Python bindings exist. And if you have a recent enough version of PieCrust, there will be support for a Hoedown formatter, as long as you install Hoedown. You can do that by running pip install hoedown.

Once installed, you can make replace Markdown with Hoedown by writing this in your config.yml, or in a config variant:

site:
    default_format: hoedown
    auto_formats:
        md: hoedown

Any extensions you have declared for the Markdown formatter generally also translate directly to Hoedown:

hoedown:
    extensions: [fenced_code, footnotes, smartypants]

The performance increase can be pretty noticeable. For instance, on my ancient MacBook Pro (2.4GHz Core 2 Duo), this blog takes almost 9 seconds to bake1:

With Hoedown, the time goes down to 7.2 seconds:

Pretty worth it if you ask me! Now most of the time spent baking happens during templating with Jinja2… time to look for a faster alternative?


  1. I used the Bench tool to generate those reports. ↩︎