The Tomb Raider I’d like

(Image source: Core Design Tribute Fan Site.)

This year, the Tomb Raider franchise turned 25, and it’s only by chance that I even found out about that. I’ve been a fan ever since I was a teen, but before this year, I hadn’t touched anything Tomb Raider in quite a while. About two months ago, I reluctantly decided to try out Legend for the third time, and I fully expected to put it down without completing it for the third time.

Little did I know that I would get so hooked up that, in addition to finishing the whole LAU trilogy, trying out (and hating the guts of) Optional Tomb Raider, and beginning a playthrough of all the classic games, I’d start following several related twitter accounts, YouTube channels, the /r/tombraider subreddit, and even join a forum. (That’s how I found out about the 25th anniversary.)

Attitude, young lady!

The first time I tried Legend was in 2012, when I bought LAU. I think I quit after a couple of levels at most. The second time was about five years later, when I pushed all the way to the first Kazakhstan level and then quit, not particularly pleased with the experience. There were very minor things irking me, but the real reason I quit both times was just one: Lara was too nonchalant about murdering people.

In Anniversary (a prequel), Lara did show remorse and shame for taking a life, but then I guess it kind of grew on her? (Credit: David Angle.)

If you take a look at things like this or this, you’ll see that I’m not a great fan of death in general, let alone of murder. Yes, I know—it’s just a game, and telling that to myself enough times was how I managed to overcome my aversion and enjoy LAU. Besides, Lara never made a big deal out of killing people in the classic games either, and I played almost all of them, so why was this a problem now?

Well, in Legend, it was different.

I distinctly remember my first encounter with enemies in the very first level of Legend. They were both armed and didn’t exactly come across as very friendly people, but they were talking about their own business and hadn’t even seen me yet. At that point, Lara didn’t even know for a fact who they were. I knew that, had I tried to just sneak past them, they’d probably open fire, so I did what Lara does and killed them. While I did that, she and her friends back in London were amiably talking over the phone about Lara’s new quest and even cracking jokes. I found that off-putting enough that I quit playing.

She literally said she didn’t have the foggiest clue who he was before killing him. Also, that he was unremarkable. Well, that’s kinda rude… (Screenshot from SourceSpy91’s video.)

As said, when I gave the game a second chance, I quit early in Kazakhstan, but I think what really hit me as bad taste was something Lara said in the Japan levels. The local Japanese mafia boss she was facing was understandably complaining about her killing his henchmen, to which Lara replied: “I’ve simplified your payroll.”

The video starts just before the payroll quote. It’s a very witty punchline, but it sort of makes her come across as so self-important that she gets to decide who lives and who dies.

She’s a badass, I get it. I like her that way. However, there’s a difference between being a badass and trivialising murder, even that of criminals. I still love LAU and all the preceding titles (most definitely not the reboots, and not just because they take the concept of mass-murder to a whole new level), but I think I would like them and Lara more if she wasn’t so casual about killing people.

If I could do it…

Before anyone plays the sexism card: this doesn’t have anything to do with her being a woman. I don’t play FPSs or war games for the same reason (also I find them boring as hell), regardless of the genitals that their main characters were born with. I like Lara Croft as a character, and I think she could be a more positive character if she dropped the cavalier attitude in regards to gunning other living creatures down. Yes—when she kills animals, that too bugs me.

Toby Gard. I think he’s a very interesting chap. Sooner or later, I need to reach out to him… (Credit: Jon Jordan, CC Attribution 2.0 Generic.)

I’m not alone in this: Lara’s father himself, Toby Gard, said in a Gamasutra interview that he’s “not keen on just mindlessly killing humans in games.” That was one of the reasons why the first Tomb Raider had so few human enemies. (Their number went significantly up in subsequent installments, but Gard had already left the team shortly after Tomb Raider was released.)

Not that I count the survivor timeline as Tomb Raider, but even if I did, it certainly didn’t solve that problem. If anything, the survivor games exacerbated it, and the jarring dissonance between reboot Lara’s careless brutality in the gameplay and her relentless whimpering during the cutscenes made it even worse. Classic Lara may be too casual about murder, but at least she is consistent.

Currently, Square Enix and Crystal Dynamics seem to be a little too busy stuffing square pegs into round holes, so I don’t really hold my hopes up. However, on the off chance that any people from SE or CD looking for ideas might ever read this, here’s a couple about a nearly murder-free Tomb Raider game.

…I’d do it like this.

The first thing to keep in mind is that Tomb Raider was created with the exploration of ancient, lost tombs in mind, and typically, the people you might chance upon in such places are already dead. I also wouldn’t expect to find too many dangerous live animals in an ancient tomb where no one has set foot in ages—especially not in tiny, locked crypts with no food, water, or air. (Finding usable medikits or ammos for just the kind of guns you happen to have with you is also not very likely, but at least it’s not logically impossible. Maybe Lara is just lucky like that.)

The Obelisk of Khamoon level in Tomb Raider (1996). Those pumas (panthers? I don’t really know) just came out of a locked chamber with a floor area of maybe three or four square metres. I have so many questions my head is going to explode. (Screenshot from Kawaii Games’ video.)

Naturally, Lara Croft without her trademark dual pistols would be just as much of a heresy as Super Mario without his mustache would be (corollary: reboot Lara is not Lara), but no humans or animals to kill doesn’t mean nothing to kill.

This might not be everyone’s cup of tea, but I’d be absolutely thrilled about a Tomb Raider game where I can lose myself in a mysterious, ancient temple or tomb to explore, knowing that supernatural creatures (like the thralls in Underworld, or the mummies in The Last Revelation) might be lurking behind every corner. Importantly, enemies like that are unrealistic enough that I would have no qualms about gunning them down, nor would I mind Lara’s witticisms about it (“This is a tomb: I’ll make them feel at home” just doesn’t sound that funny to me when it refers to people.)

Shoot them to your heart’s content, Lara. They’re unrealistic enough to be disposable. (Screenshot from Games 4k’s video.)

In contrast, human enemies that fire guns at you are trite and banal. If I wanted that, I’d play Call of Duty or some other FPS. They also don’t fit very well in the kind of game I’m describing, and detract from the very same feeling of isolation that contributed to making the first game so enjoyable.

However, ruthless human enemies who wouldn’t hesitate to fire on Lara may still make sense in a number of plots—for example, one where they’re trying to find a relic before she does. Even in those cases, there is a way for Lara to take the high ground while pursuing her objectives and keeping the game entertaining: Batman’s way.

She doesn’t need to go around wearing a cape or do the Christian Bale voice (entertaining as that would be), but instead of shooting human enemies dead, she could use the same stealth combat techniques of the Caped Crusader, for the very same reason: she doesn’t have it in her to take a life. Besides, personally I find that sneaking behind enemies to knock them out, performing silent takedowns, and engaging in some good ol’ melee fights, would make the games more stimulating. (Yes, I know the reboots did that. No, I still don’t like them, and they still don’t count as Tomb Raider in my books.)

Okay, maybe the costume, detective vision, and that kind of stuff aren’t very tomb-raidery, but I would love something along the lines of the Batman Arkham series’ predator encounters in a Tomb Raider game. (Screenshot from dalleval’s video.)

At the time of writing, I haven’t managed to play Angel of Darkness yet, but I understand that it featured stealth combat and that it sort of sucked. However, as far as I know the game was essentially a bunch of bugs strung together with a few lines of working code, and it’s hardly to be taken as proof that stealth combat can’t work in a Tomb Raider game.

Original concept art of Lara Croft. When she wasn’t yet an aristocrat, I take it. (Source: Core Design Tribute Fan Site.)

To be fair, a change like what I’m proposing might be a departure from how Toby Gard had envisioned Lara—she was supposed to be a dangerous, austere character of aristocratic descent, very attractive and yet unattainable and hard to approach. One of Gard’s sources of inspiration for Lara was Tank Girl (probably not for the aristocratic-austere thing), and I guess my idea would push Lara further away from her. At the same time, it could be a way to evolve her character into a more mature one who did away with murder for the sake of achieving her goals.


As a side note, knocking people out for hours on end without causing them permanent injuries the way Batman does isn’t really a thing. If you beat someone unconscious, or do a blood choke on them, and they don’t wake up within seconds, they are brain-damaged at best and dead at worst. Nothing that a little suspension of disbelief can’t fix, though, and actually, making Lara drop her murdering habit might also add a little bit of realism in other ways.

I’m willing to believe that, if she had killed just one person in her entire life, she could be lucky enough not to get caught; but someone must well have stumbled upon at least some of the many bodies she’s left in her wake over the years, right? And none of those cases were ever traced back to her?

In Underworld, after her manor burned down, Lara said she would go search for Thor’s belt after dealing with the authorities about the arson. Yeah, right. I mean—hello, there are more notches on your gun than there are hairs on your head, and speaking of guns, you have an assault rifle on your back. I think the coppers might want to talk to you about more than just the fire.

After you deal with the authorities, Lara dear, I’m afraid you’ll spend the next twenty-five hundred years in prison. (Screenshot from Herbie Games’ video.)

And do we want to talk about Tomb Raider II? Just how did she get rid of the bodies of all the mobsters she killed in her own house without anyone ever noticing, for chrissake? And what about Angel of Darkness? In that game, she’s on the run, attempting to clear herself of being suspected of killing her former mentor. Given her impressive total body count and how she’s somehow always got away with it, I’m not sure why she should care if anyone mistakenly thinks she’s killed one more guy.

Why yes, I did see enough. Just pray the police won’t.

Dear Santa…

So, if I could ask Santa Croft for any present at all, what would I ask?

First and foremost, I would ask that Square Enix and Crystal Dynamics forget about unifying the timelines. Many fans like the survivor timeline, so by all means, continue it if you must, but keep it separate from the classic and LAU timelines. Make it a parallel universe or something, whatever you like, but please, consider picking up from Underworld and developing that timeline from there, without dragging all the survivor drama into it. You could make more fans happy that way, and you’d have more games to sell to fans who wouldn’t touch anything survivor with a ten-foot pole.

Huh? Oh, no. Back in the day, Core Design went a lot crazier than this when it came to promotional renders. (Source: Core Design Tribute Fan Site.)

Any new game in the style of LAU would be great, but if I could choose, I’d ask for a game heavily focused on tomb exploration with supernatural enemies, with only stealth combat available whenever Lara is dealing with human enemies. I’d ask for a game with today’s realistic graphics, but without any full-body plastic surgery done on Lara. She looked just fine in LAU. (In the classic games—eh, I’m not sure. I might or might not have something about that going on behind the scenes.) Oh, and yes, Santa—do check if Keeley Hawes is available to voice Lara once again. If I hear Camilla Luddington go “Aaa you thaaa?” once more, I swear I myself might become too casual about murder…

20 interesting facts about Peanuts

Image credit: Geordie, from Pixabay

I have been a fan of Peanuts for as long as I can remember. I don’t recall an exact moment when I became a fan or when I first encountered Charlie Brown and the rest of the gang, but I do know I was in love with them all already in elementary school. I would get a Peanuts-themed school diary each year, and I even had a couple of Peanuts dolls that came as free gifts with the laundry detergent my mother used to buy. (Charlie Brown, Peppermint Patty, and a few different Snoopy’s alter egos, if memory serves.)

I went on buying Peanuts-themed school diaries for as long as I was in school, and during my late teens, I began collecting all comic books of the series I could get my hands on. Peanuts was extremely popular even where I lived—Italy—but I never quite managed to find all the books. I’m not even entirely sure they were all translated and published, but I’d say I got most of them. Recently, I got the last three books of The Complete Peanuts in the original language, and now I can safely say I’ve read each and every of the nearly 18,000 Peanuts strips Charles Schulz drew during his life. That’s how I learned some interesting trivia about the series, which I thought to share here, for the benefit of whoever might be interested.

You’ll forgive the lack of pictures in this post, but you need permission to legally publish syndicated comics on your own website. I’m neither going to pay tons of money for it nor risk a cease-and-desist, so I’ll just link to them.

1. Charlie Brown is not bald.

With the exception of a lock of hair on the front and one on the back, Charlie Brown’s hair never really appeared in the strips, but it is there. Schulz himself confirmed this in a 1990 interview with NPR. According to Schulz, Charlie Brown’s hair is very fair and cut very short so that it’s practically invisible. To support this claim, Linus describes Charlie Brown as “sort of blond” in the Sunday table of July 9, 1989. That seems to contradict an earlier Sunday table where Charlie Brown said to Schroeder that “at least I don’t have yellow hair.” (July 17, 1955.)

2. “Charlie Brown” was one of Schulz’s fellow teachers.

The last volume of The Complete Peanuts (1999-2000) published by Canongate also features the Li’l Folks strips. Mostly single-panel, these strips were a precursor to Peanuts, and some of the themes and characters that would become recurring in Peanuts can be seen already in Li’l Folks. In a short introduction to Li’l Folks in the same volume, Gary Groth states that Charlie Brown was “the name of one of Schulz’s fellow teachers at Art Instruction”, where Schulz used to work in 1946. A character named Charlie Brown appears multiple times in Li’l Folks, though it looks nothing like the modern Charlie Brown.

3. Linus was named after Linus Maurer.

Linus Maurer was an American cartoonist friend of Schulz’s. According to Schulz himself, Maurer was the first person to see the first sketch of Linus Van Pelt, who was then named after Maurer. Unfortunately, Maurer passed away in 2016 at age 90.

4. Peanuts is (possibly) set in Pinecrest, California.

To my knowledge, the January 8, 1990 strip is the only one mentioning the probable place where the Peanuts gang lives. In that strip, Linus mentions that the school where he and Sally go is the Pinecrest Elementary School. According to my research, there are only two Pinecrests in the US: Pinecrest, Florida, and Pinecrest, California. I don’t know for sure which of the two it is (if any), but my guess would be Pinecrest, California because Schulz used to live in California. Also, Snoopy’s brother Spike lives in Needles, California.

5. Schulz was a friend of tennis star Billie Jean King.

Billie Jean King was among the many athletes referenced in Schulz’s work. They knew each other personally, and as King herself stated in her preface to the 1973-1974 Complete Peanuts, mentioning her in a strip was “his way of letting me know that we needed to talk or just catch up with one another.”

Speaking of mentions, athletes weren’t the only people, fictional or real, that Schulz named in his work. Something that caught me by surprise was that Harry Potter was mentioned in the November 8, 1999 strip. Sometimes I forget that Harry Potter is a rather old series by now, and that Peanuts ran until fairly recent times.

6. “Happy birthday, Amy!”

Several August 5 strips have the text “Happy birthday, Amy!” written somewhere on them. These birthday wishes were meant for Amy, indeed, one of Schulz’s daughters.

7. Coconut hatred.

Several characters in Peanuts, including Charlie Brown and Snoopy, hate coconut with a passion. The reason is that Schulz himself did. In a Facebook post, the Schulz Museum said that “Charles Schulz first ate coconut when he was a child, and he disliked the taste so much he was determined never to eat it again. When Charlie Brown came along he shared the cartoonist’s loathing for coconut, and he was very clear how he felt about it. Schulz himself once proudly stated ‘…I’ve taught all my children to hate it too’.” According to the New York Times,  Schulz “hated cats, coconut and sleeping away from home.” (I guess his hatred for cats was milder, in that only Snoopy out of the entire gang went on to inherit it.)

8. Poochie started it all.

The vast majority of the characters call Charlie Brown using his full name. The only exceptions are Peppermint Patty (“Chuck”), Marcie (“Charles”), Snoopy (“the round-headed kid”), and Peggie Jean (“Brownie Charles”, see below). This was the case from the very first time Charlie Brown appeared in Peanuts, but technically it wasn’t always the case. In a January 1973 strip, it is revealed that this trend was started by Poochie, a minor character who was mentioned in just a handful of strips and appeared only in one. Poochie was Charlie Brown’s neighbour, who moved away from the neighbourhood during Snoopy’s puppyhood, and it was she who started calling him using his full name.

9. Snoopy wasn’t always Charlie Brown’s dog.

At the beginning of the strip, it wasn’t exactly clear whose dog Snoopy was. Regardless, in a series of 1968 strips it is revealed that Snoopy used to be the dog of Lila, a minor character who appeared only in a few strips. Lila’s family could not keep Snoopy, who was returned to the puppy farm he was born in and later on bought by Charlie Brown’s parents.

10. Charlotte Braun and the axe.

Charlotte Braun is a very early minor character who appeared in ten strips between November 1954 and February 1955. She is a dominating personality who constantly shouts. It’s unclear why Schulz named her so obviously after Charlie Brown. What’s really interesting about her is that in 1955, a fan named Elizabeth Swaim wrote to Schulz and asked him to remove the character, for some reason. Schulz took her suggestion, possibly because he himself hadn’t seen a lot of potential in the character; he replied to Swaim as follows:

“Dear Miss Swaim, 

I am taking your suggestion regarding Charlotte Braun and will eventually discard her. If she appears anymore it will be in strips that were already completed before I got your letter or because someone writes in saying that they like her. Remember, however, that you and your friends will have the death of an innocent child on your conscience. Are you prepared to accept such responsibility? Thanks for writing, and I hope that future releases will please you. 


Charles M. Schulz.”

The reply included a drawing of Charlotte Braun with an axe in her head. That’s way grimmer than I would ever have expected.

11. Peggie Jean and Brownie Charles

Charlie Brown’s long-standing love interest was the fabled little red-haired girl, but she wasn’t the only one. Peggie Jean, a minor character from the 90s, was on Charlie Brown’s mind pretty much till the end of the strip, and she actually kissed him. (She will break up with him, eventually.) The first time they introduced themselves to each other, Charlie Brown was so nervous that he said his name was “Brownie Charles”—which Peggie Jean liked so much that she started using it as a nickname for him.

12. The mystery girl

On March 2, 1994, an unknown girl walks up to Snoopy’s doghouse to tell him to get up and chase rabbits. That’s something Frieda would usually do, but the girl looks nothing like her. According to Wikipedia, Schulz claimed that the girl was Patty, but she looks nothing like Patty either. Indeed, the claim on Wikipedia has no source, so whoever that girl was is still a mystery.

13. Adults in Peanuts

Adults almost never appear in Peanuts. They are mentioned, or their presence may be implied, but they are usually not seen. A few exceptions do exist: the first one was on May 16, 1954, when adult legs were shown during a golf tournament to which Lucy participated; indistinct adult figures are shown from a distance in the May 30, 1954 strip too. Another notable exception was the November 11, 1998 strip, where Willie and Joe, two characters by Schulz’s fellow cartoonist Bill Mauldin, appear alongside Snoopy to celebrate Veterans Day.

14. Snoopy didn’t invent the “It was a dark and stormy night” incipit.

It’s quite possible that you know this already and I’m just very ignorant, but even though Snoopy did contribute a lot to the popularisation of the incipit “It was a dark and stormy night”—a quintessentially banal opener—he didn’t invent it. It was the opening sentence of the 1830 novel Paul Clifford by English novelist Edward Bulwer-Lytton.

15. Snoopy had siblings.

Snoopy wasn’t an “only dog”. (Which flies right in the face of what he himself said in the June 6, 1959 strip.) As stated in the strip from June 18, 1989, Snoopy was one of a litter of eight: Spike, Belle, Andy, Olaf, Marbles, Rover, Molly, and Snoopy himself. While Spike is arguably the most famous of Snoopy’s siblings, they all appear at some point in the strip, with the exception of Molly and Rover, who only appear in the TV special Snoopy’s reunion. A recurring theme of several strips of the last few years of the series was Andy and Olaf trying to reach Spike in Needles, but systematically getting lost somewhere.

16. Snoopy’s alter egos.

Probably, everybody knows about Snoopy’s most famous alter ego—the World War I pilot whose archnemesis was the Red Baron—but that was far from being the only one. The list is long, and includes everything from simple impressions (mostly other animals, which Snoopy envies for a reason or another) to actual personas that would recur throughout the series: surgeon, lawyer, grocery clerk, various coaches, and many, many more.

17. The Great Watermelon.

Yes. Yes, I know. It’s “pumpkin”, not watermelon. Except in Italy it was watermelon, because flimsy reasons. The translation stuck, and I grew up reading about the Great Watermelon instead of the Great Pumpkin. And no, Schulz’s pumpkins look nothing like watermelons.

18. The little red-haired girl was actually shown in the strip.

That’s right. Charlie Brown’s elusive love interest appeared in the strip. It happened only once, and it was just a silhouette, but it was her. It was on May 25, 1998.

19. The reason Spike lives in the desert is rather grim.

Snoopy’s brother Spike lives all alone in the desert, despite the fact it obviously makes him miserable, and no reason was given until September 18, 1994. The reason is, one day Spike was walking out with people, and they ordered him to chase a rabbit that darted in front of them. Spike didn’t really want to, but did it anyway. To escape Spike, the rabbit ran into the road and was hit by a car, for which Spike hated himself and the people who made him do it. He escaped to the desert so that he could not hurt anything else again. That’s right—guilt and perhaps a desire to punish himself are what led Spike to a life of isolation. Why Schulz gave him such a sad backstory is anyone’s guess—I am not aware of a specific reason anyway. (If you are, please let me know.)

20. A selection of last-times.

On October 16, 1999, Charlie Brown put away his baseball gear for the last time. The “next year” he refers to in the strip never came, as Schulz died around four months later. The last time baseball was mentioned in the strip was on December 27, 1999.

On October 24, 1999 the last football gag took place. Rerun took Lucy’s place, and neither we, nor she will ever know if Rerun pulled the football away. (That’s what makes Lucy go “Aaugh!”, and that, too, is the last time the cry appears in the strip.) According to Wikipedia (and the Peanuts Wikia as well), about the football gag, Schulz said that having Charlie Brown finally kick the football after so many years would be a disservice to the character; however, upon signing his final strip, Schulz realised that it was a “dirty trick” that the “poor kid” never got (and never would get) to kick the football. (I have no reason to doubt the authenticity of these quotes, but I could not find actual interviews or documents proving he actually said them.)

Schulz always did everything by himself, lettering included, but because of his declining health in late 1999, on December 30 and 31, 1999, and January 1, 2000, the lettering was either done by someone else or by computer.

The last daily strip was published on January 3, 2000. From that point until the day after Schulz’s death, on February 13, 2000, only Sunday tables were published. The final daily strip re-announces Schulz’s retirement (which had already been announced on December 14, 1999), and thanks the fans and editors of the strip.

It’s too bad that Schulz died. I would have loved to see how the Peanuts gang would have evolved in the age of the Internet, social media, and ubiquitous cell phones. Had he been still alive, he would have been 98 years old at the time this post was published.

Back in my day…

(Image credit: Christiaan Colen, licenced under CC BY-SA 2.0)

If you’ve browsed the Internet recently, you probably noticed how every-f#$@ing-one is dying to know whether you are going to accept their non-essential cookies or not. (I know, right? Weirdest sexual innuendo ever.) You’ll also have noticed how you’re asked to subscribe to something for every damn thing you need to do, and how receiving an email or a notification is no longer an exciting sign that somebody cares.

Okay, I admit it. I sound like a grumpy old man who’s making a big deal out of nothing. Still, while I thankfully am nowhere near being old yet, and while I prefer looking ahead over looking back, there are a few things that I like looking back to. One of them is the Internet of 20+ years ago.

If you weren’t born in the early 90s at the very latest, you probably have no idea what I’m talking about. You hardly remember a time when Facebook and social media weren’t a thing, or when “google” wasn’t a verb. You almost certainly never used Yahoo! Directory, and I would be surprised if you knew about Yahoo! at all. (I doubt I’d know about it if I hadn’t lived through the times when it was the go-to search engine, but maybe it’s more popular than I think and I’ve been living under a rock all these years.)

I got my first computer in early 1998, when connections were all dial-ups and the next level was ISDN. I wouldn’t hear about ADSL for another five years, I think. That was the time when Windows 95 was all the rage (for most home users anyway, I guess), the first edition of Windows 98 was just about to be inflicted on the world, and accelerated graphics cards like 3DFX were add-ins that worked alongside your regular 2D card.

This, younglings, was the Google of those days. (Found on the Wayback Machine, February 1998)

It was a shiny new world for me, and I was in my teens, so I guess it’s understandable if I look back on it so fondly. However, there was something about the Internet of those days that I miss.

Cosiness. The Internet of the late 90s was cosy. It felt small and quiet. Despite really annoying things like animated backgrounds, background music, and pop-ups, most websites felt calm and homely, like nice little living rooms where only you and the website owner were sitting, chatting amiably. No one trying to get you to like or subscribe, little-to-no ads (though, on the flipside, there was no AdBlock to block them, as far as I know), no trillion of cookie options to tick or GDPR notices to read. (Not that anybody ever does.)

An old 14.4 kbps dial-up modem. (Credit: Lawrence Sinclair, licenced under CC BY-NC-ND 2.0)

To be fair, finding what you were looking for, if it was there to be found in the first place, wasn’t so easy as it is now. Today, if what you need isn’t among the first few Google search results, it probably doesn’t exist; in the 90s, you’d comb through each and every last page of Yahoo! Search (or Directory), because the website you wanted might easily have been at the very bottom of the list. Don’t get me wrong: I appreciate not having to waste hours looking for things, but back then I had a lot more time to kill, and searching for something on the Internet was like a treasure hunt. I was into emulated games a lot, and finding a reliable ROM website after patiently looking for it the entire afternoon was an assured dopamine hit, just like finding a large MIDI collection, or simply the hobby website of someone who shared my same interests.

And, oh, those websites. Visiting your favourite ones over and over again, reading them from top to bottom, looking for updates, was kinda like going over to a friend’s place for tea and cookies. (Only the essential ones, though.) It was a way to get to know the people behind them without ever having met them. Instant messaging wasn’t really a thing (unless you count IRC) and your best shot at talking to them (especially if they lived abroad, which was often the case for people running the websites I visited) was sending them an email. You’d wait for a reply like you would for a Christmas present. (Yes, I’m exaggerating it again, but it was very pleasant nonetheless.) Anyone else remembers the excitement of the chime sound in Internet Mail when you got new messages?

Speaking of sounds, depending on how old you are, you might not know that back then your devices (which were just desktop or laptop computers at best) weren’t connected to the Internet all the time. Dial-up meant that you were making a phone call to connect, and the longer you were connected, the more you’d pay. That sucked big time, but if visiting your favourite website was like being at a friend’s place, switching on your modem and hearing the dial tone was like wearing your coat and going out to get there. I know a lot of people are very nostalgic about that sound.

Needless to say, at the time there was no YouTube, no Netflix, and no streaming. As far as I recall, AVI was one of the most popular video formats, it wasn’t very common to find videos to download, and when you did, your 33.6 kbps modem (or 56 kbps, if you had the latest gear) would take hours to download a 10MB video. So, yeah, watching movies online wasn’t really a thing. The anticipation of finally completing a large download was actually quite pleasant, though—less so when it failed at 99% after hours of waiting. (Yes. Yes, it did happen to me.)

I take it it must be still at it. (Source: Reddit)

In the early 2000s, say until 2005, things began to change. From my perspective, that was the rise of Flash and Java games, of ADSL, of VoIP, and the time when discussion forums were cool (they probably were earlier on too for many people, and for many still are). I am no Internet historian and I might be wrong, but I think that’s about when blogging was born. Before then, only true nerds had a hobby website: you either needed to know how to code, or be happy with whatever result you could produce with the horrible WYSIWYG editors of the time. (Also, no backend; you’d be lucky to have a visit counter and a guestbook.) I have good memories of that epoch too, the new hidden treasures of which were games like Submachine, Daymare Town, and too many others too remember (by the way, RIP Flash). That’s also when peer-to-peer grew in popularity, which combined with faster connections made it possible to download full movies—which could still take days, carried the risk of downloading a bunch of malware and viruses, and by the way was rather illegal.

Does anyone still remember computer viruses, by the way? It’s not like they’re gone, but they turned from trolls that messed with your screen and files to sneaky little bastards that try to keep as low a profile as possible—until they need to let you know that your files are encrypted and that you need to pay a ransom to get them back, anyway. Maybe I’m just out of the loop, but I don’t hear anymore about things like ILOVEYOU or Melissa.

The feeling of cosiness I was talking about and which used to apply to the whole Internet began to fade away when social media began; it didn’t just decrease in intensity, but also in scope. The number of websites that felt cosy plummeted as the Internet grew more “social”: comments, likes, shares, and so on. There were no nice little living rooms anymore, only big market squares where everybody was talking (and sometimes shouting) all the time. Catching other people’s attention became important, and that’s how having a personal website went from a hobby thing to a business where you need to know who your audience is, what the trendiest topics are, how to do SEO, and all sorts of marketing strategies. (Just so I don’t come across as a huge hypocrite, it’s not like I don’t care about growing an audience; I do, but words like “marketing” make me sick to my stomach. I’m one of those delusional romantics who believe that, as long as they focus on doing stuff they like, the right audience will come to them without having to resort to all tricks in the marketing bag.)

I was never big on social media. I joined Facebook only in 2011, Twitter only in 2020 (except for a brief fling in 2016 that ended up with me deleting my account), and there’s tons of others whose purpose I still don’t quite understand. Social media websites aren’t cosy pretty much by definition, but believe it or not, there was a time when Facebook felt small and welcoming. For a few years after I joined, it felt like a bit of a larger but still cosy living room with several friends instead of just one, except they were friends I knew in real life. Seeing the red notification icon was nice: some of my friends cared about something I said! Friend requests, whether I sent them or received them, were also very pleasant: they generally were from\to people whom I’d recently met in real life, and a friend request felt as though we were getting closer.

But then groups and pages became more and more popular, which eventually led to your feed being invaded by tons of people you didn’t even know existed. Ever received a notification about someone whose name you’ve never heard commenting on something you don’t care about in a group you forgot you’d joined? That’s what I’m talking about. Thankfully, I’m through with comment fights with strangers whose opinion I disagree with; I tend not to go much past “Happy birthday!” or “Nice cat!” But there still are people who think that, just because they happen to have a shared interest with you, it’s okay to send you a friend request even though you don’t have the foggiest clue who the heck they may be. Not cosy by a long shot.

These days, when I want to enjoy that cosiness again, I visit pages like this one. (Yes, for some reason that I myself don’t understand I’m a big Mega Man fan, and someday I should write about it.) It’s one of the few websites I know that somehow managed to survive this long without becoming a relic and without losing its original cosiness. In general, that cosiness may be lost forever, but like I said, I like to look ahead more than I like to look back: it’s possible that something new will come along, either on the Internet or some entirely new medium that we can’t even imagine yet, and with it, a new cosiness just waiting to be discovered and savoured. I look forward to that.

The blocky charm of pixel graphics games

(Image credit: user DoubleOMURFY, Steam community)

From time to time, I like popping over to and browsing for new Indie games—anything goes, really, but I tend to prefer pixel graphics games. It’s a habit I’ve formed over the past few years, dating all the way back to Halloween 2015, I think. My girlfriend and I were looking for games that would fit the mood of the season, and we stumbled upon the Deep Sleep series—a little psychological horror gem that at the time wasn’t yet on Steam. 

Granted, I had played many other pixel graphics games before—Monkey Island, Day of the Tentacle, Broken Sword, and the lot—but my encounter with the Deep Sleep series marks the moment when I became interested in retro-graphics. It’s not just me: modern games dressed in pixel clothes are becoming increasingly common, and it’s probably not just because of a bunch of nostalgics hellbent on bringing back the good ol’ games that have something that new ones lack.

DOTT was one crazy ride. (Credit: Steam)

Okay, surely the nostalgia effect does play a role, but that can’t be all. I think there’s something special about pixel graphics. The way it strips down all the bells and whistles from a game, leaving only what you really need to focus on, has a special appeal to me. It’s like playing in distraction-free mode, fully immersed in the world of the game. (Well, the game itself needs to be interesting, of course; some games just suck, and no amount of pixel art can change that fact.)

Being pixelated doesn’t make this landscape any less beautiful. (Credit: Steam)

Of course, there’s pixel art and pixel art. Though not as much as they could be in a modern game, the landscapes in Monkey Island 2 were lively and beautifully detailed, certainly a lot more than they can be in a Bitsy game. Yet both kinds of pixel art have that certain je ne sais quoi I was talking about. I can’t quite put in words what it is, but I think it has to do with the way pixel art captures the essence of things. 

By its very nature, the medium forces you to leave out the finer details and focus on the basic qualities of what you’re trying to represent. People’s faces boil down to two pixels for the eyes, and a handful more for things like mouths, hair, and maybe beards; in extreme cases, your main character could easily be just a single-sprite stick figure. The same goes for objects and places, and while this limitation used to be just a limitation indeed, nowadays it’s one of many brushes in your paint brush set.

A finer brush will allow you to paint finer details that you don’t want to leave up to the player’s imagination; it’ll let you better define what your characters and environments look like, and establish certain facts about them. A larger brush, such as that painting the broad strokes of pixel art, will let the player fill in the blanks that were not a mandatory part of the experience you were creating. Guybrush Threepwood had to have a certain look that was part of his persona; he wasn’t merely an interface between the player and the game environment, so the brush they used to paint him, so to speak, was finer than what you’d use to paint characters in this game, and a lot finer than the brush used in this one; it was a lot coarser than the brushes used to paint Arkham Asylum.

You want weird? ‘Cause this is weird. (Credit: Rusty Lake)

Obviously, a character in a game isn’t just looks. Good writing can make a moving square into the most charming character of all times, but a character that is no more than a walking stick figure is a good choice when you want the player to be the real main character. For a truly immersive experience, you may want to go with a first-person interface, like the aforementioned Deep Sleep series or the disturbingly weird games by Rusty Lake.

(Rusty Lake don’t do pixel games and their games do have main characters with an actual face, but you hardly ever see them, as the games are often first-person.) Main characters like that (or lack thereof) remove a barrier between the player and your plot, letting the game speak directly to the player and allowing him or her to live the events of the game directly, rather than vicariously through the main character. This is something which I think can make the game a lot more immersive.

And no game needs to be immersive more than a good psychological horror game. If it’s supposed to scare you, you need to be 100% in it.  That’s why I think pixel graphics is an excellent choice (though by no means the only one) for this type of game. Pixel art horror games have given me some among the best jumpscares and feelings of dread and isolation I’ve ever experienced. (For example, see again the Deep Sleep series, or The Last Door, or its first four chapters anyway.) 

Deep Sleep 2. (I think?) Way scarier than you might think. (Credit: Steam)

I think the reason for this is again the limited amount of detail present in a pixel art game. We’re most afraid of things we don’t fully understand, or that look off somehow. Pixel graphics presents players with a model of reality sufficiently “complex”, for lack of a better word, to get the gist of what’s going on, but simplified enough to be uncanny. Creating an eerie, unnaturally quiet and lonely environment is a lot easier in pixel graphics than in any modern 3D engine. 

When I say it’s easier, I don’t mean from a technical point of view. (Which might or might not be true; I’ve never made a game—yet.) I mean that pixel graphics, by its own nature, already sits right in the middle of a sort of uncanny valley for art, unlike modern 3D graphics. As an example, I’m willing to bet that none of these games would be half as eerie if they were remade in Unreal Engine. 

This is just my opinion, of course; it’s entirely up for debate and it’s not a rule without exceptions. I’m not a pixel game purist, either. Knock Knock is a 2D-ish, cartoony game that got me flipping on all light switches in my apartment each time after playing it; it’s a glaring example of something that was scary because I didn’t fully (or at all) understand what was going on. Amnesia needs no introduction but bears special mention; The Survey is another example of a 3D game that I found utterly terrifying. (By the way, if you have psychological horror games of any kind to suggest, please do; I’m always looking for a new one to play.)

Knock-knock! Who’s there? A beheaded sanatorium patient, I think? (Credit: Steam)

Finally, and here I’m circling right back all the way to the beginning, there’s the nostalgia effect, which of course is not inherent to pixel art itself. When I was a child, pixel graphics was nearly the state of the art. When I got my NES, at age seven, it was still all the rage; I began playing point-and-click classics only in 1998, when they were already a few years old at least and I was a teen. I guess that playing games during childhood was by default a more immersive and imaginative experience, which I must have linked mentally to the graphics of the time. For example, for me, The Legend of Zelda was an incredibly mysterious and surreal adventure (which I didn’t beat until I was a fourteen-year-old playing emulated games on my PC *ahem*); maybe some of that feeling rubbed off on my perception of pixel art.

This might not be all, and I don’t fully understand the reasons why pixel graphics is so special to me anyway. Maybe that’s a reason in and of itself: what’s more fascinating than something that appeals to you but you don’t quite know why?

Can we figure out the truth?

(Image credit: Ryoji Iwata, Unsplash)

Recently I found myself thinking that it’s really difficult to figure out what the truth about something is. It’s so difficult that it’s not surprising if some people choose believing things over knowing them—though I think that these aren’t the only possible values of a binary choice, but rather they lie on the opposite ends of a spectrum. 

You don’t just either believe something or know something. Whatever ‘something’ is and no matter how much you know about it, at least some of that knowledge builds upon a set of beliefs, (preferably few) axioms that are accepted on the grounds of their self-evidence until proven wrong. That’s not a problem—not as long as you’re willing to change your beliefs whenever compelling evidence suggests you should do so. 

As you question your beliefs and the set of your beliefs shrinks as you gain more knowledge, you get closer to the ‘knowing’ end of the spectrum. That’s kinda like how science works. I have a hunch that you can never get to be exactly on the ‘knowing’ end: you can only get closer and closer to it. You can, however, sit right on the ‘believing’ end of the spectrum and fully believe something you know nothing about—or even worse, something that is contradicted by the evidence. That’s not uncommon.

There’s more than one reason why finding the truth is so hard, I think. One of them is that the search for the truth is often a complicated process of reverse engineering. One fine day, we found ourselves on this rock flying in space and we had to start figuring out how the universe worked billions of years after it was born. Worse still, we got brains that are primarily geared to figure out how to survive, not how quantum mechanics works, and it’s only by a serendipitous coincidence that they can do both. (Most animals survive just fine even without basic arithmetic, and so did we, once upon a time.) How further down the rabbit hole we can venture before we start scratching the bottom of the barrel of our brain power is a good question which probably nobody has an answer to. (Yes, yes, we might augment our brains with tech or have AI solve all the toughest problems for us. I’ve heard all about it, and it may well be the case. That’s beside the point.)

That makes the search for truth hard, but sort of in an engaging way: it’s the kind of challenge that gives you a dopamine boost as you make it through every step of the way—not to mention when you finally get to the end of it. However, there are other challenges that might be less fun.

One of them that is frequently discussed lately is finding reliable sources of information. There’s tons of bullshit out there, and unlike the truth, bullshit travels way faster than the speed of light. We’re talking pretty much warp 10. Some bullshit is spread unintentionally, simply because it’s just that catchy. Other bullshit is spread intentionally, often because it paints the truth the way we would like it to be, and more importantly, the way we would like others to see it. 

Some bullshit isn’t even 100% genuine; that too is a spectrum, and a claim can sit anywhere between ‘100% bullshit’ and ‘100% true’. This can make it hard to tell facts from fiction even if you’re a critical thinking pro and are knowledgeable on the specific subject; it’s even worse when you consider that there are so many sources to comb through, and sometimes we absorb information from them without even noticing. That’s how factoids and ‘common knowledge’ are born, and they aren’t always correct.

This makes looking for the truth harder, and not in a nice way. It gets even less nice when you consider that you might have reasons not to want to know the truth. That’s when you start sliding very close to the ‘believing’ end of the spectrum.

For reasons that I don’t understand (and I’m not trying to be condescending here), some beliefs can be so intertwined with our identity (either our own or that of a group we feel we belong to) that they become sacrosanct. Questioning a belief of this kind triggers our fight-or-flight response, immediately shutting down our critical thinking. Screw the evidence: this heretic here had the effrontery to say that my god is not real, my favourite politician is a thief, my sworn enemy is a good person, my conviction about thing X is not true! I say burn ‘em at the stake!

We’ve all felt that, haven’t we? Sometimes even when we were in the right. I know I have. There were times when somebody questioned things I deeply cared about, and even though I could have backed up my case with solid evidence, the fact alone that the veracity of those things was being called into question was enough to shut down my prefrontal cortex, making me unable to think straight and deploy my argument. (I did, though, get a steady supply of sarcastic retorts, courtesy of my amygdala.)

That’s the wrong footing for someone looking for the truth. It pushes you off the path of rational research and onto that of confirmation bias. That doesn’t happen with things about which we don’t care if they’re true or false so long as we know what they are, which by the way is the perfect attitude for truth-seekers: they care for the truth, regardless of what it might be. Often, though, it doesn’t work like that: we want our truth to be true.

Topics such as politics or religion rank fairly high on the list of topics that are prone to causing conflict when discussed, and beliefs about them (and the rest of the items on the list) can be excruciatingly difficult to let go of. I guess a possible reason is that they’re extremely polarising topics that leverage the deep-rooted tribalism of our species. We’ve always liked to reduce the universe down to a matter of true or false, right or wrong, left or right, us or them, because—well, that’s easier to understand than a more nuanced world where things don’t just either belong to a set or not, but can and often do belong to it only to a certain degree. For example, nobody is fully inside the “good people” set, nor fully outside of it; each of us can be closer or farther away from its centre, so to speak, which means the set doesn’t even have a definite border. (This example is further complicated by the fact that defining “good” and “bad” is not easy, but let’s not go there.)

Things would be simpler if there were just two camps for everything, only one of which we knew to be “right”, but most of the time it doesn’t work that way. Our pet peeve politician is probably not 100% bad, our idol isn’t 100% perfect, our favourite sources of information aren’t perfect or fully bias-free, newspapers with a political ideology opposite to ours don’t always publish falsehoods, etc. Good luck finding the truth in this mess!

Regardless of why certain ideas tend to more easily than others become part of our identity and hence untouchable, I guess that another reason why it’s so hard to question them is that, once the floodgates are open, they’re open. If one of your core beliefs is bullcrap, others might be too; admitting to be wrong, to yourself and others, is shameful; readjusting your worldview to accommodate new facts takes effort. None of this sounds remotely as fun as discovering a well-hidden truth that you didn’t feel so strongly about.

Yet, moving forward, telling fact from fiction will probably become more and more important, and so will letting go of what we want to be true in favour of what is true. It might be that learning not to get too emotionally attached to possible truths will be essential to ensure we can discover the “true truths” out there; how we can do that, I frankly have no idea.