This a self-contained cheatsheet I created for TIC-80:

Here’s the PDF

And a hi-quality PNG:

tic80refpng

Discuss...

Published

02 August 2020

Category

gamedev

Tags

I resurrected a bunch of old projects done for either fun or while job hunting:

This is a demo made to breath life into a mockup screenshot I found online. Don’t remember where I got the mockup from but I like the vibe :)

Demo is written using javascript/PhaserJS directly on the sadly-ruined-to-milk-for-money-by-amazon website Cloud9. I find it infuriating that rather than supporting the c9 platform and the growing community that used it for small projects, Amazon ate the service and now only allow using it through their hyper-complex-absurdly-massive AWS mess of enterprise stuff.

This is a simple shapes application written in Typescript/Angular. It lets you: create/delete shapes, select and transform existing shapes, and saves everything by default so if you refresh the page it should restore your scene.

This is a demo to try to imitate Super Mario Bros character controls as close as possible. Written using Typescript/PhaserJS in Cloud9 (when it was usable). I didn’t intend for it to be a full game.

Remastering Zamron

Through October, myself and my brother worked to remaster an old game of mine, Zamron Encounter. We had two goals:

  1. Porting game graphics to use hardware accelerated Stage3D in Flash. The game used Flash/AS3 software rendering for graphics which is awfuly slow and buggy.
  2. My brother wanted to re-make game art for fun and so far he did a great job!

Technically the re-master is 95% done, there are two bugs left to resolve (albeit tricky ones) and we can re-release a much better version of Zamron.

Artistically, my brother finished level art and a bunch of other stuff. Still working on monsters and player textures.

There is a larger goal to eventually start a new project to remake Zamron from the ground up using a modern engine into a full product. The re-mastering effort serves two purposes: having a usable prototype to base the remake on, and developing a work pipeline between us two.

Will talk more about Zamron’s remaster progress in the future.

Here’s a sneak peek for what we currently have:

Currently working on: Game Off 2019

Game Off 2019 is a relaxed 30 days game jam that just started a few days ago that me and my brother are participating in. The theme for this one is: Leaps and Bounds. So far we have a few concepts in progress and I hope by next week we’ll be able to show some progress.

Discuss...

Published

06 November 2019

Category

gamedev

Tags

A while ago I used a simple benchmark to very roughly compare performance on multiple platforms: PC-6002 vs 80s Computers Benchmark and got some interesting results.

Recently I got a Raspberry Pi 4 and wanted to figure out how its new CPU compare to other platforms so I went back to that simple benchmark I used, scaled it up by 1000x and used it in many different ways on many different devices and platforms. I think the results are noteworthy :) but it’s still just for fun, this is by no means a benchmark that should be taken seriously.

The simple bench I used looks like this:

import math
import time
import os

all_primes = []
t1=time.time()
skip=False
for i in range(2, 100000):
    skip=False
    k = math.floor(math.sqrt(float(i))) + 1.0
    for j in range(2, int(k)):
        k1=i/float(j)
        k2=int(k1)
        if k1==k2:
            skip=True
            break
    if skip:
        continue
    all_primes.append(i)
elapsed=(time.time() - t1)
print("Prime count = " + str(len(all_primes)))
print("Python Time=" + str(elapsed))

Rewritten to: C++, C#, Lua, Javascript, and GDScript

Here are the systems I ran simplebench on:

  • Raspberry Pi 4: Cortex A72 1.5GHz, Raspbian
  • Raspberry Pi 3: Cortex A53 1.2GHz, Raspbian
  • Raspberry Pi Zero W: ARMv6 1GHz, Raspbian
  • Desktop PC: AMD Ryzen 5 1600 3.2GHz, Windows 10
  • Desktop PC: AMD Ryzen 7 3700X 3.6GHz, Windows 10
  • Mini PC: Intel Pentium 4415U 2.3GHz, Kubuntu 18.04
  • Laptop: Intel Core i5-4258U 2.4GHz, Kubuntu 18.04
  • Laptop: Intel Core i7-8550U 1.8GHz, Kubuntu 18.04
  • Laptop: Intel Pentium 4415Y 1.6GHz, Windows 10
  • Laptop: AMD E-450 1.6GHz, Debian
  • Mobile: Snapdragon 855 2.84GHz+1.78GHz, Android 9
  • ShieldTV: nVIDIA Tegra X1 2GHz, Android 8

Here are the runtimes I used to run the benchmarks on:

  • GCC 7+ (alternatively clang) with -std=c++14 and -O3 flags
  • Mono
  • Lua 5.1+ and luajit
  • NodeJS 8
  • Godot 3.1
  • Python 2.7
  • PyPy
  • termux for Android devices

I wanted to put minimal time into this so I didn’t try to run everything on every platforms, just what’s easily doable.

For each measured time, I ran the simplebench script/binary more than 10 times and took the shortest achieved time:

Platform Lua Python27 PyPy NodeJS GCC/C++ Mono/C# Luajit Godot/GD RustC
RaspberryPi4/Cortex A72 1.6GHz 0.61s 4.3s 0.14s 0.061s 0.045s 0.136s 0.087s    
RaspberryPi3/Cortex A53 1.4GHz 1.191s 10.22s 0.34s 0.157s 0.092s 0.49s 0.163s    
RaspberryPiZeroW/ARMv6 1GHz 5.32s 60.62s 1.78s 0.682s 0.29s 2.091s 1.004s   0.25s
Laptop/Core i5 4258U 2.4GHz 0.22s 1.2s 0.041s 0.018s 0.015s 0.045s 0.025s 0.637s  
Laptop/AMD E-450 1.65GHz 1.17s 7.2s 0.210s 0.110s 0.079s 0.233s 0.152s    
Mobile/Snapdragon 855 2.84GHz 0.36s 1.68s   0.013s 0.020s     0.758s  
Laptop/Core i7 8550U 1.8GHz 0.18s 1.0s 0.037s 0.012s 0.011s 0.053s 0.018s   0.004s
Laptop/Pentium 4415Y 1.6GHz 0.46s 3.65s 0.094s 0.020s 0.030s     1.411s  
Desktop/AMD Ryzen 5 1600 3.2GHz 0.22s 1.38s 0.031s 0.008s 0.011s   0.011s 0.725s  
Desktop/AMD Ryzen 7 3700X 3.6GHz 0.156s 1.26s 0.026s 0.007s 0.008s 0.014s   0.496s  
ShieldTV/Cortex A57 2.01GHz 0.745s 4.73s   0.049s 0.012s   0.056s 1.496s 0.012s
Laptop/Atom x5-z8350 1.44GHz 0.998s 8.78s 0.433s 0.095s          
Mini/Intel Pentium 4415U 2.3GHz 0.269s 1.52s 0.043s 0.014s 0.014s 0.063s 0.016s   0.007s

Conclusions:

Snapdragon 855

Mobile phone processors are catching up to laptop processors very quickly. I’ve read that Snapdragon 855 is similar in performance to a current gen core i3, and these results confirm that. It’s especially impressive that Snapdragon 855 is almost exactly matching a desktop PC Ryzen 5 1600 in Godot/GDScript and very close in Lua and Python.

Python

CPython is so incredibly slower than everything else which comes as a no surprise, some of my old CPython/PyGame games struggled to hit 60 fps on laptop processors at the time without some sort of just-in-time compilation thrown in (back then I used Psyco)

Pypy runtime (which is a decendent of psyco, full jit compilation) is impressively quick in comparison to CPython, it almost matches mono actually, it’s a surprise pypy hasn’t become the dominant python runtime yet! I think it ought to be.

Lua and LuaJIT

As expected, lua is very fast for a fully interpreted language. I wish CPython was closer to that.

LuaJIT is damn impressive. Actually really close to native C++ performance! Which is insane.

Javascript

The anomaly here is NodeJS which uses the excellently optimized Google V8 engine. Not only did it match but actually surpass native C++ performance on several platforms. I have no explanation other than blaming it on timer precision? but I’m not surprised its performance is that good as it not only runs 100% of the web, but a growing list of desktop/mobile applications like this very editor I’m using to write these words now.

It’s worth noting that the startup time when running node simplebench.js is almost as slow as compiling the C++ version. This indicates some hardcore jit compilation taking place before actual execution of the script starts.

Godot/GDScript

Godot’s GDScript is not as bad as I thought it’ll be, about 2x CPython. Alone it would make Godot a terrible solution for bigger games, but luckily Godot allows C++ modules to be used for critical bits and Mono/C# support is almost ready for prime time. I’m hoping at some point in the future they decide to reimplement GDScript to compile to Mono in the future.

Discuss...

Published

03 October 2019

Category

benchmarking

Tags

Weather not being so nice through this weekend with an upcoming quick 2 weeks vacation where it’s summertime, I thought I’ll do a fun indoors thing on Saturday.

A while ago I set up TIC-80 on NVIDIA Shield TV and hooked up a bluetooth keyboard and mouse, the idea is to do some light gamedev directly on the TV while chilling on the couch. It’ll be like the most relaxed form of development possible :)

I thought I’ll try something very simple, a minimal tetris written as straight forward as possible was what I went with. I enjoyed it thoroughly and how easy and fast developing it on lua was. It only took me a few hours to finish covering all features I wanted. I think this might be the fastest game I’ve ever finished to date!

Play TVTetris here

tvtetris

Discuss...

Published

04 August 2019

Category

gamedev

Tags

When I bought my Amiga A500 from TradeMe it came with a compatible monitor which had some old-monitor problems related to picture quality. I expected that as CRT monitors do tend to deteriote in quality over the years and also tend to die suddenly.

my_amiga_a500

I was not proven wrong as the monitor did die a few weeks later with a click and a whine.

The Amiga A500 has a video-out port but puzzlingly it outputs in greyscale only.

I searched for how to connect the Amiga to a modern monitor using either RGB or HDMI and found that the only indirect way to do it is using an ugly commodore Amiga adapter device called A520 which provides an RCA video-out signal in color as well as RF out.

a520

Someone was selling a pair of these on ebay so I got them, they both worked but one of them seemed to need some maintenance as it required a bit of fiddling around when plugged in to output in color. Sometimes it insists on only outputing blurry greyscale.

I ran the output to an LCD TV using RCA video and audio out, both produced awful blurry quality picture that made it very difficult to read any text. Since then the Amiga was all but unusable.

Modifying A520 to output S-Video

I had to decide what to do with the Amiga (either sell it or find a way to get good quality output making it usable again), when doing a quick search I found this great step-by-step guide for converting the A520 to output S-Video signal which should make it usable again.

I decided to try it. Ordered all the electronics components needed 2 weeks ago and when they arrived setup a work area and spent a day going through all the steps. After many hours and 3 solder-iron burns I got it done. I just needed to test it.

S-Video to RCA Video Out

S-Video output means I get two signals out of the modified A520 one called Chroma (letter C) and one called Luma (letter Y for some reason). It wasn’t clear in the guide how to convert that to a single RCA video-out signal, there are commercial S-Video to RCA converters but I felt since I went this far might as well try adding the conversion to the circuit.

Upon googling, I was surprised that this conversion requires a single component and it’s extremely easy to do! Just a single capacitor 470 pF across the Chroma and Luma outputs.

svideo_to_rca

Did a quick breadboard test and voila! It got video out to display on TV:

svideo_testing

svideo_testing2

The difference was very clear even through my phone’s camera:

svideo_before_after

Adding that capacitor to the output then rewiring and reseating the board, and we have a modified A520 ready for use:

a520_ready

I left it running for several hours to make sure everything is working as it should

svideo_ready

Discuss...

Published

01 June 2018

Category

amiga

Tags