Brian Lovin
/
Hacker News
Daily Digest email

Get the top HN stories in your inbox every day.

etaioinshrdlu

It looks like you're using an ice40 FPGA?

If you can make your project work with 38 I/O pins you could probably get it fabricated in ASIC by Efabless free. You just need to meet their repository requirements and make your verilog module conform to this interface: https://github.com/efabless/caravel/blob/master/verilog/rtl/...

The openlane tool converts your verilog all the way down to final ASIC design...

You'd need to license everything Apache 2.0 though. And it would have to be done soon, the deadline is Nov 30th.

nickmqb

In terms of I/O pins that should actually be fine (current pinout: VGA (14x), SPI (4x), CLK (1x), N64_controller (1x)), though I'm still working on the project and I don't think it'll be done by the deadline. It looks like they might do future batches though -- it's a cool idea!

primis

If you're using an n64 controller, you might want to consider using a gamecube one instead (it uses the same control logic, it just adds a 5v line for rumble features)

Unless you're particularly fond of the N64's controller design that is

nickmqb

That's a good idea. The two analog sticks on the GC controller would be an improvement over the single stick on the N64 controller for movement in 3D. I think that the main benefit of the N64 controller (besides a nostalgia factor, though that may just be me ;)) is how easy it is to connect. I actually just got some wire from the local hardware store, plugged pieces of it into the controller connector, and then attached some IC clips. For the GC controller, things are a bit trickier due to its connector layout, though I just found [1] which might be a nice solution; alternatively, buying a GC controller extension cord and wire stripping could be an option. I'll consider it!

[1] https://www.raphnet-tech.com/products/gc_controller_connecto...

etaioinshrdlu

It's just as well :) None of the code for the submission process they have you use actually works. It's rather insane what they are asking developers to do.

mysterydip

Don't you only need 5 pins for VGA: R,G,B,HSync,and VSync?

nickmqb

VGA is an analog protocol, but the FPGA can only output a 0 (GND) or 1 (3.3v) on its I/O pins. I'm using a Digilent VGA Pmod [1] which uses a set of resistor ladders to map each color component from a 4-bit value to an analog voltage that goes to the monitor. This means that we have 14 pins: R (4x), G (4x), B (4x), HS (1x) and VS (1x).

[1] https://store.digilentinc.com/pmod-vga-video-graphics-array/

throwaway122kk

This looks very good well done! Microsoft need to hire you for the team asap!

It always cracks me up playing minecraft on xbox one x and once your village hits few hundred villagers framerate (when running beta from insider it's shown on screen) drops to like 3-4fps

makapuf

Wow, any details? Somme code ? A repo ? What graphical output? VGA or hdmi ? What input? This is very intriguing...

nickmqb

The screen is connected to the FPGA over VGA. The output resolution is 1024x768 @ 60Hz, but the 3D portion of the screen is rendered at 256x128 @ 30Hz. The design consists of a custom 16-bit CPU (running at 32.6Mhz) and a custom raytracing "GPU" that can handle up to 4 rays in parallel.

Input happens via a N64 controller! Those are actually fairly easy to work with at a low level.

The code is not public, though I'm considering open sourcing the project when it's done. Moreover, there's a lot of additional details that I could potentially go into, so I'm considering also writing a few blogs posts with more info if people are interested!

guiambros

Pretty impressive; definitely interested in hearing more.

Did you use your Wyre [1] language to develop it? I saw the examples on Github, and seems pretty interesting. Its cuts quite a bit of the verbosity of Verilog. I'll give it a try.

[1] https://github.com/nickmqb/wyre

nickmqb

Yes, I'm using Wyre for this. Feedback is always welcome, so once you've had a chance to try it, don't hesitate to let me know what you think!

anfractuosity

It looks very cool! What's the GPU & CPU written in out of interest, I think I saw on your twitter you've written something to translate to verilog, is that used for this?

nickmqb

Yup, that's correct, the design is implemented in Wyre, which is a hardware definition language that I created. The language compiles to Verilog so it can work with existing hardware development toolchains. The language is open source and can be found here: https://github.com/nickmqb/wyre

kingosticks

Very interested, please do!

young_unixer

Are you using Minecraft's textures? because those look very similar to Minecraft's actual textures.

nickmqb

Yes, I am. This also means that any open source distribution won't include those textures. However, if I do end up open sourcing the work I'll make sure to include instructions for people that already own Minecraft; the textures are just .png files that can be extracted from the game's .jar file and can then be transformed to be used on the FPGA.

guavaNinja

I suggest you look at minetest[1] textures. It's an opensource minecraft clone. Most textures have CC or MIT licenses. Read license.txt for each texture mod before you use them. They may help in opensourcing your project.

[1]: https://github.com/minetest/minetest_game

franga2000

I can't remember any off the top of my head, but I'm pretty sure there are a few open source texturepacks for Minecraft that would be a drop-in replacement (same filenames amd structure).

takenpilot

This is insane and I love it.

mentos

How is the rendering performance with large worlds?

nickmqb

The FPGA that I'm using for this (the Lattice iCE40 UP5K) is really limited when it comes to RAM, which is the main constraint when it comes to world size. As per the title, there's only 143kb, which is insanely low for doing any kind of 3D stuff :). 48kb is used by the frame buffer, 19kb for textures, which leaves 76kb. 48kb of that is used for the map, which currently limits it to 32x32x32 blocks. However, I do have some plans to improve on that in the future!

The FPS (30Hz) is rock steady though! One of my pet peeves when doing DirectX/OpenGL development is that it's really hard to completely avoid frame drops, e.g. if the OS decides to schedule some other thread, or because the GPU driver decided to do something else than render your app. With hardware development, you can side step all of those problems. As a result, the Minecraft clone is guaranteed to not drop frames :).

gnramires

Have you thought of going Shadertoy style and doing everything procedurally? Or every block procedurally? That way you can cut RAM as much as you wish. For example, if you have a procedural formula to determine if a block is populated, you don't need to store it in RAM, just use this formula in the renderer directly (in Shadertoys this usually would repeat per-pixel).

nickmqb

It did cross my mind. However, a problem with that approach is that evaluating such a formula is too costly/inaccurate on a small FPGA like this, which just has 8 DSPs (that can only do 16x16 bit multiplication), and some of these are already in use in other parts of the design.

jfries

If you have a guarantee for the worst case of generating a pixel (which you indicate by saying that you never drop frames), couldn't you get rid of the framebuffer? Schedule pixel generation so that they complete just in time for when they're needed for output.

This would save up RAM for other things (and be a fun exercice to get right).

nickmqb

That's a good observation! This technique is also known as "racing the beam". The problem is a mismatch of refresh rates; the VGA display operates at 60 Hz but the ray tracer is not capable of producing that many pixels, it can only do 30 fps. So we need a frame buffer to store the result.

Impossible

On console and embedded the OS is either non-existent or gives your game guarantees about when it will schedule something and how often. Hardware obviously gives you way more control, but a baremetal raspberry pi project, Arduboy, console homebrew, etc can give you some of that control back in software. Awesome project btw

roblabla

As an example of this, on the Nintendo Switch, games run on three cores dedicated to the game, while the rest of the OS tasks run on the fourth core. Furthermore their scheduler gives precise guarantees about the scheduling order of threads spawned on the same core. It makes sustained 60fps achievable through careful design.

mentos

Thanks for the reply incredible work!

Are there any FPGAs out there with an order of magnitude more memory you have considered?

nickmqb

There are definitely FPGAs that are a lot more capable than the FPGA that I'm using. For example, see my reply to flatiron here: https://news.ycombinator.com/item?id=25172781

While it would have been easier to use a larger/faster FPGA, part of the fun of such a low level project is to work within harsh constraints and see what can be done regardless :).

b20000

this is super awesome! did you build the raytracing stuff from the ground up?

nickmqb

Thanks! That's correct, I built the ray tracing "GPU" from scratch. It's highly optimized because it needs to trace 256 * 128 * 30 = 0.98 million rays per second on very under powered hardware. It's specifically tailored to fast traversal of voxel grids. There's too many details to go into here, but as I wrote in another comment, I'm considering writing a few blog posts to explain how everything works in more detail!

Shared404

> I'm considering writing a few blog posts to explain how everything works in more detail!

If you do, I'd love to read them.

flatiron

Did you consider porting it to mister?

nickmqb

I probably won't port it to other platforms, however, as I wrote in another comment, I am considering open sourcing it when it's done, so others can port it if they like.

I just looked up the MiSTer board [1]. They are using a DE10-Nano FPGA [2], which is a lot more powerful than the iCE40 UP5K [3] that I'm using (for comparison, the DE10-Nano has 110,000 lookup elements, whereas the iCE40 UP5K just has 5,280, so ~20x difference!). So porting should be pretty easy. It also opens up opportunities to increase the resolution, frame rate and render distance.

[1] https://github.com/MiSTer-devel/Main_MiSTer/wiki [2] https://www.digikey.com/en/product-highlight/t/terasic-tech/... [3] https://www.latticesemi.com/en/Products/FPGAandCPLD/iCE40Ult...

MaxBarraclough

Minecraft on 40mW. Very cool.

layoutIfNeeded

Wow, this is some Fabrice Bellard tier stuff! Very impressive!

vaccinator

a Minecraft clone in hardware means all code is hardware?

nickmqb

Yes and no. The design includes a custom built 16-bit CPU, which uses a custom instruction set, which I wrote an assembler for. There is a small 4kb bank of RAM that contains a program written in this instruction set. From a hardware perspective it is just data, but from a a software perspective it's that program that is ultimately responsible for running the game (by reading input from the gamepad module, setting up GPU registers, etc.).

function_seven

Yup.

The assembly language for this chip will be redstone :)

Daily Digest email

Get the top HN stories in your inbox every day.

I'm building a Minecraft clone in hardware, on a tiny FPGA with 143kb of RAM - Hacker News