Various Oddities Graphics, Game Dev, Emulators, and other geeky stuff

17Jan/105

node.js on cygwin

I've been playing with node.js a bit recently and really loving it. The only problem I have is that it doesn't run on cygwin, and my WebGL stuff doesn't yet work well on OS X, and I really don't feel like messing with VMs and all that stuff. So, now I'm trying to get it working under cygwin. Here are some of the hacks I've used:

  • install cygwin 1.7 as it has ipv6 support and other things needed by nodejs and its dependencies
  • add to /usr/include/pthread.h:┬áint pthread_atfork(void (*prepare)(void), void (*parent)(void), void (*child)(void)); (it exists, but isn't declared)
  • sync to the latest master branch of nodejs
  • apply the patches
  • ./configure && make install

Patch 1 is a small diff that allows the V8 build tools to identify cygwin as 'win32'. Whether or not this is the right thing to do I do not know, but it seems to work. The next bit is to replace __MINGW32__ macros with __GNUC__ macros in the Win32-specific platform code where the functionality that was being guarded is the same on both MinGW and cygwin. There are only a few minor differences.

Patch 2 contains some minor changes that allow cygwin to be recognized as a platform (maybe not the right thing to do?) and to switch off the LINKFLAGS for V8 based on that. The trick to getting a successful link of node.exe is to make sure that libwinmm.a and libws2_32.a are passed to g++ AFTER libv8.a. If the ordering is any different it will fail to link. Unfortunate I don't know anything about this WAF stuff, so I just found a reasonable location and hacked the references in there. It works, but may not be 'correct'.

Patch 3 adds utils.IsCygwin() to make some of the logic cleaner and fixes up the V8 and node test scripts. The node tests run, but a few fail with segfaults for some unknown (to me) reason. I suck at debugging with gdb, and until I get some crashes in my code I'm going to ignore them.

These patches should be rolled up, but I don't know a clean way to do that.

node.exe should appear under /usr/local/bin (with extras under /usr/local/lib and /usr/local/include). Cygwin has these paths setup by default, so you should be able to just do 'node -v' and be ready to go!

If you are like me and lazy, you can download a pre-built version of node on cygwin 1.7. Extract this to /usr/local/.

Disclaimer: I have no clue what I'm doing and this is all probably very, very wrong. I haven't tested this with anything besides a few of my little node projects, and I can't support you if your stuff doesn't work.

Filed under: Coding, Web 5 Comments
8Jan/103

MegaTextures in WebGL broken

I decided to update my WebGL code to run against the latest spec, and also look into the Firefox compatibility issues (it runs it Chromium just fine). Turns out that the 'compatibility issue' is not a bug in Firefox, but instead a bug in Chromium!

The root of the problem is the same-origin policy that browsers (should) implement. It's a security feature that says that for a given page with script running on a given domain, that page can only access privileged information from that same domain (or 'origin'). If you've played with AJAX, it's why there's a bunch of hacks for setting up proxies to grab content from other services. Fortunately in WebGL you won't hit this issue very often, as to allow the web to work the same-origin policy only restricts the page from getting information, not the user. This is what allows you to have <img src="http://someotherdomain.com/foo.jpg"> - the page never looks at the pixels of the image, but the user can.

Unfortunately, MegaTextures relies on reading back the contents of the rendered scene to figure out what to draw next. A texture is loaded from another domain (ok) and drawn (also ok), but as soon as you draw a frame with that texture you are unable to read back the contents as you, the page, can access that privileged information. This works in Chromium today because Chromium does not check the same-origin policy in WebGL (yet). Firefox does.

Just to be clear, I hate this restriction, and it always ends up biting me in the ass. It's great for security and all and as a web browser I am glad it's there, but damn is it annoying!

So... what next? There are a few possible solutions. One is to just move all the content into the same domain as the running page. This is the easiest fix, as it requires no code changes. The problem is that this prevents hosting content on AWS/Akamai/etc (or in my case, seadragon.com) or even other local image farms. Another solution that requires a lot of coding would be to have two WebGL canvases and draw the feedback buffer (with no textures) into one and the real scene into the other. The complication with this is that you would need a copy of every shader and every piece of geometry in both canvases, as they are separate GL contexts and cannot share anything.

Bleh. Stupid Firefox doing the right thing - I wish it would just ignore it all like Chromium does ^_^