HWND Hunter

When using Windows API functions such as SendMessage, it is often necessary to locate the HWND of an arbitrary control on an arbitrary window at runtime. HWND Hunter, designed to speed up this process, is a program that generates code to programmatically determine HWNDs using a Spy++ style window finding method. The following function, FindHWND_AIM(), was generated using HWND Hunter and locates the HWND of an open AOL Instant Messanger 5.9 window’s text box:

HWND FindHWND_AIM()
{
   // class type: Ate32Class
   HWND base = FindWindowS(“AIM_IMessage”, “- Instant Message”);
   if ( base == NULL ) return NULL;
   int scan2 = {8, 0};
   return IterateHWNDs(base, 2, scan);
}

This function can then be used in code such as the following:

#include <windows.h>
int main()
{
   HWND aim_hwnd = FindHWND_AIM();
   SendMessage(aim_hwnd, WM_SETTEXT, NULL, (LPARAM)“Voodoo!”);
   return 0;
}

All FindHWND_*() functions use a custom function (provided within HWND Hunter) called FindWindowS, which works much like FindWindow, except searches for a substring match instead of the exact string. Thus, “- Instant Message” matches the window caption “thekrispykremlin: thekrispykremlin – Instant Message”, which would be a window of me sending an IM myself. A parameter of NULL can be used for either (or both) class name and window caption, and will result in that parameter matching for any value, much like using an empty search string.

I wrote most of the code with a sinus headache, and thus it’s not exactly the cleanest code I’ve ever written. If you plan to venture into the source, you have been warned.

Download HWND Hunter

Download Source

Using HWND Hunter to locate the Notepad textbox.

Silicon Foundry

Silicon Foundry is my latest game in the games for engineers series. You play as the new owner of an integrated circuit factory who must produce chips to meet orders from your customers, and use the profit to pay back the loan for the factory. It’s surprisingly complicated, so I’ve forgone written documentation and made a video tutorial instead. I suggest watching the video as you download the game.

Basic gameplay is divided into two components – an economic macrogame, in which you buy machines, lease and research components, and produce devices to sell to customers, and an engineering minigame in which you must place components on the IC dies, ensuring that the proper connections are brought to the pins and that internal requirements, such as CPU speed and memory, are met. Much like with real ICs, simpler chips result in higher yields – by minimizing the area used for each design, you can produce cheaper chips and turn a higher profit, allowing you to meet the economic macrogame’s victory conditions. While the game has a somewhat steep learning curve, the economic conditions are fairly forgiving, making the game very winnable. Just remember not to produce more chips than your customers will buy!

Update: There was a bug in Silicon Foundry involving localization settings that caused it to crash when formatting numbers. I’ve fixed it and uploaded a new installer. Use the link below to download the new, fixed version.

Download Silicon Foundry


Get the Flash Player to see this player.
 

Pulse

Pulse, previously titled Ruckingenur: Puzzle, is the second game in my games for engineers series. In it, you must arrange devices on a grid to redirect and filter pulses in order to achieve different objectives, depending on the mission. It’s sort of like the incredible machine, in that you have to assemble pre-created parts to achieve an effect. Once you get past the training missions, each of the missions (there are two) come in two separate flavors – slim, which contains the exact set of parts required for a specific solution, and normal, which contains far more and allows for open ended solutions.

There are five different kinds of pulses in Pulse: red, green, and blue pulses are considered data pulses, while trey pulses are clock pulses, all of which can be considered signal pulses. The yellow pulses are power pulses, which can not always be routed like signal pulses, and are traditionally used to trigger special features in devices.

One of the biggest problems people seem to have with Pulse is figuring out how to use the data latch. The data latch is used to copy data pulses – to use it, send a data pulse into the left side, storing the color of the data pulse inside the data latch. From that point on, hitting the right side with a clock pulse will cause a copy of the data pulse to bit shot out to the right. To replace the data stored in the data latch, simply send a new data pulse into the left side.

Download Pulse

A screenshot of Pulse in action.

Ruckingenur

I’ve wanted to do something like this for a long time, and finally I’ve been able to. Behold – Ruckingenur! (roughly german for Reverse Engineer, or so the internet tells me.) Combining real life electronics concepts with a few made up technologies (such as iDBG), you can now experience the fun and excitement of hacking fake consumer and industrial hardware (or something like that).

Help is built into the game in the form of an OMG I AM TOTALLY LOST” button, although here’s a brief overview: use the Zachtronics Industries hypermeter to measure and override voltages at test points on the circuit board; look up datasheets for the various chips to figure out what everything does; use the iDBG interface to connect directly to chips, giving you the ability to read and write from data, status, and control registers; use the terminal to connect to more advanced devices, such as computers (not used in this episode). Aside from opening the blast door, there are a few other things on the board you can reverse engineer, such as the LCD and other pins on the chips – try to figure them out!

Unless you’re somewhat familiar with electronics, you’ll probably find this game a little confusing. You’ve been warned.

Download Ruckingenur

Manufactoid

Manufactoid is probably one of my favorite games that I’ve made; in it, you must build a factory to assemble blocks into different products, with each product being a separate puzzle. It’s most definitely a game for engineers, as it requires a you to both think like an engineer and write Lua code to control your factory. If you’re okay with a little programming, though, you shouldn’t have any problem getting past the learning curve and might even enjoy the game.

There’s a good chance that it’ll be confusing at first, which is why I went through the effort of writing documentation, which can be found in your start menu after installing. It’s relatively terse, yet explains the basic game mechanics and acts as a reference for both Lua and the factory components.

Download Manufactoid

Assembling Mountain Dew 6-packs in the Manufactoid factory!

XVRM Manifesto

What is xVRM?

My original plans for virtual reality, code named VRM (virtual reality for the masses), outlined a simple virtual reality system. Featuring a homebrew magnetic tracker and a cheap projection screen, it would allow full virtual reality potential while staying flexible and inexpensive. However, not everything works as planned; the magnetic tracker turned out to be next to impossible with what I had at the time, and the screens were far from cheap. Alas, virtual reality seemed rather out of reach.

It is from these ashes that xVRM was born.

The first tenant of xVRM is that the typical ideas of what virtual reality should be are rather unreasonable. The expectation of large screens or a head mounted display that projects a perfect virtual world is rather difficult to meet; a virtual reality tracking system requires an insane degree of accuracy over a wide range, while the displays require a fairly high resolution at a large viewing size, all requirements that are currently available at a rather ridiculous price. Not to mention they’re mostly old technology that’s probably not even available anymore.

  • Ascension Space Pad Tracker (“affordable”): $1500, and runs off an ISA card
  • Ascension Flock of Birds Tracker: ~$25000, or something like that
  • head mounted displays: $500-$10000+, and probably aren’t that healthy
  • digital projector: $800-$2000+, and you’d most likely need more than one

Which, quite frankly, sucks.

The second tenant of xVRM is that real life handles reality much better than computers. The laws of physics react instantaneously, meaning there is no lag; let them handle rendering, collision detection, sensory feedback. Leaning more towards being a simulation than a “virtual experience”, xVRM offers a chance to realize virtual worlds, now.

What do I need for xVRM?

As of writing this, xVRM doesn’t really exist in any form past an idea; however, if it were to be done, a few things would be required:

An Arena

Otherwise known as a fairly large open space. Be it outside, inside, or both, some sort of staging area is necessary for an xVRM simulation. Which kind depends on what you’re simulating.

Some Players

xVRM is generally suited toward some sort of game; instead of AI, you need real people to play with you. Unfortunately, it’s a bit harder than adding bots to a game, but fortunately, real people are trickier opponents (well, most of them).

Cool Technology

To make the simulation come alive, you need some form of simulation technology. Be it in the form of something like laser tag, some sort of wireless fencing, or a vehicle simulator, the technology brings it all together. In fact, it’s essentially the whole thing

What would an xVRM game be like?

For the most part, it depends on what you’re trying to simulate. Some examples:

  • Modern Combat: realistic laser tag weapons and rules, some sort of objective gear (bombs to be diffused?)
  • Unreal Tournament: insane laser tag guns, less realistic suits, capturable flags, respawn points
  • Medieval Combat: cableless fencing-like gear, perhaps some type of siege engine / base system?
  • Planetary Combat: simulation capital space ships with flight hangers and miniature fighter plane simulators, mech simulators and futuristic laser tag gear for ground missions, tie in with mission support from capital ship

Ideally, they’d all use some sort of universal, cross-compatible protocol. Maximum hardware compatibility would allow for reusabliity between simulations, which would save quite a bit of development time, meaning more time to play!

Virtuality Manifesto

Update (02/27/08): It’s been over a year since I wrote this article, and I just stumbled upon the “Virtuality Continuum”, which, described in a research paper authored in 1994, strikes me as being fairly close to what I imagined in the “Barthian Virtuality Gradient”. A brief analysis of the paper leads me to believe that the authors, anchored to “augmented reality” as their reference point, considered the axis of the VC to describe the ratio of “real” content to “virtual” content – the real world is 100% real content, while a typical computer game is nearly 100% virtual content, with an augmented reality game being perhaps half of each, and an MMO like World of Warcraft being very nearly all virtual content aside from that which evolved from the players themselves and could thus be considered real? The BVG, on the other hand, deals more with the placement of the window between real data and virtual data; while being less linear, I think that it captures an entirely different and much more interesting relationship.

I use the term virtuality to differentiate away from the term “virtual reality”, which implies a reality that is somehow fake and separate from “actual reality”. In all actuality, I envision a broader spectrum, with the goal of simply tweaking reality – meshing real, physical things with a virtual environment, or bringing virtual objects into a real environment, or simply playing with virtual objects in a virtual world. The gradient between what is real and what is virtual is what defines virtuality.

As of now, I currently use the above model, creatively named the “Barthian Virtuality Gradient”. For the most part, it conveys my feelings on the matter fairly well. Chances are, it’s been previously thought of by someone before me, or even more likely superseded, but I put it out into the open regardless.

Realities begin at the bottom – absolute reality. Using a wargame as an example, this would be actual war. Getting shot would suck. From here, we can begin to virtualize certain aspects, or essences, of things; tanks now fire “virtual” bullets, carried by infrared light. You’re still driving a real tank, but a certain portion now has a virtual essence (this is quite similar to the MILES laser tag system the US Army uses). At this point the gradient isn’t so simple. Working backwards from a true neural representation of a completely virtual environment, we first reach sensory simulations, by which we simply reproduce the sensory inputs we would normally get. This is where the more traditional technologies fit, such as head mounted displays, data gloves with haptic feedback, and direct body tracking – since we can’t yet interface directly to the nerves, we simply throw the inputs as close to the brain as we can and attempt to block out what we consider to be reality. Working back even more, we pull back the virtual reality and leave room for more actual reality, such as with CAVEs. I call this environmental virtuality, in which we modify the environment with more distinct windows into the virtual world. And thus, we find ourselves back at the tank, looking out of a virtual periscope and virtual windows. It may be a real tank, or simply a mock up of one, sitting inside of an essential object, interacting with its own virtual environment. The two often mingle at this point. With good reason.

Where environmental and essential technologies meet is where we need to focus, at least for now with our current level of technology. Too many people connotate virtual reality with sensor virtuality, which is both expensive and overly-complicated, which is why I try not to use the term.

RSVP Abstract

What is Virtuality?

I use the term virtuality to differentiate away from the term “virtual reality”, which implies a reality that is somehow fake and separate from “actual reality”. In all actuality, I envision a broader spectrum, with the goal of simply tweaking reality – meshing real, physical things with a virtual environment, or bringing virtual objects into a real environment, or simply playing with virtual objects in a virtual world. The gradient between what is real and what is virtual is what defines virtuality.

So, what is RSVP?

The Relatively Simple Virtuality Platform, or RSVP, is a set of devices and a software APIs to serve a specific goal: to aid in the creation of simple environmental / essential virtuality simulations that are easy to pick up and play, and therefore ideal for high volume entertainment purposes. The system consists of the following hardware:

  • a large, pressure sensitive floor mat, which detects player positions
  • an overhead camera, to track arbitrary beacons and players via optical recognition
  • a projector screen spanning the width of the floor mat, with left and right side speaker system
  • a flexible RF communications system for player “objects”
  • the agent console, at which a human agent sits and can input human-specific data

Because of the use of RF, cameras, and pressure sensors, there are no wires to be attached to players – all that is necessary is to grab the required objects and literally jump into the game.
Designing an RSVP Module

Because of the common hardware, only two things must be developed for each RSVP game/module/simulation: the software, which does everything software normally does, and the objects, which utilize custom RF modules that are easily swapped between devices, meaning that the RF technology doesn’t have to be built into EVERY device, cutting costs and adding simplicity. These features, I hope, will make RSVP ideal for educational purposes, which I plan to test in my current lecture circuit.

Module Examples

Super Mario

  • Video: NES Super Mario remake, adapted to track the player with the pressure pad (jumping and running)
  • Audio: standard, balanced L/R audio (no real need for “surround”)
  • Optical: no optical tracking needed, as pressure pad provides faster, more relevant data
  • Objects: two button punch-stick, one button to enable running, one button to enable fireball (requires “punch”)
  • Agent: optional, would trigger player ducking into and out of tubes and doing random actions (max flexibility)

Megaman Battle Network

  • Video: mini-boss fighting game (one at a time), front on psuedo first person
  • Audio: left channel to mono-out, right channel to object via audio radio
  • Optical: light up strip on object plots a vector to the screen
  • Object: Megaman style hand controller with rumble, audio (charges up), and a button on top (change weapon)
  • Agent: nothing as of now

Jedi Trainer

Ever since I found out about the Wii, I wanted one. However, what interested me more than playing the games was making them. Lucky for me, the Wiimote is a fully capable Bluetooth HID device, which the folks at WiiLi have been able to pretty effectively figure out. I was able to actually find a Wiimote (at MSRP!) and proceeded to hack away.

Jedi Trainer is an extremely cliche lightsaber combat game that uses the Wiimote. It runs in Linux, and requires Python, pygame, bluez, and pybluez. After you’ve got those, download the .tar.gz, extract it (try tar -zxvf jedi-trainer-0.1.tar.gz), cd into the directory, and run main.py (python main.py). It will ask you to sync the Wiimote, which takes a moment, and will then launch. You may have to edit the Wiimote MAC address if it’s hardcoded into the program (I believe it is).

Although it’s not really a fully complete game, as I just threw it together over the past two days from the ground up, It’s got a few neat features I’d like to point out. I tried to capture the sort of ambient lightsaber feel, to make it a bit more special than holding a piece of shiny white plastic. You have to first turn it on, by pushing the “up” button, which makes it “spring” to life and turns on ambient lightsaber noises. Whenever you swing it, independent of if you’re doing an action, it’ll make that snazzy lightsaber “swoosh” noise. All the controls use only the accelerometer features of the Wiimote, as I don’t have a sensor bar, which should be good news for anyone like me who only has the Wiimote. It’s all pretty fun – I hope to get a video up of me playing in a day or two.

Provided below are helpful visual instructions to make your life easier. Be sure to hold the Wiimote so it is facing you for defense and facing up for attack.

Download Jedi Trainer 0.1

Press the UP button to ignote the lightsaber. To deflect a blaster shot, hold the Wiimote in the direction of the arrow, with the

To defend against a lightsaber, hold the Wiimote PERPENDICULAR to the arrow, with the To attack, hold down the

Flight of the Atropos

I’ve always dreamed of creating a fantastic space ship simulation along the lines of what I’ve described in my xVRM Manifesto, something I wrote years ago. In the past month, working alongside Kenneth Bowen and Andrew Armenia for my Experimental Game Design class (taught by RPI’s Professor Ruiz), that dream took shape in a project called Atropos.

Inside of a structure best described by Vicarious Visions’ Karthik Bala as a “space shanty”, three players work together to pilot a virtual spaceship through a virtual world. The tactical officer, working at the left console, can fire missiles and lasers, scan of targets, start electromagnetic, biological, and radiological sweeps, and both hail and accept hails from other ships in the game world. The navigation officer, working at the right console, is responsible for flying the craft and has the ability to engage a FTL (faster than light) jump, allowing the players to jump across large areas of space. The command officer, sitting in the center, can monitor the status of the ship and other scan targets via the center, dual screen display, and is responsible for communication with NPCs over the radio (the device with the phone handset).

An outside view of the Atropos. An interior view of the Atropos.

Overall, Atropos was a major success. We managed to have it fully set up for the exhibition component of the RPI Game Symposium and took home 5th place, as well as drawing a fairly large crowd and running about ten teams of three through our 15-minute mission. Players were instructed to destroy a ‘radioactive asteroid’ in a nearby asteroid field, and were then ambushed by Captain Nagel, a pirate with somewhat aggressive intentions.

We were able to incorporate a few interesting concepts into Atropos that I’ve wanted to try out for a while. We were able to use a number of interesting interface peripherals, such as a flight yoke and two foot throttle lever for flying, a giant potentiometer and lever for firing the laser, a dual-screen CRT for providing ship status and scan information, a radio that players could talk to NPCs through, and a dot-matrix printer (which everyone absolutely loved) for providing a physical copy of mission briefing and de-briefing for players to take with them after the game was finished. By fully enclosing the ship, we had complete control over lighting and sound, which consisted of an incandescent bulb for main lighting, a red CCFL for emergency lighting, and a fairly loud speaker set. When the players’ ship was damaged, the lights would ficker as explosions sounded; when the players were ambushed, they were plunged into darkness as the lights turned off, their screens turned black, and alarms sounded. When players used their FTL drives, the lights would flicker while the screens “glitched”, and our obnoxiously loud and over-phasered warp sound effect played.

The Tactical Officer's console. The dot-matrix printer used for printing briefing and debriefing reports.

In addition to an over-the-top interface and ridiculous special effects, Atropos features absolutely no artificial intelligence or pre-scripted events aside from special effects timelines. As the fourth player, the game master is responsible for controlling all other ships and events that happen in the game, along with improv-acting out the dialogue that all NPCs have with the players. Using video-portraits and creative voice acting, we brought the NPCs to life and held interactive dialogues with the players, allowing us to both immerse the players and give them in-game assistance on some of the more confusing technical aspects of the ship.

Although working with such a large project on a short timeline was somewhat exhausting, I am extremely pleased with how everything turned out, and will most definitely be exploring the simulation-LARP genre much more in the near future.

Download the Atropos GDD
Download Atropos Video Footage

A scan of the mission briefing used in Atropos. A mission debriefing for when the players fled the pirate.

A page out of the ARS-24 technical manual. The galaxy map used by the FTL operator.