First tests...

... are demonstrating multi-touch coordinates transferred to Flash app. and visualized as circles and list of coordinates. The app. is using them as source of user's actions input. Some fine tuning is yet necessary like defining of unmovable objects etc.


Physical context of use

Table's design gives an ability to change the angle between table-top and frame from horizontal to vertical setup with a continuum of intermediary stages (e.g. 30º, 60º etc.).

When positioned vertically (90º) the table-top induces a context of gravity working top-down (vertically). It can help introducing a dynamic gravity factor dependent on physical context of use. It might be interesting to implement some sensor (resistor’s bar, multi-axis accelerometer or some gyroscopic sensor) checking table-top’s position (the angle between the table-top and the frame from horizontal to vertical setup) and simple software functionality for checking and passing the angle as parameter to the application which would react by changing attributes of the virtual environment (gravity vector). For example changing the angle to 90º position allows the gravity to work vertically (top-down) making the objects falling down to the bottom of the screen. But changing that angle to 0º changes the direction of the gravity and makes it working perpendicularly to the screen – the objects do not fall down anymore but can slide on the whole area of the screen decelerated by a factor of environment’s friction. It’s a feature which can be used to create the application reacting to changes of display’s physical position in real-time, similarly to Newton Virus. All of the above can be used by a special version of this application adapted to portable hand-held computers equipped with position sensors, and work similarly to iPhysics on iPhone.