The lastest controller for Pong uses a cardboard plane attached to a potentiometer to control both speed and direction of the virtual paddle. The rotation of the potentiometer divided in half to control direction, and then within that extent, speed is modulated.
Process
Because the ESP Chip is somewhat expensive (relatively), I invested in protyping my circuit on breadboards and milled boards on to which the chip could plug in temporarily.
In order to prototype directly with the chip rather than a breakout board, I needed a programming jig for connecting via USB and closing certain routes on particular pins. When programming…
GPIO0 needs to be connected to GND. A button was held while uploading code.
Reset need to be flashed. A button was pressed initially before programming.
Tx and Rx connections between the ESP chip and FTDI cable were accomodated with header pins
The Board
The ESP chip draws a significant amount of power; however, conflicting advice online made it difficult to size capacitors. Although, I found these tips to be most helpful. While they recommend a very large capacitor (470 uF) across the Vcc to Gnd, the Adafruit breakout only used 10 uF. While I included two 470 uF capacitors, my next iteration would explore smaller size. However, a 0.1 uF decoupling capacitor across the ESP8266 Vcc to Gnd inputs very close to the pins was a critical addition.
In my initial version of ‘Many Games of Pong‘, the act of switching between controllers was so frustrating that it deterred players from doing so. This design used a 6-pin usb-to-serial adapter, which the user had to plug in “just right” before they even could play. The diagram below illustrates this initial schematic and systems.
This point of friction detracted from the key idea of players testing different controllers and de-standardizing the tools used to play games. As such, I’ve continued to develop the circuit with a focus on selecting and switching-between controllers.
The new circuit, illustrated above, uses an ESP8266 Wi-Fi chip microcontroller to eliminate wiring for sending data to the between the game and controllers. Additionally, a toggle switch, coupled with an LED, sets the state of whether or not a controller is actually connected to the game (as opposed to the hardwired connection). The server-side code keeps track of which controllers are connected and which player they’re associated with.
There’s a delay between pressing a button and seeing the effect on screen that’s still lingering, as well as the question of power. The video shows a wired connection from each controller to the computer, but that’s simply for power – not data transfer.
Code Updates
This iteration also focused on simplifying the code. My previous code for moving the paddle based on each controller was verbose and repetative. I’ve since rewritten the code to use switch cases which really simplified things.
Next Steps
The next step in the project is to create custom boards rather than use the Huzzah. Adafruit’s ESP8266 breakout board is a good reference for starting the schematic. Since some of the controllers use digital sensors (simple push buttons) while others rely on analog (photocell), the boards will be slightly different for each. Additionally, I’d like to put more time into the tangibility and tactile qualities of the sensors I use and have been hunting down various products.
Future References
Programming jig tutorials: this one by Alex Glow helped me understand the how to use the GPIO0 and Reset pins for programming
When a controller is to be operated while not being seen, legibility of sensors/inputs and feedback of action are important considerations.
Legibility refers to how do you know what control does what? This may indicated through its shape, texture, and relational configuration or orientation. Regarding configuration: How are the controls oriented with respect to each other/themselves? How are the controls oriented with respect to the body? How is the entire controller oriented with respect to the body?
Feedback refers to how do you know you’ve done something? Without sight, this could be communicated through sound or touch and haptics. Haptics may be additional feedback beyond any physical alteration of the sensor itself, such as a vibration. But it may also be a result of the sensor itself. For example, a maintained push button will have a different feel relative to the enclosure when depressed versus released.
The Scale of Physical Interaction with an Object
As the controller is not seen while being operated, does that scale of interaction and controller increase? When controllers are considered as objects, interaction can occur the scale of the whole hand rather than individual fingers. Some initial thoughts I considered:
Is it a glove interface between two hands? Touching different fingers together triggers different functions.
Are objects held and placed down on a surface to trigger different events?
Does an object have different faces which are touched to control different functions?
Sustained versus Momentary Interactions
How are interactions different when they are sporadic events such as momentarily pushing a button or flipping a switch versus a sustained interaction such as steadily holding down a button or shaking an object? What is the difference between an event-based interaction, and interaction of duration? Do each of these have different responses to the “directionality” of a controller or interaction?
Controllers
When considering the music controller at the scale of an object interacted with by the whole hand, I considered three scenarios.
All Functions Concentrated into a Single Object, Functions are Fixed to Faces (Absolute Faces)
Different sides of an object trigger different commands. When all functions are combined into a single object, their configuration with respect to each other is important as the user must understand the orientation of the object itself. The tactile difference between each side is important for this orientation. Further Development: Rather than a cube, does the object have two rounded sides for “previous” and “next” as they are momentary actions rather than sustained stairs? How is each face differentiated: many materials, texturing of the same material, shapes, etc.
Each Function is a Different Object (Absolute Objects)
Moving any object in any axis triggers the associated function. In this instance, orientation and configuration between objects does not matter, which allows individual users to configure the objects as desired. However, the legibility of each individual object is incredibly important as their shape, size and texture allows the user to discern one from the other. Further Development: Explore different sets of objects: all the same material yet different forms versus all different textures but the same form
All Functions Concentrated into a Single Object, Functions are Associated with Gesture and Direction (Relative Faces)
Similar to the first consideration, all functions exist within one object. However, the functions are not fixed to faces but rather the spatial direction or gesture. The orientation of the object is remapped after each gesture in order to allow it to occur in succession. Currently in development
Circuit Boards
Keen to develop the enclosures further after this week, I chose to mill the circuit boards and use AtTiny84s as the microcontrollers. As the controller is based around interacting with an object, I prioritized making them wireless and configuring different setups in order to fabrication with a number of different enclosures.
Building on the previous week’s progress of a breadboard circuit in which a rotary encoder controls an LED, the images below have translated said circuit into a board layout ready for milling.