top of page

Final Group Project

Tiffany Shen, Crystal Kwan, Panru Jing

Mental Model and API

Our Gigglebot control system is designed to be a prototype of a larger, more comprehensive warehouse robot system that will allow workers to deliver inventory remotely. With this system, warehouse workers will be able to drive a robot to a distant area of the warehouse, then pass control to another worker who can then take over and continue control seamlessly. Because of the complexity and scale of these systems, a prototype design is valuable in helping us understand the fundamental steps needed to pass control between controllers successfully. 

 

When passing control of a bot to another user, we want the amount of work users do to be as little as possible. This would decrease the mental workload of users, increase speed, and decrease the number of opportunities for human error. We decided to have one action to pass control, and one action to enter swarm control. 

​

The Gigglebots will each be connected to their own individual channel, and controllers will be able to rotate through the channels when button A is pressed. When passing control, the user will leave the channel they are currently connected to and enter the channel the incoming Gigglebot is connected to. 

​

When swarm control is requested, a signal will be sent from the initiating controller for all Gigglebots to enter the swarm channel. The controller then connects itself to the swarm channel, and is able to communicate with all Gigglebots within that channel simultaneously and control them as a group. When swarm control is no longer needed, the swarm controller will send a signal for all Gigglebots to return to the channel they were previously connected to. 

Task Analysis and Automation Strategy

Scope:

​

The task being studied is the driving and control system of the Gigglebot, which is a prototype for a more comprehensive warehouse robot system. The task will begin with a Gigglebot already in one room of the model warehouse, and end with the user successfully navigating through the room, entering an adjacent room, passing control of their current bot to the next user, and taking control of the next bot coming in. The user also has the option to take control of all bots at any time and drive them simultaneously with one controller. 

 

System Constraints:

​

The Gigglebots will be contained within an established warehouse with distinct rooms separated by doorways. The bot can move sequentially from room to room, with each pass through a single room counting as one task. 

 

Automation & Task Allocation:

​

In our prototype, most of the data collecting and analysis is allocated to the user since our Gigglebot lacks sensors and artificial intelligence. The Hierarchical Task Analysis below references our current system where most of the tasks are still completed by the user. As the technology improves, the bot will be able to perform more of the tasks previously allocated to the user. In the final design, the bot will be able to scan for obstacles, make pathing decisions, automatically check for available channels, send messages to other users, and perhaps even navigate without the need for a human controller. Tasks that will be automated in the final design are marked with an asterisk (*). 

 

Task Analysis:

page 1.png
page 2.png
page 3.png

Software

Two different programs were written for the receiver Microbit (attached to Gigglebot) and the controller Microbit. The radio channel the Gigglebot is connected to can be changed by pressing A on the receiver Microbit, and only needs to be set once at startup. The receiver waits for commands sent through the radio channel from the controller and executes them. 


The controller Microbit can also change channels by pressing A. When the controller is tilted, it will send a signal to the receiver to move the Gigglebot forward, backward, left, or right. When button B is pressed, the signal to change to swarm channel is sent. 


The biggest challenge we came across when writing the software was programming the turns. The default tilt sensitivity on the Microbit is very high, making it difficult to steer the Gigglebot which would instead spin rapidly to the left or right. 


To resolve the issue, we first wrote a version of the software where button A turned the Gigglebot left, B turned it right, and pressing A and B simultaneously made it move forward to eliminate the need for tilt control altogether. The buttons would be pressed once to start the movement, and again to stop. While this system of control made it easier to get the bot to its destination, it was not very intuitive and made the control feel unresponsive and stiff. Using the buttons for steering also meant we needed alternate input modes for “change channel” and “swarm control,” such as “free fall” or “impact” conditions, which were easy to trigger accidentally. We decided to revert to the original plan of using tilt to control the movement of the bot. 


By looking through the documentation for the Gigglebot, we discovered that we can throttle motor speed and power supply to the wheels, both of which would decrease the turn speed of the Gigglebot. We also programmed the left and right wheels separately to have more control over turns. 
 

UI/UX & Controller Design

Materials:

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

 

This controller is made of wood. Using a laser cutter, the shape was cut out of thin, soft pine boards. Due to this texture and after extensive sanding, the controller was very smooth and soft to the touch. This allows users to comfortably operate the controller for a long time with little fatigue. None of the edges were sharp or rough which might have caused discomfort for users. Using a scroll saw, a filler to go between these pine boards were created out of birch. Birch wood is stronger and harder and makes the controller more durable. This also increased the thickness of the handles which gives users a more secure grip and allows the controller to be stood up on its handles when not in use. A cutout for the battery was cut out of the thick birch. Because we didn’t want the wires to get bent and broken, we added small wooden boards between the birch and pine so that the battery slot holds the battery perfectly but the wires on the side have space to come out and connect to the microbit. Rubber bands were added to add a rubber grip on the sides of the controller to securely hold the microbit in the slot and reduce the risk of users dropping the controller. All these materials together created a very lightweight controller that users can easily move and hold for a long time.

​

Shape:

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

The shape of the controller is similar to that of a Microsoft Xbox One controller. There is a rectangular center that holds the microbit. The controller is the perfect size for users to efficiently reach and press the buttons on the microbit with their thumbs. The handles are thick, slant away from each other and taper towards the end so that they fit into palms and users can wrap their pinkies, ring fingers, and middle fingers around them. Curved with no sharp edges, the handles are comfortable to hold. The indent at the top of the controller allows users to hook their index fingers right behind the microbit. With a secure grip, the users can move their thumbs with speed and tilt the controller with precision. The slot for the microbit is tight enough for the microbit to slide in and stay in during use. The rubber bands add extra security so that the microbit doesn’t fall out during rough movements. 

​

​

​

​

​

​

​

​

​

​

​

​

Image: https://edsurge.imgix.net/uploads/post/image/11684/GamingClassroom-1548049731.jpg?auto=compress%2Cformat&w=640&h=259&fit=crop

​

Instructions:

​

​

​

​

​

​

​

​

​

​

​

The shape of this video game-esque controller is an affordance to its intuitive tilt function and does not require instructions. The function of the buttons on the microbit change according to the software uploaded so we did not include instructions for the button either. However, we added labels for the placement of the microbit and battery in the controller. The shape of the microbit slot is an effective affordance for what goes in the slot but we added written instructions for extra clarity.  For the battery slot, we included a symbol rather than words. Words would be hard to read from the angle at which they are seen. Although the battery symbol does not look like the battery for the Gigglebot, it is a universal symbol that users will be able to understand to mean the Gigglebot battery. These instructions are concise and are legible because of their big size and neat font. There is also a good contrast between the black text and the wood background. The arrows tell the users movement is required and that they need to move the microbit and battery down into their slots.

controller.png
detail.png
ps4.png
prototype.png

Naive User Walkthrough

User:

​

Rachel, an 18 year old college student who wants to set up and use Gigglebot and controller for the first time

​

When first presented with the controller, microbit and battery, Rachel will see the instructions on the controller. Seeing the orientation of the battery symbol on the controller, she will try to fit the battery sideways. The battery will slide in easily and she will have successfully placed the battery into its slot. If she tries to put the battery in the long and incorrect way, she will see that the battery falls out easily and that the slot was not made for that orientation. 

​

Seeing the instructions for the microbit slot, Rachel will match the microbit name on the controller with the name on the microchip. If she tries to place the microbit the wrong way with the buttons facing into the controller, the protruding buttons will not fit against the flat back of the slot. This will create an obvious awkward fit and the microbit might fall out. Rachel will then take the microbit out, notice the buttons, flip the microbit and try again. This time, the microbit will fit perfectly and Rachel would have successfully accomplished the instructions given by the controller. 

​

Seeing the unattached wires, Rachel will then connect the wires of the battery to the microbit. Through trial and error, she will find the right place to connect the two. After connection, the software has programmed that the microbit displays the channel on which the controller is on. This tells Rachel that she has successfully turned on the controller and it’s ready for her to manipulate.

​

If the Gigglebot does not move, Rachel will notice that the number on her controller does not match the number on the microbit in her Gigglebot. Pressing the A button on the microchip, Rachel learns she changes the number on the screen which correlates to the channel her controller is on. If she presses the B button, an M for Master shows up. This will give Rachel control over her Gigglebot as well as the other bots with the same receiver code. She can press B again to get off master control and return to the channel numbers. 

​

After matching the number on her controller with the number on her Gigglebot, the bot will move. Rachel will quickly learn that the Gigglebot moves with her tilting movements. It will be intuitive for her to tilt forward to move the bot forward, tilt backward to move the bot backwards and side to side to turn the bot left and right.

Video

Reflections and Future Directions

Software:

​

The Microbit coding language was a fun visual way of coding that we had not tried before. We enjoyed looking at the number of options available in the menus, and were often able to solve bugs by reading the documentation provided with the tiles. One challenging aspect of creating the software was figuring out how to make changes to the behavior of the hardware through the limited options the Microbit and Gigglebot afford. 

​

We went through several iterations of the code as we discovered various bugs or unexpected interactions. In the end, we felt like we had a robust software that is resistant to user error and only needs a one-time set up. 

​

The software can be improved by implementing more sensors onto the Gigglebot that would gather data about its surroundings, feeding into machine learning and computer vision algorithms that can help the Gigglebot navigate without human assistance. The software can also be improved by having the controller check for open channels and enter them automatically without needing the human operator to verify with teammates and cycle through manually. 

 

Hardware:

​

Creating the controller by hand was a very informative and fun experience. It was easy to continually test the shape and feel of the controller and make small adjustments accordingly. This handmade, wooden controller is different from the generic,industrial controllers that are on the market today. For future directions, wood can be stained or painted for a more personalized controller. Labels or the brand logo etched into the wood would also be a nice touch. To ensure a better grip, we can add rubber handle sleeves. To ensure the microbit doesn’t fall out, we can install a different mechanism to keep the chip in, such as a rubber guard seen in card holders on the back of phones.

​

We considered 3D printing the controller but were afraid that the material would be rough on users’ hands and if we needed to make adjustments, we would have to print the entire controller again and again. This would make the iterative design process a very tedious, expensive and environmentally irresponsible process. However, if we are able to determine that everything is the right size, shape and texture using 3D glasses and other technologies, 3D printing is a possible option. However, it gives a very generic and industrial look that a handmade, wooden controller does not have. 

​

For the Gigglebot itself, we could’ve added decorations to give the bot more personality. Use of the lights on the Gigglebot could have been utilized to notify users of warnings such as low battery, “bot is stuck” or “there are two controllers on this channel controlling me” which posed an issue during testing. In the future, collision sensors to stop the Gigglebot from running into walls, people or other bots would be a very useful mechanism. 

bottom of page