top of page

Revised user interaction flow


Considering our latest changes, we re-wrote our user interaction flow. We are still in the process of developing our app and are very excited about our progress.

...

______________________________

User action: User attaches "Clicks" to Smartphone.

System's computation: Smartphone recognizes NFC tag and triggers the "Clicks" app.

System's feedback to user: The app appears on the user's phone and reveals a question.

What to test?: Does the user understand how to use "Clicks"? A possible way to test this is to hand the user a Smartphone and "Clicks" and see if he understands he has to attach them together.

______________________________

User action: User chooses the answer (color) of his liking.

System's computation: The phone transmits a signal via NFC to the "Clicks" and triggers the RGB LED.

System's feedback to user: The color of the "Clicks" changes according to the users answer.

What to test?: Does the user understand that he can change his answer until he is satisfied? Does the user know or see that choosing an answer affects the "Clicks"?

______________________________

User action: User wears "Clicks".

System's computation: Magnets click. RGB LED remains on with the color of choice.

System's feedback to user: Sound of magnets clicking ensures the user that he placed the "Clicks" safely . "Clicks" RGB LED is on.

What to test?: Does the user understand how to wear "Clicks"? How many ways can he find to wear it? Where will the users choose to wear "Clicks" the most?

______________________________

User action: User walks around wearing the "Clicks"

System's computation: RGB LED remains on with the color of choice. No further computation needed.

System's feedback to user: RGB LED is on.

What to test?: Do "Clicks" really encourage interaction? Do people relate to other people who chose the same answer as them? Do they relate to people who didn't choose the same answer?


bottom of page