Collaborators : Ruoxiao Sun, Genggeng Wu
In this project, we propose to build a physical interactive device that uses the hand gesture input to select the stroke style and change the stroke weight to paint on the p5.js canvas screen.
As per the specifications of Project 1, we brainstormed over number of ideas. We finally narrowed down to applications using the hand gestures as input. Below are different applications we brainstormed on,
(Sign language has use of touchpoints on face too, making it tedious to have physical device for sensing. Computer vision recognition maybe better suited for the application.)
(Subtle emotion are mixture of lot of factors alongwith handgestures, such as facial expression recognition etc. Computer vision maybe better suited for the same.)
(Doable with physical device.)
(Final choice, doable with physical hand gesture sensor, Found interesting)