This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Touchscreen
#1
Android supplies some useful stats on the touch screen when you enable them on the developer menu.
Can we at least have a log of coordinates? it can track ten fingers (last check). conductive rubber tipped pens (or tape covered foil) can actuate it.

As for ways to interpret (per touch):
  • X Y acceleration
  • X Y speed
  • trajectory speed
  • trajectory acceleration
  • angular diversion
  • the de-rotation stuff in my "map" thread (cumulative angle, unroll, and it describes the same again after X Y speed being applied to result of unroll)
Reply
#2
We put it on “the list”… Smile It would certainly help if you could describe a concrete use case, so we are able to reasonably allocate resources for this.

On smartphones, there is certainly a problem with lack of screen real estate: we would need space for the user interface, data input, and data display. There is currently no reliable general way to split the screen accordingly in phyphox, so even on a tablet some fundamental work would likely be required…
Reply
#3
The volume key (device or headset) could switch between activity.
The one-input use case is probably better using the accelerometer unless you have to measure an insect.
Reply
#4
Audio feedback might make sense for such experiments, so overloading the volume keys is not optimal.
Reply


Forum Jump: