This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Touchscreen
#1
Android supplies some useful stats on the touch screen when you enable them on the developer menu.
Can we at least have a log of coordinates? it can track ten fingers (last check). conductive rubber tipped pens (or tape covered foil) can actuate it.

As for ways to interpret (per touch):
  • X Y acceleration
  • X Y speed
  • trajectory speed
  • trajectory acceleration
  • angular diversion
  • the de-rotation stuff in my "map" thread (cumulative angle, unroll, and it describes the same again after X Y speed being applied to result of unroll)
Reply
#2
We put it on “the list”… Smile It would certainly help if you could describe a concrete use case, so we are able to reasonably allocate resources for this.

On smartphones, there is certainly a problem with lack of screen real estate: we would need space for the user interface, data input, and data display. There is currently no reliable general way to split the screen accordingly in phyphox, so even on a tablet some fundamental work would likely be required…
Reply
#3
The volume key (device or headset) could switch between activity. [switching between screen as trajectory input or phy's user interface]
The one-input use case is probably better using the accelerometer unless you have to measure an insect [crawling - it is harder to envision what it could be used for, but ten concurrent inputs is alluring].
Reply
#4
Audio feedback might make sense for such experiments, so overloading the volume keys is not optimal.
Reply
#5
I think you have misunderstood, so I have added more words to my previous post.
Reply
#6
I have thought of some experiments:
-challenge: try doing a task with one hand and drawing a circle with the other. You can use the first graphing method (see my map thread) to see whose the best, perfection will show as a straight line.

-rotation-touching can indicate progress(like your water meter), and thus can graph speeds and compare different inputs, though a different (and potentially less sensitive) way is to use sound, your app can show amplitude of different frequencies - shall I add a frequency modulation thread? It should be capable of tracking multiple, and is more flexible than rotation touching.
Reply
#7
I still do not really get the second example… (rotation touch sounds to me more like a water tap)
Reply


Forum Jump: