Posts: 24
	Threads: 11
	Joined: Mar 2025
	
	
 
	
		
		
		03-17-2025, 03:42 AM 
(This post was last modified: 03-17-2025, 03:44 AM by noitarud.)
	
	 
	
		Android supplies some useful stats on the touch screen when you enable them on the developer menu.
Can we at least have a log of coordinates? it can track ten fingers (last check). conductive rubber tipped pens (or tape covered foil) can actuate it.
As for ways to interpret (per touch):
- X Y acceleration
 
- X Y speed
 
- trajectory speed
 
- trajectory acceleration
 
- angular diversion
 
- the de-rotation stuff in my "map" thread (cumulative angle, unroll, and it describes the same again after X Y speed being applied to result of unroll)
 
	 
	
	
	
	
 
 
	
	
	
		
	Posts: 671
	Threads: 31
	Joined: Apr 2020
	
Institution: RWTH Aachen University
	
 
	
	
		We put it on “the list”… 

 It would certainly help if you could describe a concrete use case, so we are able to reasonably allocate resources for this.
On smartphones, there is certainly a problem with lack of screen real estate: we would need space for the user interface, data input, and data display. There is currently no reliable general way to split the screen accordingly in phyphox, so even on a tablet some fundamental work would likely be required…
	
 
	
	
	
	
 
 
	
	
	
		
	Posts: 24
	Threads: 11
	Joined: Mar 2025
	
	
 
	
		
		
		03-17-2025, 04:22 PM 
(This post was last modified: 03-20-2025, 07:44 AM by noitarud.
 Edit Reason: Finishing
)
	
	 
	
		The volume key (device or headset) could switch between activity. [switching between screen as trajectory input or phy's user interface]
The one-input use case is probably better using the accelerometer unless you have to measure an insect [crawling - it is harder to envision what it could be used for, but ten concurrent inputs is alluring].
	
	
	
	
	
 
 
	
	
	
		
	Posts: 671
	Threads: 31
	Joined: Apr 2020
	
Institution: RWTH Aachen University
	
 
	
	
		Audio feedback might make sense for such experiments, so overloading the volume keys is not optimal.
	
	
	
	
	
 
 
	
	
	
		
	Posts: 24
	Threads: 11
	Joined: Mar 2025
	
	
 
	
	
		I think you have misunderstood, so I have added more words to my previous post.
	
	
	
	
	
 
 
	
	
	
		
	Posts: 24
	Threads: 11
	Joined: Mar 2025
	
	
 
	
	
		I have thought of some experiments:
-challenge: try doing a task with one hand and drawing a circle with the other. You can use the first graphing method (see my map thread) to see whose the best, perfection will show as a straight line.
-rotation-touching can indicate progress(like your water meter), and thus can graph speeds and compare different inputs, though a different (and potentially less sensitive) way is to use sound, your app can show amplitude of different frequencies - shall I add a frequency modulation thread? It should be capable of tracking multiple, and is more flexible than rotation touching.
	
	
	
	
	
 
 
	
	
	
		
	Posts: 671
	Threads: 31
	Joined: Apr 2020
	
Institution: RWTH Aachen University
	
 
	
	
		I still do not really get the second example… (rotation touch sounds to me more like a water tap)