11-10-2018, 12:50 PM
It would be great if there was a way to automatically batch download a csv file every like 5 seconds (use the timestamp in the name to prevent overwriting and to allow for easy sorting). Particularly when streaming to a PC.
This would be great because I have a number of algorithms I want to develop on top of this platform, but im about 500 trillion times faster if I program in python/matlab versus android. A "realtime" uplink with a wifi network will enable a TON of really cool things. The uses range from practical stuff like closed loop systems with PCs: your cell phone could be like a remote control and the accelerometer is used to sense gestures or your phone can be a 3d mouse or ... lots of options
I was trying to dissect the interface and do it myself, but I think it would be better if done on the developer end lol
Ok so I have made some progress on my own problem
This isnt the infinite solution, but its miles closer.
http://10.0.0.236:8080/export?format=0
When I run the above command (Substitute http://10.0.0.236:8080/ with whatever your realtime phyphox IP address is), it automatically saves a csv of the current data, which is perfect. This allows for batching. You can make a script in some language like matlab where you can execute these calls and create a data logger. The only downside is that it does not clear the currently collected data over time, it appends. Therefore you need to load progressively larger files.
FORTUNATELY, the files are appended in a predictable way, so you could dynamically keep track of your position in the file and then only load from that point further the next time you load in a file. If you go this route, you might also need a garbage collector to remove old files periodically, otherwise your gonna generate a lot of data very fast.
The next question is whether I can run a series of commands that can save the data collection, pause data collection, then clear the previous data collection (maybe it only does this periodically as it will cause breaks in the file), start a new data collection, and then continuing saving data for a while. This will at least limit the size of the file so its not resulting in massive data transfers that gunk up the system. Ill post an update if I figure out a good way to moderate the file size.
Alright so here is how you make a chronic data logger over wifi. It comes down to the following commands,
http://10.0.0.236:8080/export?format=0
(saves data)
http://10.0.0.236:8080/control?cmd=clear
(clear data)
http://10.0.0.236:8080/control?cmd=start
(start data)
Of course, you need to substitute http://10.0.0.236:8080 for the URL provided to you by phyphox
You start a data session on your phone, then you pull data as frequently as you like, keeping track of your position in the file and loading that data in sequentially.
After maybe 2 or 4 minutes, however long you care, you will clear the memory (this will bring the filesize back down to 1kb)
Then you restart the data collection and begin pooling data again. It looks something like this (I am using # at the end of the line to indicate comments)
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
....... (keep doing this until you hit the file size thats too big)
http://10.0.0.236:8080/control?cmd=clear #Clear the data
http://10.0.0.236:8080/control?cmd=start #Start a new data collection
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
... repeat forever
This would be great because I have a number of algorithms I want to develop on top of this platform, but im about 500 trillion times faster if I program in python/matlab versus android. A "realtime" uplink with a wifi network will enable a TON of really cool things. The uses range from practical stuff like closed loop systems with PCs: your cell phone could be like a remote control and the accelerometer is used to sense gestures or your phone can be a 3d mouse or ... lots of options
I was trying to dissect the interface and do it myself, but I think it would be better if done on the developer end lol
Ok so I have made some progress on my own problem
This isnt the infinite solution, but its miles closer.
http://10.0.0.236:8080/export?format=0
When I run the above command (Substitute http://10.0.0.236:8080/ with whatever your realtime phyphox IP address is), it automatically saves a csv of the current data, which is perfect. This allows for batching. You can make a script in some language like matlab where you can execute these calls and create a data logger. The only downside is that it does not clear the currently collected data over time, it appends. Therefore you need to load progressively larger files.
FORTUNATELY, the files are appended in a predictable way, so you could dynamically keep track of your position in the file and then only load from that point further the next time you load in a file. If you go this route, you might also need a garbage collector to remove old files periodically, otherwise your gonna generate a lot of data very fast.
The next question is whether I can run a series of commands that can save the data collection, pause data collection, then clear the previous data collection (maybe it only does this periodically as it will cause breaks in the file), start a new data collection, and then continuing saving data for a while. This will at least limit the size of the file so its not resulting in massive data transfers that gunk up the system. Ill post an update if I figure out a good way to moderate the file size.
Alright so here is how you make a chronic data logger over wifi. It comes down to the following commands,
http://10.0.0.236:8080/export?format=0
(saves data)
http://10.0.0.236:8080/control?cmd=clear
(clear data)
http://10.0.0.236:8080/control?cmd=start
(start data)
Of course, you need to substitute http://10.0.0.236:8080 for the URL provided to you by phyphox
You start a data session on your phone, then you pull data as frequently as you like, keeping track of your position in the file and loading that data in sequentially.
After maybe 2 or 4 minutes, however long you care, you will clear the memory (this will bring the filesize back down to 1kb)
Then you restart the data collection and begin pooling data again. It looks something like this (I am using # at the end of the line to indicate comments)
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
....... (keep doing this until you hit the file size thats too big)
http://10.0.0.236:8080/control?cmd=clear #Clear the data
http://10.0.0.236:8080/control?cmd=start #Start a new data collection
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
... repeat forever