This forum uses cookies
This forum makes use of cookies to store your login information if you are registered, and your last visit if you are not. Cookies are small text documents stored on your computer; the cookies set by this forum can only be used on this website and pose no security risk. Cookies on this forum also track the specific topics you have read and when you last read them. Please confirm whether you accept or reject these cookies being set.

A cookie will be stored in your browser regardless of choice to prevent you being asked this question again. You will be able to change your cookie settings at any time using the link in the footer.

Real time data download
#1
It would be great if there was a way to automatically batch download a csv file every like 5 seconds (use the timestamp in the name to prevent overwriting and to allow for easy sorting). Particularly when streaming to a PC.

This would be great because I have a number of algorithms I want to develop on top of this platform, but im about 500 trillion times faster if I program in python/matlab versus android. A "realtime" uplink with a wifi network will enable a TON of really cool things. The uses range from practical stuff like closed loop systems with PCs: your cell phone could be like a remote control and the accelerometer is used to sense gestures or your phone can be a 3d mouse or ... lots of options

I was trying to dissect the interface and do it myself, but I think it would be better if done on the developer end lol

Ok so I have made some progress on my own problem

This isnt the infinite solution, but its miles closer.

http://10.0.0.236:8080/export?format=0

When I run the above command (Substitute http://10.0.0.236:8080/ with whatever your realtime phyphox IP address is), it automatically saves a csv of the current data, which is perfect. This allows for batching. You can make a script in some language like matlab where you can execute these calls and create a data logger. The only downside is that it does not clear the currently collected data over time, it appends. Therefore you need to load progressively larger files.

FORTUNATELY, the files are appended in a predictable way, so you could dynamically keep track of your position in the file and then only load from that point further the next time you load in a file. If you go this route, you might also need a garbage collector to remove old files periodically, otherwise your gonna generate a lot of data very fast.

The next question is whether I can run a series of commands that can save the data collection, pause data collection, then clear the previous data collection (maybe it only does this periodically as it will cause breaks in the file), start a new data collection, and then continuing saving data for a while. This will at least limit the size of the file so its not resulting in massive data transfers that gunk up the system. Ill post an update if I figure out a good way to moderate the file size.

Alright so here is how you make a chronic data logger over wifi. It comes down to the following commands,

http://10.0.0.236:8080/export?format=0
(saves data)

http://10.0.0.236:8080/control?cmd=clear
(clear data)

http://10.0.0.236:8080/control?cmd=start
(start data)

Of course, you need to substitute http://10.0.0.236:8080 for the URL provided to you by phyphox

You start a data session on your phone, then you pull data as frequently as you like, keeping track of your position in the file and loading that data in sequentially.

After maybe 2 or 4 minutes, however long you care, you will clear the memory (this will bring the filesize back down to 1kb)

Then you restart the data collection and begin pooling data again. It looks something like this (I am using # at the end of the line to indicate comments)

http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
....... (keep doing this until you hit the file size thats too big)

http://10.0.0.236:8080/control?cmd=clear #Clear the data

http://10.0.0.236:8080/control?cmd=start #Start a new data collection

http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds
http://10.0.0.236:8080/export?format=0 #Save data
pause(10) #Wait 10 seconds

... repeat forever
Reply
#2
I have been working on a python 3.0 implementation for the data logger. The logger sequentially calls URLs to trigger data collection and clean up. I use chrome to call the data cownload URL because it opens a tab, saves the data, and then closes the tab automatically. Im not sure if autosaving to the download folder is by default how all chrome browsers work, but it is the case for me. You should probably make sure you have "Ask where to save each file before downloading" unchecked as seen in this picture    


Here is the code to create sequential data files (Python 3.0):


#########################################

import urllib.request
import time
import webbrowser  


IPAddress = '10.0.0.236:8080'  #IP address and port This is different for each person and specified by the phyphox app
num_data = 5 #Take 5 data chunks
pause_tm = 2 #The amount of time to wait in between data collections


save_dat = 'http://' + IPAddress + '/export?format=0'  #Saving data
clear_dat = 'http://' + IPAddress + '/control?cmd=clear'  #Clearing a data collection
start_dat = 'http://' + IPAddress + '/control?cmd=start'  #Starting a data collection


# Here is where the program actually starts, everything beforehand was just prep

urllib.request.urlopen(start_dat) #Start collecting data!!

for v in range(0,num_data):
   webbrowser.get("C:/Program Files (x86)/Google/Chrome/Application/chrome.exe %s").open(save_dat) #Open a chrome window (note if your not on windows you need to change the location of chrome) and save data!
   time.sleep(pause_tm) #Wait a bit before collecting data again

urllib.request.urlopen(clear_dat)  #Clear the data Collection
urllib.request.urlopen(start_dat) #Restart the data collection

#Collect data again, for fun, why not
for v in range(0,num_data):
   webbrowser.get("C:/Program Files (x86)/Google/Chrome/Application/chrome.exe %s").open(save_dat)
   time.sleep(pause_tm)

urllib.request.urlopen(clear_dat)  #Clear the data Collection

###############################


You need to set your 'IP address : Port' in the IPAddress variable and make sure the phyphox app has the remote connection enabled. The experiment should be paused before you run the python code. When you run the python code, it will trigger a recording to start. Then it will take generate data logs every few seconds. Afterwards it will clear the current data session and then create a few more data logs. This is just to illustrate the basic mechanism of logging data, reducing the file size, and logging data again.

Next I need to have a parallel thread that pulls the data periodically and creates a continuous data segment within python.

I ran the above code and generated the a few files. I opened 2 files that were generated back to back and lined up the data that was collected. If you see the picture     , you will note that the data does appear to be directly appended in subsequent files, which as an assumption should greatly reduce the load for calculation. Obviously I will load in like the last 5 values and do an error check just to make sure I didnt drop a packet, but I probably can load segments of files instead of all of the files.

Then... The signal processing can begin!!
Reply
#3
Nice work and thanks for sharing your solution.

If you want to improve on this, you should have a look at our documentation of the remote interface:
https://phyphox.org/wiki/index.php?title...munication

You can directly request the data as a JSON object and you transfer only the new data from a given moment instead of transfering everything every few seconds. Also, if you look at the Youtube video in the remote interface documentation (https://www.youtube.com/watch?v=sFx9zZKe4E4), there is a short example Python script that request the latest value.
Reply
#4
Thank you so much for your reply!! and on a saturday, thats devotion!

I have a horrible tendency to not read information that already exists and thus reinvent things. It can be a useful practice though, as it helps you develop skills.

I am so glad to see there is a clean mechanism in place to enable real time communication. I have so many things I want to develop for this. I can see a tremendous amount of opportunities here ranging from robotics to "fun smart tools". I think this could be a very powerful platform with a couple extra do-dads on top of it. Right now a limitation is the development of iterative app releases. Getting an update depends upon new app versions, but if you also have a "marketplace" with PC software that can be run in addition to phyphox, you can allow for rapid algorithm development, deployment, and utilization. Think of it as steam for phyphox. You need the platform to enable the algorithm development.

I have some ideas for getting a large number of cell phones streaming simultaneous data and being pooled at the PC. Now you have a sensor network with continuous sensor streams from several domains..... What greatness might come of such an accessible learning environment??
Reply
#5
(11-10-2018, 07:08 PM)Sebastian Staacks Wrote: Nice work and thanks for sharing your solution.

If you want to improve on this, you should have a look at our documentation of the remote interface:
https://phyphox.org/wiki/index.php?title...munication

You can directly request the data as a JSON object and you transfer only the new data from a given moment instead of transfering everything every few seconds. Also, if you look at the Youtube video in the remote interface documentation (https://www.youtube.com/watch?v=sFx9zZKe4E4), there is a short example Python script that request the latest value.

Here is another question though.... Using your specified method, what would you expect the amount of dropped data to be? I am looking for high frequency data transmission upwards of 100 hz on as many sensors as possible. Would I expect to have a perfect sampling rate if I query it as you suggested?
Reply
#6
(11-11-2018, 03:03 AM)jbshute Wrote: Here is another question though.... Using your specified method, what would you expect the amount of dropped data to be? I am looking for high frequency data transmission upwards of 100 hz on as many sensors as possible. Would I expect to have a perfect sampling rate if I query it as you suggested?

No, not with the method in the example, which just requests the latest value. But if you have a look at the documentation, you can add parameters to specify which part of the buffer you need:

Code:
/get?abc

Just returns the latest value of the buffer "abc".

Code:
/get?abc=full
Returns the whole buffer "abc" (similar to your export method).

Code:
/get?abc=42
Returns all values of the buffer "abc" starting from the first one that is larger than 42.

Code:
/get?abc=42|def
Returns all values of the buffer "abc" starting from the first value that is larger than 42 in the buffer "def".
The latter two only make sense if you have two buffers that represent value pairs and if one of them is monotonous. The typical example would be a buffer "time" for time data (which would be the monotonous one) and "value" with some random sensor data. If you already have the first 30 seconds of your measurement (or don't care about the older data), you could do the following request:

Code:
/get?time=30&value=30|time
This returns two arrays with all time entries larger than 30 (in this example in seconds) and all value entries at indices that match time values after 30s.
The reason that we do not directly request a specific index is that phyphox can also handle finite buffers (i.e. a queue) that only contain the last n values. New values push older ones to lower indices, so a client can not reliably keep track of these indices if finite buffers are used.
You might also want to have a look at our file format for the experiments in the app. This is how we (as well as teachers) deploy new experiments without updating the app. A good example of what our file format can achieve is our old christmas experiment: https://phyphox.org/xmas. With the upcoming version 1.1.0, new experiments can also simply be added with QR codes.
Reply
#7
(11-11-2018, 10:45 AM)Sebastian Staacks Wrote:
(11-11-2018, 03:03 AM)jbshute Wrote: Here is another question though.... Using your specified method, what would you expect the amount of dropped data to be? I am looking for high frequency data transmission upwards of 100 hz on as many sensors as possible. Would I expect to have a perfect sampling rate if I query it as you suggested?

No, not with the method in the example, which just requests the latest value. But if you have a look at the documentation, you can add parameters to specify which part of the buffer you need:

Code:
/get?abc

Just returns the latest value of the buffer "abc".

Code:
/get?abc=full
Returns the whole buffer "abc" (similar to your export method).

Code:
/get?abc=42
Returns all values of the buffer "abc" starting from the first one that is larger than 42.

Code:
/get?abc=42|def
Returns all values of the buffer "abc" starting from the first value that is larger than 42 in the buffer "def".
The latter two only make sense if you have two buffers that represent value pairs and if one of them is monotonous. The typical example would be a buffer "time" for time data (which would be the monotonous one) and "value" with some random sensor data. If you already have the first 30 seconds of your measurement (or don't care about the older data), you could do the following request:

Code:
/get?time=30&value=30|time
This returns two arrays with all time entries larger than 30 (in this example in seconds) and all value entries at indices that match time values after 30s.
The reason that we do not directly request a specific index is that phyphox can also handle finite buffers (i.e. a queue) that only contain the last n values. New values push older ones to lower indices, so a client can not reliably keep track of these indices if finite buffers are used.
You might also want to have a look at our file format for the experiments in the app. This is how we (as well as teachers) deploy new experiments without updating the app. A good example of what our file format can achieve is our old christmas experiment: https://phyphox.org/xmas. With the upcoming version 1.1.0, new experiments can also simply be added with QR codes.

Hi,

I tried your python Script for real-time access of Acc data.

It worked really good.

But what I tried to achieve is high frequency samples up to 400 Hz and equidistant samples. The spectrum in the App shows a frequency of 200 Hz, so I should have 400Hz samples.

The current time-values show that the data is not equidistant and only about 30-100Hz.

I am connected via Hotpot 5Ghz band.

Is there some improvement possible?

Thanks.
Reply
#8
You won't be able to make 400 requests per second, especially as the web server on the phone is not that efficient. In fact, you would need an even higher frequency to make sure not to miss a value die to "bad timing". You need to request multiple values at once at a lower request rate. To do so, you need to look at the last example "/get?time=30&value=30|time".

For example, let's assume you want to read acceleration data from a buffer "acc" and there is a matching timestamp buffer "t". You will need to request /get?acc=full&t=full first to get all data points that are already there. Then your script needs to take the last entry of "t" and request any data that has been taken since this last "t". For example, if the last data point was acc=7.2 and t=0.34, you would request anything since 0.34s like this: /get?t=0.34&acc=0.34|t which means that you request all values of t larger than 0.34 and all values of acc that correspond to t values larger than 0.34.

This way you can lower the request rate and even if it is limited by your network you will not miss any data point.
Reply
#9
Edit:

It does not work in the sense you proposed.

I was experimenting a little bit, but line 28 and line 38 do not work.

My Code is:

Code:
#MIDI configuration
M_OUTPUT = "Midi Through:Midi Through Port-0 14:0"
M_CHANNEL = 0
M_CONTROLS = [1, 2, 3, 4] #You can send on different CC channels
# M_CONTROLS = [70]

#phyphox configuration
PP_ADDRESS = "http://192.168.178.29:8080"
PP_CHANNELS = ["accX", "accY", "accZ", "acc_time"] #If using different CC channels, define multiple phyphox buffers
# PP_CHANNELS = ["accY"]

import requests
import time
import numpy

value_temp = 0

time_update = 0
count = 0
while True:
    if count == 0:
        PP_CHANNELS_def = ["acc_time=full"]
        PP_CHANNELS_def_x = ["accX=full"]
        PP_CHANNELS_def_y = ["accY=full"] #If using different CC channels, define multiple phyphox buffers
        PP_CHANNELS_def_z = ["accZ=full"] #If using different CC channels, define multiple phyphox buffers
    else:
        PP_CHANNELS_def = ["acc_time="  + str(time_update)]
        PP_CHANNELS_def_x = ["acc_time=" + str(time_update), "accX=" + str(time_update) + "|acc_time"] #If using different CC channels, define multiple phyphox buffers
        PP_CHANNELS_def_y = ["accY=" + str(time_update)] #If using different CC channels, define multiple phyphox buffers
        PP_CHANNELS_def_z = ["accZ=" + str(time_update)] #If using different CC channels, define multiple phyphox buffers

    url = PP_ADDRESS + "/get?" + ("&".join(PP_CHANNELS_def))
    url_x = PP_ADDRESS + "/get?" + ("&".join(PP_CHANNELS_def_x))
    url_y = PP_ADDRESS + "/get?" + ("&".join(PP_CHANNELS_def_y))  
    url_z = PP_ADDRESS + "/get?" + ("&".join(PP_CHANNELS_def_z))  

    data = requests.get(url=url).json()
    data_x = requests.get(url=url_x).json()
    data_y = requests.get(url=url_y).json()
    data_z = requests.get(url=url_z).json()

    time.sleep(0.04)
    count = count + 1
    time_update = count * 0.04;

    # Uncomment to send pitch bend
    time_value = numpy.array(data["buffer"]["acc_time"]["buffer"])
    value_x = numpy.array(data_x["buffer"]["accX"]["buffer"])
    value_y = numpy.array(data_y["buffer"]["accY"]["buffer"])
    value_z = numpy.array(data_z["buffer"]["accZ"]["buffer"])

    # values_all = numpy.transpose(numpy.vstack((value_x,value_y,value_z,time_value,numpy.hstack((numpy.mean(1/(time_value[1:-1]-time_value[0:-2])),1/(time_value[1:]-time_value[0:-1]))))))
    # print("Sending" + ", values " + str(values_all))

The error is:

Code:
During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "/home/roman/Phyphox/phyphox2.py", line 38, in <module>
    data_x = requests.get(url=url_x).json()

  File "/home/roman/anaconda3/envs/phyphox/lib/python3.9/site-packages/requests/api.py", line 75, in get
    return request('get', url, params=params, **kwargs)

  File "/home/roman/anaconda3/envs/phyphox/lib/python3.9/site-packages/requests/api.py", line 61, in request
    return session.request(method=method, url=url, **kwargs)

  File "/home/roman/anaconda3/envs/phyphox/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
    resp = self.send(prep, **send_kwargs)

  File "/home/roman/anaconda3/envs/phyphox/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
    r = adapter.send(request, **kwargs)

  File "/home/roman/anaconda3/envs/phyphox/lib/python3.9/site-packages/requests/adapters.py", line 498, in send
    raise ConnectionError(err, request=request)

ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

And is it possible to get higher frequency than 400 Hz? The implemented Sensor BMI260 should be possible to sample up to 1600 Hz. Is this restricted from the phone?

Thanks
Reply
#10
Hm, we are talking about smartphone sensors, aren't we? There is a difference between the capabilities of built-in sensors and that what the API provides us. The rates in https://phyphox.org/sensordb/ are upper limits and I could not find anything (common) avove 500 Hz…
Reply


Forum Jump: