Please accept YouTube cookies to play this video. By accepting you will be accessing content from YouTube, a service provided by an external third party.
If you accept this notice, your choice will be saved and the page will refresh.
The next phyphox version 1.1.12 will support automagically the puck.js. For the moment, you could find a reference implementation in our wiki.
The most amazing thing is that we can already embed the JS code for the puck.js in our file format and pretty much share any functionality without changing anything in phyphox. We should have tried this years ago.
Let us know if you need any support or if you created something cool with this.
Please accept YouTube cookies to play this video. By accepting you will be accessing content from YouTube, a service provided by an external third party.
If you accept this notice, your choice will be saved and the page will refresh.
The just released version 1.1.11 adds 3D depth sensors support to the previously available multiple smartphone sensors. Those augment the pixels of cameras with spatial depth information for better photos and augmented reality applications. The technology is known as light detection and ranging (LiDAR), time of flight (ToF) and true depth. The associated distance information opens up entirely new possibilities for physics experiments, but also for mathematics. In this context, phyphox is the first app in the education sector that equally utilizes front cameras with Face ID for this purpose, which are included in almost all current iPhones, for instance.
Exactly 6 years ago today, on September 12, 2016, phyphox was released in both major stores for Android and iOS. Originally intended for university lectures, using a salad spinner to analyze circular motion or determining the speed of sound with just two smartphones quickly found its way to schools as well – and around the world. This summer, we exceeded four million installations, more than one million of which as part of volume packages at schools.