Using bulk-updates for larger sample rates
Hi - I'm currently using the code from this article: https://uk.mathworks.com/help/thingspeak/continuously-collect-data-and-bulk-update-a-thingspeak-channel-using-an-arduino-mkr1000-board-or-an-esp8266-board.html
My plan was to use this so as to update my Thingspeak server every second with data recorded on my arduino of higher than 1Hz. I am currently taking vibration sensor data that for my project requires a higher sample rate than Thingspeak is able to handle naturally - thus I found bulk updates to be the solution.
I have gotten this to work as specified in the article - updates every 2 minutes with data points at every 15 seconds. I have gotten this as far as data points every second with an update to the server every 10 second - however the system fails at any data point shorter than a second. I believe this to be a problem with using delta_t but I am not sure.
To clarify I do not have a computer science background, this is currently a project I'm working on in Mechanical engineering and I've truly hit a wall with this problem. Any help would be appreciated!
My system is an Arduino MEGA 2560 with an ethernet shield - in the sample code they use RSSI to output sample data, I simply changed this to a random number between 0-50.
8 Comments
Time Descendingdelta t cannot be shorter than a second at present. I recommend you use absolute times, but start them far in the past so you can keep them separated. You cannot have duplicate time stamps in the same channel so you want to make sure to avoid that. If you have a free field, you can use another field to encode the actual timestamp, or use some consistent transformation from the data point timestamp to the actual time you have for your device. For example, if you start your absolute timestamps in January 1 1900 12:00:00, then every one second could represent a tenth of a second in your system.
As of April 2021, the time resolution of data stored in the ThingSpeak Channel is 1 second. If you're trying to upload raw data that is of higher resolution, you will have to pack the data into the field using some custom schema and the built in charts will not work.