MATLAB: Scoping Data over Serial in Real Time – Is it feasible

plottingrealtimeserial

I am using a TI microcontroller to control a system. I sample at 10kHz and each sample is a 16 bit unsigned integer value. The development board is connected by USB to my PC and there are two channels of serial interface carried – one for the JTAG emulation and one that is available for other uses.
My question is this: Is it possible to log data back to the PC over serial and then use MATLAB as an oscilloscope to view it in real time? This would be extremely useful for debugging and for gathering experimental data. A VERY rough calculation would suggest it is possible: 10000 samples * 16 bits = 160,000bps. The microcontroller datasheet shows the serial interface operating at 6.25Mbps in one of the applications.
Are there any examples out there of this being done, or resources that I could look at?

Best Answer

In theory a USB 2.0 port should be able to handle that data rate for a virtual COM port, provided that at least 160 bits of payload are sent per packet... so 5 or more samples at a time.
If you have configured to send the packet after every 2-byte sample, then USB 2.0 will not be able to handle that load.
Serial over USB is not the same for performance purposes as a true serial port. For a true serial port, you could in theory configure for an interrupt for every byte (but that could be stressful on the interrupt service routine); for Serial over USB, you are effectively limited to 1000 interrupts per second for virtual COM ports, so you are forced to bundle multiple samples per packet. Which is fine if you just care that the bytes get to the other end without overflow, but is not fine if your "real time" requirements do not permit for transport delays while samples are bundled together into a packet.
In terms of simulink and oscilliscope: you would need to simulate time (under the assumption that the data is being transmitted at a fixed frequency); OR you could time-stamp the packets as they come in and assume that the sampling frequency was fixed within the buffer of samples (so count backwards from the present time by a fixed amount per sample to deduce the time the sample was generated); OR you can have the device time-stamp the samples, which can multiply the number of bytes needed per sample -- generally to 8 bytes of timestamp if using absolute time, and generally to 4 bytes of timestamp if using relative time (in which case there can be difficulties in synchronizing the meaning of the base time.)
Related Question