I'm using the Session-based DAQ on MatLab 2013a with a NI PCI-6229 card. I have found that if I include digital channels (InputOnly or Bidirectional Input setting) for reading on/off signals, and I run the session in Background mode (startBackground), then the memory usage by MatLab steadily increases steadily. Stopping and resetting the session does not release this memory, nor does deleting the session.
I can attach more detailed code if necessary, but I've found that the simplest way to replicate this is to create a session with one analog and one digital channel (the analog channel clocks the digital one), have the DataAvailable callback simply plot the data ( plot(event.TimeStamps, event.Data) ), and run it.
Has anyone else encountered this issue and know a way around it?