Skip to content

refactor PS[3,4,5,6]000 picoscope implementation #167

@RalphSteinhagen

Description

@RalphSteinhagen

While operationally deploying the picoscope, some issues have been identified that need to be fixed and/or streamlined:

  • fix/improve the matching between emitted timing event Tags and corresponding detected sampled HW trigger edges in the picoscope block #168
    • Issues not covered in the merged inital re-implementation, to be followed up:
      • Add tags to indicate missing syncrhonisation/dropped tags
      • Track clock drift of system and sampling clock and latencies
      • gnuradio4 followup to properly support multiple events(=tags) on the same sample
      • systematically cover high load testing of the matcher. goal: 1kHz trigger rate, ~250 MS/s sample rates, worst-case latency << 100 ms (for the digitizer block should probably < 10 ms to account for other delays/uncertainties in the chain).
      • correct shifted offsets for tags emitted on the same timestamp
      • Rework public TimingSource settings interface to select TimingEvents for hardware edges and/or emitting tags.
  • change picoscope block architecture from an 'inheritance' to a 'composition' based pattern.
    goal: the present inheritance approach is problematic w.r.t. GR4's CRTP pattern (introspection) and is too strongly coupled making an individual sub-component testing between the different picoscope models unnecessarily hard. The interitance should be broken up to a templated pattern where the number of ports and PS implementations shold be injected as NTTP and type parameters, in the style of:
    template<typename T, std::size_t nAnalog, typename TPicoscopeImpl>
    requires (std::is_same<T, std::int16_t> || std::is_same<T, float> || std::is_same<T, gr::UncertainValue<float>> ||
          || std::is_same<T, gr::DataSet<std::int16_t>> || std::is_same<T, gr::DataSet<float>> || std::is_same<T, gr::DataSet<gr::UncertainValue<float>>)
    struct Picoscope : gr::Block<Picoscope<T, nAnalog, TPicoscopeImpl>  {
        using TDigitalOutput = std::conditional<gr::DataSetLike<T>, gr::DataSet<uint16_t>, uint16_t>::type;
        static constexpr AcquisitionMode acquisitionMode = gr::DataSetLike<T> ? AcquisitionMode::RapidBlock : AcquisitionMode::Streaming;
    
        // port definitions
        gr::PortIn<std::uint8_t, gr::Async> timingIn;
        std::array<gr::PortOut<T>, nAnalog> out;
        gr::PortOut<TDigitalOutput>         digitalOut; // for non-MSO, this contains only the trigger on the first bit
    
        // settings
        A<std::string, "serial number">   serial_number;
        A<float, "sample rate", Visible>  sample_rate              = 10000.f;
        A<gr::Size_t, "pre-samples">      pre_samples              = 1000;
        // ... as before
    • the TPicoscopeImpl method abstraction should be (stand-alone) unit-testable, and guided by the overarching spec notably Figures 3 and 5.
    • error handling should preferably use gr::exception(..) and the gr::Error(..) interfaces rather than using the emitErrorMessage(...) for better composability and unit-testability
    • all Picoscope return values need to be handled not just the OK path. The 'retry' resilience pattern should be deployed for errors that are recoverable and a hard exception be emitted when the retry failed or the error cannot be recovered from.
  • Complete the full Trigger Implementation & Testing
    • Implement and thoroughly test (unit + integration) digital/external triggers, including edge cases like HW errors, invalid configurations, fall-back tests, or if no signals are present.
    • Confirm that all relevant and specified modes (Streaming, RapidBlock, etc.) work as expected.
    • This should also include operational test-psX000 sanity-check programs similar to Alex's 'test-timing' program (i.e. covering the different DAQ cases, w/ and w/o trigger, retrieving S/N, firmware version, ...).
    • These test should cover the basic streaming, soft- and hw-triggered use cases w/ and w/o timing card and allow a comprehensive in field failure diagnostics.
      • The test suites must be self contained, exist for each model independently, and (configurable) output the measured result via the existing UTF-8-based ImChart plotting.
      • The configurations should be contained in the test-suites, but being able to additionally read and use user-provided '.grc' files would also be useful.
    • These need to be built and also deployed on the Yocto image to verify the correct HW functionalities in the field for the deployed devices.
  • Integrate the ps6000 model based on the existing PS3000/5000 version, and prepare subsequent field testing (e.g. via the to be written 'test-ps6000' test program)
    • Verify specifically any features that differ from the PS3000/4000/5000 models (e.g. external clock or EXT trigger lines).
    • Provide a brief summary, documentation, and demonstration (logs, screenshots, or a short video) showing the final tests in action.
  • We need both the non-blocking and (optionally) blocking digitizer behaviour (e.g., using a configurable software timer) to avoid busy polling and burning up the CPU.
    • The present implementation blocks the CPU for other essential post-processing/data-transport functions of the other blocks (N.B. our FECs have only 4 cores).
    • N.B. the polling rate also effectively defines the minimum number of available samples and the buffer size, the max number of samples, resp.
    • Both modes need thorough (automated) testing and error handling (via either gr::exception(..) or std::expected<T, gr::Error> + PS model and block name info).
  • Verify correct forwarding of Picoscope settings (channel_name, sample_rate, etc)

Sub-issues

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

Status

🏗 In progress

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions