Medtrum A6 CGM – A week (or two) in the life

After the initial couple of days with the Medtrum, and the data that was produced, I thought it would be a good idea to provide an update of wearing the sensor through to its end of life. The important questions for me were:

  • How long would it last?
  • How accurate would it be?
  • Would there be any skin reaction?
  • How was the app to live with?
  • Ultimately did I think it was worth it?

I can say that I don’t think one sensor is enough to answer these question fully, but it gives an opportunity to provide a brief insight into life with the Medtrum. As per the last post, this was Medtrum A6 using Easysense, up against Dexcom G5 using xDrip+.

It lasted longer than the warranted seven days, but not as long as oft quoted 14 days. My experience was that it wasn’t as accurate as the Dexcom G5 during the warranted period, but after that, it got a little more interesting, and that the stickiness and application of the adhesive was a lot better than the G5. I had no skin reactions. And it mostly worked. But there are some other bits and pieces around this brief overview that need further explaining. So on we go.

First up. How long did it last?

The A6 with the transmitter on full charge lasted a full seven days. The transmitter made it through to day nine before the connection kept being dropped with the phone. At this point, I took it off and recharged it for two hours, and presto, it worked again. This also revealed the mechanism by which the system learns about a new sensor.

When you reach seven days, with the app set to 14, it just continues. When you disconnect the transmitter, it assumes a new sensor and goes through a warm up, so you get two hours of charging and two hours of warm up. While a four hour break probably isn’t a disaster, it probably needs to be planned.

By the time I got to day 11, I again went into long cycles of “Lost Sensor” and, having tried various attempts to restart the sensor, recharge the transmitter, and pretty much anything I could think of, after 4 hours of no data, I came to the conclusion that the sensor was no longer functioning. I suspect that the transmitter removal that I was forced to undertake won’t have helped this. We’ll see if that is corroborated with the second sensor that I am now wearing.

Having said that, the glue on the adhesive pad was way better than that on the Dexcom, to which I had to apply overtaping at 9 days. Apologies for the photos taken in the mirror!

The Dexcom sensor lasted a further five days before being replaced.

How accurate was it?

Just to be clear, this was only one sensor, so the data is unlikely to be wholly representative. Having said that, this is the distribution of results:

The G5 was much tighter in my limited test than the A6, although the G5 did exhibit one red zone reading, which the A6 managed to avoid. What does this mean? Well if we plot these values next to the blood testing, we get:

Each bar represents the variation of the sensor data from the blood test done at the same time. What I found interesting was that the A6 read higher than blood more regularly than the G5, in fact if we look at the numbers:

During the warranted period, nearly two thirds of the readings were hyper compared to the blood test. What’s perhaps most unexpected in this is that the G5 in week 2 read much similarly to the A6 in week 1, and this perhaps needs further investigation. What was also noticeable was that as the sensor approached the end of its life, the VfB dropped away, almost suggesting that the sensor catalyst was being used up and that the twice daily calibration wasn’t enough to overcome this.

In terms of the Variation from Blood (VfB) figure, if we look at the Mean Absolute VfB, we get the following:

The G5 appears to match the MARD represented in the documentation, where the A6 is much similar to the G4 or older Medtronic offerings. Week 2 told a different story though:

Neither wins awards for that set of results, but it does raise the question of continuing to use a closed loop on an older sensor.

Finally, how do these numbers track versus blood?

I think this picture demonstrates that both have a decent lag, but due to the lack of testing for timing, it’s difficult to state that one was worse than the other, and is something worth continuing to test.

One thing that I noted over the life of the sensor was that in general, if you are not good with Calibration hygiene on the A6, it drifts a lot further than the Dexcom does using xDrip+. Even on the second sensor, I’m seeing a similar level of difference in my VfB.

How was the app to live with?

The app is surprisingly easy to use, and I prefer it to the Dexcom app. The summary data is pretty good, and it does give you a day to day “stats” view. And of course you can review your data on Easyview, although what you can easily view is a little inconsistent, again. It would make sense that you can select the ranges of dates you want to review in all the screens they provide, however, in some you can do just that, and in others, your maximum look back is 14 days. Not quite how I’d expect it to work.

Whilst this isn’t really an app complaint, I found it compression hypoed more than the G5 did, although this could just be linked to placement. I also found that overnight, there was more frequent loss of connection, although the backfill functionality meant that this wasn’t really an issue. The other side of this is that as I was able to compare it to the G5, it felt less consistent, which is part of building up trust with a system like this. If you look at the difference between the sensor readings, and then consider what we saw in terms of VfB, you can see why:

Whilst this chart is in mg/dl, what it shows is that the A6 was often more than 1mmol/l away in one direction or the other from the G5. You can see why trust might become an issue.

So what conclusions do I draw from this?

As previously, an n=1 sample is very small, and this is n=1 of n=1 so is likely to be even less relevant. For me though, the takeaways have been:

  • I’m not sure I believe the stated MARD of the A6 at 9%. The performance has been much similar to the older Medtronic setups using Enlites with Minilink transmitters on an x22 pump than that of a G5, and those had a stated MARD of around 13%.
  • The first sensor reading regularly above the glucose level was a little disconcerting. I’d definitely not recommend dosing from the CGM given those readings, but this may just have been the sensor, so more data is required to verify this.
  • Over the supposed 14 day life of the A6, the consistency of VfB wasn’t great. In the second week, the results set was rather different from the first.
  • I’d be a little concerned about using the A6 CGM with the A6 pump and the Low Suspend functionality. I’m not sure that it’s really accurate enough for that.
  • Finally, the variation on the Dexcom G5 in the second week was much more significant than I expected. Whilst I’ve been using it for looping at older than 7 days, I now wonder about the wisdom of this approach, given the second week’s positive VfD skew. As a result, I’m undertaking further testing on this to see just how big a difference I observe.

Ultimately I think it comes down to how it is being paid for. If you want something providing real-time data as core functionality and you’re paying yourself, the price of A6 is very attractive. However, at that price, it is directly competing with the Libre, which can be cheaply modified to become a real-time provider, and with the Dexcom’s ability to extend, which so far the A6 has proven not to be able to do all that effectively.

I think that if the we-are-not-waiting community can get an xDrip+ or Spike version working with the A6, we may have an opportunity to provide a more accurate “version”, and at the price, that would make it an excellent option.

For now, I’m going to stick with the G5 as my primary CGM source, especially as I’m looping. I’m happy to continue to test out the A6 and see what it produces in terms of accuracy, reliability and consistency, but my initial impression is that while it’s a nice attempt, and it’s great to have competition in the market,  it doesn’t feel like a 9% MARD (Mean or Median) system and it’s not proven, at least off the first sensor, to be consistent enough for me. At least with the Libre, when it’s out, it’s out consistently. That’s not the case with the A6.

Overall then, is it worth the £35 a sensor and £200 for the transmitter (or lower cost packs)? I’ll admit that I’m still not sure. In terms of ease of use and access to CGM, yes, I think it is. In terms of accuracy and consistency, it’s like a system from five years ago. The difficulty with that is that systems from now, and we have to include the Dexcom G6 in that, are far better.

2 Comments

  1. Thanks for the info Tim. Bit of a disappointing result, unfortunately. I am decidedly less excited about it now. Still a long ways away from release in Aus though.

    Also, how are your fingers?? Brutal.

    • Sadly. The second sensor hasn’t been much better averaging a 15% or so Mean Absolute relative variation. My fingers didn’t enjoy it.

Leave a Reply

Your email address will not be published.


*