
GDPR & AI
What Do You Need To Do To Remain GDPR Compliant?
More and more people are now using AI in their clinics and businesses. AI can process lots of data very quickly, far quicker than people can, which makes it a useful tool. Where personal data is involved, however, you need to be very mindful of GDPR.
The ICO has very clear and comprehensive guidance on AI and data protection. https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/artificial-intelligence/guidance-on-ai-and-data-protection/
So, what does this mean for you when putting your GDPR into place at your practice? There is a lot to consider here, so I will break it up into more than one blog.
Let’s start with the description of what you need to do given by the ICO
“DPIA (Data Protection Impact Assessment) needs to describe the nature, scope, context and purposes of any processing of personal data, making clear how and why you are using AI to process data. Including:
- How you will collect, store and use data.
- The volume, variety and sensitivity of the data.
- The nature of your relationship with individuals.
- The intended outcomes for individuals or wider society, as well as for you.”

AI Apps
There are lots of different AI applications available. These include general purpose AI applications such as ChatGPT, Google Gemini, DeepSeek. There are also very task specific AI applications such as Physiopedia AI Assistant and Flok Health.
Flok Health is the first AI only physiotherapy clinic in the UK. Patients can use the app to get help with back pain. A patient uses an app to report their problem and is shown videos of exercises they can do to help with that issue. So, the data that is being collected is very specific, the variety is small (back pain) and the sensitivity is relatively low.
In contrast Foresight AI is currently being trained on the medical data of 57 million people who have used the NHS in England. It is hoped that it will be able to predict disease. The data here is much larger in terms of volume. Predicting disease is very sensitive data.
So, the ICO is asking you to give information on the processing of patient data. This will be much easier with a task specific AI application than a general AI application.
“You need to evaluate whether any system using AI is more or less risky than using a system that does not use AI and show evidence of your consideration in your DPIA.
- Evidence that you considered less risky alternatives, if any exist, and why you didn’t choose these. This is particularly relevant where you are using public task or legitimate interests as a lawful basis.”

Risk
The Flok Health Clinic is an interesting example to use here. Do you, as a practitioner, think that a patient self reporting their complaint and copying exercises shown them via a video is more risky than a patient explaining their complaint face to face to a practitioner and being guided through their exercises by the practitioner?
Reports are that Flok is proving very successful for many patients, but there is concern that people might not fully understand their back issue. If so, they may report it on the Flok app incorrectly. Also, physiotheraptists are concerned that the patient may follow an exercise instruction in such a way that it results injury. Not alleviate it. Do you agree?
By answering this question, you have gone some way to evaluating whether the AI is more or less risky than a system not using AI. You have also considered less risky alternatives.
According to the newspapers, the NHS in Scotland (where Flok Physiotherapy is being trialled) have evaluated the risk. They have chosen to trial the Flok app because they think a patient will be seen quicker than if they waited for an appointment. If this is true, they could cite this as their reason for choosing to use AI.
Ok. That is probably enough for one blog. Hopefully it will have helped you to start evaluating the GDPR considerations for using AI in your practice. Enabling you you start your DPIA with AI in your practice.
Glen Mansbridge
July 2025