PCL attended the Making and Distributing Pharmaceuticals show on 21st & 22nd April in Coventry.
There was a lot of discussion around the use of AI in GxP from some of the experts in the field, which was exceptionally interesting and inciteful.
If you are thinking about using AI, have you decided what kind of data you want the AI tool to obtain for you and who your experts are in that particular field, as these experts will need to be used to train the AI tool. If you are going to introduce AI into your business, then you need to ensure that it is done in a relevant and responsible way.
They covered the risks involved with using an off the shelf LLM (Large Language Model) AI tool, the main risks being a lack of specialism, lack of transparency and hallucinations. Hallucinations occur when the AI tool veers off track and starts making things up, this makes it difficult to fact check and understand where the AI has gone wrong. Ensure you cover off the worse case scenarios in your risk assessments, then test them.
Validation is the key to mitigating the risks of using AI within your business. The AI outputs need to be explainable, decisions must be auditable and accountability must remain with a human. Look at the risks and what the detectability of them is. Educate the AI tool and then stop and check the results against what you know is correct. Data integrity and data governance is key for the data being used to train the AI. Think about the human-in-the-loop and where/when the review and monitoring of the data is going to happen.
Make sure you have done your research and are using the correct AI tool for your business. Integrate AI into your business in a responsible way, making sure you have SOPs in place to show how it works and that educational training has been given to all staff members. Let them know the pitfalls of using it and what they need to be aware of. Ensure they know the risks of using AI that hasn’t been approved by you, as if not using a closed AI tool then confidential information could be lost.
Maybe another way to look at it is rather than thinking about AI solutions, think of the problem you are trying to solve and what the risks look like for each route to get to the solution, think about the risks involved in using AI. Patient safety needs to continue to be your main focus. Even if using AI you need to ensure that there is a Human-In-The-Loop, you just need to determine where.
If you are using/going to use AI within your business then ensure you are able to explain how it is being used and what the controls are that you have in place. The regulators will start to give findings against the use of AI in the near future if it is not governed properly!
Whatever your take on AI is, be ready for it as it is coming!
PCL will be holding a forum in August to discuss AI further, if you would like to join in the discussion book now, we would love to see you there.
