Skip to main content

BETA This is a new service - your feedback (opens in a new tab) will help us to improve it.


Go to letter:

  1. A
  2. B
  3. C
  4. D
  5. E
  6. F
  7. G
  8. H
  9. I
  10. J
  11. K
  12. L
  13. M
  14. N
  15. O
  16. P
  17. Q
  18. R
  19. S
  20. T
  21. U
  22. V
  23. W
  24. X
  25. Y
  26. Z

Found 36 items .

Terms starting with A


A person or organisation buying, implementing or using a technology to provide health or social care services.

The technology will typically have been developed by a different person or organisation (the developer). Some technologies are developed by the same people who use them (that is, they are both the developer and the adopter). But for the purposes of this website, developers and adopters are considered separate groups.

An adopter can be any commissioner or provider of health or social care including primary care, mental health or community services, social care in the community or care homes, and secondary and tertiary hospitals. Adopter roles include clinicians, administrators, managers, commissioners, roles responsible for integrating and governing technologies such as IT, information governance, and clinical safety.

Adopters can also be the people using the technology. This includes health and social care workers. It also includes patients, service users or members of the public using a technology themselves or with help from a health or social care professional.

(Source: AI and Digital Regulations Service)


A process through the application of one or more anonymisation techniques to render personal information anonymous. When this is done effectively, the anonymised information cannot be used by the recipient to identify the data subject either directly or indirectly, taking into account ‘all the means reasonably likely’ to be used by them. This is otherwise known as ‘a state of being rendered anonymous in the hands of the recipient’.

(Source: HRA definitions, ICO)

Anonymous data (or effectively anonymised data)

Data that is no longer personally identifiable. Anonymised data is not considered as personal data under the UK General Data Protection Regulation. This means it is not subject to the same restrictions as personal data.

Anonymous data may be presented as general trends or statistics; for example, by removing direct identifiers such as NHS number and name, putting age into an age range (such as 25 to 40) and grouping postcodes together.

Information about small groups or people with rare conditions could potentially allow someone to be identified and would not be considered anonymous. But the risk of reidentification does not have to be completely removed for data to be considered anonymous, provided the risk is mitigated sufficiently to meet anonymisation requirements for the intended recipient. Any onward transfer of (or remote access to) the data may change its status back to personal data, depending on any additional information and means available to the onward recipient.

(Source: HRA definitions, ICO)

Artificial intelligence (AI)

The use of digital technology to create systems capable of performing tasks commonly thought to require human intelligence.

For example, an AI system may analyse radiography images and detect tumours in cancer patients.

(Source: AI dictionary)

Terms starting with B

Best-practice guidance

It is not a legal requirement to follow best-practice guidance when developing or adopting digital healthcare technologies. But following the guidance makes it more likely the technology will be successfully adopted for use in health or social care. This is because the guidance sets out the minimum requirements expected by funders, commissioners and national organisations such as NICE.

(Source: AI and Digital Regulations Service)

Terms starting with C

Clinical performance

The output of a technology resulting from the analysis and evaluation of clinical data to assess the technology is performing as intended to the benefit of patients or service users. This would be assessed both before a technology is granted market access and after it is deployed in a health or social care service.

(Source: Vasey et al. 2022)

Confidential Patient Information (or confidential patient and service user information)

A legal term defined in section 251 (11) of the National Health Service Act 2006. Confidential patient information both identifies the patient and includes some information about their medical condition or treatment. For purposes such as developing a technology (that is, not providing direct care) there are legal requirements for accessing and using health and care data.

For a full definition of confidential patient information see HRA definitions.


A legal data-protection term. A controller is a 'natural' or 'legal' person, public authority, agency, or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data. In health and care research, the controller is expected to be the research sponsor.

Note: a natural person is a living, breathing human being. A legal person is not a natural person but has some of their legal rights (for example, a corporation or partnership).

(Source: ICO controllers and processors)

Terms starting with D

Data-driven technologies

Technologies that work by collecting, using and analysing patient and service-user data to support the care of individuals, NHS services, public health, or medical research and innovation. Artificial intelligence and machine-learning technologies are types of data-driven technologies.

(Source: Our data-driven future in healthcare from the Academy of Medical Sciences)


A term that has more than 1 meaning but for the purposes of this website is defined as:

The overall act of placing a technology into a health or care service for use within its intended purpose; that is, providing health or social care. Includes risk stratification and therapeutic, diagnostic, monitoring or screening purposes.

(Source: AI and Digital Regulations Service, HRA definitions)


A person or organisation responsible for a digital healthcare technology, typically because they developed the technology.

For medical devices, this website uses the term ‘developer’ to mean the ‘legal manufacturer’. The developer is legally responsible for any device it places on the UK market. But a developer based outside the UK has to appoint a UK Responsible Person to place the device on the market on its behalf. In this case, the UK Responsible Person is legally responsible and accountable for the device (not the developer).

(Source: AI and Digital Regulations Service)

Digital healthcare technologies

Software and apps used to improve health outcomes or to improve the health and social care system. These include:

  • regulated medical devices that are software as a medical device (SaMD), AI as a medical device (AIaMD) or part of a medical device
  • software including apps designed to help people manage their own health and wellbeing
  • software designed to help the health and care system run more efficiently or help staff manage their time, staffing or resources

Does not usually include software used in operational or administrative tasks such as electronic patient records or used for automating processes. But such software can be a digital healthcare technology if it impacts how care is provided or used by patients or service users. Examples include triaging patients to determine the urgency of their need for treatment, or deciding which type of professional delivers their care.

(Source: AI and Digital Regulations Service)

Direct care (or individual care)

A clinical, social or public health activity for preventing, investigating or treating a person’s illness or alleviating their suffering.

Includes some wider activities such as local clinical audit to check that care is being provided in line with standards. But only when these activities are done by health or social care professionals who have a legitimate relationship with the person for providing their care.

For a full definition of direct care see Information: To share or not to share? The Information Governance Review from the National Data Guardian.

(Source: AI and Digital Regulations Service)


Change of performance in a machine learning model over time because the distribution of data it is applied to alters. Drift can cause a model's accuracy and reliability to decrease or increase, which may impact effectiveness.

Includes changes that occur to a technology, its environment or target variable that result in performance changes and the technology not meeting its intended purpose.

An example is data drift, when the ‘real-world’ data changes over time compared with the training data.

Continuous monitoring and retraining of models is necessary to track and manage drift.

(Source: AI and Digital Regulations Service)

Terms starting with E

End user

The person for whom the use of a technology is designed, as outlined in its intended purpose by the developer. This is different from a service user.

(Source: AI and Digital Regulations Service)


A measure of how understandable, or explainable, the decisions of an AI system are to humans.

For example: an AI that predicts which patients are most in need of surgery should be able to explain why it has prioritised patients in a certain way.

XAI (eXplainable Artificial Intelligence) means humans can understand how the results of an AI model were obtained.

(Source: AI dictionary)

Terms starting with I


A term that has more than 1 meaning but for the purposes of this website is defined as:

The process that has to be followed before a technology is used to provide care to patients or users (deployed) in a health or care service. Includes the work needed to place the technology into an established system such as a care pathway or a suite of software. Can mean the entire process of moving the digital technology from the developer’s ownership into an existing service, so that it can be used for health or social care after deployment. Integration could include data sharing agreements, commercial agreements, software or data engineering to make the technology compatible with existing systems, local validation, training staff to implement and monitor the technology, and planning for monitoring processes.

(Source: AI and Digital Regulations Service)

Intended purpose

The objective intent of the manufacturer [developer] regarding the use of a technology, process or service as reflected in the specifications, instructions and information provided by the developer.

(Source: Software as a medical device (SaMD): key definitions from the International Medical Device Regulators Forum)

Terms starting with L

Legacy system

A technology that is still in use but is outdated or no longer supported by its developer. Relates to a digital healthcare technology that is one or more of the following:

  • considered an end-of-life technology
  • out of support from the developer
  • impossible to update
  • no longer cost effective

(Source: AI and Digital Regulations Service)

Local validation

The activities done before full deployment to check a digital healthcare technology will achieve its required performance levels and intended purpose when deployed in a specific health or social care service. May include activities such as local calibration, pilot studies or silent-mode testing alongside existing processes to evaluate the efficacy of the technology.

(Source: AI and Digital Regulations Service)

Terms starting with M

Medical device

Any instrument, apparatus, appliance, software, material or other article, whether used alone or in combination, together with any accessories, including the software intended by its manufacturer to be used specifically for diagnosis or therapeutic purposes or both and necessary for its proper application, which is intended by the manufacturer to be used for human beings for the purpose of:

  • diagnosis, prevention, monitoring, treatment or alleviation of disease
  • diagnosis, monitoring, treatment, alleviation of or compensation for an injury or handicap
  • investigation, replacement or modification of the anatomy or of a physiological process, or control of conception

A medical device does not achieve its main intended action by pharmacological, immunological or metabolic means although it can be assisted by these.

Includes devices intended to administer a medicinal product or which incorporate as an integral part a substance which, if used separately, would be a medicinal product and which is liable to act upon the body with action ancillary to that of the device.

(Source: The Medical Devices Regulations 2002)

Model training

A process, needed by most types of AI, which uses data to build a model or algorithm able to predict future cases.

For example: researchers have trained models to predict Covid-19 from patients' X-rays using the National Covid-19 Chest Image Database.

(Source: AI dictionary)

Terms starting with P

Patient or service user information

Any information (however recorded) that relates to the physical or mental health or condition of an individual, to the diagnosis of a condition, or to their care or treatment. Applies to any information (however recorded) which is to any extent derived, directly or indirectly, from such information, whether or not the identity of the individual in question is ascertainable from the information.

Note: this is different from information for users such as a product’s instructions for use.

(Source: Section 251(10) of the National Health Service Act 2006)

Post-market surveillance

Activities done by the manufacturer [developer] of a medical device after it is placed on the market to make sure it meets appropriate standards of safety and performance for as long as it is in use.

(Source: Medicines and Healthcare products Regulatory Agency)


A legal data-protection term. A person or organisation who has permission to use (process) personal data that belongs to a data controller. Permission to process this data will for specific purposes, such as providing a healthcare service or assisting with diagnosis.

For full legal definitions see:

Product-specific user training

The training needed to use a specific digital healthcare technology safely and effectively in health or social care.

There is not a direct legal requirement for training under UK Medical Device Regulations (2002) but it is good practice. And if the developer determines that training of intended users is required for the device to meet safety requirements, the adopter is legally required to do this.

(Source: AI and Digital Regulations Service, Health and Safety Executive training and competence)


A technique that replaces or removes information from personal data so that a specific individual cannot be identified without additional information. Such additional information has to be kept separately from the personal data.

Encoding of personal data is an example of pseudonymisation. A specific individual cannot be identified from encoded data without a code key.

For full legal definitions see:

Terms starting with S

Service evaluation

Evaluation designed and conducted solely to define or judge current care in a specific service. It should answer the question ‘what standard does this service achieve?’ for that service only. It should measure current service without reference to a standard and involve an intervention in-use only.

The choice of treatment is that of the clinician and patient or social care professional and service user according to guidance, professional standards or patient or service-user preference, and this should happen before service evaluation.

Usually involves analysis of existing data but may include collection of data using an interview or questionnaire. There should be no randomisation.

(Source: HRA's decision tool is my study research?)

Service user

Any person whose health or care is affected by use of a digital healthcare technology. This is different from an end user.

(Source: AI and Digital Regulations Service)

Silent-mode testing

The process in which the technology is run in parallel to the existing processes and clinical workflow to assess its real-world performance. Done during the deployment and integration phases of the technology’s lifecycle. The purpose is to spot any risks or safety concerns that require mitigations and to calibrate to local requirements, and to do so before the technology is used to provide care.

(Source: AI and Digital Regulations Service)

Special categories of personal data

Personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a 'natural' person, data concerning health or data concerning a natural person's sex life or sexual orientation.

Note: a natural person is a living, breathing human being.

(Source: HRA definitions, Article 9 UK GDPR)

The organisation or partnership that takes overall responsibility for proportionate, effective arrangements being in place to set up, run and report a research project.

All health and social care research should have a sponsor. This includes all research involving NHS patients, their tissue or information.

Two or more organisations may agree to act as co-sponsors or joint sponsors. Co-sponsors allocate specific sponsor responsibilities between them. Joint sponsors each accept liability for all the sponsor responsibilities.

A sponsor can delegate specific tasks to another individual or organisation that is willing and able to accept them. Any co-sponsorship, joint sponsorship or delegation of tasks to another party should be formally agreed and documented by the sponsors.

(Source: Health Research Authority guidance on roles and responsibilities)

Synthetic data

Information that is artificially (algorithmically) created rather than generated by real-world events. Can simulate synthetic populations that resemble the characteristics and diversity of actual people. Can be generated to be statistically consistent with a real data set, which it may then replace or augment.

(Source: HRA definitions)

Terms starting with T


Properties and information associated with a technology that allow the user to understand how and why it was developed and how it produces its predictions (outputs). May include information about the datasets used to train and test the model and the performance metrics (such as accuracy) for the outputs, including key subgroups. These features or categories of transparency are normally publicly available.

A feature of transparency is model explainability; that is, using specific tools to provide insights into the inner processes of how a machine learning model produces recommendations, decisions or outputs. Details about explainability and the tools used may be part of transparency.

Transparency is sometimes used more broadly than this, such as an organisation being transparent about its use of AI. This might involve sharing information with the public and key stakeholders on why it chose to use AI, what goals it seeks to achieve and who will be affected by its use.

(Source: AI and Digital Regulations Service)

Terms starting with U

UK Responsible Person

A legal term relating to placing medical devices on the UK market. A developer (legal manufacturer) based outside the UK is required to appoint a UK Responsible Person to act on its behalf. The UK Responsible Person must provide written evidence of this.

When a UK Responsible Person places a device on the UK market on behalf of a developer, the UK Responsible Person has legal responsibility and accountability for that device.

For a full definition, see the Medicines and Healthcare products Regulatory Agency’s guidance on the UK Responsible Person.

Terms starting with V


Any individual or organisation that is promoting, supplying, selling or planning to sell a digital healthcare technology to a health or social care provider. The vendor may be the developer of the technology or a third-party agent that has not been involved in developing the technology, such as a sales organisation.

If the vendor is a third-party agent, they should clarify who is accountable for the performance and safety of the technology. For medical devices, the party that placed the device on the UK market is legally responsible and accountable for it. The vendor should also provide information about the developer. This is needed for post-market surveillance.

(Source: AI and Digital Regulations Service)

Looking for another term?

Tell us what term you were hoping to find a definition for.