Skip to content

Acceptable Use Policy (AUP): Scope, Required Clauses, and Common Pitfalls

An Acceptable Use Policy (AUP) is the document that tells employees and contractors how they may and may not use the organisation's information systems, devices, networks and data — and which sits between ISO 27001 Annex A.5.10 and real disciplinary enforcement.

What this policy is

An Acceptable Use Policy is the principal human-behaviour control in an information security programme. While technical controls (firewalls, endpoint protection, DLP) prevent what they can, the residual risk always includes actions people take intentionally or inadvertently: clicking on a phishing link, pasting customer data into a public AI model, installing a browser extension, forwarding a confidential document to a personal address. The AUP is the organisation's documented statement of what is and is not permitted — and crucially, it is the reference document that makes enforcement actions legally defensible.

There is no single statute requiring an AUP by that name. But several overlapping obligations make it indispensable. ISO/IEC 27001:2022 Annex A.5.10 explicitly requires “rules for the acceptable use of information and associated assets” to be identified, documented and implemented. Article 32 of the UK GDPR requires “appropriate technical and organisational measures” to protect personal data — the AUP is the organisational anchor. SOC 2 Common Criteria CC6.1 treats acceptable-use rules as a prerequisite for effective logical access control. In regulated sectors the expectation is more specific still: the FCA's SYSC 3 for financial services, the NHS Data Security and Protection Toolkit, and the PCI DSS v4.0 requirement 12.3 all assume an AUP exists.

The policy matters because it is the bridge between technical controls and legal consequences. Without it, the organisation cannot credibly argue at tribunal that an employee knew a particular action was prohibited, cannot claim it had taken reasonable organisational measures under Article 32, and cannot satisfy the auditor that CC6.1 has been met.

Who needs one

Every organisation with more than a handful of employees or contractors using information systems needs an AUP. Specifically:

  • Any organisation pursuing ISO/IEC 27001 certification — the auditor will ask for it at Stage 1.
  • SOC 2 Type I or Type II candidates — evidence of an AUP and acknowledgement log is collected as part of the CC6.1 control test.
  • UK employers processing personal data — to satisfy Article 32 and to meet the ICO's expectation of documented security measures.
  • Organisations handling regulated data — payment card data (PCI DSS), health data (NHS DSP Toolkit, HIPAA), classified public-sector data (HMG Security Policy Framework), financial client data (FCA SYSC).
  • Any organisation intending to enforce disciplinary consequences for IT misuse — without a written policy the employee's defence at tribunal (“I did not know that was prohibited”) is dramatically stronger.

The AUP applies to employees, contractors, consultants, interns and third-party users of company systems. It should state so explicitly in its scope clause; the commonest gap at audit is an AUP that technically only binds “employees” while the breach came from a contractor.

What must be in it

A defensible, audit-ready AUP covers the following clauses. Each is explained briefly; a full policy extends each into operational detail appropriate to the environment.

  • Scope — who is bound (employees, contractors, consultants, interns, temporary staff, vendors), which devices (company-issued, BYOD, peripherals), which networks (corporate LAN, VPN, Wi-Fi, cloud services), and which data (all information created, received or stored in the course of work).
  • Permitted use — a positive statement that systems are provided for authorised business purposes and that limited reasonable personal use is permitted, subject to the prohibitions below. Some organisations opt for a stricter business-only rule; both are lawful if consistently applied.
  • Prohibited use — explicit list of forbidden behaviours: unlawful activity; harassment; discrimination; accessing or distributing obscene, hateful or violent material; circumventing security controls; sharing credentials; using personal email for company business; storing company data on unauthorised personal storage.
  • Email and messaging — use of company email for business communications, prohibition on auto-forwarding to personal addresses, rules on chain emails, expectations around tone and professionalism, handling of confidential attachments, signature requirements.
  • Web and social media — rules on browsing categories (gambling, adult content typically blocked), expectations when identifying as an employee online, prohibition on disclosing confidential or non-public information on social platforms, interaction with media enquiries.
  • AI tools and copilots — which generative-AI tools are approved, a prohibition on inputting confidential, personal or regulated data into consumer-tier models, requirement for human review of AI-generated output before it is used for decisions, prohibition on passing AI output off as independent work in regulated contexts, and a cross-reference to any dedicated AI use policy.
  • Personal use — explicit boundaries (time, bandwidth, content types), reminder that personal use remains subject to monitoring, clarification that personal data created on company systems may be caught by subject access requests.
  • Monitoring and privacy notice — what is monitored (email metadata, web browsing, endpoint telemetry, cloud access logs), the lawful basis (typically legitimate interests with an LIA documented), who has access to the logs, retention periods, and how covert monitoring would be authorised in the exceptional case. This clause is the monitoring notice for UK GDPR purposes; missing or inadequate notice here is a frequent tribunal finding.
  • Bring-your-own-device (BYOD) — high-level rules on personal devices touching company data, with a cross-reference to the separate BYOD Policy for detail.
  • Software installation — prohibition on installing unapproved applications, browser extensions, or developer tools; the request route for approved additions; specific mention of code-generation plugins and VS Code extensions which are increasingly scrutinised.
  • Acceptable handling of confidential data — reference to the information classification scheme (public, internal, confidential, restricted), rules on sharing each class, encryption expectations in transit and at rest, clean-desk and clean-screen expectations.
  • Incident reporting — the user's duty to report suspected incidents (phishing, lost device, suspected compromise, misaddressed email containing personal data) promptly and through the named channel, with an express no-blame statement to encourage reporting. Cross-reference the Incident Response Plan.
  • Enforcement and disciplinary consequences — tiering of breach severity, the range of possible outcomes from informal warning through formal disciplinary action to summary dismissal for gross misconduct, and the reservation of legal remedies for unlawful conduct.

The policy should also carry a version number, owner, review date, and a mandatory acknowledgement statement captured at onboarding and re-acknowledged annually or on material change.

Common pitfalls

1. Copying a US template into a UK policy

Template AUPs imported from US sources often treat monitoring as an unconstrained employer right. UK and EU law treat monitoring as processing of personal data requiring a lawful basis, a notice, and proportionality. A US-style “we may monitor at will for any reason” clause is not enforceable under UK law and undermines the rest of the policy.

2. No mention of AI tools

AUPs written before 2023 almost universally miss generative AI. Employees inferring that silence equals permission have pasted customer data into consumer ChatGPT accounts, generated code incorporating unvetted open-source licences, and used image models in ways that raise IP questions. Adding an AI clause is now the single highest-leverage AUP update.

3. Inconsistent enforcement

A policy prohibiting personal use that is routinely ignored by managers is worse than no policy: at tribunal, selective enforcement is evidence of unfair treatment. Either enforce the rule consistently or rewrite it to match observed practice.

4. No acknowledgement record

A PDF sitting on the intranet is not an acknowledged policy. Use tracked acknowledgement (magic-link or HRIS-gated) so that for every employee, on every version, there is a time-stamped record. PolicySuite's distribution and acknowledgement module exists precisely for this.

5. Over-broad monitoring clause

An AUP stating “we may monitor all activity on company systems at any time for any reason” fails proportionality and transparency. State what is monitored, why, and who has access. Covert monitoring should be treated as exceptional and require documented authorisation.

6. Missing BYOD and remote-work scope

Older AUPs assume office-bound staff on company laptops. Post-2020 reality is that a large share of policy-relevant actions happen on home networks, personal devices, and cloud services outside the corporate perimeter. If the AUP scope does not follow the data, it does not meet Article 32.

7. Treating the AUP as a static document

Without a review cadence, the AUP drifts out of line with the technology stack. A review annually, and on any material change (new SaaS platform, new AI tool, new device class), keeps it usable.

Framework mapping

A well-drafted AUP contributes directly to controls in the major information security frameworks.

Acceptable Use Policy mapped to frameworks
FrameworkReferenceWhat it requires
ISO/IEC 27001:2022Annex A.5.10Rules for the acceptable use of information and associated assets identified, documented and implemented.
NIST CSF 2.0PR.AA (Identity Management, Authentication, Access Control) and PR.DS (Data Security)User awareness and data-handling rules supporting the PROTECT function.
SOC 2 (AICPA TSC)CC6.1 & CC1.4Logical-access controls, user responsibility statements, personnel ethics and commitment to integrity.
UK GDPRArticle 32Appropriate organisational measures for the security of personal data; the AUP is the principal organisational measure.
PCI DSS v4.0Requirement 12.3Acceptable use of end-user technologies defined, documented and acknowledged.
Cyber Essentials (UK)User access & secure configurationDocumented rules supporting the five technical controls, particularly around software installation and access management.

How it fits with other policies

  • BYOD Policy — extends the AUP to personal devices; the AUP references it for device-specific rules.
  • Password Policy — covers credential hygiene (length, rotation, reuse prohibitions, MFA); the AUP points to it rather than duplicating.
  • Information Security Policy — the high-level governance document under which the AUP sits; the AUP operationalises the IS Policy for end users.
  • Incident Response Plan — the AUP obliges users to report; the IR plan is the machinery that processes the report.
  • AI Use Policy — where generative AI adoption has grown beyond a single clause, a standalone AI policy tightens specificity around prompts, data classes, attribution and IP.
  • Remote Work Policy — covers physical security, workstation hygiene, home network expectations; many AUPs fold this in, but a separate policy is cleaner in hybrid organisations.

Frequently asked questions

Is an Acceptable Use Policy legally required?

There is no single statute that names the AUP by title, but several obligations effectively require one. ISO/IEC 27001:2022 Annex A.5.10 requires documented rules for the acceptable use of information and associated assets. SOC 2's Common Criteria CC6.1 requires logical-access controls including user responsibility statements. Article 32 of the UK GDPR requires appropriate technical and organisational measures to secure personal data — an AUP is the organisational half of that. For regulated sectors (financial services under the FCA's SYSC, NHS organisations under the DSP Toolkit), the AUP is a specific expected artefact. In practical terms, any organisation above a handful of employees needs one.

Should the AUP ban personal use of company systems entirely?

A complete ban is usually counterproductive and rarely enforced consistently, which weakens the policy when it matters. The pragmatic position is to permit limited, reasonable personal use that does not interfere with work or expose the organisation to risk, while prohibiting specific behaviours (unlawful use, use that consumes disproportionate bandwidth or storage, use that creates confidentiality or reputational risk). The UK Employment Appeal Tribunal in Atkinson v Community Gateway Association (2014) treated a policy permitting reasonable personal use as the expected norm.

How does the AUP interact with employee monitoring?

Monitoring requires a separate lawful basis under UK GDPR (usually legitimate interests, assessed via a documented LIA) and a transparency notice. The AUP is where that notice is typically given: it tells employees that use of company systems is logged, who has access to the logs, and for what purposes. Monitoring without prior notice risks breach of both Article 5(1)(a) transparency and the implied term of trust and confidence in the employment contract. The ICO's 2023 employment practices guidance is explicit that covert monitoring should be exceptional.

What should the AUP say about generative AI tools?

At minimum, the AUP should state which AI tools are approved, prohibit the input of confidential, personal or regulated data into public-tier models, require human review before AI output is acted on, require attribution where policy or regulation demands it, and prohibit the use of AI to generate content that is then passed off as proprietary. Many organisations extend the AUP with a dedicated AI Use Policy as adoption deepens; the AUP then cross-references it.

How often should we have employees re-acknowledge the AUP?

At onboarding, on any material amendment, and at least annually thereafter. Annual re-acknowledgement is the practical floor because it demonstrates the organisation has kept the workforce aware — this matters at audit and in the event of a disciplinary proceeding where the employee claims they were unaware of a rule. Track acknowledgements formally; informal email confirmations are hard to defend.

Ready to implement this policy?

PolicySuite's InfoSec 38 Enterprise Policy Pack includes a full AUP pre-drafted for your jurisdiction, workforce model and tech stack, alongside 37 related security policies. Smaller organisations often start with the ISO 27001 Core Set (~£400) which includes the AUP and six companion documents.

£950 one-off · 38 policies